With 90 percent of all the data in the world having been created in the past two years, it’s easy to understand why data storage is a growing concern. The Internet of Everything is seeing to that.
We’re drowning in data. Whether it’s kept for Big Data analytics to extract intelligence and make discoveries, to comply with data retention requirements imposed by government agencies, to guard intellectual property and trade secrets, or simply to conduct day-to-day business, data storage and security are beyond the capabilities of many companies to deal with on their own.
Virtualization, cloud data storage and storage technologies such as flash can provide some respite. Still, classifying one’s data stores for tiering, optimizing applications for virtualization (if they can be optimized) or defining data security and compliance requirements present challenges of their own … challenges that must be addressed, especially when considering outsourcing.
The advantages of outsourcing data storage to the cloud are many, not the least of which is that you don’t have to invest in and manage the storage infrastructure. Also, capacity scaling or decommissioning is a simple exercise … there when you need, gone when you don’t. Leave technology evaluations and new product integration to the provider. Once you’ve defined the parameters for storing, accessing and protecting your different classes of data, it’s the provider’s responsibility to deliver the infrastructure and technology needed to achieve your requirements.
The I/O Challenge
With the rapid increase in data generation and storage requirements, I/O bottlenecks are increasingly an issue. Response times for any application, virtualized or not, can slow to such a point that productivity suffers. If latency is too long, some applications can shutdown altogether.
The traditional response has been to resolve the problem by throwing more hardware at it, installing faster interconnects or upgrading the SAN itself. After all that effort and expense, application performance still may not improve appreciably or at all if the problem is with the application.
“Most I/O issues are traced back to code and/or application configuration issues,” says Andrew Mametz, Peak 10 vice president of service delivery. “Things like non-optimized queries or inefficient database application settings often create I/O lag. However, that being said, there are applications and environments that are not ideal for virtualization regardless of modifications.”
Can bottlenecks occur in the cloud? Yes, however provisioning adequate storage resources to accommodate a customer’s need, both for capacity and performance, need not be a problem. Issues arise when information about performance requirements is incomplete. This gets back to understanding data prioritization and use before settling into the cloud. “The more information we have on I/O requirements, the better our recommendation will be,” says Andrew. “Tools and calculators are a tremendous help for uncovering those needs, as well as figuring out what may not work before it can become an issue.”
Taking proactive action can help to mitigate performance issues. For example, using software-based tools can help achieve greater scale and speed within the cloud storage infrastructure. These tools allow storage volume to grow and shrink in a much more manageable way versus the traditional LUN-sizing that most storage administrators are familiar with today.
“Another area where we leverage software and the underlying algorithms is for auto-tiering,” Andrew says. “Auto-tiering means that the storage system itself will automatically move storage workloads from slower-moving disks onto higher-performance technologies. Because not all workloads are equal and because not all workloads are 24/7, this tool enables high I/O workloads to move in and out of performance disks on an as-needed basis.”
Whether storage requirements call for high-performance, performance or capacity (or all of them) providing an accurate profile will go a long way to delivering the optimum cloud solution (or internal solution, for that matter). You can also be sure requirements will grow and change continually in this data-obsessed world.
Offloading the storage portion of IT operations to a capable cloud provider will give your data professionals the bandwidth they need to put data to work solving business issues, improving competitive position and serving customers.