Cloud Storage to the Rescue?

13.07.2009
Most organizations are being -- or, more accurately, with the storage required to house and protect the many redundant copies of data that continue to be generated unabatedly.

In these times of cost reduction and consolidation, we must ask ourselves whether there are better ways to manage the glut. Unfortunately, companies have been grappling with this issue for more than a decade. And thus far, the technology options have come up short in addressing the core issues plaguing storage management: unbridled data growth, poor resource utilization and ineffective planning. The latest technology with promise is cloud storage. Will it work any better than the others?

Thinking back to the early days of SANs, one of the primary justifications for moving away from siloed direct-attached storage in favor of storage arrays and SAN infrastructure was the promise of greater efficiency through improved resource utilization. While SANs might have brought other benefits, such as improved availability and recoverability, utilization continues to be less than optimal in many environments.

The next big initiative was information life-cycle management. By devising a strategy for storage allocation and distribution based on the business value of data, the theory went, we could reduce quantities of expensive high-end storage and thereby shrink costs. The result: A lot of organizations bought additional tiers of storage, but not a lot of savings materialized -- at least not nearly the amounts anticipated.

A more recent technology enhancement is thin provisioning, which although promising remains a somewhat niche technology because of current application and operating system constraints. Another is , which was designed primarily to achieve a more favorable cost point for disk backup compared with tape. Both thin provisioning and data de-duplication will help drive storage efficiency in the future.

Let's not lose sight of the fact that at the same time organizations have been struggling with storage efficiency issues, the cost of an actual device to store data has continued to fall, and fall fast. So why is it so difficult to get our arms around this problem?

To a large extent, the answer lies in the continued lack of comprehensive storage management policies -- and data management policies -- within most organizations. That problem is exacerbated by the lack of metrics and reporting about data and storage usage, and trending. Consider, for example, the purging of data. Like a diamond, a piece of data, once created, is forever. It is typically stored, backed up, replicated and, perhaps, archived (all of which require more storage). But the likelihood that it will actually be purged is very low.

Interestingly enough, the sad state of storage management might represent a significant opportunity for cloud storage. The cloud could serve as a secondary or, more likely, tertiary tier of storage. Data -- mostly unstructured -- could then be relegated to the cloud, either manually or with an automated data mover based on aging and access policies. In addition to freeing up capacity and slowing the rate of equipment acquisition, this data would no longer need to be backed up or replicated, so the multiplier effect would be eliminated. Furthermore, if cloud service providers are truly service-oriented, they might well provide more-comprehensive service-level agreements and reporting metrics on this data than is available internally.

Clearly, moving data into the cloud isn't something to be undertaken lightly. There is much to consider, including security, availability, access and control. And it's important to keep in mind that although the cloud might offer attractive pricing for some classes of data, to drive more systemic changes in cost, cloud storage must be accompanied by a well-formulated storage and data management strategy.

James Damoulakis is chief technology officer at GlassHouse Technologies Inc., an IT infrastructure consulting and services firm.