Danny Milrad

Subscribe to Danny Milrad: eMailAlertsEmail Alerts
Get Danny Milrad: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Virtualization Magazine

Virtualization: Article

Use What You Have, Buy What You Need

How to Increase the Utilization of Existing Storage Resources

Companies today are very aware of the high costs associated with managing stored data and keeping this data available to business-critical applications. These management costs are escalating at a time when corporate IT organizations are looking to streamline operations to ensure that infrastructure investments lead to increases in productivity and profitability. At the same time, the pressure to manage more infrastructure resources with fewer people is at an all-time high.

Highly centralized data centers serving the needs of both internal departments and external customers are now regarded as the path to address these problems by centralizing procurement and the administration of complex systems. These data centers inherently contain a massive amount of storage on the order of tens and hundreds of terabytes, and a heterogeneous set of server platforms suited to the needs of each application or department.

The one-size-fits-all approach to storage no longer works. Storage requirements now vary by application and even by the user of an application. File storage for instance, has different requirements for performance, recoverability, and scalability than customer facing Web content, internal e-mail, or mission-critical database instances. Based on the relative value to the business, data storage requirements have become diverse and complex.

This article will shed light on how to get the most out of your existing storage infrastructure, how to scale storage resources for future growth (cost effectively), and how to build an information infrastructure that is secure, manageable, and reliable enough to support mission-critical applications.
Improving Storage Utilization Drives Down Costs & Complexity

As the cost of storage hardware continues to decline, many organizations simply choose to buy more storage as their data requirements increase. Unfortunately, such a tactic isn’t a long-term solution. Besides the immediately apparent hardware and labor costs of such an approach, adding capacity also demands additional floor space, real estate, maintenance, and administrators. And every added disk array is an added potential point of system failure.

Worse, as companies add storage capacity, they accumulate systems from various hardware vendors, each with its own operating software and utilities. The environment becomes more complex with each addition, and this complexity adds expense. The IT department must have administrators who are proficient in multiple storage technologies and ensure that they are available at all times to address problems when they arise.

The most cost-effective storage, consequently, is the storage that has already been purchased. Although the concept sounds simple, analyst studies show that companies typically buy and deploy excess capacity leaving their utilization rates between 20%-40% (Gartner 2005 Data Center Conference). This means the real total cost of ownership of storage is exponentially higher than anticipated and minimizes any chance of seeing a return on investment.

Understanding how to increase the utilization of existing storage resources and justifying new purchases should be the primary goal before evaluating and buying any new storage resources. Not only does this eliminate the pain of growing data but it also provides important benefits for data backup and recovery. In addition, it aligns IT with changing business needs.

Defining Storage Utilization

Companies that feel they have high levels of storage utilization probably haven’t run the numbers lately. They may look at how full their disk arrays are and assume that because they’re at 70% or 80% capacity their storage utilization is acceptable. But simply reviewing overall disk capacity use fails to address what’s being stored.

Most storage devices contain a large percentage of data that’s of little or no immediate business value. Much of it may be non-business-related files such as MP3s. More often many of the files may be duplicates, old or rarely accessed. Of the files that are clearly business-related, many may not have been used in the last 90 days or even the past year. So the question to ask is: What percentage of capacity that is being used has an ongoing business value?

When companies analyze storage utilization from this standpoint, the results are usually surprising, if not shocking. One financial institution recently determined, after careful analysis, that its actual storage utilization was only 8%. Other companies confirmed estimates in the single digits as well. The challenge then is to manage better where, and how, this information is stored.

Virtualization Enables Storage Pooling for Better Utilization

The ability to pool storage into logical volumes has been around for some time. And yet, the technology is still somewhat underutilized. Consider a situation in which a particular disk array (A) is only 50% full. If an application that uses another array (B) needs more storage capacity, but it can’t get any from (A), the administrator has to consider buying more capacity while the array (A) sits half-empty.

Storage virtualization helps resolve the utilization problem outlined above by enabling administrators to pool all storage into logical groups that can be reallocated quickly or in real-time based on demand. The best virtualization software can do this across any storage array from a variety of vendors running under a variety of operating systems from a single management interface.

When storage resources are virtualized, they appear to administrators as a single resource. For example, two 72GB drives can be combined to create a virtual 144GB disk or volume. Data can be moved transparently across vendors and operating systems to utilize available capacity. Storage management tools also enable IT shops to classify data by age or type so that less valuable or less current data can be moved automatically to less costly storage (more about this tiered approach below). Storage utilization improves. Capital costs shrink. Additionally, new tools enable the migration of data between operating systems – from AIX to Linux, for example, or from Linux to Solaris.

Not only does this storage pooling improve storage utilization but administrators instantly become more productive and can now spend more time on other tasks, such as building business applications.

Creating a Tiered Storage Infrastructure To Improve Utilization

Another useful response to the utilization problem has been to segregate data into multiple tiers according to the cost of the hardware, freeing up expensive high-performance storage like Fibre Channel-based storage by migrating older less-used data to lower cost storage like SATA. This data migration can be done based on the age, size, owner, or other attributes of the file. And it can be done in reverse if a file that was once unimportant suddenly becomes very important.

Tiered architectures can reduce storage capital and operating expenses by hosting less-critical and stale data on lower-cost storage devices. A tiered storage strategy provides organizations with snapshot backups and point-in-time copies to be hosted on multiple tiers, replicates mirrored data to less costly storage, and uses dynamic storage tiering for active policy-based movement of data.

Tiering storage is about recapturing high-rent primary disk space and redirecting data that doesn’t belong on a higher class of storage to secondary or tertiary targets. Implementing a tiered storage infrastructure enables organizations to utilize existing resources better, reduce management complexities, and reduce overall costs.


More Stories By Danny Milrad

Danny Milrad is senior product marketing manager of the Storage and Server Management Group at Symantec.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
ISSJ News Desk 08/04/06 05:58:14 PM EDT

Companies today are very aware of the high costs associated with managing stored data and keeping this data available to business-critical applications. These management costs are escalating at a time when corporate IT organizations are looking to streamline operations to ensure that infrastructure investments lead to increases in productivity and profitability.