Solving Storage Headaches: Assessing and Benchmarking for Best Practices
Learn how to accurately benchmark your data storage operations through a 360-degree review to cut costs while preventing unintended consequences in data loss or security.
Storage is consuming a greater portion of the IT budget than ever before. According to Forrester Research, Inc., storage budgets have increased from 10% of the IT budget in 2007 to 17%. Innovations such as deduplication and thin provisioning help to reduce total storage consumed, while server virtualization has allowed enterprises to reduce the amount spent on storage. But the associated storage reductions have not offset the space needed for additional snapshots, replication (the most significant driver of storage growth rates), disk-to-disk backup and the like.
While storage growth averages 50% to 60% compound annual growth rate (CAGR), storage prices on a per GB basis continue to decline at 35% to 40% annually. Taken together, this means IT organizations that can limit storage growth to 45% to 50% can maintain their storage budget with zero rate of growth.
Another major concern for most organizations is data protection such as backup and recovery. Industry data indicate that a 15% to 25% nightly backup failure rate is not uncommon, whereas best practice is less than 5% failure. To counter this deficiency and to improve recovery service levels, most IT organizations have implemented aggressive data replication strategies, as well as disk –to –disk backup technologies. The result has been significantly improved recovery point objectives (RPO) and recovery time objectives (RTO). But these recovery improvements contribute to the explosion of total data stored.
Comprehensive Storage Assessment
Continuous improvement of the storage environment requires a constant balancing. Optimizing one area in the absence of a comprehensive view may result in negative consequences in other areas. A 360 –degree storage review is the best assurance against such unintended consequences.
Use a comprehensive, metric-based assessment methodology that examines every aspect of the storage environment within the context of business operations and adjacent systems. This enables you to determine current effectiveness and develop a set of recommendations to improve and optimize your storage environment. Benchmark data highlights areas that demand attention, such as mismatched hardware infrastructure and immature operational processes. This data can also identify ticking time bombs, such as inadequate data protection, SLA vulnerabilities and out –of–control data growth.
We recommend that seven major areas be examined: capacity management, performance, infrastructure, data protection, compliance, reporting, and operations. Each area has its own set of best practice benchmarks, while each discipline must be examined in the context of the others.
A comprehensive assessment methodology has three primary phases –kick –off, data collection, and data analysis.
The kick – off stage includes preliminary activities such as determining the key stakeholders, mapping the architecture of the IT infrastructure, identifying which reports are currently available and analyzing drivers for growth – –both from a business and technology standpoint. The objective is to set the stage for data collection.
The ensuing data collection segment must be meticulous and comprehensive in order to answer the following storage questions:
- What, if any, are the critical vulnerabilities? These will take the highest economic and resource priority.
- What improvements can be made at little or no cost (i.e., the "low hanging fruit")?
- What infrastructure adjustments need to be made; that is, what areas are out of compliance with best practices?
- What operational adjustments need to be made to improve operational maturity?
The data collection stage typically includes gathering questionnaires and conducting interviews with the key stakeholders. Reporting tools that are already available are examined and secondary tools are deployed if called for.
The data analysis stage typically includes:
- Utilization of physical hardware.
- Classification of infrastructure based upon desired future state.
- Ease and risk associated with either migration or consolidation.
- Cost benefit analysis, if possible.
- Initial transition plan, timeline and suggested future state.
Organizations should not focus on the absolute benchmark score, but rather the relative improvement over time. Best practices are an ideal, but continuous improvement provides provable, measureable progress that contributes to the bottom line.
Read the complete white paper Solving Storage Headaches: Assessing and Benchmarking for Best Practices (PDF) or learn more about Cognizant's IT infrastructure services practice.