View Full Version : Reducing data footprint to better manage storage requirements
01-19-2012, 11:22 AM
What are some of the techniques and best practices for reducing the data footprint of an organization to better manage storage requirements?
01-23-2012, 04:07 AM
We all know how important data is in every business and data grows every moment with rapid speed.
storage costs are approx aroung $6000 per 1TB of data per year.we definately need best techniques to mitigate the data foot print which we call it as data footprint reduction(DFR).
1)Dedupe in simple does deduplication resulting in storing refined data after removing duplicate data.
2)CAS(Content address storage) We use this for archiving the non-operational data.We need similar ideas to archive the operational data with repect to how old the data was.
Fetching data which is inactive in the prodcution needs to be archived locally so as to retrive in quicker way
01-25-2012, 07:50 AM
With the worlds continued obsession to create data the problem of a big data footprint is not only around reducing it but rather around efficient management. IT execs around the world are focusing more and more on this point. The floods in Thailand have made a impact on the prices of HDD's and the usual downward cost trend of storage acquisition is being bucked for this year at least.
Data de-duplication is an ideal option for data at rest, but not ideal for active data as the dedupe overhead will have a negative performance impact and reduce the ability to scale.
CAS is a very good option especially if one considers that most of the organisations data is static, the active archive strategy hold merit.
Active data footprints are potentially difficult to reduce, but it is possible to manage them more efficiently. Technologies such as Thin Provisioning and Dynamic Tiering can make a significant saving on the utilization of Tier 1 storage. Savings of 20% - 30% can be achieved with the reclaiming of over allocated storage or unused storage. Based on the given numbers above we can predict that the cost saving for 100TB of historical storage capacity could be as high as $180,000 worth of storage saved per year, from day one. This is without taking into account the effect of future savings due to efficient capacity management ($540,000 over 3 years). The ROI for this type of technology is becoming more compelling.
Combining Storage Virtualization with these technologies allows the organisation to extend the life of older storage arrays and enable these arrays to benefit from the latest storage technologies.
For more information have a look at Hu's Blog.
Powered by vBulletin® Version 4.2.0 Copyright © 2013 vBulletin Solutions, Inc. All rights reserved.