Skip to content

September 8, 2011

1

Managing the Data Boom

By Joe Maglitta

“Data deluge.” “Data tsunami.” “Data explosion.” Over the last couple of years we’ve heard much about the boom, uh, rapid growth of enterprise data. It’s tempting to tune out, but don’t. Managing budget-killing information growth really is critical to enterprise sustainability. Here are two proven strategies.

But first, a quick look at the challenges. Growing data volume inevitably means adding costly storage and associated infrastructure. Copies needed for quality assurance, test/development and reporting also consume more space. So organizations face not just the cost of maintaining a large database, but everything around it as well. Plus, bloated databases can choke performance of ERP, CRM and other key enterprise apps.

Part of the problem lies with SQL itself. The database language is very good at bulk modify. IT often tries to speed things up by putting the larger databases on bigger servers. Usually this just increases software licensing fees.

So what can be done?

Strategy 1. Data Archiving
It’s estimated that less than 10 percent of companies have a formal archive/deletion policies. Keeping everything forever may seem the simplest approach. But it’s expensive. And potentially dangerous; such information is discoverable and can be used against you in lawsuits and investigations.

Enterprises must identify a data archiving solution that best suits your business needs. A recent Gartner report recommends every application have a retention policy specifying when data should be deleted. The firm also advises businesses to require that all new applications have an archiving plan. Smart archiving also helps improve storage asset utilization; identify opportunities to reclaim, consolidate and optimize storage resources; and reduce operational and management costs by using fewer disparate devices and software tools. An archiving strategy can be implemented with storage compression in mind, so that an archive copy represents 10 percent of the size of a production copy.

Strategy 2. Data Compression
Sure it’s been around a while. But compression is an obvious but powerful remedy to data overload. The reason: fewer bits are needed to encode information. A recent report by researcher ITG, commissioned by IBM, suggests that most organizations can expect capacity reductions of 50 percent or more using data compression.

So compression sounds like a no-brainer. But what about the performance hit from using system resources to compress and decompress data? While CPU usage is higher, I/O savings actually improve performance. And because CPUs outperform I/O channels feeding them, it’s a good performance tradeoff. You’ll want to make sure to compress all objects in the database, including indexes and temporary data.

Data bloat can be a budget buster. Taming it calls for shrewd tactics for optimizing data storage and managing data growth.

Want to learn more? On Sept. 13 at 12 p.m. EDT I’ll be hosting a webcast entitled “Taming Data Growth Made Easy.” The one-hour live event will feature experts from IBM and top implementation partner Estuate. We’ll talk in more detail about the fixes above and about consolidating databases and warehouses, virtualizing storage and freeing up high-value storage space.

You can register here.

Read more from IT Management
1 Comment
  1. Sep 19 2012

    Of course not.

Comments are closed.