6 Bad Habits of Data Management

By: In: Data Backup & Archiving On: Jul 01, 2016
6 Bad Habits of Data Management

As an IT professional, you’re probably guilty of at least one or two transgressions when it comes to managing your company’s data. But can any single bad habit cause a company irreparable harm? Unfortunately, yes. At the very least, they can cost your organisation time and money.

Here are six bad habits of data management you should be aware of:

1. Keeping Too Much Data

With the rise of Big Data, IoT and social media, organisations are collecting and storing more data than ever. According to a 2015 IDC report, 42% of companies surveyed had no archiving process and chose, instead, to archive everything. This bad data habit often consumes extra costs, both to store data and to continuously back up or archive it.

So how do you know when you’re keeping too much? According to Forbes, “most CIOs and all general counsels know intuitively that half or more of stored data is debris.” The 2012 Compliance, Governance and Oversight Counsel (CGOC) Summit found that as much as 69% of corporate information had “no business, legal or regulatory value.” Nebulous dark data, often comprised of semistructured and unstructured data types, further complicates this issue. While the keep everything mindset may seem prudent for IT organisations, this form of data hoarding quickly becomes unsustainable.

2. Keeping Too Little Data

While your organisation can’t keep everything, you still need to take steps to store what is important or what they are legally required to retain. Unfortunately, two issues can arise. The first: sometimes deletion occurs after data ages past a certain point. If organisations are not careful, accidental deletion may remove some data that they still need for legal, regulatory or business reasons.

The CGOC summit also found that 25% of corporate information has “current business value,” 5% was classified as an official record and about 1%  was subject to litigation hold.

The second issue that arises occurs more in regards to incomplete backup or disaster recovery procedures. Here, critical applications, key files or data sets may be left unprotected because backups do not reflect the latest changes to systems or applications. It pays to verify successful restore capabilities via periodic trial runs, especially when it comes to mission-critical applications.

3. Keeping Data All in One Place

Disaster recovery and business continuity efforts typically involve planning for local disruptions or outages as well as for site-wide and regional downtime events. When it comes to site-wide disasters, conventional wisdom suggests having multiple backup copies available for restore in other locations. This includes backup copies stored off site, even in a different region. Some organisations use cloud backup, while others rely on backup tapes stored securely off site. To avoid rolling corruption or viruses that infect the latest backups, many organisations also retain previous backup tapes remotely, fully detached from potential network corruption.

4. Keeping Data Inefficiently

This bad habit has two parts: One applies to inefficient data storage management practices, the other applies to one or more inefficient IT processes. The first affects organisations that keep data where it has always been since it was created, despite the fact that multiple copies of the same file exist, or the file is ageing and has not been accessed for over two years.

The second type of inefficient data management occurs in the technical software, server and storage stack. Some organisations may not be efficiently using many space-saving features available to them, such as compression or deduplication. Such features can dramatically shrink the capacity footprint of data stored or protected, thereby reducing the amount of storage required.

5. Keeping Data on the Wrong Type of Storage Media

Not all data warrants storage on an organisation’s most deluxe system. In an effort to help them face the data onslaught (or the “zettabyte apocalypse“), many organisations have instead begun to implement tiered storage and data protection architectures.

This might mean that only mission-critical and/or frequently accessed data resides on costly Tier 1 storage systems. To implement tiered storage or data protection, organisations typically apply manual or automated policy rules to their data. Such rules determine when it makes sense to move or migrate certain types of data to lower-cost disk or tape “tiers,” or even when it makes sense to archive or delete certain data.

6. Keeping Data Without Regard to Its Lifecycle Needs

Bad Habit #6 actually occurs throughout the previous five bad habits of data management. After all, each of these errors of commission or omission can be prevented once you decide to handle your data based on its lifecycle needs. This means that you must address you data’s needs from cradle (when the data is first created) to grave (when the data has outlived its usefulness and can be safely deleted). At first, managing one’s data based on a complete lifecycle of information might appear to be no small feat. However, outside experts can help make the initial process less painful and much more rewarding.

Now that you’re familiar with the six bad habits of data management, get ready to test your data management IQ with our new guide.

← Thoughts of the Future of LTFS and Disk Urgent vs. Important: Balancing Your Disaster Recovery Budget →

Leave A Comment

About the author

John Sharpe

John Sharpe is Director of Product Management for Iron Mountain’s Data Management business. In this role, he is responsible for developing and implementing strategies for backup, disaster recovery, and archiving. Creating new offerings that will allow our customers to extract more value from their media, whether new or archival, is central to Mr. Sharpe’s work. His primary overall objective is to ensure that Iron Mountain is a trusted information partner for our customers – and much more than a storage vendor. Mr. Sharpe has over 15 years of experience in engineering, corporate strategy, and product management. He holds a BA in computer science from Boston College and an MBA in finance from Yale.