– Jim McGann, Vice President, Index Engines, says:
It seems like everyone’s talking about unstructured data lately – the cost, the risk, the massive growth – but little is being done to control it, especially with compliance regulations increasing annually along with data breaches.
Many organizations have been too busy playing catch up to get ahead on managing unstructured data. Bogged down with legal cases, increasing budgets and confusing regulations, many organizations are looking to get ahead on their data by finding the highest impact projects that will see an immediate ROI.
While there’s no one project that will guarantee compliance or avoid devastating breaches, there are six crucial data projects not to leave off the schedule in 2014.
- Proactively classify data based on risk: Getting ahead of the data tsunami and understanding what you have, where it is located and what may become a liability is a dream of many organizations. Now is that time to make that a reality. Unstructured user data must be analyzed and classified based on risk and exposure. For example, spreadsheets owned by the R&D department related to the ‘Jones project’ can be found and preserved in a legal hold archive. It is no longer safe sitting on the user share server with daily backups being made
- Audit and secure PII data: Personally Identifiable Information (PII) such as social security and credit card numbers are hidden throughout your network. This content is a liability and can result in high-profile data breeches if not properly managed. Search and find user files and email containing PII and migrate this content to an archive, encrypt it, or purge it from the network. Organizations can then ensure they are safe from hidden PII and determine the best disposition strategy for this content
- Support a development of data policy: Classifying user data based on insightful metadata properties allows for more meaningful strategies to be deployed. Classify data into departments or groups within the organization, then by age and access times, and finally owner allows for disposition strategies to be developed and executed. Additionally policies can be refined and updated based on actually data classifications of what exists within the data center.
- Migrate aged and outdated content offline: As user data ages it languishes online and slowly becomes forgotten. Classify this data by age and access times, and allow for a policy to be developed. Migrating this content off the primary network onto a cloud repository, archive or offline environment allows it to be managed and pruned as the data loses its value.
- Remediate aged data on legacy backup tapes: Every organization has legacy backup tapes generated for disaster recovery purposes. These tapes contain copies of all user generated files and email for decades. Compliance, legal and regulatory requirements are mandating that content on these tapes be managed as it can be requested when required. Profiling legacy tapes, extracting what is required, and archiving it for future access not only streamlines legal and eDiscovery requests but reduces offsite storage expenses.
- Audit records management policies: A records policy can only be effective if it is properly enforced. Since a data center is complex and many people are involved in managing the content, auditing the networks to ensure compliance with policy is critical. Without regular audits data that should no longer be available can easily exist, causing challenges with legal strategies. Regularly audit networks and servers to check for compliance. If a company has a policy that no PSTs are allowed on the network, it can be a nasty surprise if one is uncovered during discovery.
The benefits of managing unstructured data include reduced risk, quicker eDiscovery time and regulatory compliance. With finances already tight and data growing rapidly, don’t leave these projects off the schedule in 2014.
Jim may be reached at Jim.McGann@IndexEngines.com