WP For Cost-Conscious Compliance Reporting, Rethink Your Data Retention Capability | Imperva

For Cost-Conscious Compliance Reporting, Rethink Your Data Retention Capability

For Cost-Conscious Compliance Reporting, Rethink Your Data Retention Capability

Staffing costs required to generate reports for compliance audits are high, but the time required to generate the reports themselves is not necessarily to blame if you have suitable access to your data. Today, the cost to retain data is the real challenge in compliance reporting. In this post, we’ll review the built-in data retention limitations that traditional data logging and database activity monitoring tools have. We’ll explain what most organizations need to do to retain data today and why it’s not sustainable. You’ll also get an overview of available solutions so you can decide which data retention approach is most cost-effective and enables you to meet current and future compliance reporting requirements.

In many industries today, compliance regulations call for organizations to have the capacity to produce audit trails from up to five years of data logs. Traditional native data logging, auditing, and monitoring methods are expensive and complex. Database monitoring systems, in particular, generate a tremendous amount of data as they oversee data access and, specifically, privileged user activity. The data volumes can easily grow into many TBs, leaving significant challenges for organizations that need to collect and manage this data effectively from both a logistical and cost perspective. Current Database Activity Monitoring (DAM) tools and other legacy solutions have severe limitations regarding data retention, and many enterprises are storing and retaining data using Tier 2 and Tier 3 storage or even tape archives for compliance auditing.

This is not an isolated problem. According to a new report by Seagate, 73% of IT leaders report that their organization is hampered by data retention costs. To confront the data collection, management, and retention challenges, organizations must take advantage of next-generation data retention technology that leverages advances in storage efficiency, scalability and query performance.

Failure to do this will, sooner rather than later, result in unsustainable data retention costs and insufficient compliance reporting. The most straightforward solution is to reduce the cost and footprint of database activity logging and monitoring so you can cost-effectively retain more data.

What you should be asking from your compliance reporting solution

To comply with regulations today, you need compressed, cold, cheap storage of multi-year legacy data that’s easy to access for fast robust reporting for audits. Your organization has neither weeks to extract legacy data from Tier 2 and Tier 3 storage or tape archive nor staff availability to manage the process. Most solution providers offer robust reporting capabilities on par with the industry gold standard. However, these reports cover only brief periods, 30-60 days maximum in most cases. When they need to access data logs beyond that period, the time and effort required to produce robust reports increase dramatically. Restoring older data from archives for reporting can take weeks to build a sandbox, restore the data, and run reports. You should run likely reporting scenarios from legacy data to determine how much it will cost to produce standard compliance reports in this environment.

Another tactic that some organizations use to control and simplify data retention is to use a Security Information and Event Management (SIEM) tool like Splunk to ingest all their data as part of an overall data security strategy to get all data in one location. In theory, this is a good idea because storing these logs in Splunk does enable compliance with regulatory record retention requirements. On the other hand, doing so can represent a high proportion of an organization’s overall data ingestion costs. The cost of Splunk for individual organizations is determined by the volume of data ingested into the platform (GB/day). An organization with an annual term license that sets an index volume price of $0.88 per GB and that ingests 2,600GB of data per day spends over $835,000 annually for data retention in Splunk, not including staff costs to manage the Splunk instance.

What a cost-effective compliance reporting solution looks like

Imperva Data Security Fabric (DSF) automates and simplifies regulatory compliance activities and provides superior long-term retention of live audit data. We consolidate years’ worth of database activity leveraging Imperva’s internal data lake. Imperva DSF’s highly efficient NoSQL column-store deduplicates and compresses data at a rate of over 50k events per core. Imperva DSF enables seamless movement of data between storage tiers for cost avoidance: hot storage, S3 buckets, AWS glacier, etc. All data is available anytime. Data is “always live” and not archived, making robust reporting on multi-year data available in minutes. APIs enable controlled access to SOC, DBAs, Forensics, etc. through their tool of choice. You can easily demonstrate compliance and conduct forensics with real-time interactive multi-year data exploration.

Imperva DSF takes advantage of next-generation data warehousing technology that leverages advances in storage efficiency, scalability, and query performance. By incorporating Imperva DSF into your architecture, a typical enterprise can reduce its hardware footprint (and associated costs) by more than 25% while simplifying the flow of data activity collection and accommodating retention of much larger datasets. The net result is substantial savings on infrastructure costs, as well as having created a single source, accessible data repository ideally suited for enabling a variety of users and use cases to easily leverage both current and retained activity data.

Imperva DSF makes it less costly to manage legacy data with Splunk

Organizations that use Splunk can save significantly on ingestion costs by using Imperva DSF to normalize, compress, and filter raw activity logs before using Splunk to ingest it. Pre-processing with Imperva DSF means you only need Splunk to index just 5-30% of the original raw activity logs data. In addition, Imperva’s DSF has a virtual index that allows Splunk to access the normalized data, running native Splunk jobs, but without the need to ingest and pay for this data. This means you get the best of both worlds – Splunk runs better and your compliance costs are reduced. Your organization retains the same ability to comply with record retention requirements while using Splunk to index just a small percentage of the original data.

The reduction in the number of full-time employees (FTEs) required to manage fewer servers and the cost savings on data retention alone justifies the investment in Imperva DSF and compress time to value from years to weeks.

Contact us and find out how we can help control data retention costs and make compliance reporting more efficient.