Test Data Manager

 View Only

Data breaches on the rise – so why not go synthetic?

By Anon Anon posted Feb 18, 2016 11:49 AM

  

Data breaches are becoming more prevalent, and the legislation should be made clearer and tougher. That is the message which comes out of the Californian Data Breach Report [1], published last month by the Attorney General of California, Kamala D. Harris.

 

The report describes how the scale of data breaches is increasing at an alarming rate. In 2013, there were 167 reported data breaches in California, which affected more than 18.5 million of its 38 million consumers – a massive increase of 600% compared to the numbers affected in 2012.

 

Meanwhile, the cost of such data breaches is on the rise. In 2013, the average cost of a data breach rose by 15% compared to 2012, reaching $5.85 million in the US, while the average cost of human error was $160 per record. [2]

 

The Attorney General’s report describes how human error and deliberate misuse accounted for 22% of data breaches in California – corresponding to roughly 4 million records. In the world’s 8th largest economy, then, human mishandling of data equates to fines of roughly $652 million dollars. And that only covers just over one fifth of reported breaches.

 

In spite of these stringent legislative punishments, the Attorney General does not think they are tough enough. She calls for stronger breach notice laws, a clarification of the rules and responsibilities of data owners and processors, and for the enforcement of final breach reporting. The suggestions each echo the upcoming EU data protection directive, due for enforcement in 2016, which will see maximum fines soar to €100,000,000, or 5% of annual turnover.

The view of legislators on both sides of the Atlantic, then, is that organizations can no longer dodge the issue of data security.

 

The 88.9% of US organizations who were not fully PCI compliant by their annual baseline assessment in 2013 (Verizon, 2014 PCI Compliance Report), for example, will have to rethink their approach to sensitive data, in order to avoid hefty punishments. The EU directive will also mean that data processors outside the EU will be subject to the directive if they process the data of, or provide a service to, EU residents.

 

The Attorney General’s report further describes how a large portion (7.5 million) of the 18.5 million Californians affected by data breaches were involved in the Target data breach, which lost 70 million credit/debit card details, and 40 million names, addresses, phone numbers and email addresses.[3]

 

This breach involved the use of third party vendors, and was partly a consequence of giving over certain security rights to them.[4] The use of third party vendors increases the cost of a data breaches considerably, by as much as $43 per record (Ponemon, 2013). Factor in the fact that the data outsourcing market is set to soar between now and 2018 (Frost & Sullivan, 2014), and there seems to be little reason left for organizations to not implement better data security.

 

Synthetic data generation offers one way to ensure compliance with previous and upcoming legislation, even when outsourcing. Using data with all the characteristics of production data, but with none of the sensitive content, mitigates the risk of the human errors and process failures that lead to data breaches.

 

Synthetic data generation is really the only way to completely eliminate the risk of any data breach and must be the way forward for financial institution, as well as having the advantage that data can be built to meet specific requirements. This avoids the unauthorized handling of synthetic data, while realistic data can also be readily and quickly created and sent to outsourcers and third party vendors.

 

To learn more about how synthetic data generation can ensure compliance, read the CA Test Data Manager Data Sheet.

 

References:

[1] The full Californian Data Breach Report can be downloaded here.

[2] http://www.ponemon.org/blog/ponemon-institute-releases-2014-cost-of-data-breach-global-analysis

[3] http://www.ponemon.org/blog/ponemon-institute-releases-2014-cost-of-data-breach-global-analysis

[4] http://datalossdb.org/index/largest

0 comments
2 views

Permalink