The General Data Protection Regulation (GDPR) is increasingly causing headaches for senior IT management. In the media the main focus is on consent, but there are many other areas that need attention. One of them is application testing.

With our experience in large mainframe IT organizations, we know that mainframe applications are often complex, and mainframe skills are becoming increasingly scarce. Lack of documentation along with the integration of increasingly complex and sophisticated customer applications due to mobile, is making mainframe application testing complicated.

For this reason many companies have turned to outsourcing to ensure all critical applications keep ticking. Working with a third party makes testing incredibly important. To ensure that your application will not cause problems when it is integrated into the real-world IT environment, you need to know how it will behave. As such, the application testing environment needs to reflect reality as closely as possible, which means using customer data to test the applications. However, fears over data protection are rightly standing in the way.

Choosing Security Over Quality

According to a global CIO survey commissioned by Compuware and conducted Vanson Bourne, many companies are failing to use customer data out of fear of data protection laws. In fact, almost a third (30 percent) of respondents said they do not provide their outsourcer with customer data at all. This can have a serious impact on quality, forcing companies to increase investment in internal QA teams and subjecting them to higher error and bug rates in application code. As a result troubleshooting and fixing bugs takes longer, causing delays in the launch of new applications and driving up overall cost of mainframe application testing, development and maintenance.

On the other hand, many customers are choosing quality over security. According to the CIO survey, 20 percent of companies do not mask their data before passing it to the outsourcers for use in application testing, as they fear doing so will impact the quality of their QA processes. While poorly tested applications could cause an expensive system failure, the long-term reputational damage from a data breach can be just as serious. Not to mention the fines that can be incurred from falling foul of data protection legislation.

New GDPR Legislation

The GDPR, adopted by the EU Parliament in April 2016, states that customer data can only be used for the purposes under which it was provided to the organization. This usually doesn’t involve testing. Also, the new law will hold both the data owner as well as the data processor accountable for a data breach, which for obvious reasons impacts the relationship with outsourcers .

Big fines can be incurred in the event of a breach if a company is found to be non-compliant. Yet despite this, 43 percent of the CIO survey respondents that share customer data are unsure about the data protection laws and regulations in place and therefore do not know if they are exposing their company to any level of risk. Moreover, 87 percent of companies that share unprotected customer data believe they can rely on unsecure NDAs to protect them in a worst case scenario. This is a risky strategy; in reality, how enforceable is an NDA? If a breach were to occur, would customers accept that all a company had put in place to protect their data was a piece of paper? It is also worth considering that while the company may uphold the NDA, would it be much of a deterrent to an individual?

How Masking Sensitive Data Can Help With Application Testing

In an attempt to side-step issues around data protection, while still enabling customer data to be utilised in a test environment, many companies have turned to data masking as a solution. Data masking involves blanking out sensitive fields of information to protect the customer.

Businesses’ expectations of IT having the ability to provide a rapid response with new quality applications that can deliver a competitive edge are increasing. Therefore, time-to-value in application development is a key priority. Companies cannot afford delays caused by test data creation and need to deploy solutions that can speed up the process while still delivering high quality. Companies should automate and simplify the process of creating test data, taking an approach that will allow them to ensure the quality of their application testing remains high, while still compliant, in a timely way.

Taking an automated approach to test data management allows companies to efficiently scramble, translate, generate, age, analyse and validate test data swiftly and safely. By doing so, companies can ensure they are able to create and protect test data while also ensuring consistency across multiple data types in both mainframe and distributed applications. In addition, by automating the test data production process, companies can incorporate consistent business rules into a single interface. This helps standardise data manipulation and enhance productivity, allowing for easy creation of specialised disguise routines when needed.

By taking this approach to generation, organizations can make test data anonymous while still allowing development teams to test applications and remain productive. Additionally, as the process is automated rather than using a specialist team to extract and create production data, the process is simplified and more cost-effective. There is no doubt that the challenge of security versus quality is a difficult one; but by taking this approach organizations can create a culture of agile development, while still maintaining low levels of costs and shielding the company from the risk of costly data breaches.