The future of test data management

Iain Chidgey, VP and General Manager, International, Delphix, reviews the key challenges facing testers and developers following the TEST Focus Groups event.

In today’s culture of immediacy, speed to market has never been more important to ensure organisations can conquer the competition and deliver on the promise of high quality business applications. Yet, grappling with a plethora of issues, the software testing world is rife with delay‑inducing challenges.

From access to quality data, difficulties in ensuring collaboration between teams and deploying staff with the correct skills to effect change, many have hailed agile and DevOps methodologies as the answer. When speaking to a number of industry insiders, however, we found out exactly what bottlenecks are currently holding them back from success.

Gaining access to accurate, high quality data is like chasing gold dust

While having access to secure data on demand is a critical success factor in delivering timely and accurate releases, confessions from the testing community have shown that testing environments are regularly limited due to data issues. In fact, recent research from Delphix has shown that staff are waiting up to a week or longer to refresh non‑production environments from production. These difficulties in managing test data have also led to bugs creeping into development cycles and impacting the bottom line.

Working together doesn’t always go to plan

Enabling teams to achieve mutual goals is also an area for improvement, with feedback indicating that when it comes to testing and development teams, it’s traditionally a finger‑pointing exercise for blame when processes and outcomes don’t align. Getting data into the hands of those who need it, when they need it, is an ongoing challenge, and collaboration traditionally has not been considered a prerequisite for success. As time is lost while teams work independently without common goals, a cultural dilemma is arising.

Automation is the aim, not the reality

While collaboration between development and operations is important, leading organisations also encourage developers to embrace ops functions and deliver scale through automation. From software architecture and design to system admin and production support, success is synonymous with a style of IT management and implementation that places emphasis on automation and iterative delivery of software. Unfortunately, legacy infrastructure often doesn’t support modern approaches and deployment automation, and there is still a large portion of the market that has yet to fully embrace this practice and remove delay and inefficiency.

The need for speed means sacrificing on security – or does it?

Despite facing unprecedented risk of data loss, teams are opting for agility over security, deciding to move faster rather than waiting for data to be masked and secured for use in non‑production data. As businesses opt to expand the population of employees who have access to full production data, this creates challenges in ensuring data is safeguarded and protocols are adhered to.

Measurement is the missing link

While automation and collaboration are challenges impacting speed to market, they aren’t themes that can provide a tangible metric for measurement. Across the board, the way we measure productivity and success today is varied and wide ranging. From test coverage to customer feedback and the volume of bugs found in production, teams struggle to define and measure their productivity effectively.

The golden ticket

Of all the challengers our industry leaders have addressed, the issues all have one underlying connection – data. The ability to copy, secure and deliver data on demand is a critical success factor for business. Only by making the underlying data more agile can businesses reduce delays. This is where data virtualisation can step in to do the heavy lifting.

By virtualising at the data level, copies no longer need to be duplicates; rather, data blocks can be shared. This means environments can be served up in minutes, not months. Data sets can be refreshed and reset on demand and environments can be bookmarked and shared between users. This reduces the time it takes to provision data for applications and limits the hand‑offs between teams competing for data access. In turn, this means self‑service and automation take precedence and empower users to copy and share data without fear. In turn, this fosters collaboration as data can finally be in the hands of those that need it, when they need it. As this approach means data can be masked during data delivery without any need for manual intervention, organisations can also finally control data access and ensure security policy is up to scratch without enduring bottlenecks to speed.

Edited for web by Jordan Platt

More
articles