Using analytics to improve testing and application delivery

Oded A. Tankus, Project Manager, Assure, discusses analytics and its implication on testing process improvement and application delivery.

The TEST Focus Groups conference was conducted on March 22nd at the Park Inn Hotel, London. Lead by Assure’s CEO, David White, three roundtable discussions were conducted on the subject of ‘Using Analytics to Improve Testing and Application Delivery.’ Assure was approached to lead these roundtable discussions, and welcomed 28 professionals who came from a variety of industries and mostly held executive and managerial roles (68%). 50% of the participants came from the financial industry.

In the three roundtable sessions we touched on all aspects of analytics in the testing process and its implication on testing process improvement and application delivery.

This article summarises some of the topics that were discussed in the form of Q&A, and describes the best practices, insights and experiences of the participants from the perspective of the metrics used and the insights gained in improving the test process and shortening the application delivery time without compromising quality.

Organisational and Management Support

How can we get organisational and management support for the QA analytics initiative?

The organisation defines a framework that drives the QA improvement function. Management understands that their success is directly proportional to the quality of the delivered applications. Management always needs to balance between committing resources to deliver quality applications and lowering QA costs. Budgets are justified by continuously demonstrating the quality of application delivery through efficient quality processes.

The Analytics Process

How can we streamline the analytics process, maximising the extraction of meaningful insights?

The analytics process must be structured and well planned. The following are some of the points discussed:

1. Analytics process components:

  • A communications platform.
  • A feedback system.
  • Training.
  • A text analytics engine.

2. Real time versus right time – information and insights must be provided at the right time and not necessarily continuously.

3. Insights – the process of interpretations must be standardised so that different managers come to the same conclusion after analysing the data.

Analytics

What should be in the analytics toolbox?

Tools that extract meaningful insights that support:

  • Descriptive analytics – creating simple counts, distributions, visualisations describing your data.
  • Predictive analytics – predicting organisational and process behaviour, e.g., can you predict at the beginning of a release how it will end?
  • Prescriptive analytics – prescribe corrective actions and suggest mitigations after identifying aberrations, risks, etc.

Dashboarding

What is the best way of developing and designing a dashboard?

‘Dashboarding’ is defined as the development of an effective dashboard – a tool that supports the way to gain insights, allowing the information consumer to think business instead of data. Some considerations for dashboarding:

  1. Follow an accepted methodology for dashboarding.
  2. The dashboard must support a specific process/goal in order to be useful.
  3. Do not mix strategic metrics with operational metrics, since this causes ambiguity and confusion.
  4. Provide drill down capabilities to better focus on and target your issues. Define threshold indicators and colour coding to add intelligence.

Metrics Methodology

What is the best way to develop ‘meaningful’ metrics?

The de‑facto methodology that provides the most cost effective list of metrics is the GQ(I)M (goal‑question‑indicator‑metric) methodology, where:

  • Goals need to be defined based on organisationally adopted quality  frameworks (TMMI, ISO, IEEE, etc.).
  • Questions need to be defined so that their answers meet the goals.
  • Defined metrics are a natural outcome of the questions.

Data

What are the major issues and misconceptions surrounding the data asset?

The two major issues are:

1. Data quality and standardisation:

  • Organisations are not aware of the low quality of their data.
  • Analytics will not solve the data quality issue, but only highlight where data quality is weak.
  • Improve the quality of the data asset and form a trustworthy base for analytics.
  • Data quality issues are treated through process improvement initiatives.

2. Data structures:

  • Data must have a stable and consistent unified data structure defined in entity‑relationship‑diagrams reflecting business entities, relationships and attributes.
  • Data structures inherently contain natural hierarchies which are translated to and defined for filtering and drill down capabilities when dashboarding.

Processes

On which processes do organisations focus when considering the analytics initiative?

The three major quality processes that directly impact the analytics initiative are:

1. Requirements management – requirements must be modelled. Requirements modelling techniques include use cases and ERDs. In the absence of good requirements, test case documentation is used.

2. Test automation – used to decrease the delivery time of software by decreasing the time and effort allocated to quality. Effective test planning can dramatically simplify the test automation effort. Considerations for test automations include:

  • Identify high risk business areas.
  • Do not try to automate everything. Define clear criteria for automation.
  • Automate test cases that always pass or always fail, test cases with a high number of runs.

3. Defect management – effective management of the defect lifecycle will vastly increase the quality of delivered applications. Metrics in this domain are usually filtered by defect severity and include:

  • Counts and average times between defect statuses to identify bottlenecks.
  • End‑to‑end times along the defect status network, e.g., open‑close.
  • Counts and average times between cycles of inefficiencies on the network – e.g., number of defects and average times on the open‑reject‑open path.

Conclusions

The above focused on a small portion of the subjects raised in the roundtable discussions. The major areas of discussion focused on data quality, test automation and the tools and techniques surrounding dashboarding, metrics programmes and analytics. Most organisations have implemented basic quality metrics for descriptive analytics. A small few have evolved to predictive and prescriptive analytical capabilities.

Edited for web by Jordan Platt

More
articles