When I was just starting my career, a common career path for junior IT employees was to start in quality assurance (QA), progress to development, and then perhaps move to operations or the business side of IT. Times have changed, but perceptions of QA and testing teams have not kept up.
With the industry’s focus now on delivering great software fast to address dynamic, mission-critical business challenges, and with innovations such as model-based testing, real-time test data generation and service virtualization, the work of software testing teams is even more vital and interesting. Mandates for two-week sprints from agile teams, faster release cycles, and end-to-end DevOps orchestration present great opportunities for organizations to hear, and carefully consider, the voices of testing teams to accelerate value delivery.
One challenge inherent in learning from testing teams is that it requires us to listen differently—in other words, more empathetically and analytically—since they often raise symptoms of deeper issues.
Software testing teams are seldom the most vocal or prominent groups along the software delivery pipeline. By the time they express their needs, actionable insights or work experiences, many of the other teams—development, product owners, and business stakeholders—have shifted to the next backlog or other priorities.
This is an unfortunate circumstance, for a number of reasons. Since testing is one of the last stops before software is released to production, software assets needing all forms of QA and test-data-proofing are fully loaded with value. Like a beautiful piece of handmade furniture awaiting final finishing, software at this stage contains substantial investments and latent value. The focus now is to bring the product over the finish line and to market in top condition with confidence.
That puts software testing teams in a unique position. Working at the intersection of development and operations and under the brutal scrutiny of business stakeholders (if software quality is not achieved), testing teams see the results of technical requirements converted to code. They also often hear from operations teams about production issues long before development or requirement owners.
One inherent challenge in learning from testing teams is that it requires us to listen differently—in other words, more empathetically and analytically—since they often raise symptoms of deeper issues.
Below is a small sample of ‘voices of the testing team’ and what we can learn from those voices. As a services architect for the DevOps, Continuous Testing and Continuous Delivery practices for Enterprise Studio by HCL, I hear these types of comments frequently, happily and with enormous respect for the responsibility these teams shoulder.
Testing of all nuanced cases
Request from testers to operations: We need a full copy of the production database to ensure that all possible nuances and application behaviors are tested. We will mask the data, bring it into test and get to work!
This request reveals that the testing team does not know all possible combinations of criteria for data to reuse existing test data assets, recreate them or request a targeted subset of production data. The testing team requests production data, presuming most, if not all, conditions are covered in a large amount of data. Such requests put unnecessary demands on production teams, delays testing, and does not ensure adequate test coverage.
What is needed:
Model-based testing can help testers avoid manually listing all data scenarios. It also supports behavior-driven development, combining business requirements with functional test cases and allows the following:
- Better communication (earlier and in greater detail) during requirements definition with input from production teams and development. This helps testing teams clarify and narrow the use cases needing scrutiny. It also relieves pressure and workload on operations, the database team and testing at this critical point of the SDLC resulting in high-quality test data and test results.
- Automatic generation of test cases according to application along with data scenarios that support those test cases. Having the application generate the scenarios avoids over-testing and limited or incorrect data scenarios caused by human bias or errors.
Ad hoc test data creation
Comment from testers: In order to have data to test, I have to go into the application and manually create a new customer and some transactions with the right parameters. Often, half of the steps in my test case involve data creation.
This request expresses that the testers don’t fully understand the business logic and/or data model, which adds time, effort and uncertainty for development and testing teams. Data models are inherently complex, which means that the developers’ knowledge of the model is often limited to the specific components they are developing. To make things more complicated, the data model may span multiple applications, obscuring linkage at the database level. This can result in poor-quality code being released into production and teams struggling to find the source of problems.
What is needed:
- Data profiling: Discovery of data relationships within a single application database and across all related applications in the environment. Move from cataloging the data structure in static documents to techniques that allow applications to automatically track table relationships and table structures. This understanding serves as a data dictionary that benefits all SDLC participants.
- Use the data model for test data generation so that all relevant data elements are created and support the various test cases. Since the table relationships are automatically tracked in the solution, test data engineers need not worry about writing complex insert statements or making sure the sequence of inserts is correct.
On-demand test data search
Request to data management team: Please send me a list of IDs that meet the criteria specified in this email/spreadsheet/online form. I need these production records to begin testing.
The request reveals a number of things:
- Production teams still retrieve test data through direct ad hoc database queries.
- Data criteria may be defined in multiple databases.
- No mechanism exists for easily linking all data properties together.
The lack of reliable, consistent processes results in unreliable, inconsistent results and increased stress, and discord among teams.
What is needed:
- Data profiling: Discovery of data relationships between related applications in the environment. This allows search criteria to be defined on cross-database elements (e.g., testers can search for users from the customer profile application database based on criteria in transactional data in the transactions application database or the CRM application database).
- Enabling of test data self-service: To find, reserve, and analyze test records based on any required data entity, testers can use online forms mapped to data models to query complex data structures that span multiple systems. This eliminates the need for writing complex cross-database queries, which require precisely correct JOIN logic. It also allows connections across different database technologies.
The work required to set up self-service increases communication across database management teams, application developers, testing and operations, thus improving efficiency and holistic understanding of complex systems.
Production application troubleshooting
Request to DBA team: I need to troubleshoot a customer issue in real time, so I need direct read access to the production/DR copy of the application database.
This reveals the lack of an easy mechanism for extracting a data subset and all the related entries into test environments while also retaining data integrity and securing sensitive information. Testing becomes a bottleneck. Organizations that are inefficient in addressing customer issues work harder, struggle to focus on longer-term value, and suffer lower customer satisfaction.
A robust and repeatable sub-setting and in-flight masking process:
- A good sub-setting process automatically defines all related tables from which to retrieve data and ensures relational integrity of data. It also allows logical relationships to be added easily and the generation of required assets so the process can be executed as required.
- To ensure data security, a masking process needs to be attached to the sub-setting process that masks the data in flight while copying the data from the production environment to test environments. This way, sensitive data is not moved outside secure production environments.
Dependent services required for testing
Comment to business: I am unable to begin testing service integrations to internal APIs until code reaches the systems integration testing environment. Also, we can’t test external services until user acceptance testing.
The organization lacks sufficient testing environments or system availability. Obtaining and maintaining the necessary testing infrastructure may be constrained by budget, the time needed to develop adequate systems, or limited access to third-party systems. These weaknesses force organizations to work sequentially rather than in parallel, impeding advance preparation.
Use service virtualization stand in for unavailable systems. Virtual services simulate the behavior of components or full systems, including processing time, message headers, and security. This allows for the shift-left of integration and performance testing, so that testing and development begin at the same time. This lowers the cost of quality, improves the work of development, and increases the speed of value creation throughout the SDLC.
Finally, here are suggestions for improving an organization’s ability to hear/gather/take note of/leverage the views and insights of testing teams to unlock value along the full SDLC:
- Find or develop ways to include testing teams during requirement, planning, and prioritization.
- Inform the testing team of the business connection of underlying code so that the insights they share are more likely to be presented in a business context.
- Adopt processes that can be automated to feedback and feed-forward communication between various teams, including IT operations, and testing.
- Move away from the mindset that testing is a labor-intensive effort that can be done by more junior, lower-cost employees.
- Change the work culture by moving toward test automation tools, model-based testing, better adoption of advance test data management, and service virtualization.
By listening to the experiences and actionable insights voiced by testing teams, organizations can improve work efficiencies and cross-silo collaboration. In turn, organizations can apply more resources to address systemic SDLC issues and enhance software-quality to market quickly and confidently.
About Enterprise Studio
Enterprise Studio by HCL Technologies helps organizations make the connections between IT and business that optimize time and multiply value for realizing the full potential of their digital business plans. Our seasoned technologists, coaches, and educators can help you unlock value from existing IT investments to become a stronger, more adaptive organization – in part by leveraging a BizOps approach so that IT outputs are strongly linked to business outcomes.
Whether you’re an established Global 500 company or a new disruptive force in your industry, we can help you navigate complexities that come with competing in an inter-connected digital era. We are a global solution provider and Tier 1 global value-added reseller of Broadcom CA Technologies and Symantec enterprise software.
Many of our experts at Enterprise Studio are from the former professional services units of CA Technologies and Symantec. For decades, our teams have supported and led organizations to innovation with powerful enterprise software solutions and cutting-edge methodologies – from business and agile management to security, DevOps, AIOps, and automation.