Software testing has evolved over the years. Today, all the top runners (HP, IBM, etc.) have a well-refined suite of quality management tools that cater to the needs of different types of testing, ranging from functional/regression to performance testing. The evolution of these products continues due to changing economic conditions and the current needs for Agile development, improved time-to-market, etc.. While these top quality management suites cover traditional testing, special areas like SOA, BPM and the Cloud have not yet been addressed.
These new modern technologies are adopted largely in applications catering to the mid-range and huge enterprise. While one of the key advantages of these technologies is being “available on-demand,” this is also a challenge when it comes to testing those applications. The applications generally involve the integration of several third-party applications. For example, an e-commerce application will need to talk to a mainframe for clearance. Testing of these applications can be a challenge due to the following issues.
- Unavailable/inaccessible systems/components
- Legacy systems
- Dependent systems/components which are yet to be developed
- 3rd party services (Mainframe, SaaS, Cloud) where we have limited control.
- Inability to execute parallel activities / Limited availability of test lab environments
- Due to high costs
- Due to lengthy setup/configuration
- Setup time – test environment (more the setup time, more the time to deliver)
- Very high operational costs
- H/w, S/w, Other special IT resources, 3rd party services
- Extensive test data management time and cost to setup and tear down tests
- Lack of control over usage of 3rd party services
And due to the above challenges, the issues below emerge:
i. Delays in test cycle times to deliver releases on a timely basis
ii. Significantly higher defect rates than projected, in a late production/testing phase
iii. Poor software quality due to skipped or incomplete testing
iv. Performance bottlenecks later in the lifecycle
Tools are available to test special areas like web service. TestMaker, soapUI and WebInject are a few examples from the open source world. However, these tools are not the answer to the challenges above.
“Service Virtualization” is an area that is gaining aggressively in the market because it takes care of all the challenges noted above. Service virtualization is the practice of capturing and simulating the behavior, data and performance characteristics of unavailable or incomplete systems for unconstrained use in development and testing lifecycles. With the enablement of service virtualization, we have the following advantages:
- Delivers realistic simulated environments at a fraction of the cost
- Simulation can be a solution where the solution is not available
- Reduce regression testing time/shortens release cycles by enabling parallel development and testing
- Rapidly create multiple virtual test environments, customized for each testing team
- Eliminates critical testing constraints by virtualizing IT resources
- Reduced defects
- On-demand infrastructure
- Support for any model — on-premise; on-cloud
- Faster time-to-market
- Improved business agility
- Virtualized model can cater to the needs of any type of testing, including functional, performance, etc.
Currently, there are four vendors specializing in this market space with their suite of products.
i. iTKO LISA (now owned by CA)
ii. Green Hat (now owned by IBM)
Of the above, all of them allow for “capture, model and simulate” and scripting for altering/creating a model. Green Hat claims that simulation models can also be created using models apart from capturing, and they also provide models for some common objects like databases.
The applicability of behavior virtualization is widespread — typically, most of the systems developed today are dependent on a lot of external services, and these tools form an essential part of the testing portfolio.
To know more about quality management software , please click here