Sorry, you need to enable JavaScript to visit this website.

Automation: Meeting the expectancy by Video Testing

Automation: Meeting the expectancy by Video Testing
October 28, 2020

Co-author: Banumathi Palanichamy

In today’s world, humans tend to automate their everyday activities for better accuracy. The growth of technology is becoming endless in recent years. Hence, the emergence of automation testing has taken place to overcome the burden of manual testing. Video technology is a strong component of automation testing. Quality can be broadly achieved by testing the product to identify the correctness, completeness, and quality of the developed computer software provided the user’s experience is not affected. An essential part of the success of every project lies in automation testing and quality deliverables. Complexity arises under the following circumstances:

  • When a new video technology emerges with a faster adoption mechanism
  • The video sequences must be played in random order
  • The experts needed to validate these changes are very expensive and are not available all the time

Quality can be broadly achieved by testing the product to identify the correctness, completeness, and quality of developed computer software.

Video quality testing can be categorized as subjective and objective. In a subjective way, the testing is done by the actual users who validate and certify the quality by playing multiple video clips. They have a scaling system to provide the rating by quality. Though accurate, it requires tedious, manual effort and is not a highly repeatable process.

Alternatively, the objective case considers various image attributes such as luminance, chrominance, edge sharpness, and temporal changes, etc. By using and comprising these image attributes, an algorithm assesses the video sequences by interlinking these attributes and providing the quality of the video in a data format. In this way, alignment errors are not added as an error of the video quality of the video source. This type of testing is repeatable with video scoring.

Increasing Views

Video integration in websites has now become a crucial element in the mix. However, video production is expensive. Here are the three significant areas companies should focus on:

  • Building a robust technology infrastructure
  • Leveraging data analytics
  • Generating a substantial number of videos

The picture below depicts a general overview of a video source into streaming:

Video Source

The features of video testing include logo identification, text verification, video files comparison, and report generation. In logo identification, alignment verification is done for determining the position and orientation of a known object by locating it. Inspection is done for detecting simple flaws such as missing parts or unreadable printing. Gauging is done for measuring the lengths, diameters, angles, and other critical dimensions. In-text verification, EPG , and settings verification are done. In the comparison of video files, pixelation frames, distortion frames, and black screen frames are done.

We expect advanced video comparison features such as:

  • Supporting different fps, resolution, and file type settings in both the actual and captured video files
  • Testing even when one actual video is recorded in one hardware setup and the captured video is recorded in another hardware setup
  • Solution for spatial alignment

The Way Ahead

In this blog post, the focus is on how video testing can be done in the following ways:

The video can be recorded from the device under test (DUT) using the following sources:

  • Web camera
  • Video capture card

Video Source

Audio and video synchronization testing can be added here. Here, the captured video will be split into image frames and audio waves. The required image will be taken as a reference for the image pattern comparison, and the required wave pattern will be taken as a reference for audio. Image pattern match and sound compare features are utilized for finding the timestamp between the video and audio, and the respective timestamps are placed in an Excel sheet for verification.

In general, the recorded videos can be compared with the reference video using a frame-by-frame comparison. The image pixel compare feature is utilized for frame comparison between the two video frames, and the PDF report will be generated as output, showing the matched and unmatched frames. But video testing does not support the comparison feature alone. It also covers the following configurable features, such as full reference comparison, without reference comparison, frame syncing, the threshold for the number of mismatched frames, image matching algorithms, and frame matching algorithms, etc.

So, it’s time to grasp and realize the potential of video testing from various perspectives.