Video Clarity
CONTACT US  
 
 
 

Video Testing for Broadcasters

Video Testing for Broadcasters

This paper explores the challenges Broadcasters face when assessing video quality. A combination of video processing, compression, transmission, and decompressing determine the system’s performance. While video quality relates directly to video processing and compression efficiency, the broadcaster should not dismiss transmission errors and set-top box output. Broadcasters should evaluate the video quality throughout the development, quality assurance, and deployment phases.

Most users expect that a single video test & measurement system can produce an exact metric of video quality. Unfortunately, no simple metric (or test device) exists that can comprehensively measure and categorize video quality. The ITU has formed a group to study objective measurement tools, the “Video Quality Experts Group” or (VQEG). Their conclusions, thus far, have been captured in the “VQEG Report on the Validation of Objective Models of Video Quality Assessment.” 

Further reading includes Can Objective Metrics Replace the Human Eye? and Can Video Quality Testing be Scripted?

Factors Involved in Video Quality

Video processing and compression algorithms change the characteristics of the original program in the quest of reducing the bandwidth needed to send the programming information to the home. The art is to do this without allowing the audience to perceive a change in video quality. Successful video processing and compression algorithms perform the desired modifications while presenting a result to the viewer that, subjectively, looks natural and realistic. This sounds difficult, but it is necessary when transmitting many channels of high-quality programming.

Broadly speaking, five factors affect the video quality:

  • Video Pre-Processing
  • Constant Bit Rate (CBR) versus Variable Bit Rate (VBR)
  • Inherent video compression efficiency
  • Transmission Quality/ Error Recovery
  • Inherent video decompression efficiency

Video Testing Challenges

Each broadcaster – traditional or web caster – must deal with rapidly changing varieties of programming, new video processing algorithms, and new compression algorithms. Video processing and compression companies continuously invent sophisticated ways to reduce the huge bandwidth requirements to manageable levels. How can broadcasters know if a new algorithm is better than their current choice?

Broadcasters invite the various video processing, compression, transmission, and decompression companies into their R&D facilities, and perform side-by-side tests also known as a “bake-off”. Each vendor starts with the same source material, and does their best to reduce the bandwidth while keeping the video quality high.

While an objective measurement would create a repeatable quantitative number, VQEG is not prepared to propose one model over another. In other words, objective metrics are still evolving. In the meanwhile, subjective side-by-side comparisons are still needed.

The broadcaster shows the results to a group of experts and asks them which is best. This is termed subjective video analysis, and it measures the overall perceived video quality. The most commonly used video quality evaluation method is the Mean Opinion Score (MOS), recommended by the ITU. It consists in having several experts view a known distorted video sequences in order to rate its quality, according to a predefined quality scale. By doing this the expert viewers are trained to build a mapping between the quality scale and a set of processed video sequences. After the “training” is complete, the subjects are then asked to rate the new video processing algorithms.

Simply stated, the test setup is

  • Start with a known video sequence.
  • New Video Processing system alters the video sequence.
  • Display the original and processed video sequences.
  • Bring in experts to subjectively vote.

Complexity arises as

  • New Video Processing systems may need new equipment to playback the video sequences.
  • The original and processed video sequences should be displayed in random orders.
  • Expert viewers are expensive and do not produce repeatable results.

Video Quality Testing Methodologies

Video Quality Testing can be done in 1 of 2 ways: Subjectively or Objectively. Follow these links for more information about Subjective Testing and Objective Testing methodologies.

In the Subjective case, experts view multiple test clips and vote based on a quality scale (usually 1-5). The Test Equipment must play the video sequences in a pre-defined order and allow the expert time to vote. This is a tedious exercise and is not highly repeatable, but it is based on actual users so it is accurate.

In the Objective case, an algorithm “watches” the video sequence and measures the luminous, chrominance, blockiness, edge sharpness, and temporal changes. This data is then correlated with respect to the source video sequence, and a assessment is made about quality. In order to do this, care must be taken to spatially and temporally line up the data to prevent alignment errors from affecting the video score.

Regardless of the methodology, the video test setup must be repeatable. Ideally, the video scoring is also repeatable.

Video Quality Testing Equipment

To streamline the process, equipment for video quality testing needs to be defined, which can capture, play, and analyze any two video sequences. Further, as new input/output modules are continuously under development, the test equipment should use an open-architecture approach to ease upgradeability.

The following are the key attributes of a robust video quality testing tools.

  • Allow a way to import video sequences regardless of their file type – i.e. AVI, QuickTime, Raw, Video Editor, MPEG, etc.
  • Convert all video sequences to user-selectable resolution, bit depth, and color format so that they can be displayed multiple viewing modes on the same display.
  • Serve video sequences to the encoder and/or video processing unit using SDI, Component, DVB-ASI or DVI.
  • Capture the output of the encoder or the encoder/decoder pair.
  • Align the captured and played out video sequences both spatially and temporally.
  • Allow multiple playing modes such as play, shuttle, jog, pause, zoom and pan.
  • Apply reference and no-reference objective metrics to the video sequences to score the video.
  • Log/graph the objective scores for easy analysis.
  • Export pieces of video sequences to further analyze off-line.

Setting up Consistent Tests

Simplicity is the key to any test. Aim to change only one variable at a time.

  • Select a known test sequence, either one of the VQEG sequence or something which is indicative of your broadcast material.
  • Use serial digital video as the input to the video processor and/or video compression unit.
  • Build out a network where you can inject known errors into it.
  • Use a common set-top box or video decompression unit. For ultimate video quality, select one with SDI output.
  • Capture the video decompression output to compare it with the input.
  • Compare the source and resultant video sequences on either 1 monitor with pan/scan or on 2 monitors
  • If 2 monitors are used, then calibrate both monitors to the same black levels, contrast, etc. using a known test pattern.
  • Store the results of the objective metrics, and the subjective MOS score.

Video compression systems inevitably introduce artifacts as the bit rate lowers. Running a system side-by-side with a video reference is the most useful way to categorize and relate the severity and frequency of the artifacts. While looking at artifacts, the video test equipment should highlight edge noise, blocking & tiling, dark pictures, and tearing.

Please remember that when testing an equipment manufacturer, tuning control mechanisms may need to change. Thus, the testing is not complete until the optimal calibration is done. Further, some equipment manufacturers will look better in some instances; while others look better in other instances.

Subjective Analysis Display Modes (Vertical Split)
SubjectiveViewing500

To simplify the work flow, any video sequence can be played while capturing another video sequence, thus, combining the video server, capture device, viewer and video analyzer into one unit. By doing this ClearView controls the test environment, which allows for automated, repeatable, quantitative video quality measurements.

Quantitative Picture Quality Evaluation
ObjectiveTestingCVE

Automated Pass/Fail Testing
PassFailTestingCVE

Conclusion

The following ingredients provide the basis for useful video quality testing

  • Good preparation
  • Known test sequences
  • Flexible video test equipment
  • Methodical test plan
  • Experienced judges

For more information about Video Clarity, please visit http://www.videoclarity.com.

 

PDF   Video Testing for Broadcasters

 
 
 
1-408-379-6952
 
Pharm Olam has offices very populated cities in Europe, Latin the actual , of india, South africa, And the USA to provide practical coverage for oncology studies of all sizes. These onsite workshops can be found in one, two, Three, Or four day options and feature pre class writing lab tests, fun discussion, In class soccer drills for kids , Breakout workouts, instruction, And post class assessment to ensure a relevant workshop that delivers long term results. The Senior Director plays a key role in the theory,Medical device and diagnostics companies in the conviction hearing area . Candidates with a 4 year degree in the life sciences and minimum 2 years of industry related experience are invited to attend . (Equivalent work experience may be considered in lieu of a four year degree. PhD and Postdoc appliers welcome.).