S3 Group's StormTest

Today's DTV offerings need increasingly complex content delivery platforms to maintain pleased subscribers. Content is now delivered across multiple different networks, including broadcast and broadband forms, and consumed on many more devices, both inside and outside the home. This creates a need for guaranteed validation of each platform to be completed by service providers. Some of these networks and device platforms are probably outside the immediate control of providers, which throws up additional new challenges such as how to guarantee QoS levels on third-party networks and devices.

Software subsystems

To address this evolving challenge, S3 Group has developed a range of products focused on the specific test and validation problem at different stages during the platform lifecycle. These are built on the core StormTest platform, which has been deployed for more than 30 customers worldwide. The majority of these have been DTV operators. The core software architecture of the StormTest platform is outlined in Figure 1.

There are three major software subsystems to the platform:

  • The Client Subsystem software allows tests to be authored and dispatched for execution as well as detailed log results analysis;
  • The Configuration and Scheduling Subsystem stores all the information required to dispatch tests at appropriate times to the devices under test and to gather results thereafter;
  • The Devices-Under-Test (DUT) Server Subsystem controls access to the devices under test. It is paired with daemon versions of the client software to allow tests to be uploaded by clients and run locally with respect to the devices under test, irrespective of how remote the original client is.

Each of these subsystems can be located independently from each other with the only requirement being a reliable, reasonable bandwidth IP connection between each of the sites.

The Client software is typically located wherever test script development and execution is required. In some deployments, these clients have been taken off-shore in order to benefit from better cost efficiencies. In other cases, clients are deployed at multiple different geographic sites with shared access to central test servers that provide a global level of visibility into previously unavailable platform testing activity.

The Configuration and Scheduling subsystem is typically located close to the main IT infrastructure of the customer. As the central storage point for all test scripts, test schedules and results, it is often hosted on a dedicated, high-reliability server and integrated into the IT department's daily backup schedule.

The DUT Server subsystem is usually installed on servers in racks that also contain the DUT. This subsystem is normally located wherever an appropriate test signal can be found to feed to the devices under test. For some test scenarios, such as during new device or application development, test streams may be sufficient as input. This removes any physical limitation on where this subsystem must be based. More typically, however, access to the live signal being delivered by the DTV operator is crucial to the testing operation, and so this subsystem is installed somewhere physically within range of the operator's network.

In addition to these major subsystems, many other subcomponents and integration points shoul be considered in any real deployment. Integration with requirements management tools like HP Quality Center, issue and defect tracking tools like Bugzilla, and automated build systems like Hudson are common.

GUI-based tests

StormTest is a programmable system requiring test scripts to tell it what tests to run. Test scripts can be written using the high-level, drag-and-drop, GUI-based Test Creator or lower-level Python scripting interface. Often, users start with the GUI-based tool to get familiar with the system before moving to the Python level to program more-complex test cases. Any work done can be exported through the GUI tool to Python in order to get a jump start on their work.

Once the test scripts have been written, they can either be run remotely from the devices under test by executing them directly on the client machine, or they can be uploaded to the central Configuration Server for execution locally close to the devices under test. Pros and cons exist to both approaches. Running them on the user's client PC allows rapid iteration of the test scripts during script development and allows users to see feedback from the test script as it executes. Uploading to the central system allows tests to be run automatically based on pre-set schedules without requiring the Client machine to be connected at the time of execution. This is useful for scheduling tests to run repetitively or at anti-social hours, but also can be used to ensure that any Ethernet network impairments between the client and the server don't impact the quality of the test performed on the DUTs by the server.

End-to-end maintenance

In addition to CPE-focused testing, StormTest is increasingly being used by clients during the development, automated test and ongoing maintenance of end-to-end DTV content-delivery platforms. In such scenarios, StormTest is still used to automatically interact with the CPE, but, in addition, the test scripts also use network interfaces to provide test stimuli at various other points across the content delivery platform. This is to measure their impact on the service delivered to the CPE. S3 Group demonstrated this type of use in a collaborative project done with a number of other participants in the emerging EBIF interactive-TV market in the U.S. last year. In this configuration, StormTest test scripts were driving STBs to launch and interact with EBIF applications. (See Figure 2.) During this interaction, a number of options were selected that resulted in the cable headend executing transactions with other external application servers, e.g. completing simulated purchase of movies. Once the script finished its interaction with the application on the STB, it proceeded to connect directly to the external application server to validate that the transaction had been registered successfully. This completed the full, end-to-end validation of the entire EBIF platform and verified the correct operation of a large number of individual system components that had to interoperate in order for the test to pass.

As today's DTV operators evolve ever-more complex content delivery platforms, the development and ongoing maintenance of these platforms will require an ever-increasing degree of testing and monitoring. Automation can play a vital role in delivering this while controlling costs.

John Maguire is the director of product strategy, S3 Group.