System on Modules (SOM) and End-to-End Verification Using Test Automation Framework

December 20, 2021


Photo Courtesy of Softnautics

This article will provide a high-level overview of how verification of any SOM or any carrier card we call a development kit, need to go through different verification and validation before handing off new solutions to end-user as per their product requirement and how they can contribute to the success of any automated testing process.

SOM is an entire CPU architecture built in a small package the size of a credit card. It is a board-level circuit that integrates a system function and provides core components of an embedded processing system - processor cores, communication interfaces, and memory blocks on a single module. Designing any product based on the SOM is a much faster process than designing the entire system from the ground up. 

There are multiple System on Module manufacturers available in the market worldwide with an equal amount of open-source automated testing frameworks. If you plan to use System-on-Module (SOM) in your product, the first thing required is to identify the test automation framework from the ones available, then check for a suitable module for your needs. 

System on Module (SOM) ensures reduced development and design risk for any application. SOM is a re-usable module embracing maximum hardware/processor complexity, leaving behind reduced work on the carrier/mainboard, thus accelerating Time-to-Market.

It reduces the design complexity and the time-to-market which is critical for a product's success. These System-on-Modules run an OS and are mainly used in applications where Ethernet, file systems, high-resolution display, USB, Internet, etc. are required, and the application needs high computing with less development effort. If you are building a product with less than 20-25K volume, it is practical to use a ready SOM for the product development. 

Test Automation Frameworks for SOM

A testing automation framework is a set of guidelines used for developing test cases. A framework is an amalgamation of tools and practices designed to support more efficient testing. The guidelines involve coding standards, methodologies to handle test data, object repositories, processes to store test results, or information on accessing external resources. 

Testing frameworks are an essential part of any successful product release that goes under testing automation. Using a framework for automated testing will enhance a team’s testing efficiency, accuracy, and will reduce time and risks. 

There are different types of Automated Testing Frameworks. Selecting the right framework is very crucial for your SOM application testing.

Below are a few commonly used examples:

  • Linear Automation Framework
  • Modular Based Testing Framework
  • Library Architecture Testing Framework
  • Data-Driven Framework
  • Keyword-Driven Framework
  • Hybrid Testing Framework

From above, the Modular and Hybrid testing frameworks are best suitable for SOM and their development kit verification. The ultimate goal of testing is to ensure that software works as per the specifications and in line with user expectations. 

The entire process involves quite a few testing types which are preferred or prioritized over others depending on the nature of the application and organization. Let us see some of the basic testing involved in the end-to-end testing process.

Unit Testing 

A full software stack is made of many small components. Instead of directly testing the full software stack, one should cover individual module level testing first. Here, unit testing makes sure to have module/method level input/output testing coverage. 

Unit testing offers a base for complex integrated software and provides quality application code, speeding up continuous integration, and the development process. Often unit tests are executed through test automation by developers.

Smoke Testing

Smoke testing is used to verify whether the deployed software build is stable or not. To go ahead with further testing depends on smoke test results. It is also referred to as build verification testing which checks whether functionality meets its objective. There is still some development work required if SOM does not clear the smoke. 

Sanity Testing 

The changes or proposed functionality that are working as expected is defined by sanity testing. Suppose we fix some issue in the boot flow of the embedded product, then it should go to the validation team for sanity testing. Once this test is passed, it should not impact other basic functions. Sanity testing is unscripted and specifically targets the area that has undergone a code change.

Regression Testing 

Every time the program is revised/modified, it should be retested to assure that the modifications didn’t unintentionally “break” some unrelated behavior. This is called regression testing. These tests are usually automated through a test script. Each time the program/design is tested, it should give a smooth result.

Functional Testing 

Functional testing specifies what the system does. It is also known as black-box testing because the test cases for functional tests are developed without reference to the actual code, i.e., without looking “inside the box.”
All embedded systems have inputs, outputs. Black-box testing is about which inputs should be acceptable and how they should relate to the outputs. 

The tester is unaware of the internal structure of the module or source code. Black-box tests include stress testing, boundary value testing, and performance testing.

Image/Video intensive industries face difficulty in designing and developing customized hardware solutions for explicit application, with reduced time and cost. It is linked with quick evolving processors with increasing complexity, requiring product companies to constantly introduce upgraded variants in a short span. 

Over the past years, Softnautics has developed complex software around various processor families from Lattice, Xilinx, Intel, Qualcomm, TI, etc., and has successfully tested the boards for applications like vision processing, AI/ML, multimedia, industrial IoT, and more. 

Softnautics has market proven process for developing a verification and validation automation suite with zero compromises on feature and/or performance coverage as well as executing test automation with in-house STAF and open-source frameworks. Softnautics also provides testing support for product/solution future releases, release management, and product sustenance/maintenance.

Read our success stories related to testing automation to know more about our quality engineering services for semiconductors.

Contact us at [email protected] for any queries related to your solution or for consultancy.

Narayan is associated with Softnautics as Staff Engineer. At Softnautics, he works on multimedia verification, validation and automation testing services for clients. He has 7+ years of experience in dealing with DSP, ARM platform based-product verification as well as machine learning, deep learning apps development. In his free time, he like to go on tracking , playing table tennis and listening to music.

More from Narayan