A brief guide to testing#

To ensure quality in our build we test our build regularly and intensively. Different levels of testing are used depending on what area is under test.

More details at Test reporting tool

Types of tests#

build time#

At build time checks from the Yocto toolkit and custom checks from .bbclass files are used.

runtime (hardware agnostic)#

For this we use the Automated runtime testing feature from Yocto.

Those tests can be run on virtualized hardware, hence can be performed on every build change.

opt-out for hardware dependent tests

Instead of specifying the tests for a specific hardware, all tests are loaded but then skipped if the expected hardware is not used/present.

This is done using e.g. @skipIfNotQemu() or @skipIfMachine(...) decorators.

Cases needs to be added manually

To add a new test suite to a run, the file needs to be added to the TEST_SUITES variable of the image under test. See also the Yocto project’s variable documentation.

runtime (hardware dependent)#

For hardware dependent tests we either use Test exports <https://docs.yoctoproject.org/5.0.9/test-manual/runtime-testing.html#exporting-tests> or if run from a build environment our

$ devtool simplecore-test-target

integration

release testing#

for releases and release candidates we use scotty-test (see Run a tests on real hardware for more information).

The run tests, are based on runtime (hardware dependent) test exports.

SDK testing#

SDKs are tested we are using SDK testing functionality <https://docs.yoctoproject.org/5.0.9/test-manual/intro.html#testsdk> from Yocto.

See also How can I test my changes to the SDK test cases?.

Cases are picked automatically

All valid test case implementation placed under lib/oeqa/sdk/cases are picked automatically. There is currently no way to exclude tests from a run.

Testing extensions#

If the build is based on simplecore distribution, we offer a variety of helpful extensions.

For instance you can use the following python class decorator

from within a test case you can use the following functions, by importing the extension module

from oeqa.extensions.helper import SimpleCoreTestExtensions

class testClass(OERuntimeTestCase):

    def testCase(self):
        # e.g. to access the board extensions
        SimpleCoreTestExtensions().board.reboot(self)

the following extensions are available

Run tests on specific hardware#

To run or not run tests on specific hardware, you can use

from oeqa.extensions.decorator.runson import runsOn, runsNotOn, DeviceInfo

class testClass(OERuntimeTestCase):

    @runsOn([DeviceInfo(machine='my-machine'), DeviceInfo(machine='my-other-machine')])
    def testCase(self):
        # e.g. to access the board extensions
        SimpleCoreTestExtensions().board.reboot(self)

to make the test only run on my-machine AND my-other-machine, respectively

from oeqa.extensions.decorator.runson import runsOn, runsNotOn, DeviceInfo

class testClass(OERuntimeTestCase):

    @runsNotOn([DeviceInfo(machine='my-machine')])
    def testCase(self):
        # e.g. to access the board extensions
        SimpleCoreTestExtensions().board.reboot(self)

to have it running on all BUT my-machine.

These decorators are determining data from the running device under test, the following parameter can be set in the DeviceInfp constructor

parameter

description

machine

MACHINE

feature

SOM feature level

overlays

configured overlays

hasAudio

SOM has audio

hasBluetooth

SOM has Bluetooth

hasEMMC

SOM has EMMC

hasHDMI

SOM has active HDMI display

hasLVDS

SOM has active LVDS display

hasWifi

SOM has Wifi

numCAN

Number of CAN busses

numETH

Number of ETH interfaces

numGPIO

Number of GPIOs

numLVDS

Number of LVDS interfaces

numSER

Number of UARTs

numUSB

Number of USB ports

numSPI

Number of SPI ports

baseboard

Detected base board (e.g. EP5-001)

Variant data#

When gathering the DeviceInfo data the implementation looks in every layer for a file lib/oeqa/files/machine-data-<MACHINE>.json.

This file contains a list of variant data using the same parameter as described above and in addition

parameter

description

_ep1_marker

overlay name for EP1

_ep5_001_marker

overlay name for EP5-001

_ep5_002_marker

overlay name for EP5-002

_ep5_003_marker

overlay name for EP5-003

_ep5_004_marker

overlay name for EP5-004

_ep5_012_marker

overlay name for EP5-012

The entries are merged in order of appearance

Reporting#

Reporting on a pull request#

If a pull request is run, the test results from runtime (hardware agnostic) and the SDK testing test cases are posted as a comment to your pull request.

The full build results can be downloaded from the actual run in simplecore-tools.

Limited storage time of artifacts

The full test results are only stored for up to 24 hours, after that they will be deleted automatically by Github

Reporting in simplecore-tools#

For all push builds runtime (hardware agnostic) and the SDK testing test cases are added to the permanent test report storage in simplecore-tools.

Those can be enhanced by runtime (hardware dependent).

Out of those reports our Test Report is generated.