We've launched our new site at www.openlighting.org. This wiki will remain and be updated with more technical information.
Difference between revisions of "RDM Responder Testing"
From wiki.openlighting.org
Line 1: | Line 1: | ||
− | [[ | + | As part of the [[Open Lighting Project]] a suite of tests for [[RDM]] responders has been developed. This enables manufacturers to check how well a RDM device conforms to the E1.20 specification. The tests cases are written in Python, and use the [[OLA Open Lighting Architecture]] to communicate with devices. |
+ | |||
+ | |||
+ | * [[Responder Testing FAQ]] | ||
+ | * [[Running the tests]] | ||
+ | * [[Writing RDM Responder Tests]] | ||
+ | |||
== Setup the Test Rig == | == Setup the Test Rig == |
Revision as of 20:47, 30 January 2011
As part of the Open Lighting Project a suite of tests for RDM responders has been developed. This enables manufacturers to check how well a RDM device conforms to the E1.20 specification. The tests cases are written in Python, and use the OLA Open Lighting Architecture to communicate with devices.
Setup the Test Rig
The following controller devices are supported:
Connect the device under test to the controller device and start olad. Patch the output port on the controller device to a universe (UNIVERSE_NUMBER). Then run ola_rdm_discover, you should see the responder's UID appear:
$ ola_rdm_discover -u UNIVERSE_NUMBER 00a1:00010003 7a70:ffffff00
Running the Tests
The tests are written in Python and run using ola_rdm_test.py. Below is the output from a typical test run:
./ola_rdm_test.py --universe 1 --pid_file ../../python/pids.config 00a1:00010003 Starting tests, universe 3, UID 00a1:00010003 SetManufacturerLabel: Passed SetSoftwareVersionLabel: Passed GetManufacturerLabel: Passed GetSoftwareVersionLabelWithData: Failed ... ------------- Warnings -------------- ------------ By Category ------------ Product Information: 7 / 7 100% RDM Information: 1 / 1 100% Core Functionality: 2 / 2 100% Error Conditions: 10 / 16 62% DMX512 Setup: 3 / 3 100% ------------------------------------- 29 / 30 tests run, 23 passed, 6 failed, 0 broken
Useful Options
ola_rdm_test.py has some options which can assist in debugging failures. For a full list of options run with -h
- -d, --debug
- Show all debugging output, including actual & expected responses.
- -l, --log
- Log the output of the tests to a file. The UID and timestamp is appended to the filename
- -t Test1,Test2 , --tests=Test1,Test2
- Only run a subset of the Tests. Only the tests listed (and their dependencies) will be run.
Information on Tests
Some tests have dependencies, which are other tests that need to be completed before the test can be run. Dependencies can be used to check for supported parameters and other conditions that may affect responder behavior.
There are 4 result states for a test:
- Passed
- The responder replied with the expected result
- Failed
- The responder failed to reply, or replied with an un-expected result
- Not Run
- This test wasn't run because the responder doesn't support the required functionality
- Broken
- An internal error occurred, this indicates a programming error or an error with the test rig.