Fixing TC-WEBRTCR-2.X Test Case Failures GStreamer Installation Guide
Introduction
Hey guys! We've encountered a bug while running the WebRTCRequestor cluster test cases in the Test Harness (UI). The tests are failing right at the beginning, and we wanted to share the details, steps to reproduce, and our workaround so you can avoid this issue. This article dives deep into the bug encountered in TC-WEBRTCR-2.X test cases specifically on the Test Harness (TH) UI, version v2.14-beta1.1+winter2025. The failure occurs during the setup phase, preventing the tests from executing properly. This comprehensive guide will walk you through the bug, the steps to reproduce it, the expected behavior, and the solution we found. If you're encountering similar issues with your WebRTCRequestor tests, read on to learn how to resolve this problem and ensure your tests run smoothly.
The Bug: Timeout Error During Setup
The error message we're seeing is a TimeoutError
: "Expected output ''Server initialization complete'' not found within 30 seconds." This means the test harness is waiting for the camera application to initialize, but it's not receiving the expected confirmation within the timeout period. This issue specifically affects the TC-WEBRTCR-2.1 test case and potentially other related tests in the WebRTCRequestor cluster. Understanding the root cause of this error is crucial for ensuring the reliability and stability of your testing environment. Let's delve into the specifics of the error message and its implications for your testing process.
Here’s the detailed error message:
[MatterTest] 07-24 07:24:20.795 INFO ==========> TC_WebRTCRequestor_2_1 <==========
[MatterTest] 07-24 07:24:20.806 INFO Temporary storage directory: /tmp/TC_WebRTCRequestor_2_16jijc4td
[MatterTest] 07-24 07:24:20.810 INFO RUN: /root/chip-camera-app --KVS /tmp/TC_WebRTCRequestor_2_16jijc4td/kvs-app-sndidqdh --secured-device-port 5540 --discriminator 1234 --passcode 20202021
[MatterTest] 07-24 07:24:21.494 INFO Avahi group established
[MatterTest] 07-24 07:24:21.553 INFO Avahi group established
[MatterTest] 07-24 07:24:50.817 ERROR Error in TC_WebRTCRequestor_2_1#setup_class.
Traceback (most recent call last):
File "/usr/local/lib/python3.12/dist-packages/mobly/base_test.py", line 411, in _setup_class
self.setup_class()
File "/root/python_testing/scripts/sdk/TC_WEBRTCR_2_1.py", line 75, in setup_class
self.th_server.start(
File "/usr/local/lib/python3.12/dist-packages/chip/testing/tasks.py", line 132, in start
raise TimeoutError("Expected output '%r' not found within %s seconds" % (expected_output, timeout))
TimeoutError: Expected output ''Server initialization complete'' not found within 30 seconds
[MatterTest] 07-24 07:24:50.836 ERROR
******************************************************************
*
* Test setup_class failed for the following reason:
* Expected output ''Server initialization complete'' not found within 30 seconds
*
* File "/usr/local/lib/python3.12/dist-packages/chip/testing/tasks.py", line 132, in start
* TimeoutError: Expected output ''Server initialization complete'' not found within 30 seconds
*
* Test step:
* UNKNOWN - no test steps provided in test script
*
* Endpoint: None
*
******************************************************************
[MatterTest] 07-24 07:24:50.868 INFO Summary for test class TC_WebRTCRequestor_2_1: Error 1, Executed 0, Failed 0, Passed 0, Requested 1, Skipped 1
[MatterTest] 07-24 07:24:50.873 INFO Summary for test run MatterTest@07-24-2025_07-24-20-594:
Total time elapsed 30.21814139399976s
Artifacts are saved in "/tmp/matter_testing/logs/MatterTest/07-24-2025_07-24-20-594"
Test summary saved in "/tmp/matter_testing/logs/MatterTest/07-24-2025_07-24-20-594/test_summary.yaml"
Test results: Error 1, Executed 0, Failed 0, Passed 0, Requested 1, Skipped 1
Key Takeaways from the Error
- The error occurs during the
setup_class
phase of the test, indicating an issue with the test environment initialization. This means the core problem lies in the setup routines executed before the actual test logic is invoked. - The timeout suggests that the camera application isn't starting correctly or isn't providing the expected output to the test harness. This can be due to various factors, such as missing dependencies, incorrect configuration, or issues with the application itself. Knowing this can help us narrow down the troubleshooting steps and focus on the critical areas.
- The absence of specific test steps in the error message further underscores that the problem is with the foundational setup rather than a particular test case step. This is an important distinction, as it directs our attention to the initial conditions and dependencies required for the tests to run.
Steps to Reproduce the Behavior
To reproduce this bug, follow these steps:
- Open the UI: Launch the Test Harness user interface.
- Upload PICS file: Upload the relevant PICS (Protocol Implementation Conformance Statement) file for the WebRTC Requestor cluster. This file defines the capabilities and configurations of the device under test and is crucial for the test harness to understand the device's behavior.
- Update UI configuration: Configure the UI settings as required by the test case. This includes specifying the device's endpoint, setting up any necessary parameters, and ensuring the test environment is properly configured. Accurate configuration is essential for reliable test results.
- Select TC-WEBRTCR-2.1: Make sure the TC-WEBRTCR-2.1 test case is selected for execution. This is the specific test case that exhibited the error, so ensuring it's selected is vital for reproducing the issue.
- Put DUT in commissionable state: Execute the command
./chip-camera-controller
to put the Device Under Test (DUT) into a commissionable state. Commissioning is the process of adding the device to a network and configuring its basic settings. A commissionable state is required for the test harness to interact with the device. - Click Start: Click the Start button on the Test-Harness user interface to begin the test execution. This action triggers the test harness to initialize the test environment and run the selected test cases.
By following these steps, you should be able to replicate the TimeoutError
and confirm the bug's presence in your environment. This reproduction is a critical first step in verifying the issue and testing potential solutions.
Expected Behavior
Ideally, the WebRTC Requestor Test cases should run smoothly without any errors in the TH(UI). The test harness should successfully initialize the camera application, establish communication with the DUT, and execute the test steps as defined in the test plan. A successful test run results in a Pass status for the test case, confirming that the device under test meets the required specifications and functionalities. Understanding the expected behavior helps in identifying deviations and troubleshooting issues more effectively.
The Root Cause and Our Solution
After digging into the issue, we found that the root cause was missing GStreamer packages inside the Docker container used by the Test Harness. GStreamer is a multimedia framework that's essential for handling WebRTC functionalities, and without it, the camera application couldn't initialize correctly, leading to the timeout error. This highlights the importance of ensuring all necessary dependencies are installed within the test environment. Overlooking a critical library or framework can result in unexpected failures and impede the testing process. Addressing these dependencies is key to achieving a stable and reliable test environment.
To fix this, we installed the required GStreamer packages inside the Docker container and re-executed the test cases. Guess what? The tests passed without any errors! 🎉
Here’s what we did:
1. Install Required GStreamer Packages
Run these commands inside the Docker container:
apt update
apt install -y \
libgstreamer1.0-dev \
libgstreamer-plugins-base1.0-dev \
gstreamer1.0-plugins-base \
gstreamer1.0-plugins-good \
gstreamer1.0-plugins-bad
These commands update the package list and install the necessary GStreamer libraries and plugins. The -y
flag automatically confirms the installation, streamlining the process. By installing these packages, we ensure that the camera application has the required multimedia processing capabilities, resolving the timeout error. This step is crucial for any test environment that involves WebRTC or multimedia functionalities.
2. Verify if the Library is Installed
Check for the presence of libgstapp-1.0.so.0
:
ls /usr/lib/*/libgstapp-1.0.so.0
You should see a valid path like:
/usr/lib/x86_64-linux-gnu/libgstapp-1.0.so.0
This command verifies that the core GStreamer application library is installed correctly. The ls
command searches for the specified file across different library directories, ensuring that the library is accessible to the system. Seeing a valid path confirms that the installation was successful and that the library is in the expected location. This verification step is essential for confirming that the fix is properly implemented and that the testing environment is complete.
Log Files
For more detailed information, you can check out these log files:
- TH Logs:
- Python Inside Docker logs:
These log files provide a comprehensive view of the test execution, including any errors, warnings, and informational messages. They are invaluable for debugging and troubleshooting issues. The TH logs capture the test harness's perspective, while the Python inside Docker logs detail the camera application's behavior. By examining these logs, you can gain deeper insights into the root cause of failures and verify the effectiveness of any fixes applied. Analyzing these logs is a crucial step in ensuring the reliability and stability of your testing environment.
PICS File
You can find the PICS file here: WebRTC Transport Requestor Cluster Test Plan.zip
The PICS file is a critical component of the testing process, as it outlines the capabilities and configurations of the device under test. It enables the test harness to accurately assess the device's conformance to the required specifications. By providing this file, we ensure that the test environment is properly configured and that the tests are tailored to the specific characteristics of the DUT. This level of detail is essential for achieving reliable and meaningful test results.
Screenshot
Here’s a screenshot of the error:
The screenshot provides a visual representation of the error, making it easier to understand the context and nature of the problem. It highlights the TimeoutError
and the specific point at which the test failed. Visual aids like this can be incredibly helpful for quickly grasping the issue and sharing it with team members for collaborative troubleshooting. The screenshot serves as a clear and concise record of the error, aiding in its identification and resolution.
Environment
- TH Version: v2.14-beta1.1+winter2025
- Sha: 7232ff8
Knowing the specific environment details is crucial for reproducing the bug and verifying the fix. The TH Version and SHA provide a precise identifier of the test harness version used during the test. This information helps in ensuring that the issue is addressed in the correct version and that any fixes are compatible with the existing environment. By documenting these details, we facilitate clear communication and collaboration among team members, ensuring that everyone is working with the same context and information.
Conclusion
So, if you're running into this annoying timeout error with the WebRTCRequestor test cases, make sure you've got those GStreamer packages installed! This simple fix saved us a lot of headache, and we hope it helps you too. Ensuring all dependencies are correctly installed is a fundamental step in setting up a robust testing environment. In this case, the missing GStreamer packages were the key culprit behind the timeout errors. By installing these packages, we not only resolved the immediate issue but also gained a deeper understanding of the testing environment's requirements. This proactive approach to dependency management can prevent similar issues in the future and contribute to a more stable and reliable testing process.
By sharing our experience, we aim to help others avoid the same pitfalls and streamline their testing efforts. Remember, a well-configured environment is the foundation of accurate and efficient testing. If you encounter any similar issues or have further insights, feel free to share them in the comments below. Let's collaborate to build a more robust and reliable testing ecosystem for WebRTC and related technologies!