EerieLeap is an open-source sensor monitoring system built with Zephyr RTOS. It provides a robust platform for reading, processing, and managing sensor data, supporting digital, analog, and CAN Bus data inputs.
Core features of the system include:
- Real-time sensor data collection and processing
- CAN Bus data collection and streaming
- Data logging
- Custom expressions for sensor value calculation
- Lua scripts for sensor values and CAN messages custom logic
EerieLeap is highly configurable system, allowing not just collection and streaming of data, but also custom logic for sensor values and CAN messages. Only imagination limits the possibilities.
EerieLeap implements a layered architecture designed for real-time sensor monitoring with clear separation of concerns:
The system follows a pipeline architecture where sensor data flows from hardware inputs through processing stages to independent outputs:
Hardware Input → SensorsProcessingService → [LogWriterService | Controllers/Views]
Hardware Output ← CanbusSchedulerService
Hardware Abstraction Layer
- Zephyr device drivers provide standardized interfaces to hardware peripherals
- Device tree configuration defines available sensors, buses, and interfaces
- Subsystem adapters (
app/src/subsys) wrap third-party APIs with project-specific abstractions - Hardware-agnostic design allows the same codebase to run across different platforms
Domain Layer (app/src/domain)
- SensorsProcessingService: Collects and processes sensor data from all sources (digital inputs, analog inputs, CAN Bus messages). Each sensor operates on an independent schedule with configurable sample rates.
- CanbusSchedulerService: Streams processed data to external devices over CAN Bus. Each outgoing message operates on an independent schedule.
- LogWriterService: Handles persistent storage of sensor data in ASAM MDF format, operating independently from other outputs
- Domain services coordinate data flow and implement core business logic
Controller Layer (app/src/controllers)
- Provides interfaces for external control and data access
- Exposes device functionality to network services and user interfaces
- Includes optional display UI (
app/src/views) for devices with screen support
The system employs a dual-format configuration architecture:
- JSON: Human-readable format for user-supplied device configuration and persistence
- CBOR: Compact binary format for efficient runtime configuration management
- Configuration Managers bridge user settings with domain service initialization
- Data Collection: SensorsProcessingService coordinates sensor readings from all sources (digital, analog, CAN Bus). Each sensor operates on an independent schedule based on its configured sample rate.
- Processing: Raw data is collected from hardware through Zephyr device APIs, then processed and validated by domain services
- Parallel Outputs:
- LogWriterService writes processed data to persistent storage independently
- Controllers provide real-time access to current sensor values
- Optional display and web interfaces enable monitoring capabilities
- Data Streaming: CanbusSchedulerService transmits selected sensor data to external devices over CAN Bus, with each outgoing message operating on its own independent schedule
The system leverages Zephyr RTOS threading primitives:
- Scheduler services run in separate threads
- Data handoff between collection and processing stages is synchronized through semaphores/locks
- Hardware access is synchronized through Zephyr's device API locking mechanisms
app/src- main application source codeapp/src/configuration- JSON and CBOR device configuration related services and schemasapp/src/controllers- interfaces for device control and data accessapp/src/domain- business logic layerapp/src/subsys- independent subsystems and adaptersapp/src/utilities- utility functions and classesapp/src/views- UI for optionally supported display
app/boards- board-specific configurationapp/libs- external librariesboards- custom board definitionstests- functional and unit testsmodules- external Zephyr modules
The configuration system employs two separate subsystems: JSON and CBOR. JSON is used for user-supplied device configuration. CBOR is used for internal configuration management, as the binary format is more compact and faster to manipulate.
Boost.JSON is used for JSON parsing and serialization.
zcbor is used for CBOR parsing and serialization. The zcbor library provides helper methods to set up serializers and deserializers, and helper scripts can be used to generate them. The project's main CMake file contains helper scripts to generate CBOR serializers and deserializers. However, since the generated code doesn't support the C++ features this project requires, the current implementation uses the generated helper methods only as reference for defining C++ feature-rich versions.
With seldom exceptions, the core codebase does not rely on third-party library APIs. Instead, it provides its own abstractions and interfaces, typically defined in the app/src/subsys directory.
The domain layer is responsible for business logic and data processing. It is located in the app/src/domain directory. Two main components are SensorsProcessingService and CanbusSchedulerService.
SensorsProcessingService is responsible for collection and processing of sensor data. It is located in app/src/domain/sensor_domain/services directory.
CanbusSchedulerService is responsible for collection and processing of CAN Bus data. It is located in app/src/domain/canbus_domain/services directory.
Another key domain component is LogWriterService, located in app/src/domain/logging_domain/services directory. It is responsible for logging sensor data to a file. The main log file format currently used is ASAM MDF version 4. Logging format-specific implementations are located in app/src/domain/logging_domain/loggers directory.
The domain layer includes Configuration Managers, components responsible for loading, saving, and configuring the corresponding domain components. Configuration manager implementations are located in app/src/domain/domain_component/configuration directory.
The project uses submodules for external dependencies. To initialize submodules, run:
git submodule update --init --recursiveThe development environment is based on Docker. Use example.docker-compose.yml as an example. Rename it to docker-compose.yml to use it.
By default, the container will run with privileged mode enabled and pid mode set to host to allow access to devices connected to the host. Devices must be connected to the host before running the container.
If using Docker for Windows with Docker using the WSL2 engine, make sure to attach the device to the WSL container. This can be done with the help of the usbipd-win tool. Example:
usbipd attach --wsl --busid <busid>Where <busid> is the bus ID of the device, which can be found with the usbipd list command.
The connected device should be visible in the container as /dev/ttyACM0. Test the presence of the device in the container with the ls /dev/ttyACM0 command.
For VS Code build setup examples, refer to .vscode_example/tasks.json. Alternatively, you can build the application using the following command:
west build -b $BOARD appwhere $BOARD is the target board.
Currently, the following boards are supported:
native_simqemu_cortex_m3
esp32s3_devkitc_procpu
Running and debugging in a simulator can be more convenient and time-efficient. To run the compiled application in a simulator, use:
west build -t runFor VS Code debugging setup examples, refer to .vscode_example/launch.json.
Once the application is built, flash it to the target board with:
west flashESP32S3 DevKitC v1.3 / EerieLeap PM v0.2 (Reference Docs)
Build with Bootloader:
west build -p auto -b esp32s3_devkitc/esp32s3/procpu --sysbuild ./appSimple Build:
west build -p auto -b esp32s3_devkitc/esp32s3/procpu ./appSerial Monitor:
west espressif monitorDebugging
Debugging works to some extent. The Cortex-Debug extension for VS Code can be used for this purpose. A config example set up for a Docker container can be found in .vscode_example/launch.json.
For manual GDB, run either west debug or run OpenOCD in one terminal:
/home/ubuntu/zephyr/workspace/utilities/openocd-esp32/bin/openocd \
-f /home/ubuntu/zephyr/workspace/utilities/openocd-esp32/share/openocd/scripts/board/esp32s3-builtin.cfg \
-c "set ESP32_ONLYCPU 1; set ESP_FLASH_SIZE 0; set ESP_RTOS Zephyr" \
-c "init; halt; esp appimage_offset 0" \
-c "esp32s3.cpu0 configure -rtos Zephyr" \
-c "init" \
-c "reset init"And GDB in another:
/home/ubuntu/zephyr-sdk-0.17.4/xtensa-espressif_esp32s3_zephyr-elf/bin/xtensa-espressif_esp32s3_zephyr-elf-gdb \
-ex 'target extended-remote :3333' \
-ex 'symbol-file build/zephyr/zephyr.elf' \
-ex 'mon reset halt' \
-ex 'maintenance flush register-cache' \
-ex 'break main' \
-ex 'continue'MCUDev DevEBox STM32H7XX_M (Reference Docs)
Required Tools
STM32CubeProgrammer needs to be installed on the host machine. If using Docker, you can take advantage of the build script expecting to find STM32CubeCLT in tools/st-stm32cubeclt_1.20.0.sh. Source the tool from the ST Website and unpack it to tools/st-stm32cubeclt_1.20.0.sh. The Dockerfile will take care of installing it during the build process.
Build:
west build -p auto -b mcudev_devebox_stm32h743_hw_20 ./appDebug:
The Cortex-Debug extension for VS Code can be used for debugging. An example configuration valid for a Docker container in combination with ST-Link connected over SWD can be found in .vscode_example/launch.json.
The project uses zcbor to generate CBOR helpers from schemas. To install zcbor:
pip install zcborThe project uses html-minifier-terser to minify HTML files. To install html-minifier-terser:
npm install -g html-minifier-terserUse CBOR Serializer helpers in the core CMakeLists.txt file to generate CBOR helpers for the configuration schemas. Uncomment the corresponding line to generate the base set of serialization classes. Files will appear in the src/configuration/cbor/generated directory. Finalized files are expected to be placed in the src/configuration/cbor/<base_name> directory.
To access the web interface, connect to the WiFi network with SSID EerieLeap and use the address 192.168.4.1:8080 in your browser.
To access the web interface while running in the simulator, follow the instructions in the Networking with native_sim board documentation. Run the ./net-setup.sh up script first, then start the simulator. In a Docker container, net-tools can be found in ~/zephyr/net-tools directory.
To gain access to the web interface, the port needs to be forwarded from the loopback interface to the TAP IP. The socat tool can be used for this purpose. Run the following command:
socat TCP-LISTEN:8080,fork TCP:192.0.2.1:8080where 192.0.2.1 is the TAP IP address.
Once the port is forwarded, the web interface can be accessed at http://localhost:8080.