This project involved developing a low-cost, real-time system to analyze human postural control by correlating physical joint movement with underlying muscle activation.
The system integrates two key data streams: a webcam using OpenCV to track joint motion and a surface Electromyography(EMG) sensor to read muscle activity. As the project lead for the vision and software, I was responsible for developing the PyQt5 application that serves as the central dashboard. This GUI handles camera calibration , real-time ArUco marker detection, 3D pose estimation, and data synchronization. The final application plots both the joint motion data and the live EMG signal on synchronized Matplotlib charts for direct, real-time analysis.
The system is divided into 3 layers:
Data Acquisition Layer
Processing Layer
Data Visualization Layer
The chart below shows the different layes and the connection between them.
The acquisition layer consists of two parallel systems. For visual data, a standard webcam captures a live video feed, while ArUco markers are placed on the subject's ankle, knee, and hip . For biological data, a 3-electrode EMG sensor is placed on a target muscle (e.g., Tibialis Anterior) following SENIAM guidelines for accurate signal acquisition. The raw EMG signal is processed and filtered by an Arduino Due (12-bit ADC) , which then transmits the clean data to the main PC via a USB serial connection.
I led the development of the main control software in Python. The custom PyQt5 dashboard I built manages the entire workflow. It guides the user through camera calibration (using an 8x6 chessboard) to remove lens distortion. Using OpenCV, the application detects ArUco markers in real-time and calculates their 3D position and orientation using the Perspective-n-Point (PnP) algorithm.
A key challenge was data synchronization. The application uses a "handshake" protocol and multithreading to manage the high-frequency data from both Pyserial (EMG) and OpenCV (video) simultaneously. When the user clicks "Start," the GUI aligns both data streams and visualizes them on 7 live Matplotlib plots, providing an immediate, correlated view of muscle firing and joint translation/rotation.
8 x 6 Chess board for camera calibration
GUI Concept Drawing
This academic project was managed using a formal, structured workflow. We used Probant (a project management platform) for task management and milestone tracking, creating a Gantt chart to ensure we stayed on schedule. All project resources, including code, research papers, and technical notes, were centralized on Microsoft Teams and SharePoint. This platform was also used to build a collaborative SharePoint wiki, which served as our primary source for technical documentation and the final operation manual. To manage our scientific research, we used Zotero for bibliography and citation management.
Bibliography Management using Zotero
Project Gantt Chart
This section shows the skills I developed/improved working on this project.
GUI Development: Building a complex, multi-threaded dashboard from scratch using Python and PyQt5.
Computer Vision (CV): Implementing OpenCV for camera calibration, real-time ArUco marker tracking, and pose estimation.
3D Pose Estimation: Applying the Perspective-n-Point (PnP) algorithm to extract 3D translation and rotation from 2D video.
Data Synchronization: Designing and implementing a software-based "handshake" to align two separate, real-time data streams (EMG and CV).
Data Visualization: Integrating dynamic Matplotlib plots into a live PyQt5 application.
Hardware/Software Integration: Interfacing with microcontroller (Arduino Due) data via Pyserial.
Biomedical Sensing: Processing and visualizing live EMG signals.
Project Management: Using Proboant for task delegation and Gantt chart milestone tracking.
Technical Documentation: Authoring a comprehensive operation manual and creating a SharePoint project wiki.
Research & Collaboration: Managing citations with Zotero and using MS Teams for file management and team coordination.