OpenCap Monocular has been validated on the following activities:
It may work on other activities, but these have not yet been validated. Jumping is currently not supported and does not work reliably.
⚠️ Important Guidelines for Best Results:
OpenCap Monocular estimates 3D human kinematics and musculoskeletal dynamics from a single static smartphone video. It is validated against marker-based motion capture for walking, squatting, and sit-to-stand activities and is free and open-source.
Key outputs include joint kinematics and kinetics, and example tasks are documented alongside videos and figures on this page.
Quantifying human movement (kinematics) and musculoskeletal forces (kinetics) at scale--such as estimating quadriceps force during a sit-to-stand movement--could transform the prediction, treatment, and monitoring of mobility-related conditions. However, traditional motion analysis requires costly and time-intensive laboratory systems, which limit clinical translation. Scalable, accurate tools for biomechanical assessment are critically needed. We introduce OpenCap Monocular, an algorithm that estimates 3D kinematics and kinetics from a single static smartphone video. The method refines 3D human pose estimates from a monocular pose estimation model from computer vision (WHAM) via optimization, computes the kinematics of a biomechanically constrained skeletal model, and estimates kinetics via physics-based simulation and machine learning. We validated OpenCap Monocular against marker-based motion capture and force plate data for walking, squatting, and sit-to-stand tasks. OpenCap Monocular achieved low kinematic error (4.8° rotational mean absolute error [MAE]; 3.4 cm translational MAE), outperforming a regression-only computer vision baseline (9.3° rotational MAE; 11.0 cm translational MAE). It also estimated ground reaction forces during walking with accuracy comparable to, or better than, that of our prior two-camera OpenCap system. We demonstrate clinically meaningful accuracy in applications related to frailty and knee osteoarthritis, including estimating the knee extension moment during sit-to-stand transitions and the knee adduction moment during walking. OpenCap Monocular is deployed via a smartphone app and secure cloud computing, enabling free, accessible single-smartphone biomechanical assessments. Such accessibility enables large-scale remote studies and, ultimately, routine evaluations of mobility and function in the clinic or at home. Our code is available at github.com/utahmobl/opencap-monocular.
OpenCap Monocular has been validated on the following activities:
It may work on other activities, but these have not yet been validated. Jumping is currently not supported and does not work reliably.
⚠️ Important Guidelines for Best Results:
We developed an interactive web-based visualizer to explore the 3D kinematics computed by OpenCap Monocular alongside the original video and ground truth data.
You can interact with sample results, compare different methods, and view the movements from any angle.
Ready to measure human movement with a single smartphone? Start using OpenCap Monocular today!
OpenCap Monocular is freely available and requires no specialized hardware, just an iPhone or iPad.