Advanced science.  Applied technology.

Search

Measuring Non-Human Kinematics & Biomechanics for Unlabeled Subjects, 10-R8975

Principal Investigators
David Anthony
Brian Swenson
Inclusive Dates 
09/27/19 - Current

BACKGROUND

The biomechanics industry has long been using motion capture systems to track movement and forces of people. Recently, SwRI developed a 3D markerless motion capture system for human biomechanical assessment that provides measurement accuracies comparable to marker-based 3D motion capture systems, primarily marketed at athletes. The objective of this project is to adapt the motion capture technology to track the health, activity levels, and habits of primates, expanding our potential technical applications, customer list, and funding sources to zoos and other animal research organizations. For this internal research project, we partnered with Texas Biomedical Research Institute (TBRI) to monitor baboons using SwRI’s markerless biomechanics technology. If feasible, the data will characterize primates’ health and behaviors, making it a valuable tool for primate studies.

APPROACH

The primary goal of this research is to adapt our existing markerless motion capture technology to baboons. This will result in a demonstration of a system that will use TBRI-captured video as the input data to output a baboon kinematic model. In Phase 1 of this project, a methodology was developed to: set up a camera system and infrastructure necessary to capture visual data of the baboon enclosures at TBRI, establish a method of transferring and storing image data to a computer indoors at TBRI, modify our markerless motion capture system to work on non-human primates. In Phase 2 of this project we: captured additional data at TBRI to train the neural network to improve performance of the system, developed an annotation tool to help label specific joints and features on the baboon, began developing a pipeline to combine baboon video with a baboon kinematic model. We will use the captured video data to derive quantitative biomarkers for health.

ACCOMPLISHMENTS

To date, we have developed an annotation tool to label specific joints and features on the baboon, to include additional points on the baboon’s tail. We expanded the associated neural network to track these additional points. We also implemented camera calibration algorithms in the annotation tool to calculate camera intrinsic and extrinsic parameters from recorded video. Using this calibration, multiple cameras can be deployed to a primate habitat, allowing us to estimate 3D positions of primate joints, improve gait analysis, perform more demanding analysis of the behavior and motion of the primates, and make the data collection more robust to occlusions from objects in the cage.

We are currently working on improvements to the neural network performance. While the system is currently showing good performance on views of unobstructed baboons, it struggles when obstructions such as chain link fences partially occlude the primates. We are modifying the neural network training procedure to produce better predictions in these situations. We have identified two approaches, a denoising autoencoder, and a modified spatial dropout function that show promising initial results, and are experimenting with the two approaches to determine which has the best performance.