MocapNET

MocapNET

FORTH-ModelBasedTracker

A real-time method that estimates the 3D human pose directly in the popular Bio Vision Hierarchy (BVH) format, given estimations of the 2D body joints originating from monocular color images. Our contributions include: (a) A novel and compact 2D pose NSRM representation. (b) A human body orientation classifier and an ensemble of orientation-tuned

930 Stars
143 Forks
930 Watchers
C++ Language
other License
100 SrcLog Score
Cost to Build
$3.27M
Market Value
$16.41M

Growth over time

19 data points  ·  2021-05-01 → 2026-04-01
Stars Forks Watchers
💬

How do you feel about this project?

Ask AI about MocapNET

Question copied to clipboard

What is the FORTH-ModelBasedTracker/MocapNET GitHub project? Description: "A real-time method that estimates the 3D human pose directly in the popular Bio Vision Hierarchy (BVH) format, given estimations of the 2D body joints originating from monocular color images. Our contributions include: (a) A novel and compact 2D pose NSRM representation. (b) A human body orientation classifier and an ensemble of orientation-tuned ". Written in C++. Explain what it does, its main use cases, key features, and who would benefit from using it.

Question is copied to clipboard — paste it after the AI opens.

How to clone MocapNET

Clone via HTTPS

git clone https://github.com/FORTH-ModelBasedTracker/MocapNET.git

Clone via SSH

[email protected]:FORTH-ModelBasedTracker/MocapNET.git

Download ZIP

Download master.zip

Found an issue?

Report bugs or request features on the MocapNET issue tracker:

Open GitHub Issues