To enable advanced research into artificial humanoid control, Microsoft’s robotics team releases a library of pre-trained simulated humanoid control models with enriched data for training new


Simulated humanoids present an intriguing platform for studying motor intelligence with their ability to mimic the full spectrum of human movement. An important area of ​​study in machine learning is the acquisition and application of motor skills. The physical simulation of human talents presents significant control challenges. A controller must manage a high-dimensional, unstable and discontinuous system, requiring precise synchronization and coordination to achieve the desired movement.

All current learning methods struggle to learn complex humanoid behaviors using a tabula-rasa approach. Motion capture (MoCap) data has quickly become an integral part of humanoid control studies. MoCap trajectories are sequences of configurations and poses that the human body assumes throughout the movement in question, and as such they contain kinematic information about the movement. This information can help a simulated humanoid learn basic motor skills through MoCap demonstrations, making it easier to learn complex control strategies.

Unfortunately, using MoCap data in a physics simulator requires retrieving the actions (e.g., joint torques) that produce the series of kinematic poses in a given MoCap trajectory (i.e., tracking the path). Finding an action sequence that makes a humanoid follow a MoCap sequence is not straightforward. Reinforcement learning and adversarial learning are two approaches that have been used to solve this problem. Training agents to recreate hours of MoCap data is also computationally intensive, and the computational load of detecting these actions increases with the amount of MoCap data. Even though MoCap datasets are widely available, few research organizations with significant computational resources have been able to use them for further learning-based humanoid control.

A recent Microsoft study presented MoCapAct, a high-quality MoCap tracking rule dataset for a simulated humanoid based on MuJoCo, as well as a collection of deployments from these expert policies.

Aiming to remove current barriers and enable the use of MoCap data in humanoid control research, MoCap is designed to be compatible with the hugely popular dm_control humanoid simulation environment. CMU MoCap is one of the largest publicly available MoCap datasets. MoCapAct policies can track 3.5 hours of this data.

The researchers demonstrate the use of MoCapAct to learn varied movements by studying its expert policies and using the expert deployments to form a single hierarchical policy that can track all considered MoCap clips. The low-level part of the policy is then recycled for efficient learning of RL tasks.

The team trained a GPT network to generate motion in the MuJoCo simulator in response to a motion prompt using the dataset for generative motion completion.

This dataset allows research groups to avoid the time-consuming and energy-consuming process of learning low-level motor skills using MoCap data. This greatly lowers the barrier of entry for simulated humanoid control, promising rich possibilities for exploring multitasking learning and motor intelligence. The team believe their approach can be used in training alternative policy frameworks such as decision transformers or in setups such as offline reinforcement learning.

This Article is written as a research summary article by Marktechpost Staff based on the research paper 'MoCapAct: A Multi-Task Dataset for
Simulated Humanoid Control'. All Credit For This Research Goes To Researchers on This Project. Check out the paper, github, project page and reference article.

Please Don't Forget To Join Our ML Subreddit

Tanushree Shenwai is an intern consultant at MarktechPost. She is currently pursuing her B.Tech from Indian Institute of Technology (IIT), Bhubaneswar. She is a data science enthusiast and has a keen interest in the scope of application of artificial intelligence in various fields. She is passionate about exploring new technological advancements and applying them to real life.


About Author

Comments are closed.