Bandai Namco Published a Repository with Motion Datasets

The datasets contain a range of contents such as daily activities, fighting, and dancing; with styles such as active, tired, and happy.

Bandai Namco Research shared a repository with motion datasets collected by the team. The datasets contain a diverse range of contents such as daily activities, fighting, and dancing; with styles such as active, tired, and happy. These can be used as training data for Motion Style Transfer (MST) models.

The researchers were interested in making diverse stylized motions for games and movies that pursue realistic and expressive character animation, so MST has been drawing the team's attention, "which aims to convert the motion in a clip with a given content into another motion in a different style while keeping the same content."

"A motion is composed of a content and style, where content is the base of the motion and style comprises of the attributes such as mood and personality of the character tied to the motion."

Currently, two datasets are available in this repository:

  • The first contains 17 types of wide-range contents including daily activities, fighting, and dancing, 15 styles that include expression variety, and a total of 36,673 frames.
  • The second has 10 types of content mainly focusing on locomotion and hand actions, 7 styles that use a single, uniform expression, and a total of 384,931 frames.

Each dataset is based on the motion of three professional actors, collected at the motion capture studio of Bandai Namco. The team applied noise removal, proportion alignment, and clipping and saved it all in BVH format.

You can find the repository on GitHub. Also, don't forget to join our new Reddit pageour new Telegram channel, follow us on Instagram and Twitter, where we are sharing breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more