What is Project Rigel?

Project Rigel is an "All in One" Full Body Motion Capture solution that can provide body, fingers and facial tracking, built using Unreal Engine 4.

The body is driven by IK using Vive Trackers, which ensure better and accurate results compared to IMU based motion capture suits.

Rigel Demo Download

Here you can download the executable demo
Rigel Demo

Since the introduction of the SteamVR Input plugin, there have been some changes related to the Vive Trackers setup, so we made a video explaining what needs to be done in order to make everything work during Rigel's Calibration.
Rigel Demo Guidelines

User Feedback is very important at this stage, so once you'll test Rigel, we ask you to spend a couple of minutes on this survey.
Rigel Demo Survey

How does it work?

Unreal Engine 4 is used to build a runtime Full Body IK rig.

SteamVR send all the informations from the Motion Controllers and Vive Trackers inside Unreal Engine 4 in realtime.

A dedicated Blueprint gets all the motion data and assign them to the Animation Blueprint.
Additional external hardware, like VR gloves and iPhone data, is already integrated into the Animation Blueprint, and is up to the user to choose which one they would like to use.

How does it compare to Ikinema Orion, and what additional features this offers?

We've been using Ikinema Orion for many different projects in the past, but since Ikinema is no more, we had to find an alternative solution, so we decided to build our own solution.

We're able to provide a robust and efficient Full Body Tracking solution, and on top of it we built additional features, such as:

Hands/Elbows Clipping Removal

Often times, while recording motion capture animations, an animator usually have to adjust the hands/elbows position, since they clip/intersect with the CG character they're working on, so this feature prevent the clipping of those body parts, so that the motion capture animation result is much cleaner, reducing the need of manual adjustment from the animator, saving precious time during the cleanup process.

Peak Removal

Is well known that either the Vive/Vive Pro/Valve Index comes with the HMD, controllers and two Lighthouse, and this is perfectly fine for playing in VR, but when using multiple Vive Trackers some occlusion may occur, meaning that both Lighthouse cameras lost the tracking onto one or more Vive Trackers, and this cause the Tracker to move very quickly to another location, to then snap back to its tracking position.

This is known as an "animation peak", and it usually last a couple of frames.

Most of 3D softwares have builtin filters that remove those peaks, or the animator manually remove those.

So instead of filter the animation using an additional software, we built a runtime Peak Removal that works as follow:

Character1 receive all the tracking informations from the Vive Trackers, while in the background there is an additional character, let's call it Character2, that has a 60frames delay.

A dedicated setup is used to detect is a Peak occur on Character1, and when this happens, it send this information to Character2, and with those 60frames delay, we smooth out those Peaks, so that the recorded animation won't have those issues.

Since Peaks may occur often ( depending on your StreamVR and Lighthouse setup ), this may save lots of animation takes, since a Peak at the wrong time can mess up parts of an animation.

Adaptable IK Retargeting

Project Rigel setup allows to drive any humanoid character, no matter what the size or proportions are, thanks to its fully retargetable IK system.
This means that you can record Motion Capture animations using directly the character you want, rather then recording using a predefined character ( i.e. UE4 Mannequin ), removing the need of after recording retargeting.

Realtime Data Smoothing

Since the Vive Trackers can shake quite a bit during fast movements, the realtime smoothing can drastically reduce this issue while you're recording Mocap Animations, rather then do that during the cleanup process.
A slider allows the user to control the amount of smoothing during the recording.

Auto Toe Bending

This is a small addition allows the realtime bending of the often ignored toe joint, which will come very handy when the animation is polished, since the animation on that joint is already done automatically inside the Animation Blueprint in Unreal.

What is the supported hardware?

Here is a list of the current and future supported hardware:

Fingers Tracking

Noitom Hi5 VR Gloves
Valve Knuckles
VRGluv ( coming soon )
StretchSense ( coming soon )

Facial Tracking

Face Cap ( in streaming using OSC plugin )
Faceware ( coming soon )
Dynamxyz ( coming soon )

Our goal is to add support for every hardware available, so that the user has complete freedom to choose the setup they like.

Is it just for Motion Capture?

You can use it to do whatever you need, meaning that you can potentially use Project Rigel for actor previz during Virtual Production, or strap a VR Backpack and create a Location Based Entertainment solution, or create a VR Job Training solution to help companies evaluate employers safety.

Flexibility and freedom of choice are key for this solution.

Who's this for?

Wheter you're a big studio or an indipendent developer, this is the most affordable solution on the market regarding Optical Full Body Motion Capture.

Currently there is a huge gap between IMU based Motion Capture Suit ( Perception Neuron, Rokoko, Xsens and so on ) an Optical Based Motion Capture solutions ( OptiTrack, Vicon ), and Project Rigel goal is to fill that gap by providing a top quality affordable Motion Capture Solution.