One of the main issue of using a IMU based motion capture suit, like Perception Neuron, is the not-absolute precision you get out of it.
For example if you mark your initial position on the ground, then walk 5-10 meters forward, turn around and go back to the initial marked position, you will notice that although you are phisically in the same location, in Axis ( the software that gather the mocap data from the Perception Neuron Suit ), you will see that you're near the starting location, but not 100% there.
This is a well known issue given by the nature of the IMU based motion capture suit, which do not use any kind of optical tracking solution.
In order to solve this issue I experimented a bit by integrating a Vive Tracker into the setup, using Unreal Engine 4 in order to blend together the realtime animation streaming from Axis with the Vive Tracker world position.
After some tests I ended up creating an IK setup which is being driven by the realtime data from Axis.
The first video below shows how the IK setup works by using a Vive Tracker
On the left: Realtime Mocap data from Axis
On the right: Realtime Mocap data from Axis with the Pelvis driven by the Vive Tracker
The second video shows the IK setup driving the entire body ( test done while wearing the suit ) and shows the difference between the absolute positioning given by the addition of the Vive Tracker vs the positioning from the Perception Neuron Suit itself
On the left: Driven by the IK setup using the Vive Tracker on the Hips while everything else is driven by Mocap from Axis
On the right: Realtime Mocap data from Axis
As you can see based on the second video, there are noticeable issues, especially related to the feet sliding ( which I was expecting ), caused by the fact that the Pelvis is driven by the Vive Tracker, and is not relative to the legs position/orientation.
In addition to that there is some weird torso popping caused by the IK setup, which I'm about to replace with an alternative method.
As soon as the feet sliding issue will be solved, I'll post an update and a playable scene for all Perception Neuron users to test.
I've also did some tests with the integration of the Vive Controllers, which will add another level of precision to the entire setup, and the results are quite good.
Up next I will do the integration of the Vive HMD with the Perception Neuron Suit, and customize a bit the experience in order to have the functionalities of the controller itself using hand gestures, similar to what I already did using the Noitom Hi5 VR Gloves.