Noitom Perception Neuron Review

The Perception Neuron Motion Capture Suit is possibly one of the most affordable Motion Capture Solution around, being an “All in one Solution” to record motion captures animations, since you’ll get both hardware and software in a single package, for a total price of 1.799$.


What’s in the box?

The suit comes in a nice bag with the Noitom logo, and all the parts of the suit fits perfectly inside the bag, so you’ll have a good time unboxing all the components, but you need to be very tidy in order to put everything back into the bag, because there is no unused space.

You’ll find 4 bags, where inside you’ll find the Torso straps, the legs straps, the arms and head straps, and in the last bag you’ll get the 2 sets of gloves.

At the bottom of the bag you’ll find two antimagnetic metal boxes, where all the sensors are stored and a pair of spare fabric gloves.

There is a pocket on the top lid of the bag, where you’ll find the Hub, Cables for sensors calibration and a red cable for the Hub connectivity.




Technical specifications

The core of the suit are the sensors, and the Perception Neuron uses Inertial Measurement Unit sensors ( or IMU ), and each one gives rotation data, and by combining the data of the sensors  together, the software gives Full Body animation data.

A very simple way to understand how they work is to have 3 of those IMU sensors placed onto the Hips and one on each leg: the Hips is considered to be the “center” of the body, so if we’re standing with our legs still, the rotation data for both the Hips and Leg value will be, for example, 0, but if we move our left leg forward, we’ll have a situation were both our leg sensors values will be different then 0 ( for example +-20° ) and that value, compared to the Hips value, will be translated to a location in space.

All the data is captured by a Hub and sent to a workstation via either USB, Wirelessly or it can be recorded into a SD Card and later manually transferred into the PC.

The Mocap “Suit” in reality is not really a suit that you wear, but rather a series of Velcro Straps, that you place onto specific parts of your body, each one with a socket for the IMU sensors to be placed, connected together by cables.

This can be very useful, because it will fit almost any person physique, but on the other hand it doesn’t look as cool as all the mocap suit in the behind the scene videos you see on YouTube, but honestly I’m not too concerned about that.


Hardware Setup

Wearing the suit is quite easy, and since I’ve been using it for quite some time, the first thing I do is to first get the sensors, and place them into each socket.

After I took care of the sensors placement, I wear the suit starting from the Torso, using the locking system on the front, then I use the Upper Arm straps on each arm, connecting the cables between the straps as well.

I usually also connect the Head Strap, so that I don’t have to reach for the cable once I’m wearing the Torso Strap, and I can just grab the Head Strap and place it onto my head.

After that I assemble together the 3 straps for the legs, and I’ll wear them starting from the Thigh to the foot, connecting them to the Torso strap using the cables.

Then I get the Hub and connect it to the Torso strap, and since there is a handy clip on the back, I usually attach the Hub on my pocket.

Depending on how you’re planning to get the data from the Suit on the PC, if you’re going with the wireless setup, you’ll need a power bank ( which is not included with the suit mind you ) in order to power up the suit and the Hub.

Last I get the gloves, and since those are made of fabric, they tend to stick to the Velcro quite easily, so is the last part of the suit you want to wear; again I take care of the cables, this time coming from the upper arms, and the “Suit” is complete.

Now the suit can be turned on by using a USB cable from the Hub that can be either connected to the PC directly, or connected to the power bank, that I usually keep in my pocket.

Be careful because there are two specific ports, one for the data transfering, to be used when you want to connect it using the USB cable, and the other one ( you can see the 5V written on ) is for both data transferring and power the suit itself, used with the power bank.

Either way, as soon as the suit powers up, the Hub will make a sound and the sensors will have a “breathing” light blink.

Bonus advice: if some of the sensors are not “breathing”, means that probably the sensors is not mounted properly, so try to remove the sensor, clean it up ( blow on it ), and try to mount it again.

If the sensor do not light up at all, probably one of the cable is not connected properly, so plug and unplug the cables of the sensors that is not turned on.

Bonus advice2: Noitom advice, once you’re done with recording and you need to put the mocap suit back in the bag, is to also remove all the sensors and put them in the metal box, in order to avoid any kind of magnetic interferece.

Since I got them, I never did that, I always left the sensors in their socket and put the suit in the bag, with almost all the cables connected, and I had magnetization issues maybe 2 times, so my advice is to try to keep the suit far away from magnetic sources, but the removal of the sensors is something that you can avoid, so if you see that even with the sensors left in the suit the mocap results are ok, you can just keep them mounted and place the suit inside the box.


Recording and Streaming

In order to receive data from the Suit, Noitom kindly provide two softwares, called Axis Neuron and Axis Neuron Pro, that will manage the suit calibration, recording, tweaking and so on.

Previously the Axis Neuron Pro version was given only to registered members, but now you can download it directly from the website, so in case get the Axis Neuron Pro, since it has more features.

After the installation is done, as soon as you’ll launch it, a pop up window will appear, and if you’re connected with either USB or Wifi, you should see your Hub showing up, and you can choose to either Connect or do the Setup.

Bonus advice: About Wifi, I read that many users have issues with the WiFi connection, especially because the type of router you’re using is a critical factor, and I also had troubles at first, but then I found the perfect solution: instead of using my home WiFi router, I’m using my smartphone using the “Hotspot” feature, so that it’ll act as a router, and I can connect both the Hub and the pc to this wireless network, and so far it worked flawlessly every time.


What I suggest it to first use the USB cable from the Hub, connect it to the PC, and click Setup on the pop up window, then you’re asked to choose a WiFi network to connect to, choose the proper wireless network and type the password.

Once you’re done remember to use the Connect button, which will use the USB to get the data from the Hub, but I also notice that if you don’t do that when you’re doing the WiFi setup, the Hub won’t show in the pop up window.

As soon as the Hub is connected with Axis, you will see a humanoid figure appearing on the screen, and you will also see that on the right side of the screen the status of the sensors, so if some of them are grey,means that they’re not connected properly.

Now disconnect the USB for the pc, use the other port on the Suit Hub to connect it to the power bank, and as soon as the suit turns on, you should see the Hub showing up in the pop up window.

Choose Connect and now you’re connected via WiFi.

Within Axis Neuron there are multiple options that you can choose, but since you’re now connected via WiFi, we’re going to first choose a body preset based on our height, then we can start the calibration.

Once you’re ready choose the Calibrate button and follow the instructions you see on the screen.

You’re asked to do 4 main poses, one after the other, in order for the suit to calibrate the sensors accordingly, and once you’re done you can begin recording Motion Capture.

There are some tweaking you can do that will affect the quality of the mocap while you’re recording, but I won’t cover those in this review.


So after the setup is completed, you have multiple choices on what you can do.

First of all, you can use Axis to record the motion capture animations, and what you need to do is to simply hit the record button.

You’ll hear a sound, indicating that the recording has begun, and you’re also asked to choose a name for the recording itself, and after that you can start doing your motions, and hit the record button once again when you’re done.

You will notice that your motion capture animation is now listed in the working directory on the bottom left of Axis, so you can eventually load the animation and do some tweakings directly in Axis and export the animation in FBX format, so that it can be used by other softwares for mocap cleanup or to be used on characters directly.

The alternative to recording inside Axis is to stream the animation in realtime towards other DCCs or Game Engines, and Noitom provides different ways to do that.

On their website you’re able to download different plugins for Motion Builder, Unreal Engine and Unity, as well as downloading a data reader that gives you realtime data from the sensors.

The use of those plugins is very “plug & play”, and it require very little effort in order to have everything working properly.

There are also existing softwares that did the integration of the streaming functionalities within their software, and FaceRig, iClone and BoB Software did exactly that, so on the website you’ll also find dedicated solutions for those.


UE4 Streaming

Starting from UE4, I’ll do a very basic tutorial, so that everyone will be able to understand how the plugin works, in this case using UE4.19.2

Create an empty project, and as soon as the UE4 project opens, just close it.

Using a file browser, navigate into the project’s main folder, and create a folder called “Plugins”.

Download the plugin UE4.19.2 ( or whatever version you would like to use ) from Noitom website, unzip it and copy the folder called “perceptioneuron” inside the newly created “Plugins” folder.

Start your UE4 project again, and if you go to Plugins, scroll at the very bottom, you’ll see under “Project” > “Animation Data Source”, and if you’ll click on it, you’ll see the “Perception Neuron” plugin, which should be already enabled.

What we need to do now is to download the Noitom Avatar from their website, so let’s do that, and we’ll import the FBX called “NeuronRobot_SingleMesh” into our project.

We’re also using the UE4 Mannequin, just to show how the bind pose of the character matters, and why the character you’re going to use in UE4 must be in T-Pose.

In order to import the UE4 Mannequin we’re going to choose “Add New” > “Add Feature or content pack > Third Person.

What we need to do right now is to create an Animation Blueprint for the NeuronRobot, ad we do so by right click on the NeuronRobot skeletal mesh and choose Create > Animation Blueprint.

In the AnimGraph, right click and search for a node called NewPoseCalc, that is the node which is needed to retarget the joints from the skeleton from Axis onto our NeuronRobot character.

So what you need to do is to match the corresponding joints with each other, and once they’re all set, compile and close the Animation Blueprint.

Now Create a Pawn or Character, and inside the BP add a Skeletal Mesh component, then choose the NeuronRobot character and choose its Animation Blueprint we previously created.

In order for the animation data to be sent from Axis to UE4, we need to add another component, so look for PerceptionNeuron and add it; in the detail tab, the only thing we need to change is the name of the Avatar name, so change it from None to Char00.

Once that is done close the PawnBP and drop it into the scene, preferably put it at the origin of the axis like so.

Now we need to drop into the scene the Perception Neuron Manager, which can be found under the “Models” tab.

Here what we need to do is to modify the Port from 7001 to 7002.

Now open the Axis Neuron software and go to File > Settings, since we need to take care of the following:

  • Under Output format be sure that “Displacement” is not selected

  • Under Broadcasting, be sure that you’re using UDP and the UDP Ips are both 255.255.255.255. If you see only one Ip add another one.

  • In the sub menu, under BVH, be sure that is enabled and as Format choose Binary, and also be sure that the ClientPort is 7002

Close the Settings, and now look on the right towards the Sensors Map tab, and you’ll see a blue raw where you’ll find the name of the Avatar and the body size, so be sure that the Avatar name is Char00.


Once this setup is done, if you’re already wearing the suit and you already did the calibration, if you go back to UE4 and hit Play or SImulate, you will see your character moving on screen!

If we try to do the same thing with the UE4 Mannequin, you will see that because the binding pose is the A-Pose, once we create the Animation Blueprint and add the NewPoseCalc node, that by the way can be copied from one BP to another BP, as soon as we hit play we’ll see that the orientation of the arms is completely off, because of the A vs T Pose between the characters.

What I usually do is to create my characters directly in T-Pose most of the time, but you can eventually also adjust the arms orientation directly inside UE4 in order to solve this issue, using a couple of Transform Joints node and by adding a certain value to the arms joints.


Bonus Tip: In this case we use two different characters, the Mannequin with the UE4 standard skeletal hierarchy, and the NoitomRobot, with its own skeletal hierarchy.

This means that we can drive any character we want in UE4 by simply creating the Animation Blueprint setup explained before, and choosing the joints accordingly.

Based on that, something else that you can do is to record the motion capture animation directly applied onto your character using UE4, rather then getting the mocap from Axis and apply that onto your character in a separate DCC.

In order to do that the steps to follow are really simple:

  • Considering that all the steps for realtime animation inside UE4 are already done, in UE4 you need to go to Window > Sequence Recorder

  • Here choose Add and select the new sequence you just created

  • There are a number of options you can choose here, so we’re going to

  • Set the sequence length, that by default is set to 60 seconds, so if you need to record a long take, be sure to modify this value to something like 500-1000

  • Set the Recording Delay, by default set to 4 seconds

  • Scroll down to Actor Recording, you can pick the actor you want to be recorded, so here choose your character. Mind you, only skeletal meshes can be recorded!

  • Scroll on the bottom and expand the Animation Settings, and on Sample Rae be sure to change it to 60, which correspond to the FPS the animation is being recorded


Once you finished with the settings you can press the Record button, and a countdown will begin.

You will notice that you also have a timer on screen, indicating how much time has passed, so when you’re done with your mocap animation, just press the StopAll button.

By default the animations are recorded inside the Cinematics > Sequence > Animations.

In order to export an animation, in the Content Browser, you can simply right click on it > Asset Action > Export, and this will save a FBX of your animations.


Motion Builder Streaming

Now switching to Motion Builder, first we download the plugin from Noitom’s website, and there are various version of the plugin available, so select the one based on your Motion Builder version, then run the installer.

Once the installation process is done you can open Axis Neuron, and be sure that under File > Settings > Broadcasting you enable Advanced BVH and disable BVH, making sure that the ServerPort is 7005.


Now Open Motion Builder and go to the bottom right window, under Asset Browser, in order to find the Motion Robot Plugin, and drag it into the viewport.

In the Navigator, on the bottom left of the screen, you will see that a new item has appeared, called Devices, and if you click on it we need to make sure that everything is as follow:

  • Enable Online, and you’ll see the light turning yellow

  • Enable Live

  • Under “Model Binding” click on the scroll down menu and select “Create”, and if you’re wearing already the suit, you’ll see a skeletal mesh moving on screen accordingly, otherwise you’ll see a T-Pose skeletal mesh

  • Click Characterize

  • Import your skeletal mesh by either import Merge, or by using the asset browser and drag the skeletal mesh into the viewport and do the characterization for this character, or select its preset characterization template.

  • After you did the characterization, under the Character Controls > Source, switch to NoitomRobot, and you’ll see the animation is assigned to our imported character!


So now that we have the animation retargeted onto our character, in order to record the animation we need to do the following:

  • Go back to the Device and click on it

  • Enable Recording

  • Go to the Timeline and hit Record, choose wheter or not you want to overwrite the current take, and then hit play


When you’re done with the mocap recording you can press either Record or Stop, and the animation will be saved as a Take.

If you want to check your brand new mocap recording, be sure to disable “Live”, otherwise you won’t see the animation playing at all, since it’s still receiving the animation data from Axis Neuron.


You can also test the Unity plugin as well, but since I’m not too familiar with tha game engine, you can search on YouTube a tutorial about how to install and use the plugin.


Motion Capture use in Virtual Reality

One of the most interesting thing about the suit is that it can be used in many different ways, and what I came up with is to give users a Full Body experience in Virtual Reality, instead of having just two floating hands while in VR.

Because the suit works with IMU sensors and can be used wireless, the user can freely walk around a huge environment, and as long as the wifi signal from the hub to the pc holds up, you can have a huge degree of freedom on what you can do, especially if you need to record a long take for a cinematic, or if you want to create a Full Body VR experience in a huge environment.


A simple way to integrate the suit in VR is to use a 3 Degree Of Freedom ( or DOF ) HMD, meaning that you’ll be able to look all around yourself, but if you try to move your head around, you won’t be moving in VR, so movement is handled by the controller.

Using UE4, we can reuse the Character setup previously done, and add some minor tweaks.

An easy way is to add a Scene Component and a Camera as its child, then make the Scene Component a Child of the Skeletal mesh, attaching it to the head socket.

We also need to determine the orientation of our character relative to the HMD, so we create a “Look at” setup, where we interact with a UMG button in order to start a countdown that reorient the character body accordingly.

Once you wear the suit and put on the HMD, in my case I used the Oculus Go, if you look down you’ll see the body which is not properly oriented, so if you look around you’ll see the interactive button, and once you look at it for 2 seconds, it takes the character orientation and make it relative to where you’re currently looking at.

After the calibration is done, you’re free to walk around and experience true Full Body VR.


I also created an advanced version of the setup for the Oculus Go, by using the Oculus DK2 instead, and instead of being thetered to a PC, I used a Backpack VR PC, the MSI VR One, that is able to handle high quality scenes the same way as a desktop PC, being equipped with a GTX1070 and 16Gb of Ram, so the result was a nuch higher visual quality, higher framerate and of course the possibility to walk around in VR in Full Body, which was an amazing experience.


Something else I also developed is the enhancement of the tracking data, in order to completely get rid of the drifting issues that may happen while recording.

What I mean by that is the following:

Because of the way the suit works, if you setup your mocap stage with floor markers, in order to have a reference of your “zero point” where you should start your animations, if you walk around for a bit and go back to that starting point, you’ll notice on Axis that you’re not in the same position where you initially started.

In order to avoid that, I used a Vive Tracker in order to drive the position of the Hips, and use the difference between the Vive Tracker and the Hips position coming from Axis, using that Delta value to drive the rest of the body, resulting in a complete removal of the drifting.

I also developed an integration of the suit with the current HMDs, and I was able to use the setup for the drifting removal with the Vive Pro, by also replacing the data coming from the arms and override those with the Vive controllers, in order to still have a Full Body experience, but to also be able to be 100% precise over what you’re doing in VR.


Final Thoughts

Overall the Perception Neuron Mocap Suit, since its release in September 2015, made Motion Capture affordable for everyone, allowing users to use the suit in many different and original ways, also considering that this kind of technology has always been extremely expensive.

Sure there are better alternatives, and in recent years other companies tried to create better suits with more features, but still as today, this “All in one” solution is worthed the price and if you are committed to get out the best out of it, you won’t be disappointed.


Side Note: Noitom has also developed an improved version of the Perception Neuron Suit, called Perception Neuron Pro, where they removed the connection cables between the different parts of the suit, making the sensors wireless, and also greatly improving the suit straps in order to have a better grip on the body, especially the torso.

Noticeable improvements have been made to avoid drifting by reducing the magnetic interference, and in general the data is more clean and the setup is less time consuming.
Unfortunately as of now, the Pro version comes without any fingers tracking, but they're planning to add those in short time, meanwhile you can add the fingers tracking by using the Noitom Hi5 VR Gloves and integrate them into the Mocap Setup.



Pros

  • You’ll get both Hardware and Software in a single package

  • Very affordable, also compared to competitors

  • Plugins for realtime use into different Game Engines and DCCs

  • No tracking cameras, the user can freely walk around without any limitations


Cons

  • Lack of precision for finite movement

  • Initial setup can be very time consuming, and technical issues can be frustrating

  • Recorded animations often require extensive cleanup



Pricing

Perception Neuron V2 = 1.530€ / 1.699$ ( plus custom )

Perception Neuron Pro = 4.050€ / 4.499$ ( plus custom )

Noitom Hi5 VR Gloves = 999€ / 999$ ( Optional, to add on top of the Perception Neuron Pro )

iPhoneX or later = 1.200€ / 1.300$ ( Optional )

Total = Between 1.530€ / 1.699$ to 6.250€ / 6.960$