Full Body Virtual Reality Solutions for:

- Job Training

- Engineering

- Automotive

- Assembly Machine Simulations

- Full Body "Interactive Movie Experience"

- Interactive Architectural Visualization

- Shared VR environment for multi-user reviews


Latest Updates


Introducing Project Rigel - All in One Full Body Motion Capture Solution for Body, Fingers and Face

Project Rigel has been developed in order to fill a gap in the market left by Ikinema Orion, since as of now there aren't any valid Optical Full Body Capture Solutions using Vive Trackers on the market.

In order to make life easier for developers,  on top of the body tracking, I integrated both Fingers and Facial tracking, using external hardware, such as Noitom Hi5, VR Gloves, Face Cap, FaceAR, Faceware, Dyanmxyz, and so on.
As of now Project Rigel is still in development, but I'm planning to release it in very short time.

Project Rigel Tech Demo

Project Rigel Page


Realtime Facial Animation from Face Cap to Unreal Engine 4 using OSC Plugin

I recently found out about the OSC Plugin for Unreal Engine 4, and since I wanted to experiment with something different, I used the Face Cap iOS App in order to stream realtime facial animation data inside UE4.
Here is the link of the blog article:



Unreal Engine 4 - Shared Character Rigging Workflow in Maya - Body and Facial Rig

Character rigging, in both in videogames and CGI, has always been a critical part of any project, and often times is a tedious and time consuming process.
Since the amount of character to rig can vary depending on the project, this process can be automated using some clever use of Maya's tools, creating a workflow very similar to what Mixamo and Polywink are currently offering, but targeted to be used in Unreal Engine 4.

The skeletal hierarchy of a character can be constrained to the shape of the character, in order for the joints to move and rotate in place, matching a new character.
A Blend Shape facial rig can be shared across characters with different topology adapting a Blend Shape Head to match a new character head.
A Joint Based Facial Rig can be shared across characters using the same technique used for the Body Shape matching.

All the tools have been developed in Maya using MEL Script, and all the characters share the same Unreal Engine 4 skeletal hierarchy, and the entire process is also integrated with the ART Toolkit plugin for Maya developed by Epic Games.

I'm offering this as a service with very affordable pricing, so if you're interested feel free to contact me using the "Contacts" section on the website. 


Demos available for download

Since we had lots of requests about what we've been developing for Full Body VR experiences and the use of the Hi5 VR Gloves, we decided to add a "download" section to our website, where you can experience them by yourself.
Please follow the instructions about the requirements, and within the project itself you'll find all the necessary settings in order to run the demos.


Reusable Skinning Technique for Characters

Character Skinning is usually a very tedious and repetitive process, and since I usually find myself doing character animation tests using different models, I created a setup in Maya where I'm able to reuse the skinning information from a Base Mesh and transfer them onto a brand new character.

Using a MEL script, I'm morphing the Base Mesh and also move the joints accordingly, so that the entire skeletal hierarchy is already set in place.

The Morph mesh inherit the skinning informations via UV, while the brand new character gets the skinning informations using Closest Point on Surface.


School in Motion Festival - Rome, 15-05-2019

On the 15th of May I was invited as a guest speaker to the School of Motion Festival, based on the use of Animation within education in schools, and I did a live demonstration of the Full Body Motion Capture setup I use.
The main purpose was to introduce young generations of students to the potentials of the new technology and how it can be used in schools.
It was a really nice experience and people were really surprised by affordable consumer level hardware allows you to enter the digital animation world and be creative and imaginative without boundaries.
You can see myself @0.52 seconds ( wearing a prototype of the Helmet for the iPhoneX and the Hi5 VR Gloves ) and @1.48 wearing the Vive Pro and the Hi5 VR Gloves.


Full Body Motion Capture: Valve Knuckles integration with IKinema Orion

Valve was kind enough to send me two pairs of Knuckles, so first thing I did was to integrate them into the Full Body Motion Capture setup I created a while ago ( using Noitom Hi5 VR Gloves ), in order to showcase how the Knuckles can be alternatively used, beside using them as controllers.

Finger tracking ( using touch capacitors technology ), isn't as accurate as having an IMU sensor on each finger, but I don't mind that much, since you can easily get accurate hands shapes, then export the results in Maya/Motion Builder and tweak them accordingly.

Because of the sensors placement, your hand will be completely closed when the fingers rest on each sensor, meaning that when you're doing the "grabbing" gesture, your index and thumb aren't really bent all the way.

When you're doing mocap isn't really an issue, but if you're planning to use hand gestures and the buttons on the Knuckles for different purpose, you'll probably end up having a confusing gameplay setup, since is very natural to press the buttons below the index and thumb when you're trying to grab an object in VR, so be aware of that.


Facial Mocap Pipeline - Unreal Engine 4 to Maya

I recently added Facial Motion Capture on top of the Full Body Mocap pipeline, using the iPhoneX and the FaceAR Sample provided by Epic Games on their Marketplace.
The Facial Rig is built using Blend Shapes, and as of now there's no way to export those from UE4 to any DCC, because Blend Shape export is not supported.
To solve this issue, I created a proxy rig within the skeletal hierarchy of the character, adding 52 joints, each one named after each blend shape.
In UE4 I get each blend shape value ( 0 to 1 ) and that value is driving the proxy rig joints on the Z axis.
When the skeletal hierarchy is recorded using Sequencer Recorder and exported to Maya, I then use SDKs to link the proxy rig joint values to each blend shape, in order to drive them in realtime.
The result is a Full Body Motion Capture performance recorded inside UE4 and exported to Maya, including the facial animation as well.


Rome Meetup Video

On November I did a talk during the second Rome Meetup, where I shared my journey thru Full Body in VR, detailing all the steps and experiments along the way.

Enjoy the video!


Perception Neuron Vive Tracker Integration - Update 1

This is an update for the Perception Neuron Vive Tracker Integration I developed a while ago.
Since the IK setup previously used was causing some issues with the orientation setup ( given by SteamVR during the Room Scale setup ), I decided to just use FK data from the suit itself, while the Pelvis is the only joint driven by the Vive Tracker.
The setup was created using the Unreal Engine 4.19.2.

The video shows also a UMG Menu, and the buttons available allows to tweak/reorient the Pelvis axis forward vector the orientation, also giving the possibility to rotate the character by +/-90°
Additionally I also added the possibility to use the Vive Controllers in order to drive the Arms using IK, although currently there the UpVector is not set due to some orientation setup issues, but it'll be solved shortly.

The last part of the video also shows the VR setup while wearing the suit, but as of now the setup is very raw and need improvement, but is very cool to see the full body in VR!

Here is the video showing the setup in action:

I also included a playable demo, so if you want to test it you need:
- Perception Neuron Motion Capture Suit ( V1 or V2 )
- Vive / Vive Pro ( with controllers )
- One Vive Tracker

Note: If you try to use the HMD while wearing the suit, the body will probably have the wrong orientation, and as of now the UMG menu can't be used while in VR, but it'll be available soon.

Perception Neuron Vive Tracker Integration Demo


Full Body VR Interactivity

Blending of the Noitom Hi5 VR gloves and the Ikinema Orion full body tracking setup.

Purpose is to give the user the ability to fully control a body in VR and interact with the environment, giving the possibility to also change each hand role, in order to move or interact.


Full Body Motion Capture for Unity using IKinema Orion + Noitom Hi5 VR Gloves

Turned out that in Unity creating the same Full Body Motion Capture I'm currently using with the Unreal Engine 4 is slightly more complicated, so I ended up using the mocap recorded inside UE4 and rely on Motion Builder for the animation retargeting for Unity characters, and it works quite nicely!


WIP - Car Configurator for Audi R8 running on iPad Pro

I'm doing some tests in order to see how well the iPad Pro is capable of managing a huge amount of polygons and Draw Calls.

The entire scene is not optimized and there is a lot of work to do, but overall I'm quite impressed by the results!

Considering that mobile development is all about optimization, the rendering capabilities are not that far from the desktop ( except for the deferred rendering ), but shader quality and technical setups can be ported 1:1 to the iPad from the UE4 scene.

iPad Pro R8 Demo


WIP - VR Car Configurator for Audi R8

Inspired by the McLaren Car Configurator developed by Epic Games using the Unreal Engine 4, I decided to develop something similar but in Virtual Reality.

I was able to push 11milion triangles ( 3 Audi R8 seen at once ) with 3500 draw calls, and this was quite surprising, since usually around 2 milions triangles perfomance decrease rapidly.
This is running in Editor, but packaged version ( soon available for download ) runs at 90FPS steady.

UMG causes big frame drops, so I'll probably use an alternative which do not cause any issues.

Since this is a WIP some things need to be taken care of, such as:
- Shaders setup for the different car paint colors
- Cross section need to be worked on
- Lighting needs some improvement ( also exterior setup is planned )
- Noitom Hi5 VR Gloves need to be integrated

Tested on MSI VR One, i7-7820 @2.9GHz, 16 Gb of Ram, GTX1070 8Gb


Manual Assembly Workstation and Virtual Welding Station using Noitom Hi5 VR Gloves

RnD done in order to show the use of VR within working environments, using Noitom Hi5 VR Gloves in order to give the user an advanced interaction in VR.


Full Body Motion Capture using IKinema Orion + Noitom Hi5 VR Gloves

First test, quite satisfied with the results!

8 Vive Trackers used, animation data stremaing and recorded inside the Unreal Engine 4 @60FPS


Noitom VR Gloves Prototype with Vive Trackers

Although we already developed a custom solution for VR Gloves ( using the Perception Neuron Mocap Suit ), Noitom was kind and sent us a pair of Hi5 VR Gloves.

First impressions are very positive:

- Comfortable to wear, the Vive Tracker is roughly at 45° on the outside, so it's "out of the way" if your hands are close together

- Pairing is done using a dongle, which can be plugged into the Vive spare USB port on the HMD itself, and the pairing itself is very quick

- Calibration is done by two poses and it takes roughly 15 seconds

- IMU sensors on fingers and wrist gives no noticeable lag

- Unity and Unreal Engine 4 plugins already developed by Noitom, with free scene available for download

- Accuracy is very good, but you may notice some drifting when you move your hands together, but that is due to hand size, so it's not drifting from the IM sensors but it depends how big are your hands in real life vs VR rigged hands

We're currently integrating them into our VR projects, especially into the Manual Assembly Workstation which currently uses the Vive Controllers, so it would be really neat to use fingers to pick up objects!



RnD for a continuous rotary Micropumps assembly machine and a conveyor where the camera separates the products based on their color.