Latest Updates



05-08-2020
Rigel Demo Release

We're releasing a demo for Rigel, our "All in One" Full Body Motion Capture Solution for body, fingers and face, so that potential customers can test this solution and evaluate accordingly.

As of now Rigel supports:

- 4 to 8 Vive Trackers
- Noitom Hi5 and Valve Knuckles for Fingers Tracking
- FaceARSample, FaceLink and Face Cap for Facial Tracking ( the recently released Live Link Face is also supported, but is not included since this demo is built using UE4.22 )


Here you can download the executable demo:



Since the introduction of the SteamVR Input plugin, there have been some changes related to the Vive Trackers setup, so we made a video explaining what needs to be done in order to make everything work during Rigel's Calibration.



User Feedback is very important at this stage, so once you'll test Rigel, we ask you to spend a couple of minutes on this survey.



Note: As we recently updated the hands accuracy by introducing a 3 poses calibration setup, both the elbows/hands clipping and the Peak Removal features are not included in this demo, since they're currently being updated.


Rigel release as an Unreal Engine 4 plugin is set at the end of August, pricing will be as follow:
550€ ( one time payment ), with a 50€/year fee for support/maintenance.


We encourage everyone in sending videos of the demo experience, especially if you encounter technical issues or if you would like to give additional advices regarging Rigel setup.

We've been asked to provide some Motion Capture Animation samples, so we uploaded some of them here:

16-07-2020
Perception Neuron to Maya Scripts - Free Download

4 years ago, after I bought my Perception Neuron Mocap suit, I wanted to speed up the animation retargeting in Maya, and instead of using HumanIK, I created a script that allows the retargeting from the Perception Neuron skeletal Hierarchy to the Unreal Engine 4 standard skeletal hierarchy ( i.e. Mannequin and Paragon Characters ) in just a click.

There are two scripts: One does a 1:1 retargeting, while the other one doesn't take into account the XY movement, so that the character is "locked in place".


29-06-2020

Rigel - Features Higlight - Realtime Motion Capture Data Smoothing


In this video you can see the realtime motion capture data smoothing that allows the user to choose the degree of smoothing while recording motion capture. This feature is intended to smooth out data while recording fast movements ( like fighting and fast gestures ), so that the animation curve will require less cleanup after the mocap has been recorded.

For more informations about our Full Body Motion Capture Solution, visit the link below.

Rigel Page


01-06-2020

Rigel - Features Higlight - One Click Calibration and Adaptable IK Realtime Retargeting


In this video we're showing how fast and easy the calibration process is, and how Rigel is able to retarget in realtime the animation data from the Vive Trackers to characters with different body sizes.


For more informations about our Full Body Motion Capture Solution, visit the link below.

Rigel Page

10-04-2020

Introducing Project Rigel - All in One Full Body Motion Capture Solution for Body, Fingers and Face


Project Rigel has been developed in order to fill a gap in the market left by Ikinema Orion, since as of now there aren't any valid Optical Full Body Capture Solutions using Vive Trackers on the market.

In order to make life easier for developers,  on top of the body tracking, I integrated both Fingers and Facial tracking, using external hardware, such as Noitom Hi5, VR Gloves, Face Cap, FaceAR, Faceware, Dyanmxyz, and so on.
As of now Project Rigel is still in development, but I'm planning to release it in very short time.

Project Rigel Tech Demo

Project Rigel Page

18-11-2019

Realtime Facial Animation from Face Cap to Unreal Engine 4 using OSC Plugin


I recently found out about the OSC Plugin for Unreal Engine 4, and since I wanted to experiment with something different, I used the Face Cap iOS App in order to stream realtime facial animation data inside UE4.
Here is the link of the blog article:

Link


05-11-2019

Unreal Engine 4 - Shared Character Rigging Workflow in Maya - Body and Facial Rig


Character rigging, in both in videogames and CGI, has always been a critical part of any project, and often times is a tedious and time consuming process.
Since the amount of character to rig can vary depending on the project, this process can be automated using some clever use of Maya's tools, creating a workflow very similar to what Mixamo and Polywink are currently offering, but targeted to be used in Unreal Engine 4.


The skeletal hierarchy of a character can be constrained to the shape of the character, in order for the joints to move and rotate in place, matching a new character.
A Blend Shape facial rig can be shared across characters with different topology adapting a Blend Shape Head to match a new character head.
A Joint Based Facial Rig can be shared across characters using the same technique used for the Body Shape matching.

All the tools have been developed in Maya using MEL Script, and all the characters share the same Unreal Engine 4 skeletal hierarchy, and the entire process is also integrated with the ART Toolkit plugin for Maya developed by Epic Games.

I'm offering this as a service with very affordable pricing, so if you're interested feel free to contact me using the "Contacts" section on the website. 



09-09-2019

Demos available for download


Since we had lots of requests about what we've been developing for Full Body VR experiences and the use of the Hi5 VR Gloves, we decided to add a "download" section to our website, where you can experience them by yourself.
Please follow the instructions about the requirements, and within the project itself you'll find all the necessary settings in order to run the demos.


08-07-2019

Reusable Skinning Technique for Characters


Character Skinning is usually a very tedious and repetitive process, and since I usually find myself doing character animation tests using different models, I created a setup in Maya where I'm able to reuse the skinning information from a Base Mesh and transfer them onto a brand new character.

Using a MEL script, I'm morphing the Base Mesh and also move the joints accordingly, so that the entire skeletal hierarchy is already set in place.

The Morph mesh inherit the skinning informations via UV, while the brand new character gets the skinning informations using Closest Point on Surface.


14/06/2019

School in Motion Festival - Rome, 15-05-2019

On the 15th of May I was invited as a guest speaker to the School of Motion Festival, based on the use of Animation within education in schools, and I did a live demonstration of the Full Body Motion Capture setup I use.
The main purpose was to introduce young generations of students to the potentials of the new technology and how it can be used in schools.
It was a really nice experience and people were really surprised by affordable consumer level hardware allows you to enter the digital animation world and be creative and imaginative without boundaries.
You can see myself @0.52 seconds ( wearing a prototype of the Helmet for the iPhoneX and the Hi5 VR Gloves ) and @1.48 wearing the Vive Pro and the Hi5 VR Gloves.


27/05/2019

Full Body Motion Capture: Valve Knuckles integration with IKinema Orion

Valve was kind enough to send me two pairs of Knuckles, so first thing I did was to integrate them into the Full Body Motion Capture setup I created a while ago ( using Noitom Hi5 VR Gloves ), in order to showcase how the Knuckles can be alternatively used, beside using them as controllers.

Finger tracking ( using touch capacitors technology ), isn't as accurate as having an IMU sensor on each finger, but I don't mind that much, since you can easily get accurate hands shapes, then export the results in Maya/Motion Builder and tweak them accordingly.

Because of the sensors placement, your hand will be completely closed when the fingers rest on each sensor, meaning that when you're doing the "grabbing" gesture, your index and thumb aren't really bent all the way.

When you're doing mocap isn't really an issue, but if you're planning to use hand gestures and the buttons on the Knuckles for different purpose, you'll probably end up having a confusing gameplay setup, since is very natural to press the buttons below the index and thumb when you're trying to grab an object in VR, so be aware of that.



19/03/2019

Facial Mocap Pipeline - Unreal Engine 4 to Maya

I recently added Facial Motion Capture on top of the Full Body Mocap pipeline, using the iPhoneX and the FaceAR Sample provided by Epic Games on their Marketplace.
The Facial Rig is built using Blend Shapes, and as of now there's no way to export those from UE4 to any DCC, because Blend Shape export is not supported.
To solve this issue, I created a proxy rig within the skeletal hierarchy of the character, adding 52 joints, each one named after each blend shape.
In UE4 I get each blend shape value ( 0 to 1 ) and that value is driving the proxy rig joints on the Z axis.
When the skeletal hierarchy is recorded using Sequencer Recorder and exported to Maya, I then use SDKs to link the proxy rig joint values to each blend shape, in order to drive them in realtime.
The result is a Full Body Motion Capture performance recorded inside UE4 and exported to Maya, including the facial animation as well.






13/02/2019

Rome Meetup Video

On November I did a talk during the second Rome Meetup, where I shared my journey thru Full Body in VR, detailing all the steps and experiments along the way.

Enjoy the video!


29/10/2018

Perception Neuron Vive Tracker Integration - Update 1

This is an update for the Perception Neuron Vive Tracker Integration I developed a while ago.
Since the IK setup previously used was causing some issues with the orientation setup ( given by SteamVR during the Room Scale setup ), I decided to just use FK data from the suit itself, while the Pelvis is the only joint driven by the Vive Tracker.
The setup was created using the Unreal Engine 4.19.2.

The video shows also a UMG Menu, and the buttons available allows to tweak/reorient the Pelvis axis forward vector the orientation, also giving the possibility to rotate the character by +/-90°
Additionally I also added the possibility to use the Vive Controllers in order to drive the Arms using IK, although currently there the UpVector is not set due to some orientation setup issues, but it'll be solved shortly.

The last part of the video also shows the VR setup while wearing the suit, but as of now the setup is very raw and need improvement, but is very cool to see the full body in VR!

Here is the video showing the setup in action:


I also included a playable demo, so if you want to test it you need:
- Perception Neuron Motion Capture Suit ( V1 or V2 )
- Vive / Vive Pro ( with controllers )
- One Vive Tracker

Note: If you try to use the HMD while wearing the suit, the body will probably have the wrong orientation, and as of now the UMG menu can't be used while in VR, but it'll be available soon.


Perception Neuron Vive Tracker Integration Demo



10/08/2018

Full Body VR Interactivity

Blending of the Noitom Hi5 VR gloves and the Ikinema Orion full body tracking setup.

Purpose is to give the user the ability to fully control a body in VR and interact with the environment, giving the possibility to also change each hand role, in order to move or interact.




17/06/2018

Full Body Motion Capture for Unity using IKinema Orion + Noitom Hi5 VR Gloves

Turned out that in Unity creating the same Full Body Motion Capture I'm currently using with the Unreal Engine 4 is slightly more complicated, so I ended up using the mocap recorded inside UE4 and rely on Motion Builder for the animation retargeting for Unity characters, and it works quite nicely!




13/03/2018

WIP - Car Configurator for Audi R8 running on iPad Pro

I'm doing some tests in order to see how well the iPad Pro is capable of managing a huge amount of polygons and Draw Calls.

The entire scene is not optimized and there is a lot of work to do, but overall I'm quite impressed by the results!

Considering that mobile development is all about optimization, the rendering capabilities are not that far from the desktop ( except for the deferred rendering ), but shader quality and technical setups can be ported 1:1 to the iPad from the UE4 scene.


iPad Pro R8 Demo


05/02/2018

WIP - VR Car Configurator for Audi R8


Inspired by the McLaren Car Configurator developed by Epic Games using the Unreal Engine 4, I decided to develop something similar but in Virtual Reality.

I was able to push 11milion triangles ( 3 Audi R8 seen at once ) with 3500 draw calls, and this was quite surprising, since usually around 2 milions triangles perfomance decrease rapidly.
This is running in Editor, but packaged version ( soon available for download ) runs at 90FPS steady.

UMG causes big frame drops, so I'll probably use an alternative which do not cause any issues.

Since this is a WIP some things need to be taken care of, such as:
- Shaders setup for the different car paint colors
- Cross section need to be worked on
- Lighting needs some improvement ( also exterior setup is planned )
- Noitom Hi5 VR Gloves need to be integrated

Tested on MSI VR One, i7-7820 @2.9GHz, 16 Gb of Ram, GTX1070 8Gb





19/01/2018

Manual Assembly Workstation and Virtual Welding Station using Noitom Hi5 VR Gloves

RnD done in order to show the use of VR within working environments, using Noitom Hi5 VR Gloves in order to give the user an advanced interaction in VR.



18/01/2018

Full Body Motion Capture using IKinema Orion + Noitom Hi5 VR Gloves


First test, quite satisfied with the results!

8 Vive Trackers used, animation data stremaing and recorded inside the Unreal Engine 4 @60FPS




06/11/2017

Noitom VR Gloves Prototype with Vive Trackers


Although we already developed a custom solution for VR Gloves ( using the Perception Neuron Mocap Suit ), Noitom was kind and sent us a pair of Hi5 VR Gloves.

First impressions are very positive:

- Comfortable to wear, the Vive Tracker is roughly at 45° on the outside, so it's "out of the way" if your hands are close together

- Pairing is done using a dongle, which can be plugged into the Vive spare USB port on the HMD itself, and the pairing itself is very quick

- Calibration is done by two poses and it takes roughly 15 seconds

- IMU sensors on fingers and wrist gives no noticeable lag

- Unity and Unreal Engine 4 plugins already developed by Noitom, with free scene available for download

- Accuracy is very good, but you may notice some drifting when you move your hands together, but that is due to hand size, so it's not drifting from the IM sensors but it depends how big are your hands in real life vs VR rigged hands


We're currently integrating them into our VR projects, especially into the Manual Assembly Workstation which currently uses the Vive Controllers, so it would be really neat to use fingers to pick up objects!



06/08/2017

UE4 CAD VR Demo

RnD for a continuous rotary Micropumps assembly machine and a conveyor where the camera separates the products based on their color.


Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam porttitor augue a turpis porttitor maximus. Nulla luctus elementum felis, sit amet condimentum lectus rutrum eget.