IKinema Orion Review

IKinema Orion is the perfect answer to super expensive optical based motion tracking technology, since it gives the opportunity to studios and developers to afford Optical Motion Capture without spending huge amount of money.

Being a standalone software solution that relies on external hardware to work, we’re going to take a look at how the solution works and how to do the setup for the hardware itself.

Technical specifications

In order for IKinema Orion to work, you need to buy/own a HTC Vive or Vive Pro, and in addition to that you also need to buy Vive Trackers, a device that works the same way as a Vive controller, but that can be attached to any object, ranging from a prop gun, a ball, a bat, and so on.

The smart thing about Orion is that the Trackers are used onto specific parts of the body, namely the Head, Hips, Feet, Hands and Elbows, and using Inverse Kinematics ( or IK ), the software is able to move the body and limbs around in a very natural and precise way, and the animation that is recorded is very high quality and requires less manual cleanup, compared to IMU based mocap solutions,

Hardware Setup

The Vive Trackers can be bought on the Vive website for 119€/120$, and they come with a usb cable, a Bluetooth dongle and a usb Hub.

The setup in SteamVR is very easy, and it works as follow:

  • Plug the usb dongle on your pc

  • Using the usb cable, connect it to the pc and to the Vive Tracker mini USB port

  • The Vive Tracker will turn on, and in SteamVR you’ll see the Vive Teacker icon green

  • Disconnect the usb from the Vive Tracker, then turn it on again, you’ll see that in SteamVR the Vive Tracker icon is now grey, and the Vive Tracker les is blue

  • Press the central button on the Vive Tracker until the blue light will blink, then in SteamVR right click on the grey Vive Tracker icon and choose Pair Tracker

  • Now SteamVR will pair the dongle with that particular tracker

  • Do the same thing for all the other Trackers

In order to have everything well organised, I used a adhesive tape and wrote the name of the tracker accordingly to what part of the body will be used onto.

To attach the Trackers onto the body, I used a combination of GoPro accessories, some ¼” adapters and Velcro straps, and being the entire setup adjustable, it can be adapted to people with different body size.

Recording and Streaming

IKinema Orion installation is very straight forward, and during the installation you’re asked where to install the software and what plugins you want to install.

Once the installation is completed and you first start Orion, you’ll notice that the interface is intuitive and with few but essential functions, and you have all the basic functions already in the startup window, since you can decide what Vive Trackers setup you want to use, and if you want to quickly start the motion capture process by pressing the Play button, and if you want to record the mocap you’re performing.

The presets on the lower part of the window will affect the placement of the Vive Trackers on the body and if the mocap will be recorded using the Vive HMD, and in general if Orion will be used directly in VR ( more on that later ).

About the Vive Trackers placement, when you scroll thru the presets, you can clearly see how the Vive Trackers are placed onto the body, but to have a better understanding on how exactly they’re mounted and how they need to be oriented, it’s better to read the documentation on the official website, where there is an accurate description about the trackers placement and orientation, so be sure to follow that before starting the mocap process.

On the bottom of the window you have additional buttons, and one of these is the “Viewport”, that will show a 3D view of the mocap scene, where a character will be shown in realtime, so that you can see the animation while is being performed.

The “Help” button send you to the documentation page and the “Options” button pulls up a series of tabs, that allows you to customize the mocap setup you’re going to use.

First we have the “Input” tab, where you can choose to stream the animation using the “Live” mode, that will take the data coming in realtime, and also allows you to record the take itself and give it a name, meaning that Orion will record the animation using an internal file format, and you can later choose to review or playback the take you just recorded using the “Playback Vive Take” scroll down menu.

By default you’ll see a couple of mocap animations that you can play, but every time you choose to record the take, you’ll also see that take in the scroll down menu, with the name you choose when you recorded.

Switching to the next tab we have the “Output”, that allows you to control the settings of the streaming port and subject name.

Unless you need specific Port number or subject name, the defaults, Skeleton as subject name and Port 3883, are perfectly fine, but what’s important is what Output Avatar you’re gonna use, since the joint name is what’s used for realtime streaming skeleton retargeting within Unity and Unreal Engine, so be sure to use the proper character based on what software you’re using, so UE4 Mannequin for the Unreal Engine and Unity-chan for Unity.

The Output Avatar is also the same character you see in the viewport while you’re performing the animation, so also from there you can see which character is being used for realtime streaming.

You can also drive a male or female IKinema character, or use the Genesis character, that has the same skeleton hierarchy as the characters used in Daz3D.

Instead if you select the “Actor” preset, what will be used is a character with custom skeletal joints proportions, as shown in the next tab called “Actor”.

Here you can see that you can input custom dimensions for Arm, Spine, Neck and Legs, that will be measured on the person that is going to perform the motion capture.

Remember that you can always save the Actor skeleton settings by using the Save button that you can find below.

If we switch to the “Misc” tab, the first thing you can set is the delay of the calibration: by default is set to zero, meaning that the calibration itself will be done when the user will press the trigger of the Vive Controller, while if you set it to a value higher then 0, there will be a countdown, and at the end the calibration will be prompted.

You can slo choose to record the mocap animations in different formats, choosing between FBX and BVH, which are the most common formats when dealing with Motion Capture animations.

Next tab is the license tab, where you’re going to use your license key provided by IKinema, and last we have the “About” tab, that shows you the current Orion version you’re using.

If you plan to use Orion itself as your main software to record motion capture, there are few settings you need to set, so doe the following;

  • In the first window, select what kind of rig you’re going to use

  • In the Input tab, be sure that “Live” is set, and optionally enable the “Record Vive Take”, just to have a backup file

  • In the Output tab be sure to select what character you want the mocap to be created with

  • In the Misc tab, whether you’re using the Vive controller or Vive Trackers for hands, set the delay to either 0 or 3-5 seconds

  • Close the Options and in the first window, click the “Record FBX” and then hit the Play button

So as you can see this is a very fast and easy solution to quickly record your motion capture sessions and also the FBX file to be then used onto your character in order to perform some cleanup, or to use it for placeholder animations.

A notable restriction that you have while using Orion is the characters that you can use while recording you mocap animations, meaning that you have the already mentioned characters found in the Avatar scroll down menu, but if you want to record mocap animations using your custom character, you have two different options:

  • Pay 200£ to IKinema in order to receive a custom script tailor made to your character skeletal joints and proportions, so you can see your character in realtime inside the Viewport in Orion

  • Buy IKinema Live Action for UE4 for 2.500£, that allows you to characterize your character in UE4 and see it at runtime inside the Unreal Engine 4, driven by Orion IK.

Bonus Tip: Every time you launch IKinema Orion, the license is going to be checked online, so be sure that you’re connected with either cable or wifi, and if there is an upgraded version of Orion, the software wil display a “UPDATE” text in the initial window of Orion, so that you’re notified instantly, and from there you can choose to install the upgrade in order to have the latest version available.

UE4 Streaming

Since we covered all the different functions within IKinema Orion, we’re ready to stream the realtime animation inside the Unreal Engine 4.

First we do the setup in UE4, by creating an empty project, then add the Plugins folder to the newly created project and restart UE4; as soon as it’s open, go to Edit > Plugins and look for the “Animation” tab, and enable the Orion Streamer, and you’ll be prompted to restart the editor.

As soon as the UE4 scene has been restarted, you need to do the following:

  • From the Content Browser, go to Add > Third Person Template

  • Navigate thru the folders, until you’ll find the UE4 Mannequin Skeletal Mesh

  • Right click on it, choose Create > Animation Bludeprint, then give it a recognizable name

  • Open the Animation Blueprint, and in the AnimGraph right click and type “orion”

  • A single node, called Orion Stream, will appear, so choose it and you’ll see that you need to input some informations

  • In the “Server name” just put “localhost”, used when you’re using the PC you have Orion installed, and not a network host, to stream towards UE4

  • In the Port number, if you’re using the ddefault, put 3883, but if you choose something different just double check it within the Output tab in Orion

  • As Subject Name use “Skeleton”

  • Set Reconnect to enabled, then compile and save

  • In the Content Browser, Right Click and choose “Blueprint Class”, then choose Pawn

  • In the Component Tab click on the arrow and search for the Skeletal Mesh component

  • Go back to the content browser and select the UE4 Mannequin skeeltal mesh, then assign it as the skeletal mesh in the PawnBP

  • Again in the content browser, look for the Animation Blueprint you previously created, then set it to be the AnimBP for the UE4 Mannequin, compile and save

Side note: As of today, the plugin for UE4.20 doens’t work, so be aware of that when you start developing your project, but you can get the plugin for 4.19 and 4.21.

In Orion to start streaming the animation, you need to do the following;

  • If you’re going to stream live, make sure that in the Input tab, the “Live” option is selected, otherwise if you’re going to stream a previously recorded animations, you can select Playback Vive Take, and select the take from the scroll down menu.

  • If you’re using a controller for the hands, make sure that in the Misc tab, the delay is set to zero, while if you’re using Vive Trackers on the hands, set the delay to something like 3-5 seconds

  • After that close the Options menu using the button below, and based on your realtime setup, you need to choose the rig from the scroll down menu.

  • Press Play and either start the calibration using the trigger on the Vive controllers, or wait for the countdown, and when you’ll see a character appearing on screen, the calibration is done

  • Go back to UE4, drag&drop the UE4 Mannequin Pawn inside the scene, set it to 0,0,0, then hit the Play or Simulate button.

Now you can see that in both the Orion Viewport and in UE4, the Mannequin is moving in realtime, based on your movements, or alternatively the animation playback is being streamed from Orion towards UE4.

If you look thru the official documentation, you’ll also find aUE4 sample scene to download, where you have pretty much what I described above already set for you to use and test..

Bonus Tip: Because Orion gets the Vive Trackers data externally, you will also notice that if UE4 isn’t in either Play or Simulate, in the PawnBP you’ll see the Mannequin moving, so the data is being constantly sent in realtime.

You may have some issues during the calibration, meaning that sometimes after you either press the trigger or wait for the countdown, you’ll see some text on the first window in Orion, where you can see what tracker failed to calibrate properly.

If you keep having those issues and the calibration take longer then usual to success, these are usefull advices:

  • If you have IR cameras or the wifi router is very close to the Trackers dongles, try to either turn the camera/wifi router off, or move them away from the dongles

  • Connect as many USB cables as the dongles on the main USB hub, also using the Vive Tracker USB support, in order to keep all the dongles at 20cm from each other, mostly because they can interfere with each other signal.

  • Turn off and on again Orion, sometimes is just what’s needed

Bonus Tip2: You can decide to use UE4 to also record the motion capture animations while it’s being perfomed, here is what you need to do:

  • Considering that all the steps for realtime animation inside UE4 are already done, in UE4 you need to go to Window > Sequence Recorder

  • Here choose Add and select the new sequence you just created

  • There are a number of options you can choose here, so we’re going to

  • Set the sequence length, that by default is set to 60 seconds, so if you need to record a long take, be sure to modify this value to something like 500-1000

  • Set the Recording Delay, by default set to 4 seconds

  • Scroll down to Actor Recording, you can pick the actor you want to be recorded, so here choose your character. Mind you, only skeletal meshes can be recorded!

  • Scroll on the bottom and expand the Animation Settings, and on Sample Rate be sure to change it to 60, which correspond to the FPS the animation is being recorded

Motion Capture use in Virtual Reality

Orion shines when it comes to Virtual Reality, since it just works with the same setup you use while performing Motion Capture Animations, meaning that the rig can be used for both, without any additional tweaking.

So if we use the same setup in Unreal Engine 4, put on the Vive HMD, set the PawnBP to Posses0 in the details tab, and switch from Play to Play in VR, you can now see yourself in VR with your full body!

Yes is that simple, and allows you to have a very comfortable experience in VR, since you can actually see your full body moving accordingly as in real life, rather then having two floating hands in space.

Since Orion takes care of the main joints in the body, but doesn’t take into account any fingers, we can also set some pre-recorded animations to be used while using the trigger, like the hand closing and opening when you press the Vive Controller triggers, similar to what you see in the UE4 VR Template, so you can also reuse those animations together with Orion.

Because I always try to push the technology available, I also used the Noitom Hi5 VR Gloves in order to add fingers animation on top of the realtime body animation, and I created the entire setup using the Unreal Engine 4 to combine both animations at the same time, as you can on screen right now.
Also Valve was kind enough to send me two pairs of Valve Knuckles ( or as they're being named nowdays, Valve Index Controllers ), and I quickly integrated them into the Motion Capture setup I'm using with IKinema Orion, adding a different method to have finger tracking while doing motion capture.

The result is a solution that very few studios are able to offer, since fingers animation is usually something that is added after the main body mocap is being recorded, so is a task assigned to the Animator itself, using mostly hand poses and traditional keyframe animations, but this setup drastically reduce the need of having this animations done manually, saving time and also allow the actor who’s performing the mocap to have 100% of his performance.

In addition to that, since Augmented Reality has become more mainstream and recent phones supports that technology, I was also able to add Facial Motion Capture as well, using ARKit and the iPhoneX, using as a base the FaceAR sample scene provided by Epic Games, and on top of that I also used the Oculus Lipsync plugin, driven by realtime audio, blending the results of both in realtime.

The current result is a fully featured, head to toes, motion capture solution, that allows solo developers and studios to have top quality motion capture animations, keeping the hardware and software cost relatively low, compared to big names like OptiTrack and Vicon.

I experimented a lot with Full Body in VR, and since I previously developed a custom solution in for the Hi5 gloves by creating hand poses and assigning actions to those, like grab, teleport, move forward, I applied this kind of template I built to be used with Full Body VR, so that you can use both hands and assigning them predefined roles, and as you can see on scree, both can be used for actions like grabbing and releasing objects in the world, or used for moving and teleporting around, but still be able to use the Full Body in VR, and the entire setup can be easily expanded in order to be used for Job Training simulations, Line operators behaviour study, posture and movement study, and many more, so the possibilities are endless.

Final Thoughts

IKinema Orion made optical tracking technology available to anyone, by providing a solution that is very cost effective and ensure top quality animations out of the box.

Although is a software only solution and all the hardware need to be bought separately, is a must own solution for however want to experiment with top quality motion capture and Full Body VR solutions.

The solvers and IK setup is really top notch and the recorded animations can easily be used as a temporary placeholder before they’re cleaned up, but overall the money you’re going to spend are really worthed.


  • Very easy to use

  • Top quality Motion Capture Animations

  • Can be easily used for VR

  • Lots of uses for non-gaming experiences

  • Realtime streaming towards Unity and Unreal Engine available via plugins


  • Hardware + software pricing can be very high for some ( up to 5k€ )

  • Limited realtime character selection, that can be solved by either paying 200£ for a custom character, or buy IKinema Live Action for 2.500£

  • Limited tracking area ( 5x5 for the Vive, 6x6 for the Vive Pro ), that can be expanded by buying additional Lighthouse 2.0 base stations only for the Vive Pro.

  • Realtime streaming towards Motion Builder/Maya missing


8 Vive Trackers = 119€ x 8 = 952€

HTC Vive / Vive Pro = 599€ - 1.119€ / 1.399€

IKinema Orion License = 450€ / year

Noitom Hi5 VR Gloves = 999€ / 999$ ( Optional )

Valve Knuckles = 299€ / 279$ ( Optional )

iPhoneX or later = 1.200€ / 1.300$ ( Optional )

= Between 2.000€ / 2.230$ and 5.000€ / 5.560$