ARTICLE UPDATE:
the ManipulationHandler is soon going to be deprecated, therefore use ObjectManipulator: article here: https://codeholo.com/2020/09/24/anchoring-objects-with-local-anchors-and-persisting-with-hololens-2/
Hello everyone! I’m back with another tutorial for the HoloLens 2. This time for Manipulating 3D models i.e moving, rotating, scaling objects using the HoloLens 2 and the latest stable MRTK V2.3
As a bonus, we will also explore how to turn on and off the hand mesh visualization and the diagnostics in this tutorial. The hand mesh is especially useful if you want to make use of the new hand tracking and fingertip visualization feature from MRTK.
Microsoft has covered this very well in their tutorial here and here
You may have seen my tutorial for the HoloLens 1 with moving, rotating, scaling objects with the older HoloToolkit here.
So, yes- Microsoft has upped it’s game with the new toolkit and he new headset. So many things have gotten better. I have given an introduction into how to start with HoloLens 2 and MRTK V2.0 in my previous article.
Before we start, a few changes:
1. I have used the Unity version 2019.3.1f1. This is because I am now switching completely into IL2CPP as the Scripting Language. This is a lot more compatible if you are writing projects which are also combining with other OS such as ARkit. Moreover, Unity 2019.x versions have only IL2CPP as Scripting language as .NET is deprecated
2. I used the latest stable release of MRTK V2.3.
Ok, let’s get started. By the way, if you did not already know, you can use the Unity Hub to download different versions of Unity and open projects with different versions. I found it really helpful while switching between projects with different versions.
1. First, create a new project in Unity with version 2019.3.xf
2. Now you need to import the MRTK V2.3 packages. The mandatory one is called Foundation. The optional ones are the Examples , Extensions and Tools. I will not go into detail about Extensions and Tools. But you could also import Examples, because it has a good collection of prefabs and example scenes which you may need for your projects
3. Once the import finishes, you’ll see a Apply default settings popup. Click on Apply. This makes sure the MRTK default settings is applied to your project.
4. Click on Mixed Reality Toolkit at the top and click on Add to Scene and Configure. This will add the Mixed Reality essentials to the project such as the Playspace and the Toolkit.
The Playspace has the Main Camera and the Toolkit has the essential things you’d need for your project. Feel free to expand the two and check out what’s in there
5. Now let’s have a look at the profile. So as I was saying, everything has changed. Some essentials like the InputManager, Cursor etc are now part of what is called as a profile. Each type of behaviour is controlled by a different kind of a profile: for example: Input system profile controls the Input behaviour, Spatial Awareness, the spatial mapping system and so on. For the HoloLens2, we will choose the DefaultHoloLens2ConfigurationProfile. To do this, click on the MixedRealityToolkit and in the dropdown for profiles, choose the DefaultHoloLens2ConfigurationProfile.
6. You’ll notice that none of the behaviors like camera, input, diagnostics etc are editable. But we’d like to change some things for this tutorial. For example, turning off the annoying diagnostics (I find it interfering with the view). For this, there are two ways. You can either Clone the profile or Click on Copy & Customize. I haven’t noticed any difference between the two. Both clicks bring up the same popup. If I do find out the difference, I will update it here. So let’s click on Copy & Customize, and that brings up a popup.
In this, we want several profiles to be accessible. Therefore, we will expand on the Advanced Options and clone the Input System Profile and the Diagnostics System Profile by clicking on Clone Existing for each of them. Then click on Clone.
Now you’ll see that your Input and Diagnostics profiles are editable. The reason why we want the Input profile is to turn on/off the hand mesh visualization. But we will come to this later.
Let’s now get a 3D object which we would like to manipulate:
The Smithsonian recently opened up it’s awesome collection where .obj files are available to download 🙂 These are available both in full and low resolution and they are uber cool. So, I went ahead and downloaded one of them. Heads up, it took a while for me to download the full res. Also, make sure you take the ones marked with CC and credit them. You do get only .obj files, so you have to convert them to .fbx. Use Blender or any such similar online tool. You can even use a simple 3D object in your scene. But this one makes it cooler 🙂
I took the Triceratops which is a really cool fossil dinosaur. I converted this to .fbx and then created a material with the jpgs provided for the same.
Let’s do a quick check now. Let’s build and deploy and see what we can see. Make sure the object is visible in front of the camera. I also rotated it around the Y axis by 90 to face the camera. You also MUST NEVER change the MixedRealityToolkit and MixedRealityPlayspace positions. They always must be at 0,0,0. If you need to adjust, always adjust the objects.
Make sure you also go to Player settings and check Virtual Reality Supported. The list would be empty, so add Windows Mixed Reality
Go to Build Settings and choose the build settings shown in the screenshot and then click on Switch Platform. Choose a folder to build onto like for the HoloLens 1. The build will take some time for the first instance as it’s IL2CPP builds. Takes about 10 min depending on how big your project is.
If all is well, then you should see a tiny dino in the view and the small Diagnostics window floating around.
Tip: To reduce build times, build onto the same folder next time – takes about 2 min or less
Ok, now let’s make the dinosaur big. I increased the scale to 10,10,10.
Let’s also disable the Diagnostics tab:
Note that this is especially useful when you have a lot of performance affecting modules in your project. This will show you exactly at which point it affects the performance. Since we have a low res model and we don’t do any complex stuff, we need this off.
Click on the MixedRealityToolkit and on the Diagnostics tab. Then uncheck the Enable Diagnostics System.
Let’s go to the next step of adding the manipulation:
Firstly, we need a collider. Since the object is a dino, I will add a simple capsule collider to it. I don’t want to complicate with the collider stuff in the tutorial.
Now click on the Triceratops object and Add Component and type Capsule Collider.
The collider will be quite big, so make sure you adjust it to fit the dino correctly.
My dino collider values are shown in the screenshot.
Now we need the Manipulation Handler. On the Triceratops, click on Add Component and add the Manipulation Handler from the MRTK.
Let’s look at the Inspector properties on the Manipulation Handler:
In the Host Transform drag and drop the object you want to manipulate. In this case, it is the Triceratops.
Let’s keep the Manipulation Type as One and Two Handed. So you can manipulte it with both hands. (I prefer the two handed personally, but it’s your choice)
We want to manipulate position, rotation and the scale of the dino. So keep the Two Handed Manipulation to Move Rotate Scale
For Manipulation Events let’s add a sound clip when we start moving and end the movement, so that we also have some audio feedback. For that we need to add an Audio Source to the model. Click on Add Component and add Audio Source
On Manipulation Started, add Runtime Only and look for AudioSource.PlayOneShot and add MRTK_Move_End
For On Manipulation Ended, add MRTK_Move_Start
Don’t forget to drag the model onto the gameobject field
You can keep the rest of the Default Settings as the same.
Now we also add the Bounding Box:
This is now nicely stylized with the new HoloLens 2.
Click on the Triceratops and Add Component and add the Bounding Box
We need to now add a bunch of assets from the MRTK into the Inspector values of the Bounding Box
For the Bounding Box settings, add the Triceratops as the Target Object.
In the Box Material add BoundingBox
In the Box Grabbed Material add BoundingBoxGrabbed
Add BoundingBoxHandleWhite to Handle Material
BoundingBoxHandleBlueGrabbed to Handle Grabbed Material
MRTK_BoundingBox_ScaleHandle to Scale Handle Prefab
MRTK_BoundingBox_ScaleHandle_Slate to Scale Handle Slate Prefab
MRTK_BoundingBox_RotateHandle to Rotation Handle Prefab
If the Proximity Effect is Active, then the handles are shown and hidden with animation based on distance from the hands. This is explained more in detail here: https://microsoft.github.io/MixedRealityToolkit-Unity/version/releases/2.3.0/Documentation/README_BoundingBox.html We keep the recommended settings
We will also add one last thing to the Manipulation Handler:
in order for the Bounding Box edges to behave the same way while moving it using Manipulation Handler’s far interaction.
For this go to your Manipulation Events under Manipulation Handler, and add the BoundingBox.HighlightWires with OnManipulationStarted and BoundingBox.UnHighlightWires with OnManipulationEnded
Ok, now time for another intermediate build and deploy to test whether what we added works so far. Build it to the same folder so that your build is faster.
Before you do that, to quickly check if you have no errors, try out the simulation in the Editor by clicking Play on it. You should be able to see the dinosaur with a blue cube around it. This is the BoundingBox. You can also try to simulate the hands by following the Hand Simulation tips by Microsoft here
However, I realize the best is to try it on the headset directly
If everything goes well, then you should see the dino in front of you when you start and you can put your hands near it to pinch it and drag it to wherever else you want. Since one handed manipulation is on, you should also be able to use two handed manipulation to increase scale, rotate and move the dino. It also should play your audios respectively when you start manipulation and end it.

What I realized is that the default handle sizes on the object are small for me to pinch. So let’s fix that. I like to have bigger handles so that I know where I am grabbing. And the handles once grabbed turn blue, so that is good feedback unlike on the HoloLens 1 handle style.
Go to the BoundingBox Scale Handle Size and make that bigger. I made it to 0.1. Also the same for Rotation Handle Size.
Another annoying thing which I like to remove is this blue box around the dino when it starts. I want it to only appear when I start the manipulation. So uncheck the Show Wireframe.
Ok, build and deploy once more. Don’t forget same folder unless your app folder gets corrupted due to some reason and you have to create a new one and build and deploy it onto that
Now you’ll see the handles only being shown when you go near the dino or if you gaze at the edges of the dino. Or the whole BoundingBox animates itself and highlights it’s wires (remember the HighlightWires method we added to the ManipulationHandler?) when you go near because Proximity is active or when you gaze at the edges.

Since far Manipulation is also enabled, you can also gaze at the dino and from far do the pinch action with your hands (one or two handed depending on what you have enabled). You will see the hand rays “grabbing” the dino and then you can turn it or scale it. So, you do not have to be close to manipulate an object.

Ok, now let’s move on to the next part where we see our hands covered by this mesh like thing:
This is useful if you want to see a pseudo version of your hands in the scene and is especially useful for you to understand when you interact with an object. This is making use of the new HoloLens 2 hand tracking feature. And the mesh is called Hand Mesh visualization. Now this is turned off by default because it takes a hit on performance. But we are again not doing something complex and our model is low res. So we can turn this on.
For this, go to MixedRealityToolkit and on the Profile, click on Input tab. Here, you will see that the Hand Tracking part is uneditable. For this, we have to clone the profile for HandTracking.
Once you clone it, the settings for Hand Tracking become editable. Make sure the Hand Mesh Prefab is set to ArticulatedHandMesh and HandMeshVisualization Modes is set to Everything. The HandJoint Visualization Modes shows the joints tracked as well. But we don’t need this.
Cool, so now build and deploy one last time.
Now you should be able to see a mesh on your hands when you start your app and also while manipulating etc. This I found really cool because it lets me know when exactly I have pinched a handle on the object to rotate or move it. And this is also useful when you want to touch some buttons or other interactable objects and know where exactly you touched.

Your app output should look something like the video below:
Errors and Solutions:
Error: Object does not get manipulated (move, rotate or scale)
Solution: Check if there is a collider attached and if it is properly fitted onto the object
Error: Build in Unity project gets stuck with some font asset error from MRTK
Solution: I have looked around for a solution for this but in vain. The only thing that helped me was removing the MRTK Foundation package, cleaning the project and importing this again. It is a pain, I hope they fix this with future versions…
Error: No handles shown or no handles shown when I go near
Solution(s): Check if Bounding Box script is attached to the object. And if Proximity Effect is Active. Also if handle sizes are more than 0
That’s it! Do tell me if this tutorial helped you and stay tuned for more HoloLens 2 and MRTK V2 explorations.
The project itself is in the Git Repository here.
Credits: 3D model Triceratops downloaded by the Smithsonian under the Creative Commons License.