Many of you wrote to me about QR code tracking not working w.r.t the previous article, however, I guess you were also using the newer versions of Unity (2020.2.x+) which support only OpenXR and not Windows MR. Therefore, I was replying to refer to my other article which showed a fix for the same.
So I thought let me make it easier by writing an article combining something new and a clearer fix for the old. This article would explain how we align a 3D model over a real-world object using QR code detection (also an article request) and then read some data from a JSON file to do something eventually. The QR code will be on the real-world object, i.e a shoebox in my case. I would be using Unity 2022.1.12 and MRTK 2.8.2. I have tried to keep it updated to the latest versions in both cases.
First step: set up project and get the relevant plugins
- Create a new 3D project with Unity 2022.1.12f1 or any version you choose but note that this code works for versions above 2020.2.x+ i.e the ones using OpenXR
- Get MRTK V2.8.2 You can either import it directly into your project or get it via the Mixed Reality Feature Tool. I have shown in my previous article how to use this tool, so I will not be describing it here again. Once you import it, go back to the Unity project and it will refresh and show that you can install Unity OpenXR plugin. Go ahead and install it. It will show you a popup to refresh APIs – click on Yes. The project will get refreshed
- Now you need to set it up for Open XR. In case you are migrating from older versions, follow this article. I will show it here for a new project. You need to use the Mixed Reality Feature Tool to get the Mixed Reality OpenXR plugin. Select the project path and click on Discover Features.
- Once the APIs reload, then go on to install the Mixed Reality OpenXR Plugin, the same way
- When you go to Unity now, it will show you a popup to enable XR Plugin Management with OpenXR settings. Click on Show XR Plugin Management settings.
- Click on UWP and select OpenXR. It will show you a yellow icon. Click on that and see what is needed to be done. If it shows get interaction profiles: click on Edit
Add Eye Gaze and Hand Interaction Profiles. Make sure you click on yellow icons to fix all.
This is an important step to ensure everything is correctly set up for OpenXR settings.
Now restart your project so that everything is refreshed and loaded correctly.
- After I restarted, it showed me this popup- You need to install Microsoft Open XR Plugin if you are targeting Hololens 2 and HP Reverb G2. So go ahead and click on Install Microsoft OpenXR plugin. This happens if you have skipped to install Mixed Reality OpenXR plugin. So apparently, I forgot to click import in Step 4. 🙂 My bad! So that was a good prompt from Unity.
Click on Show instructions.
Ok, so I went back to step 4 and clicked on Import button for the Mixed Reality Open XR plugin which I had previously forgotten. So if you have done Step 4 properly, you can skip Step 7. The popups for XR Configuration Settings will ask you to apply MRTK settings, keep clicking Next or Done wherever it prompts you.
- If you go to XR Plugin Management and check the Open XR settings, select the Microsoft Hololens feature group under UWP. If a yellow icon appears next to Open XR click on it and Fix All. No yellow icons or prompt popups anymore suggest everything has been rightly set up.
Set up scene and MRTK profile
This project will show you the right profile settings for eye tracking and hand tracking to work. I will not explain each one of them as it has been done in previous articles. You can also access the GitHub project and check the profile settings in detail.
- Select Mixed Reality – Toolkit – Add to Scene and Configure.
- For the Profiles- the main changes were in Input
Below are the right settings for Eye Tracking to work with Open XR
- Make sure under XR Plugin Management you have selected Eye Gaze Interaction profile as we set up in the previous section
- Make sure there are Input Data providers for OpenXR XRSDK Device Manager and Eye Gaze Provider
- In the Input Simulation Service under Eye Gaze Simulation Mode make sure you select Mouse
- Under Pointers and Gaze Settings, check Use Eye Tracking Data and uncheck Use Head Gaze Override.
You can play around with these settings to choose different options. But I found this to be the best for it to work both in Editor and Hololens 2.
The rest of the Profile settings can be referred to in the project itself.
QR code tracking with new Open XR APIs
My reference to this was here: https://github.com/yl-msft/QRTracking/tree/main/SampleQRCodes/Assets/Scripts But I modified it slighlty to include some 3D model display where QR code is scanned. For the 3D model, I used a simple cube which I made into a cuboid which fits my shoe box. This, I will use to demonstrate alignment. The scale of my shoebox is: 31.5cmx23cmx12cm, so I made the cuboid fit to the same scale. I made the cuboid a nice red color. Move it to a random position so that later it moves to the QRCode scanned position according to the code.
Note on my shoebox: Reebok is not the sponsor for this blog article :))
Get all the QRcode related scripts and prefabs into the project
First, go ahead and get the following package: Microsoft.MixedReality.QR and via Nuget for Unity. I also explained how to do this in the previous article. Make sure you restart your Unity project so that the Nuget menu shows up. Check that you have both these packages under Packages.
While installing from Nuget, if you get the error which I got, “Insecure connections not allowed“, then refer to the solution below under Errors and Solutions.
Next, I have a folder with all related scripts and QR code prefab. It’s similar to the previous article so I will not go into details here. Get the QRCodeStuff folder into your project. For OpenXR, the only difference is to the QRCode prefab, attach the QR code script and the SpatialGraphNodeTracker script. This is the replacement of SpatialGraphCoordinateSystem for OpenXR. In the QRCodesVisualizer, you have to make a reference to SpatialGraphNodeTracker instead of SpatialGraphCoordinateSystem.
Get the following in your scene: QRCodePanel, QRCodesManager
Set up the references in your QRCodePanel. On the StartScanButton, OnClick(), call the StartScan() on the QRCodesVisualizer:
and on the StopScanButton, OnClick(), call the StopScan() function:
What we want to achieve is, we scan a QR code and show the 3D model at that pose, i.e this should then align to the shoe box. Then we show some arrows on it which points you where to look at and do something with it. This “do something” can be read from a JSON file.
Scanning and aligning 3D model to real-world object
In your QRCodesVisualizer.cs, add a reference to the 3D model. We also will make a simple panel which we will call StepHandlerPanel which will show up also along with the 3D model. For now, do not worry about what goes in the panel. This is to read data coming from the JSON file. I have also included the StepHandlerPanel prefab in the project. The rest of the fields are the same as the previous article for Status Text and Latest QR Code Details.
Add the StepHandlerPanel to our scene and hide this object in the Hierarchy for now. This will be unhidden, as long as we find our QR code and align the object. Add these as references into the Inspector of the QRCodesManager.
If you see the HandleEvents() of the QRCodesVisualizer.cs, I have added some more things to make it clear to us as to where we found the QR code. And also to update it’s position if we are tracking it constantly- In case we move the QR code while tracking. Or in this case, if we move the shoebox with the QRCode on it.
Next in the script, look at StopScan() function. I have modified it to show the model as soon as we stop scanning a QR code (user triggered). Then we write some code to translate the model pose to the QR code pose and we also show the StepHandlerPanel. I also show the StepHandlerPanel in front of the user.
Note that I compare the QRCode scanned to a QR code I specifically set up for Alignment. I created a QRcode with the text “Alignment code” – CASE SENSITIVE!! and check if this is found. This avoids any random QR codes that you may encounter while scanning and only aligning the model when the right QR code is scanned.
Capabilities to set
Choose the following in your Player – Publishing Settings: Internet Client, WebCam, Microphone(if you want speech triggered commands to start and stop QR code scanning), Spatial Perception and Gaze Input.
Build and try
Right. So let’s give it a build and try what we have until now. Place a QR Code on your object. I placed it on the middle of my shoebox. This ensures that it aligns exactly on the box. Else manually add offsets to adjust your 3Dmodel. I added some offsets to the z axis so that my model sits perfectly on the shoebox after scanning.
Accept the capabilities prompts on the app. And then press Scan on the QRCodePanel. I have set the panel to AutoTracking mode so it keeps scanning until you explicitly press the Stop button. When you scan and see the QR code, it should immediately display the QR code prefab on the QR code. Then press Stop on the QR code panel, and this should immediately bring the red 3D model on the same pose as the QR code.
When you do this, you will realize that the 3D model takes the exact pose of the QR code- which makes my 3D model placed vertically on the shoebox. But it needs to be placed horizontally. So let’s correct this. First add an empty parent Gameobject to the model. I call it ModelParent. This will have rotation 0,0,0. Now change the reference on the QRCodesManager- QRCodesVisualizer.cs to Model as ModelParent. This ensures that the when the ModelParent changes its pose to the scanned QRcode, the child gameObject which is our actual 3DModel, maintains a horizontal placement. All those offsets which I added to the child object, add it to the parent object. So you won’t turn the 3D model with this logic unnecessarily.
Setting up points of interest on the real-world object to look at
Imagine you have to inspect something on the shoebox. I have set up 5 arrows on different parts of the shoebox to exemplify that. You do that by attaching those arrow gameobjects to the shoebox 3D model at the position you want. This way when it is aligned, you know exactly where to look on your shoebox. I also put the Arrow prefab in the QRcodestuff folder.
The JSON part
The JSON file which we have to read
Now that you have arrows set up to the points of interest, we need something which brings these points of interests to the user’s attention one at a time (or adjust according to your use case). We do this by reading these “steps” from a JSON file. The JSON file just contains 5 different steps with an ID and Description for what to do in the step. Feel free to adjust the JSON file acc to your use case. I simply write “Do something Step 1”, “Do something Step 2” and so on. The JSON file is also in the project. For the Hololens 2 to read it, you put it in your StreamingAssets folder. Then we write a script which will read the contents from there. So put your file in the StreamingAssets folder. If you don’t already have that folder, just go ahead and create a new one. I called my JSON file StepData.json.
Make sure your JSON is rightly formatted.
For the JSON to be read, you need to create a StepData class which has an ID and a description. This should be Serializable. So create this in your project. I called it StepData.cs
The Step Handler Panel
You have added this panel previously when we were setting up the QRCodesManager. The panel is a simply Near Menu which has Next and Back buttons and a text field to show the step description. As you press the Next or back buttons, the corresponding arrows for the 5 steps previously configured will be shown or hidden. You can only show it when the QR code scanning has been successful and your alignment code has been found.
Also go ahead and disable the arrows. These will also get enabled only when the corresponding step is navigated into.
The Step Handling Logic
Now I wrote a simple script called StepHandler.cs which will include a reference to those 5 arrow prefabs, next and back buttons, number of steps and the description. Add an empty GameObject to the scene. Attach to this the StepHandler.cs and add those references in the inspector. I called my GameObject StepHandler.
Before we try it, let’s have a look at our StepHandler.cs.
What we are doing is, as soon as we start, we read the file. Remember, we put the file in StreamingAssets. Now to read it, use Application.streamingAssetsPath. Then we read the text and load it onto a string.
Since our JSON has more than one StepData item, we use a wrapper to convert an array of StepData to a single item each item to read it. That is what we are doing and the code for that wrapper is not mine, I adapted it from this StackOverflow answer: https://stackoverflow.com/questions/36239705/serialize-and-deserialize-json-and-json-array-in-unity
We then start the StepHandling with index 0 – so that the first step is loaded as soon as we start show the panel.
I also added a simple counter to navigate the next and back buttons and to set the appropriate arrows and description on the panel. This is a very simple and fast script for this demo project.
Now go back to the inspector and on your Next and Back Buttons, call the Navigate() function of StepHandler.cs. I set a simple flag option and if it is checked, then we have pressed the next button, else we have pressed the back button.
Allrighty, everything is correctly set up. We can now build and try again. I did not pay that much attention to the solvers on the panels. So go ahead and tweak them to what you like. Now once you scan the right QR code, you should see the model on your object with arrows pointing to different points of interest. Close the QR code panel by pressing Close. You should also see a panel where the first step is already shown with the next button. Then navigate back or forth and see the different arrows shown.
Below is the output video:
So that’s it- hope this was helpful. Let me know! The git project is here: https://github.com/NSudharsan/HoloLensExamples/tree/master/JSONInputAndQRCodeAlignment
Errors and solutions
Error: Insecure connection not allowed – while installing Microsoft.MixedReality.QR package into the project
Solution: temporarily, change the Player settings – Configuration – Allow downloads over http: to Always allowed. This downloads the nuget pkg. But change this back to Not allowed as this is more secure.
Error: the model is aligned unevenly
Solution: When you scan the qr code, make sure you are looking at it from an even level. The QR code tends to be uneven when you don’t look at it at eye level. i.e when it is on a table or so, look at it from different angles, until you are satisfied if the QRCode prefab is aligned on the code properly.
2 thoughts on “Reading JSON input with Hololens2 (Bonus: QR code alignment on a real-world object with Open XR and MRTK V2.8)”
Hi Nischita, thank you so much for the detailed guidelines!
I am currently using Photon for a multiplayer game using hololens.
I didnt have luck in making the azure spatial anchor work so I am looking for a different method to anchor the shared experience in place, and i am following this to have the qr code as an alternative solution. 🙂
I was wondering how i should approach this tutorial if I want to assign a specific qr code for spawning an object like the red cube.
For instance, Vuforia uses a prefab that has the image marker and the game object together, so for me it was easy to understand which qr code is linked to what. In this case, I wasnt sure where I can assign the qr code in the project.
Could you please help me understand this? Thank you!
Hi, if I understood you right, You could probably use the data on the QR code to match your gameObject. For ex: if you create a QR code with data (i.e text) with “Red Cube” then you spawn that red cube when this particular QR code is scanned. There is no way of preassigning qr codes to assets- but after you scan the qr code, if you map it with the data on the codes with your assets, that should work.