
I thought about sharing my insights on AR development so far. If you are used to VR programming with Unity3D, there are actually several pitfalls you can encounter:
Lighting/Shadows
So the first thing you usually set up in a virtual scene is the main camera and the main light source. While being the default setup of a newly created Unity3D scene for VR, this is not the case for AR. There are several discussions on forums about lights in AR, so I will keep this short:
- Activate receive shadows on the game object of choice (of course you would also need a shadow generating object)
- Allow shadows in the project quality settings (default settings for Hololens are fastest which does not include shadows)
- Use a custom shader
- Do not forget that black is treated as transparent. You will need to set your shadow color to something different
Colors/Visibility
While playing around with different colors to visualize cursors and pointers, I realized that some colors just do not work out for AR. Depending on the background, i.e. your physical world, some colors will just blend into the environment until the object is moved (common fate). Though being a fan of colors, I realized that white is the only color that really stands out on various backgrounds and offers the best readability for text. You can improve this even further by using a darkened background. Again, black will not work, so you need a shade of gray. This will result in a semitransparent color that reminds of looking through sunglasses. If using colors, choose a bright hue, e.g. light-green as in the picture.
Field-of-View (FOV)
Do not trust the advertisement. The FOV is actually disappointingly small, so that larger UI elements will keep disappearing from the user’s view. With the camera being at the origin [0,0,0] and the UI at a distance of 90 cm in front of the user, the maximum height you can set for your interface is around 50 cm. Any larger UI will just be cropped which is very annoying for searching-tasks. Also consider the height you set your UI on. Setting it too high can lead to strain in the neck, so you will need to find a neutral position. Be careful with the setting, though. My experience was that the actual height can vary from user to user and depends on the current HMD height during application startup. A trick to fix an uncomfortable position is to restart the application while shifting the HMD position in the opposite direction, e.g. ask your user to sit very straight when the UI was too low, and then shift back.
Distance for Interaction
In case you plan to include hand-tracking to manipulate your UI, be aware of the following limitations: arm length and clipping plane distance. It might sound hilarious, but I encountered cases when the user’s arm was too short to reach the UI from the starting position. As UIs are world-fixed, i.e. not moving with the user, you can simply come closer. This is where your near clipping plane can kick in and clip away your complete UI. Therefore, I do not entirely agree with the suggested near clipping plane distance of 50 cm. In fact, even a distance of 30 cm can sometimes be too large. You have to be aware, though, that coming too close to a UI can destroy the stereoscopic effect, resulting in you seeing two overlapping UIs instead of one.
Deployment/Performance
It should be obvious but it never hurts to be on the safe side: Do not deploy in debugging mode when running calculation or network heavy code. While having a debugging console can be really handy, you would not believe the impact on performance it can actually have.
Should I encounter any more insights, I will keep this post updated. Anyway, feel free to add your experience and suggestions in the comments.
Leave a Reply
You must be logged in to post a comment.