HoloLens: The First Few Weeks


I have to admit, the first time I tried HoloLens at //Build in 2015, I was underwhelmed.  The demo was scripted, and the product was clearly not complete.  There was literally a main behind the curtain debugging the device as we went through the motions.

The second time I got to try one was at //Build 2016, where I was given the chance to participate in the "HoloLens Experience" Demo.  It was a one-on-one with the new dev Kits in a room set up to look like your average living room.  Holograms were pre-placed in the room and the guide started me on my journey.  The guide was nice enough to let me just do my thing, so I was able to explore the holographic world a little bit, and to interact with the device as a user would.  This experience was clearly much more engaging than the first, and I walked away feeling like Microsoft may have a decent product on their hands, yet I remained skeptical.

It wasn't until two dev kits showed up in the office that I really got a taste for the capability of this device.  My first order of business was to walk through the library of demo apps that Microsoft has in the Store.  I was blown away by how well it actually worked in the real world.  The curated experiences really show off the potential, and the flaws of this early dev kit melt away.


But if we put  our engineering hats on, it's easy to see the potential.  And the limitations. For starters, there is a low-level Direct3D HolographicSpace exposed to give you almost infinite fine-grained control over your application - and yet large pieces of .Net API functionality simply do not exist.  In many cases the underlying API's are there but the shell isn't finished.

So here are a few things to consider when developing for the HoloLens:

2D and 3D experiences don’t mix well.

I was amazed how easy it was to get a run-of-the-mill UWP app running, but transitioning from the 2D space to a 3D space is still a bit disjoining for users and you can't yet view one inside of the other.  I suspect this will change dramatically in future versions but for now it limits the types of app experiences that can be built.  You are essentially required to choose one or the other, but check out our blog post on hybrid experiences.

Treat it like a phone.

The HoloLens has limited RAM and its memory profile seems closer to a phone than to a full-featured desktop.  It is impressive what it can do, but you'll start seeing memory warnings and accelerated garbage collection around 128mb.

HoloLens is designed for a contextual audio experience.

In some cases, the fact that the shell isn't finished doesn't really matter.  Typing using the visual keyboard is borderline impossible and you can't guarantee your users will have a keyboard.  Microsoft has an app that lets you use your phone or PC as a remote keyboard, but if you can rely on an audio experience, I highly recommend it, but some considerations still exist like:

  • educating your users how to interact with your app using audio
  • accounting for distortion in a loud space - both with inbound commands and spatial audio
  • accessibility issues
  • the maturity of the speech API's and their ability to understand your users

A WiFi connection is basically required.

The 3D spatial mapping tracks your rooms based on WiFi connection.  In fact, if you open the dev portal, you can see all spatial maps are stored based on WiFi network.

It is an indoor device.

The spatial mapping is absolutely excellent inside.  It uses the same technology as the Kinect to define your space.  This means a combination of different camera types can compose the space.  The sensors simply cannot handle looking at the sun.  Even if it could, it still relies on finite distance calculation to position objects in real space.  Not to mention no one wants their brand new $3,000 toy to get wet.

The limited field of vision is not as significant as you've heard.

The number one complaint that people have when they first use the HoloLens is the limited field of view.  Because the HoloLens displays are waveguides, there is a physical limit on the amount of light a specific material can bend into your pupil.  As  Oliver Kreylos points out on his public blog here:

Concretely, the maximum balanced field of view for a wave guide made of material with index of refraction n=1.7 is 36.1°, and given that typical optical glass (crown or flint) has somewhere from n=1.5 to n=1.6, this matches my FoV estimate of 30° rather well. High-end materials with n=1.85 (which is close to the physical limit for glass) would have a balanced field of view of 47°. Diamond, which has n=2.4, would probably yield a comfortably large FoV, but given the required size of the waveguide, I imagine only Bill Gates could afford that.

Materials science is an interesting space.  I suspect the field of view will expand in the future.  I've found when you are  in a compelling experience and can move around it at will;  the lack of peripheral vision does not distract from the experience.  In fact, I've heard from people who struggle with the uncanny valley and motion sickness when using true VR devices that they have a much more comfortable experience here since it is not fully immersive.

But for some users the Interpupilary Distance matters

When you first fire up the device, it walks you through a setup experience to calibrate the IPD.  In practice, I've found this really affects the experience, especially for first-timers.

The device is heavy, expensive and fragile.

While you personally may not find discomfort with the HoloLens, it is easy for users to get fatigued so experiences should be limited in scope to low-impact activity for now.  That being said, I can see a wide range of accessibility-base applications being developed.


I think Microsoft has done a great job giving you initial tools to get started but the development model sort of feels like building a house where your only tools are a screwdriver and a hammer.  But hey - we are in undiscovered country here and I have no doubt the well-funded enthusiasm will produce compelling app and programming models soon.  Stay tuned to learn more about what we find at InfernoRed.