Microsoft HoloLens 2 – experiencing the future today

I have been very enthusiastic about HoloLens 2 -devices when I first saw them demonstrated during 2019. I have been staying up to about HoloLens throughout these two years – especially thinking and learning about use cases in different industries and bringing in Line of Business applications to it. The latest big news was the announcement of Microsoft Mesh in Ignite 2021 Spring and also the Mesh preview application for HoloLens 2. With some luck I managed to get a HoloLens 2 for testing it out to a short period of time.

As a scifi fan, it was a very futuristic feeling to actually use HoloLens 2. Compared to the first version this one recognizes lots more movement of your hands and fingers. Seeing the Windows logo appear on your wrist and pressing that for the main menu (or using one-hand gesture to open/close the main menu).

Moving screens and apps around, pinching and grabbing objects – turning them around feels like magic when it is really smooth. And you only wear the headset, that holds all those sensors (like Kinect) that make this possible. Field of vision has limits, but it was rather nice compared to precious version (from memory). And I did of course take the device to my backyard for testing outside while the sun was rising behind the house. The display was very clear and crispy – just like using it inside.

That is of course all about the awesome tech it has. And yes, I am still very hyped about it.

HoloLens 2 is easy to wear and I didn’t think it was too heavy. Closest comparison would be the bicycle helmet – without the chin strap. And when comparing to something like PlayStation VR headset: you don’t. HoloLens 2 is much much more comfortable, it is very easy to put on and take off.

Mesh Preview app

Of course the first thing I wanted to try out was that Mesh preview application. While it was fun to experiment on my own it doesn’t really unwind and show what you can do with it until you get someone else to test it out with you – from a different country. Asking from my network I found Joost, who was willing to set aside some time so we could try it out together.

It was surely fun to test out the collaboration in Mesh. Of course it is just a demo app, but I think it showed several great possibilities what HoloLens 2 and Mesh has together: holoprojected avatars that have facial & hand expressions and especially eye contact.

Since HoloLens can track eye-movements it was a very scifi moment (again) to be able to talk to Joost while we both faced each others. You can see if people are looking at you – so you can have eye contact. That and mouth expressions (speaking) made it a really immersive part. You forgot very easily that the other person does not see what you are seeing. Instead they see their own room/physical space and I see what is at my end. However we both occupy the same virtual space and can interact with all virtual objects there: including models the other person uploads to the space.

It is a cool pose Joost has in the pic. That also shows how well Mesh app conveys people’s expressions and hand motions. HoloLens can track individual fingers there easily. It is very very awesome. Did I mention already the scifi feeling?

The background in these pictures with Joost avatar there is a carefully selected part of my home.

This is a another great expression. All this makes the interaction very different than what you get in regular online meetings. The audio is also spatial so it follows the person and you can hear that in HoloLens built-in audio.

I kept my typical Finnish expressions on photos we took at the app. As can be seen this is the room Joost was in – I didn’t see any of his background during the meeting – only afterwards when he sent me photos he took.

Nope, no selfies at Mesh Preview App.

For a moment I don’t look so serious there! I see I chose a bit more beard to my avatar than I thought, but what I like about this tech is how it shows where I am looking – straight at the camera in this case.

Joost shared me a link to the video about the Mixed Reality application that Velicus is building: training on CPR. That is a really nice example how Mixed Reality can help people to learn and also about it’s possibilities.

More Mesh app testing

What you can do in Mesh application? You can upload 3D models there (like an earth, but surely for business use these could be models about your products or other artifacts), upload photos and then you can of course highlight areas there with inking. Ink being in three dimensional space it can be used to create effects you walk through. And that is the cool thing: those annotations exist overlapped (augmented) to physical words (augmented reality). You see then via HoloLens’s and can interact with those elements that you added via the app. It makes it possible – and fun – to utilize the space you are in (and the other participant) to move around objects, group things together and also to check out those 3D elements. And yes, there is the virtual table that fits your room – sometimes it can be a bit oddly located but moving it is easy and quick.

Just fooling around with Mesh App outside and finding a good spot for a table while playing around with tools.

Testing calling to Microsoft Teams

Since I am passionate about modern (and futuristic) work I of course had to try Remote Assist with a trial license. Calling from the device to someone using Teams and being in the call with them was quite a nice experience.

The Remote Assist has been there since HoloLens 1, but I recall it took a big leap forward between those tests and the current one. Experience was really smooth and fun too.

Being able to see what I see gives Teams participant the ability to guide or evaluate what I do. Adding pictures (instructions/ schematics) and using annotations on physical objects is a very very fun & cool way.

There are lots of great applications that can be used to test HoloLens 2 capabilities and limits. Sadly Ignite Pokémon demo isn’t available.

A bit of other testing

There are several free apps in Microsoft Store that can be used to test what HoloLens 2 is capable of. Like going through some anatomy of a human.

What is not available in the real life (well, not this clean anyway) is to “go inside” the model and examine details. Imagine this being augmented with medical information and foundings of a person & modeled to the app.

And it comes to details how well HoloLens 2 can track eyes (look at the object and then do actions to it), finger movements (moving sliders) combined with voice commands.

Conclusion and “the future is there”

Using HoloLens 2 really makes one to understand how they could benefit several industries and business scenarios. Manufacturing, maintenance and training are first and very used examples But remote control, overlaying designs over a physical place (remember that Augmented world can be spatially (location) attached. Walking around a hall and reviewing what the end result will be before construction has even started allows seeing potential design errors in advance. How about warehouse workers? Doctors and nurses and so on.

The big shockwave is yet to come – sooner than later. Since Microsoft Mesh was announced at Ignite 2021 Spring people are waiting how those super-cool features will work. Meeting with Holoportation / projection. All sitting by the same table with realistic holograms representing people who sit in different physical locations. Extended Reality, the future, is here sooner than we think. While HoloLens 2 is out of reach for most people, there are already options like AltSpace VR that is available for Quest (& other VR headsets) but also as an application to PC. And Microsoft Mesh is coming to AltSpace VR (now in private preview).