When you build applications for HoloLens you will need in most cases to implement some form of interaction to communicate with holographic objects. Normally you use gestures with your hand in combination with the gaze. The gaze is the direction in which you, or actually the HoloLens, points to. The idea is that you move the object with the gaze and use the hand gestures to select or unselect it.

For Example moving a holographic object would be something like; select it with the gaze by looking at it, select the object to enter the move state, move it by gaze (by moving your head) and select it again with the hand gesture to place it at the current position.

The point is that moving it by head (using the gaze) is not natural for people. There are other actions which would be more suited like aiming at a target or allowing to light up some part of a group of objects. But moving around is more naturally when you do that by hand. It also means that you will need to move around to place a holographic object at a certain location. Sounds natural?

Start building the HoloLens app

In this example i will show you how to move a holographic object around by using my hand. I have left out the select and unselect functionality for simplicity. For the holographic object we will use a primitive sphere accompanied by a TextMesh to have the position and velocity displayed next to it.

MovingHolographicobjects

First of all we need a new HoloLens app project build by using Unity. The project need to be configured by using the default settings which are needed to make the build suitable for HoloLens. The scene contains nothing more than a main camera, Directional light and an empty GameObject renamed to “Managers”.

MovingHolographicobjects1

Secondly we add a new C# script to the assets called “FollowHand.cs”. This script is added as a component to the empty GameObject “Managers”

MovingHolographicobjects2

Getting more detailed information

UnityEngine has a class called InteractionManager which allows you get more detailed information regarding the input mechanism, position and velocity. The class InteractionManager has a number of static events which can be used to determine if a specific source has been detected, lost, pressed, released or is updated. A source can be a hand, voice, controller or something else. More information can be found here.

We will be using the SourceLost, SourceDetected and SourceUpdated events. The first two events are used to activate (show or hide) the holographic objects. The third event is used to update the location of the holographic objects to that of your hand.

Implement code for holographic objects

Before we can implement the different events, we need to have some methods for creating the primitive sphere and TextMesh objects. Add the following code the “FollowHand” script file.

   
 private GameObject indicator = null;
    private TextMesh textMesh = null;

    private void CreateIndicator()
    {
        if (indicator == null)
        {
            indicator = GameObject.CreatePrimitive(PrimitiveType.Sphere);
            indicator.transform.localScale = new Vector3(0.04f, 0.04f, 0.04f);
        }
    }

    private void UpdateIndicator(Vector3 position)
    {
        if (indicator != null)
        {
            indicator.transform.position = position;
        }
    }

    private void CreateText()
    {
        GameObject text = new GameObject();
        textMesh = text.AddComponent<TextMesh>();
        text.transform.localScale = new Vector3(0.01f, 0.01f, 0.01f);
    }

    private void UpdateText(Vector3 position, Vector3 velocity)
    {
        if (textMesh != null)
        {
            position = new Vector3(position.x, position.y + 0.1f, position.z);

            textMesh.gameObject.transform.position = position;
            var gazeDirection = Camera.main.transform.forward;
            textMesh.gameObject.transform.rotation = Quaternion.LookRotation(gazeDirection);
            textMesh.text = string.Format("Position:{0:0.00},{1:0.00},{2:0.00}\n Velocity: {3:0.00},{4:0.00},{5:0.00}", position.x, position.y, position.z, velocity.x, velocity.y, velocity.z);
        }
    }

    public void ShowObjects(bool show)
    {
        if (indicator != null && textMesh != null)
        {
            indicator.SetActive(show);
            textMesh.gameObject.SetActive(show);
        }
    }

The CreateIndicator() method is used for creating the holographic object which we want to move around by hand. For now we use a primitive sphere object. It is scaled down to make it smaller. The position of this primitive is set by calling the UpdateIndicator() method.

The CreateText() method creates a GameObject with a TextMesh. This one is also scaled down. This TextMesh is used by the UpdateText() method to display the position and velocity of the hand next to the primitive. Because the text needs to be readable by the user of the HoloLens, it is rotated by using the Quaternion.LookRotation() method based on the direction of the gaze.

The last method is ShowObjects() and will show or hide the primitive and the TextMesh.

To make sure that both the primitive and the TextMesh are created, add the create methods in the Start() method of the script file.

    void Start ()
    {
        CreateIndicator();
        CreateText();
    }

Implement the InteractionManager events

We need to add the subscriptions of the different events to the same Start() method.  As we said we only use the SourceLost, SourceDetected and SourceUpdated events. In this example we do not remove the subscriptions anymore. Make sure that when you create your own projects and you are finished with these events to unsubscribe from them.

    
void Start ()
    {
        ...

        InteractionManager.SourceLost += InteractionManager_SourceLost;
        InteractionManager.SourceDetected += InteractionManager_SourceDetected;
        InteractionManager.SourceUpdated += InteractionManager_SourceUpdated;
    }

The implementation of the SourceDetected and SourceLost events are really simple. We check if the source is a hand. In that case we call the method ShowObjects() to show or hide the holographic objects.

    
private void InteractionManager_SourceDetected(InteractionSourceState state)
    {
        if (state.source.kind == InteractionSourceKind.Hand)
        {
            ShowObjects(true);
        }
    }

    private void InteractionManager_SourceLost(InteractionSourceState state)
    {
        if (state.source.kind == InteractionSourceKind.Hand)
        {
            ShowObjects(false);
        }
    }

The event SourceUpdated contains the actual code to retrieve the position and velocity of the hand. This method is getting called when something updates regarding those two values.

    private void InteractionManager_SourceUpdated(InteractionSourceState state)
    {
        if (state.source.kind == InteractionSourceKind.Hand)
        {
            Vector3 handPosition;
            Vector3 handVelocity;

            state.properties.location.TryGetPosition(out handPosition);
            state.properties.location.TryGetVelocity(out handVelocity);

            UpdateText(handPosition, handVelocity);
            UpdateIndicator(handPosition);
        }
    }

Through the InteractionSourceState, InteractionSourceProperties and InteractionSourceLocation we call the methods TryGetPosition() and TryGetVelocity(). The Vector3 values returned are handed over to the methods UpdateText() and UpdateIndicator(). This will cause the primitive to appear around the hand palm and an update of values for the TextMesh.

Finish up and build the app

Now build and run your application for HoloLens. As soon as you move your hand in the view of the camera and you use a gesture it recognizes the hand and starts tracking it. The primitive will follow the palm of your hand while the TextMesh shows you the coordinates (in real world) of the hand and the velocity over the x, y and z axis. If you want to move to object more in front of the hand or fingers, just change the position and rotation based on the position of the hand you got from the TryGetPosition() method.

[su_youtube url=”https://youtu.be/C2gLhwblQwQ”]

 

 

 

 

 

Previous articleCreate a gaze which interacts with the environment for HoloLens
Next articlePositioning and rotation of holographic objects with HoloLens
A professional which inspires, motivates and educates businesses on how to leverage emerging technologies using Virtual, Augmented and Mixed Reality in their journey to digital innovation. He has a strong background on SharePoint, Office 365 and Microsoft Azure and is working with machine learning, artificial intelligence and cognitive services. Primarily focused on manufacturing, construction, industry, logistics and maritime/offshore, his goal is helping businesses to achieve more on creating, improving, smartening and shortening of business processes by using services, apps and devices. Furthermore, he engages in speaking, blogging and is organizer of events like the Mixed Reality User Group, Mixed Reality Talk @ VR, SP&C and the Global AI/MR Bootcamp. With more than 20 years of IT Business experience, acting in different roles like solution architect, he believes the future is just starting! Highly experienced in all realities, HoloLens and other devices, User experiences on devices and connecting the world to the cloud will help him to shape that future!

8 COMMENTS

  1. Hi! Good work.
    A question:
    “Is possible get hand rotation?” example: state.properties.location.TryGetRotation(out handRotation); or something similar.
    Thanks
    Luca

    • Not at the moment. Hopefully it will be available with the new gestures in the new version of the HoloLens software

  2. Hi Alexander,
    I read your post and tried it and its really cool but one problem occured. When I rotate my head e.g 90° the hand cursor won’t be displayed correctly.
    Do you know how to solve this issue?
    Greetz,
    Chris

    • When you rotate your head, you also rotate the HoloLens. HoloLens will not recognize the tap or bloom gesture anymore. In the new pre release software update for HoloLens your hands are always recognized. You are then able to implement your own gestures even when you head is turned

  3. 안녕하세요 한국에서 홀로렌즈 배우고있는 오현덕 입니다!

    덕분에 많은 도움이됬습니다.

    감사의말 올립니다!

  4. Hi Alex, I just tried your script. it’s pretty cool, but i wonder can we track both hands using this script?
    also, seems like line 24 and 25 are causing errors due to AddComponent() requires a parameter. What’s the best fit there? I tried TestMesh and Vector3 but neither worked.

    Thank you!

    • It seems that some code was missing. I changed the code in the CreateText method. This will resolve your AddComponent error. With regards to both hands tracking, that’s somewhat tricky. The problem is that you will not get back which hand you are tracking. A solution could be to check the location within the view. If it is at the left side of the middle, it will probably be the left hand. Same for right hand on the right side. Hopefully that will help you further.

LEAVE A REPLY

Please enter your comment!
Please enter your name here