Using Microsoft® Kinect® (unofficial)

NOTE: IntuiLab no longer provides out-of-the-box support for Microsoft Kinect. No additional features or bug fixes will be made for Kinect v.1, no support will be offered for Kinect v.2. We have open sourced our legacy support and posted it to GitHub. Feel free to contribute! This page is kept solely for those creating their own Kinect support based on the code posted to GitHub.

    Introduction

    The Microsoft Kinect Interface Assets enable you to control an IntuiFace experience using the Microsoft Kinect sensor. IntuiFace provides a set of gestures and postures as well as presence detection to trigger any action in your experience. You can also manipulate assets directly using hand pointing.

    NOTE:

    • IntuiFace currently supports Kinect v1. Kinect v2 support is currently being developed but no delivery date has been planned.
    • Either Composer Pro or Enterprise is required to create and save experiences using Microsoft Kinect. However, to preview Kinect support you can use any edition, including Composer Free. Composer Free users won't be able to save their work but they will be able to experiment with Kinect in pre-built experiences such as the Kinect sample experience found on the Marketplace tab of Composer's Experiences panel. Pro and Enterprise users can make and save any changes they wish.
    • Only Kinect for Windows is supported. Kinect for Xbox 360 is not supported due to usage restrictions imposed by Microsoft, Only Kinect for Windows can be used for commercial purposes.
    • Any trigger/action pair you create for a Kinect Interface Asset is globally accessible across all scenes within an experience. This means that if you want a given gesture or posture to result in different actions depending on the scene, you must
      1. Add to your experience multiple instances of the Kinect IA that contains the gesture(s) or posture(s) you wish to use in multiple ways..
      2. Configure each instance of this IA to represent a different mapping of triggers to actions - e.g. one IA instance maps Swipe left to the Next scene navigation action while another IA instance maps Swipe left to the Next page action for a book.
      3. In Composer's Edit Mode, disable all Kinect IA instances by deselecting the various Enable properties in the lower Properties panel.
      4. Use the Entering trigger of a scene to enable the appropriate Kinect IA instance. Use the Leaving trigger of a scene to disable that same instance.

    To use the various Kinect interfaces you must first install Kinect for Windows Runtime v1.8 or later available on the Microsoft support website. If Composer is running during installation of the Kinect runtime, you must stop and restart it for changes to take effect.


    There are six available Kinect Interface Assets, accessible in the Select an interface panel:

    Each Interface Asset is equipped with a Design Accelerator enabling you to verify that your Kinect sensor is properly detected. Insert a Kinect interface into your scene, double click on the newly created interface asset, then enter Play Mode.

    Here is some advice from Microsoft for optimal use of the Kinect sensor:

    • The user should be standing between 80cm and 4m (30 inches to 13 feet) from the Kinect sensor
    • Limit light sources directed toward the Kinect sensor. Since the sensor uses infrared, too much light could prevent it from working properly
    • The Kinect sensor must see at least the user's head and knees to detect him/her properly.

    Kinect Settings

    Use this Interface Asset only if you need to change default settings of the Kinect sensor.

    Properties:

    • Stream --> select the stream to display onscreen. You can choose between the following:
      • Color --> the standard camera image.
      • Depth --> the depth image. The grey level indicates distance from the sensor.
      • None --> use this if you only want to display the skeleton. Assumes the Skeleton on stream property (see next) is toggled on.
    • Skeleton on stream --> displays the user skeleton on top of the stream selected for the Stream property above.
    • Minimal distance to lock Kinect --> Enables you to change the minimal distance between the closest user and the Kinect sensor before Kinect locks detection. To unlock the Kinect detection, the user has to step away from the sensor. The value of this property should be between 0.5 meter and 4 meters.
    • Kinect sensor elevation --> Adjusts the tilt angle of Kinect sensor. This value should be between -27 degrees and 27 degrees. The default value of 10 degrees is suitable for when the Kinect sensor is put on a table approximately 70cm / 2.5 feet above the ground.

    People Detection

    This Interface Asset enables you to visualize the live stream of the sensor and includes triggers for when people enter or leave the field of view.

    streams.jpg

    Properties

    • Stream --> select the stream to display onscreen. You can choose between the following:
      • Color --> the standard camera image.
      • Depth --> the depth image. The grey level indicates distance from the sensor.
      • None --> use this if you only want to display the skeleton. Assumes the Skeleton on stream property (see next) is toggled on.
    • Skeleton on stream --> displays the user skeleton on top of the stream selected for the Stream property above.

    Triggers

    • New person detected --> Raised when a person enters Kinect’s field of view.
    • No person detected --> Raised when the last person leaves Kinect’s field of view.

    Gestures

    Detects swipe left, swipe right, push and wave. Use left/right swipes to control scene-to-scene navigation or to move through a collection. Use the push gesture to, for example, open the collection item in focus. Use the wave to reset a scene.

    gstures.jpg

    Properties:

    • Enable Gesture Swipe Left
    • Enable Gesture Swipe Right
    • Enable Gesture Push
    • Enable Gesture Wave

    Triggers

    • Push detected --> Raised when a Push gesture is detected.
    • Swipe Left detected --> Raised when a Swipe Left gesture is detected.
    • Swipe Right detected --> Raised when a Swipe Right gesture is detected.
    • Wave detected --> Raised when a Wave gesture is detected.

    Letter Postures

    This Interface Asset detects when the user in front of the sensor is in a posture with the shape of the letters A, T, U or V. From these triggers you can call any IntuiFace action.

    postures.jpg

    Properties

    • Enable Posture A
    • Enable Posture T
    • Enable Posture U
    • Enable Posture V

    Triggers

    Each "letter" has the 3 following triggers

    • Posture detected --> Raised when the posture is detected.
    • Posture detection in progress --> Raised when posture detection is in progress. You can bind its parameter to a Linear Gauge to see the progress.
    • Posture lost --> Raised when the posture is lost.

    Command Postures

    This Interface Asset detects when the user in front of the sensor is in a posture with one of three shapes referred to as Home, Wait & Stay. (see picture below). From these triggers you can call any IntuiFace action.

    command_postures.jpg

    Properties

    • Enable Posture Home
    • Enable Posture Stay
    • Enable Posture Wait

    Triggers

    Each "command" (Home, Stay, Wait) has the 3 following triggers

    • Posture detected --> Raised when the posture is detected.
    • Posture detection in progress --> Raised when posture detection is in progress. You can bind its parameter to a Linear Gauge to see the progress.
    • Posture lost --> Raised when the posture is lost.

    Hand Pointing

    This Interface Asset recognizes the four hands of two people and identifies when any of those four hands enters or leaves a Grip state. Think of making a fist as equivalent to a touch on the actual display. For example, to move an image, raise your hand, align it over the image, make a fist and then move your hand. The image will move along with it.

    NOTE: Use of this interface asset requires Port 3333 - the default TUIO port - to be opened on the PC hosting Player and Microsoft Kinect. If IntuiFace Composer or Player was installed with Administrative rights then this port was opened automatically. If Composer or Player were installed under a non-administrative account, the IntuiFace Configuration Tool must be run with administrative privileges in order to open Port 3333. See this article for more information about the IntuiFace Configuration Tool.

    Hand_Pointing.png

    Properties

    • Enable Pointing Mode

    Triggers

    You can trigger any IntuiFace action to occur when any detected hand changes state. Currently, there is no way to differentiate which hand or which person was involved in the state change. Here are the states:

    • Hand grip is released
    • Hand is gripped
    • Hand is lost
    • Hand is moved: This trigger has three properties whose values are available for binding with action parameters
      • Hand is gripped: a yes/no value
      • Hand X Position: the x coordinate of the hand that was moved
      • Hand Y Position: the y coordinate of the hand that was moved
    • New hand is active: This trigger has two properties whose values are available for binding with action parameters:
      • Hand X Position: the x coordinate of the hand that was moved
      • Hand Y Position: the y coordinate of the hand that was moved