Immerse

Interaction overview

Step 8 of 13 |30 mins

Navigation

Mapping input from controllers

Description

The Immerse SDK provides a set of components for creating rich, immersive interactions in multi-user VR environments. This section explains why physics is used and why Immerse SDK components require a Rigidbody and other Unity physics components. Tracking of VR controllers is also covered.

Why use physics?

Creating a good immersive VR experience is about consistency - a user should be able to put on a headset and immediately feel they understand the world around them. The experience doesn't have to exactly match real life, but interactions should be intuitive and realistic, so that the laws of physics can be used to solve problems and interact with the environment. Unity provides a number of options with it comes to moving and tracking objects.

The simplest and most common method is to parent the object to the controller. Alternatively a GameObject's position and rotation could be updated using the Transform component. If this method was used, the object would feel as if it had no mass and provide a very rigid feeling in the hand. Imagine picking up a virtual tennis racket using this method - it would have no real presence, almost directly extending from the arm. Even if a Rigidbody was added, it wouldn't collide with anything and travel through walls.

If physics wasn't used, even adding a collider to the object could lead to other objects and colliders being bypassed, if it was moved quickly. This is because the physics engine updates at a different rate to the internal Unity code.

Unity's inbuilt physics engine makes an experience look and feel considerably better. Suddenly objects can be thrown, ball bounced and objects can collide with each other. Hinges move realistically and objects experience friction and momentum, but it is not perfect. For instance, when an object is held and the user moves their hand through a wall, what should happen? A VR user cannot be physically prevented from doing this, if they are in an open space.

How physics is used in the Immerse SDK

Controllers and Hands

The immerse SDK represents VR controllers as hands. When the trigger is pressed, the hand will animate and make a fist when fully pressed in. The animation is synchronised with how much the trigger is pulled. The hands track the controller every frame and will move exactly as the controllers move. Physics is not required for this part - hands at this stage, do not have physics enabled and will freely travel through objects and walls in a scene.

What happens when an object is grabbed?

When a user moves their hand near an interactable object, an outline is drawn around that object. When the trigger is pressed in, the user's hand will disappear - the object is now being held and physics is enabled. When the controller is moved while the trigger is pushed in, the movement vector in world space is calculated, which sets the velocity of the object directly; the object is 'nudged' towards the location of the controller. The quicker the controller is moved, the larger the force becomes; this is calculated every frame continuously. The force is adjusted and then reapplied every frame, until the object and the controller are in the same location. When this happens, velocity to set to 0. Rotation is achieved in the same way through angular velocity.

This means that the object will track the controller very accurately, but still adhere to collisions in the scene. If a Rigidbody-enabled ball was hit with the racket mentioned above, it will fly away just like a real ball.

What happens when a user lets go of an object?

When the trigger is released, tracking is stopped but both velocities are left as they are. The Unity physics engine will take over and calculate every frame where the object should be, just like any other physics-enabled system. For instance, if a ball is picked up and thrown, it will perform exactly like a real ball and fly through the air, bouncing each time it hits the floor (note: this must be set up with the standard physics materials in Unity, with bounce and friction used).

Is any special setup needed?

The Immerse SDK uses the Unity physics engine (Unity documentation) .

To enable interaction with VR controllers, a set of components are provided to handle common interactions. For instance, Pickup allows objects to be picked up and thrown. Others such as Dial, Door and Lever use a hinge component as well. Read specific pages to understand which Unity physics components are needed for each of these interactable objects.

Synching with other players

When a user interacts with an object, Unity physics is applied to that object. Observers in the Scene (in a headset or WebGL) will receive X,Y,Z and rotation data for that object - physics will be disabled for them.

Network latency can have an effect on synchronisation - because it is time relative, users might not notice unless latency is poor. Messages are sent 10 times a second for efficiency and movement data is interpolated between key frames, so that objects move smoothly when viewed by somebody else.

Each type of interaction requires a TransformSync component to enable synchronisation.

Navigation

Mapping input from controllers

Updated 3 months ago

Interaction overview


Step 8 of 13 |30 mins

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.