A synchronised interactable button that uses physics to push


PushButton is an interactable button that uses physics to push and is synchronised between different users.

For an overview of how our SDK deals with interactions, please see Interaction Overview.

How It Works

PushButton uses physics (a Rigidbody and a Collider ) to enable interaction. When a VR user's hand comes in close proximity to the button, it will automatically change into a pointing state to make pressing easier. The tip of the index finger contains a collider, which will physically push the button on collision. Once a button is pressed, a message is sent to all users in a scene.

PushButton also supports reactions, which allows logic operations to be integrated, when the button is pressed.

How To Implement

  1. Select a new GameObject
  • Reset the Position, Rotation and Scale of the Object. Ensure Position is set to <0, 0, 0>
  • Add Rigidbody component
  • Add TransformSync component
  • Set Sync Values to Position
  • Add PushButton component
  • Add a child object and then add the Mesh that will move down as the button is pressed to the new child.
  • Add one or more colliders (BoxCollider is best)
  • Set the Sensitivity (see below)
  • Set the correct Axis (see below)


Common Issues

If the button does not move, or moves oddly:

  • Is the object's resting position and rotation reset to 0's?
  • Ensure that the correct Axis is being used
  • Do you have a collider on your GameObject? This is necessary for the SDK to detect engagement and for the Rigidbody to work.
    Note: When a collider is missing, you will not see the hand disappear when the trigger is pulled.
  • Do you have duplicate scene indices? You can check this on the Scene Object .
  • Do you have another collider near that might prevent movement/rotation?
  • Check if any requirements are set up, and are they all met (they should all be green when in play mode in the Editor Inspector).

An example of how the hierarchy is set up

Unity Editor Component


The PushButton component provides several properties that control its behaviour:

Verbose Logging Enabled
When enabled, this component instance will log verbosely to the console (only in the Editor). This is useful for debugging, but can have a detrimental effect on the framerate as verbose logging sometimes logs every frame.
IndexThe object's unique index - this will be automatically set by the SDK and should not be edited.
Allow Interaction only by Local AvatarOnly allows interaction if a Local user or their hand is touching the object
Is Interactable only when parent engagedThis setting is only used when the button is a child of a Pickup. In this case, the button can only be pressed when the Pickup is held in the other hand.
Interaction RequirementsA set of requirements that need to be met to allow interaction; can be used to disable the object until a particular state is reached. See Interaction Requirements for more details.
SensitivityThe maximum number of units the mesh can move downwards.
AxisThe valid direction of movement when the mesh is pressed downwards.

Valid choices are:
PosX : The X axis (red)
PosY : The Y axis (green)
PosZ : The Z axis (blue)
NegX : The negative X axis (red)
NegY : The negative Y axis (green)
NegZ : The negative Z axis (blue)

In the picture above, the Button has the green axis pointing upwards, so the Axis setting should be NegY


Immerse SDK Class Library

More detail in ImmerseSDK.Interaction.PushButton class

Unity iconTry out this component in the Examples project

Examples (menu) > Interactions > Load Button Example

Learn more about the Examples project

Updated 4 months ago


A synchronised interactable button that uses physics to push

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.