DEV Community

Cover image for Part 5: Input/Controllers & Ray Casting (WebXR with Babylon.js)
Bryan for Taikonauten

Posted on • Updated on • Originally published at Medium

Part 5: Input/Controllers & Ray Casting (WebXR with Babylon.js)

đź‘€ Stumbled here on accident? Start with the first part!


Welcome back to the 5th installment of this WebXR/Babylon.js series. This part is about Input (specifically MetaQuests Motion Controllers) and the concept of Ray Casting.

ℹ️ Remember - you can always run the code associated with this article and follow along using

npm start --part=5


interacting with a mesh by changing its color

interacting with a mesh by changing its color

Ray Casting

What is a Ray Cast?

Ray casting is used to determine the line of sight, select or interact with virtual objects, or navigate through a virtual environment.

How does it work?

Ray casting involves projecting a line (ray) from a specific point into the virtual environment and determining what objects in the virtual world it intersects with. For example, in a VR game, you might use ray casting to select an item or to aim a weapon.


Hit Testing vs. Ray Casting

❗️ Hit testing and ray casting might be perceived as similar because both involve projecting rays in a 3D environment to determine intersections. This fundamental similarity in their core concept—using rays to find points of interaction—can make them seem alike, especially in scenarios where both methods are used for spatial analysis or object interaction within virtual or augmented reality environments.

âś… Hit testing in augmented and virtual reality determines where a virtual object intersects with the real-world environment, while ray casting involves projecting a line from a point in space to detect intersections primarily within a virtual environment.


Performing a Ray Cast from a Controller Input Source

createRayFromController(controller: WebXRInputSource): Ray {
    const origin = controller.pointer.position;
    const direction = controller.pointer.forward;
    return new Ray(origin, direction, length = 100);
}
Enter fullscreen mode Exit fullscreen mode

The createRayFromController method in this context generates a Ray object originating from a WebXR controller. It takes a WebXRInputSource object (representing the controller) as an argument. The method sets the ray's origin to the controller's pointer position and its direction to the forward direction of the controller's pointer. The ray is given a length of 100 units. This functionality is typically used in virtual environments to create a directional line for purposes like aiming or selecting objects in the direction the controller is pointing.


Controller Detection and Interaction

đź“š Babylon.js supports a wide range of motion controllers. Each controller's buttons, thumb sticks, and touchpads are updated every frame, providing real-time input data.

handleControllerSelection() {

    if (this._xr === null) {
        return;
    }
    this._xr.input.onControllerAddedObservable.add((motionControllerAdded) => {
        motionControllerAdded.onMotionControllerInitObservable.add((motionControllerInit) => {
            const motionControllerComponentIds = motionControllerInit.getComponentIds();
            const triggerComponent = motionControllerInit.getComponent(motionControllerComponentIds[0]);

            triggerComponent.onButtonStateChangedObservable.add((component) => {
                if (component.pressed && component.value > 0.8) {
                    const resultRay = this.createRayFromController(motionControllerAdded);
                    const raycastHit = this._scene.pickWithRay(resultRay);
                    if (this._debug) console.log(raycastHit);

                    if (raycastHit && raycastHit.hit && raycastHit.pickedMesh) {
                        if (raycastHit.pickedMesh === this._box) {
                            const mat = this._box!.material as StandardMaterial;
                            mat.diffuseColor = Color3.Random();
                            this._box!.material = mat;
                        }
                    }
                }
            });
        });
    });
}
Enter fullscreen mode Exit fullscreen mode

The function handleControllerSelection() illustrates how to respond to inputs from a motion controller in an XR scene.

Initialization: First, it checks if the XR session is present and initialized.

Controller Events: When a controller is added, an observable onControllerAddedObservable is activated. This observable responds to the controller's initialization by listening to the controller’s onMotionControllerInitObservable.

Trigger Component: Once the controller is initialized, the trigger component of the controller (usually the trigger button) is identified. This component is then used to respond to press events by listening to the onButtonStateChangedObservable.

motionControllerComponentIds[0] corresponds to the standard trigger button on an XR controller like a MetaQuest 3 controller.

The observer reacts to us pressing the trigger button on the controller. Since the trigger button can be pressed to various degrees, we want to make sure the button is at least pressed for 80%, so quite strong, before reacting to it. If both requirements are met, the ray cast is created from the controller.

If our ray cast hits the box (if we are aiming at it and pulling the trigger on our controller), the color of the box will randomly change to a different color.


More about inputs and controllers can be found here


Adding the Controller Selection to the scene

async createScene(): Promise<Scene> {
  ...
  this.handleControllerSelection();

  return this._scene;
}
Enter fullscreen mode Exit fullscreen mode

Finally, we add the handleControllerSelection to the scene.



Conclusion

Babylon.js provides robust support for various input sources and controllers in WebXR, effectively managing them through classes like WebXRInput and WebXRInputSource. This flexibility allows for seamless integration and real-time interaction with different types of controllers, enhancing the user experience in virtual environments.

This article demystifies the concepts and practical applications of ray casting and hit testing in virtual environments, illustrating their similarities and differences. It emphasizes the use of ray casting for interactions within virtual spaces, such as selecting or aiming at objects, and demonstrates how to implement ray casting in a practical scenario using WebXR controllers. This highlights the importance and versatility of these techniques in developing immersive and interactive virtual reality experiences.

In the sixth part we dive into Animations.

Top comments (0)