XR Keyboard VI: Direct Interaction with Touch Controllers
Using a controller, as expected, increased the speed and accuracy of typing. There will likely be a singularity moment when hand tracking is indistinguishable in terms of accuracy from tracking devices like touch controllers. Still, right now, the difference is enormous, and if the speed and precision of your actions matter, it's better to avoid hand tracking. For context, I used Quest 2 to run the prototypes.
There are a few improvements in this iteration:
1) Haptic feedback. I used very simple vibrations to signify OnPointerHover and OnPointerClick events, but having haptics while you pushing a key would make the experience even better.
2) You can cancel typing a character. In this prototype, you add a character by physically pushing a key. It allows you to cancel this action if you realize it was the wrong key.
3) Easier to type the same characters in a row. Since you have to physically push a key to get a character, there is a natural signifier between several "pushes" of the same key (each time you need to release the key).
4) Controllers do not hide other keys like hands do. It's essential because visual feedback is the primary way to navigate the experience since you can not leverage your muscle memory (you look for a particular key to press it).
----------
Tools:
— Figma (Visual Design)
— Unity3D (Prototyping)