Google’s Project Glass promotional video has not only stirred up the augmented reality enthusiasts with a range of features that look as if they were taken from a Sci-Fi movie but it also raised many question marks about the user interface (UI) of these AR glasses, especially about its controls.
Having a smooth UI so responsive to head tilting controls for scrolling and clicking and voice commands might seem a little bit too far off from reality. Not to mention how the Google Glasses’ wearers would look like in a social environment: nobody wants to be seen mumbling to themselves or moving their heads like they have a nervous tic.
This is why Google has come up with another patent targeted to their Project Glass’ possible UI issue. And this time it involves hand gestures. Using a reflective infrared identifier placed on a user’s hand, the “wearable marker for passive interaction” tracks and identifies the user’s hand movements.
The identifier would be invisible to the human eye and could be placed on a ring, bracelet, artificial fingernail or even a glove. The head-mounted display (HMD) – in this case the glasses – would be equipped with an IR camera device capable of detecting radiation reflected by the identifier. The HMD would recognise known patterns of motion and thus be controlled by hand gestures. Opening a document, taking a photo or opening your Google Maps app could potentially be completed through the flick of your finger.
This looks as a great alternative or even as a complementary UI navigation system to the voice-controlled and head-tilting version. But how comfortable will the wearers feel looking like they are conducting an orchestra while walking down the street?