I witnessed the future of smart glasses at CES. And it’s all about gestures

Rate this post


In one corner of the busy showroom floor of CES 2025I felt like the conductor of an orchestra. As I subtly waved my hand from side to side, notes played on the cello displayed on the giant screen in front of me. The faster I moved my hand, the faster the bow slid down the strings. I even earned a standing ovation from colleagues present at the booth after a particularly fast performance.

That’s what it feels like to use the Mudra Link bracelet, which lets you manipulate devices using gesture control. Motion controls aren’t new; I remember using touchless controls back in 2014. with devices like Myo wristband. What’s different now is that gadgets like these have a bigger reason to exist thanks to the arrival of smart glasses, which seemed to be everywhere at CES 2025.

Startups and big tech companies have been trying to make smart glasses for more than a decade. However, the arrival of AI models which can process speech and visual input he made them at the same time feel more appropriate since ever. After all, digital assistants could be much more useful if they could see what you see and answer questions in real time, similar to the idea behind The Google Astra project prototype glasses. Smart glasses shipments expected to grow 73.1% in 2024, according to September IDC reportwhich further shows that technologically equipped glasses are starting to take off.

Read more: Nvidia CEO explains how new AI models could work with future smart glasses

Watch this: These new smart glasses want to be your next AI companion

Last fall, Meta showed off its own prototype pair AR glasses called Oriongesture controlled and bracelet with neural input. At last year’s Augmented World Expo AR conference, other startups showed similar experiments.

At CES, it became clear that companies are thinking a lot about how we manage these devices in the future. In addition to the Mudra Link bracelet, I came across a few other wearables designed to work with glasses.

Take the Afference Ring, for example, which applies neural haptics to your finger to provide tactile feedback when using gesture controls. It’s intended for devices like smart glasses and headphones, but I got to try a prototype of it paired with a tablet just to get a feel for how the technology works.

In one demo, I played a simple mini-golf game that required me to pull my arm back to reel and then release it to release the ball. The more I pulled back, the stronger I felt the sensation of my finger. The experience of toggling the brightness and volume sliders was similar; when I increased the brightness, the sensation on my finger felt more prominent.

afferent-ring.png

The Afference Ring provides haptic feedback to your finger.

Nick Henry/CNET

It was a simple demo, but one that helped me understand the type of approach companies can take to apply haptic feedback to mixed reality menus and apps. Afference didn’t mention specific partners it’s working with, but it’s worth noting that Samsung Next is involved in Afference’s seed funding round. Samsung released its first smart health tracker ring last year and announced in December that it was creating the first headphones to work on newly announced phones Android XR platform for an upcoming mixed reality headset.

The Mudra Link cuff works with the newly announced TCL RayNeo X3 Pro Glasseswhich launch later this year. I briefly tried the Mudra Link cuff to scroll through the app menu on the RayNeo glasses, but the software isn’t finalized yet.

I spent most of my time using the bracelet to manipulate graphics on a giant screen used for demonstration purposes at the conference. The cello example was the most fascinating demonstration, but I was also able to grab and stretch a cartoon character’s face and move it around the screen just by waving and snapping my fingers.

Halliday’s smart glasses, which were also unveiled at CES, work with a companion navigation ring. While I didn’t get to try the ring on, I did use the glasses briefly to translate language in real-time, with text translations instantly appearing in my field of view even on a noisy showroom floor.

A woman with red hair is fixing a pair of glasses in a black frame

The Halliday smart glasses place a small screen in your field of vision and you can navigate the device with a companion ring.

James Martin/CNET

Without gestures, there are usually two main ways to interact with smart glasses: touch controls located on the device and voice commands. The former is ideal for quick interactions, such as swiping through a menu, launching an app or rejecting a call, while the latter is useful for summoning and commanding virtual assistants.

Gesture control can make it easier to navigate interfaces without having to raise your hand to your face, speak aloud, or hold an external controller. However, there’s still a degree of awkwardness that comes with using gestures to control a screen that’s invisible to all but the person wearing the glasses. I can’t imagine waving my arms in public without any context.

Meta is already targeting gesture-controlled glasses, and its CTO, Andrew Bosworth, told CNET recently that gestures will likely be required for any future pair of display-enabled glasses.

If CES is any indication, 2025 is shaping up to be a big year for smart glasses – and gesture control will undoubtedly play a role in how we navigate these new spatial interfaces in the future.

CES 2025: See the 35 coolest tech products we can’t shake

See all photos



 
Report

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *