Meta Ray-Ban Display<span class="caption-credit__credit">(Image credit: Lance Ulanoff / Future)</span>

I’ve been here before, wearing smart glasses that deliver information almost directly to my retina, but this is also somehow different. Meta has, with Meta Ray-Ban Display Glasses, somehow solved so many issues that bedeviled Google[1] Glass before it.

From the design to the display technology, and from the intelligence to the gestures, these are a different beast, and not just from the woebegone Google Glass. Meta Ray-Ban Display Glasses also stand apart from the Orion AR smart glasses[2] I test drove last year and even the Ray-Ban Meta Smart Glasses Gen 2 unveiled this week at Meta Connect in Melo Park, California.

The goal with these smart glasses is to deliver not just notifications and information to your eye, but video, photos, calls, directions, and more. And in contrast to what I recall from my Google Glass Days, the Meta Ray-Ban Display Glasses are far more successful in these efforts.

But let’s start with the looks.

At a glance, Meta Ray-Ban Display Glasses, available in black or sand finishes, look very much like the Ray-Ban Meta Glasses[3]. They were, after all, co-designed by Ray-Ban maker Essilor Luxottica.

Meta’s tight partnership with the eyewear manufacturer results in frames that look like stylish, if somewhat beefy, eyeglasses. They’re more attractive than, say, Snap’s current AR Spectacles[4], but if you like your frames thin and wispy, these are not for you.

In many ways, Meta Ray-Ban Display Glasses are engineered for subtlety. Aside from the somewhat thicker frames and stems, there’s no indication that a world of information is being delivered to your face. Importantly, the glasses are paired with a neural band that delivers your intention from hand gestures to the band, and then to the connected glasses. Gestures can be accomplished with your hand sitting on your side or in your lap.

The waveguide display embedded in the left lens is undetectable from the outside, even when displaying full-color images.

I tell you all this to explain that this experience was both familiar and yet unlike any smart glasses demo I’ve ever had before. It’s a combination of numerous advancements and some smart design decisions that resulted in more than a few “Oh, wow” moments.

The experience

Meta Ray-Ban Display Glasses, which will cost $799 and ship on September 30, come in transition lenses with prescription options. For my test drive, the Meta team collected my prescription and then fitted a black pair with prescription inserts.

Separately, they took my wrist measurement – I’m a three, if you’re curious – to ensure I got the best-fitting neural band. While quite similar to the neural band I wore for the Orion Glasses, Meta told me this one is improved, boasting 18 hours of battery life and new textiles modeled after the Mars Rover landing pad.

They slipped the band onto my wrist, just below the wristbone, tight enough that the sensors could effectively pick up the electrical signals generated in my muscles by my hand gestures.

Meta Ray-Ban Display

(Image credit: Lance Ulanoff / Future)

Next came the glasses, which, despite their slightly beefy 69 grams, were perfectly comfortable on my face. I could see clearly through the lenses, and my eyes darted around looking for the telltale embedded screen. I’ll be honest, I was prepared for disappointment. I needn’t have worried.

Meta executives explained the control metaphor, which combines hand gestures and voice commands. There are just a handful (get it?) of gestures to learn and do somewhat emphatically, including single and double taps of your index and thumb or middle finger and thumb, a coin-toss-style thumb flick, and swiping your thumb left or right across your closed fist.

There are others, but these are the primary ones, which, I must admit, I sometimes struggled to remember.

When the band was seated correctly on my wrist, these gestures worked. Over the course of my demo, the band moved a bit, and we lost some control accuracy. However, as soon as I pushed it back in place, the system caught every gesture.

Meta Ray-Ban Display

(Image credit: Lance Ulanoff / Future)

Perhaps the most remarkable part of Meta Ray-Ban Display Glasses is the integrated waveguide display. This is so fundamentally different from the approach Google took more than a decade ago, with a prism perched above your eyeline, that it’s no longer worth the comparison. The image is generated in the stem and projected across the screen through the use of imperceptible mirrors.

It’s more than that, though. The small screen, which I’d say appears as a virtual 13-inch display in front of your right eye, is perfectly positioned and stunningly sharp, clear, and colorful.

More importantly, when you summon it, it appears near the center of your gaze. This meant I wasn’t looking up or to the left or right to see the screen.

Meta Ray-Ban Display

(Image credit: Lance Ulanoff / Future)

What does happen, though, is your depth of field shifts so that you go from focusing on the world in front of you to the interface just millimeters from your eye. To an outside observer, I assume it might look like you’re zoning out.

As I mentioned, the imagery is crisp, which Meta told me is due to the 42 pixels per degree resolution and the max 5,000 nits brightness, which is automatically adjusted. I did put the latter to the test in direct sunlight and, much to my surprise, I could still clearly make out all the interface elements.

Phone, who needs a phone?

Meta let me walk through a wide variety of experiences, which included everything from phone calls and texting to viewing full Instagram Reels. It’s quite something to see a whole social video floating in front of you (complete with stereo sound) without it either pulling you entirely out of your real world or forcing you to wear a special headset for the experience.

Like the Ray-Ban Meta smart glasses, Meta Ray-Ban Display Glasses come with a 12 MP camera. I took photos using gestures and could see them instantly in front of me. More interesting, though, was the pinch and twist gesture I used to zoom in on my floating viewfinder before I took a photo.

The display system is also useful to walk back through all the photos you took on the device. I opened the gallery and used my thumb gesture to see all the photos taken by previous demo guests.

There’s an ultra-clear home screen where I found, among other options, Instagram, WhatsApp[5], and music.

I opened the music app and used gestures to swipe through songs, and then a nifty pinch-and-twist gesture to raise and lower the volume. The music sounded quite good.

Meta AI is, naturally, deeply integrated here. At one point, I looked at an Andy Warhol painting (the Campbell Soup Can) and asked Meta AI to identify the artist and his work. That’s more or less table stakes for smart glasses, but then I asked to see more of Warhol’s work, which was quickly presented on the waveguide display in full color.

There are other fun tricks like redesigning the wall color of a room or, as I did, transforming one of the Meta reps into a pop-art image.

Translation, directions, and beyond

More useful and compelling is closed captioning, which can translate in real time (and on device), showing you your language in text as the speaker talks in another language.

For our demo purposes, though, I used the closed captions to display captioning for whoever I was facing. It was pretty fast, mostly accurate, and, if I turned away from one speaker, it would then switch to display captioning for my new interlocutor. We did this in a small room where everyone was talking, and I was impressed with how the Meta Ray-Ban Displays used their on-board mics to maintain audio focus.

When I wasn’t using the display, I double-tapped my middle finger and thumb to put the display to sleep (it also automatically sleeps during periods of inactivity). When a text message came in, the screen woke up, and a partial text appeared at the bottom edge of the display. I tapped again to open it in full and then used my voice to respond.

Meta Ray-Ban Display

(Image credit: Lance Ulanoff / Future)

There will, eventually, be another way to respond. I saw but did not try a handwriting gesture that’s set to arrive later this year. I watched as one of the Meta execs used his finger to spell out a text on his knee. The first attempt resulted in a typo, but he got it right on the second try. Imagine all the serrupticuous texts Meta Ray-Ban Display Glasses wearers will be sending.

I asked Meta AI for restaurants in my area and instantly saw a card list of options I could swipe through with a map above. Each time I swiped, the map zoomed out further. I then selected one and could see a larger map with my position and a blue line to the destination. My arrow moved as I turned, and I moved.

In my test phone call, I could see video of the guy on the other end, and when I used gestures to share my view, I could see exactly what I was sharing right next to his live feed. Again, I was shocked at the clarity.

Throughout all this activity, no one could see what was going on behind my Meta Ray-Ban Display Glasses, although they had them hooked up to a separate laptop so they could follow along.

Meta Ray-Ban Display

(Image credit: Lance Ulanoff / Future)

This is because Meta has kept light leaks to a bare minimum with a claimed 2% of actual leakage. Naturally, though, if you are recording or taking a picture, there is an LED indicator on the front.

Meta told me the glasses get six hours of battery life, which is slightly anemic compared to the 8 hours on the new Ray-Ban Meta Gen 2 glasses. The case, which folds flat when not in use, adds an additional 24 hours of battery life.

Look, I’m not saying Meta has solved the smart glasses question, but based on my experience, nothing comes closer to effortlessly delivering information at a glance, and I’m starting to wonder if this is a glimpse of what will someday replace smartphones.

You might also like

References

  1. ^ Google (www.techradar.com)
  2. ^ Orion AR smart glasses (www.techradar.com)
  3. ^ Ray-Ban Meta Glasses (www.techradar.com)
  4. ^ Snap’s current AR Spectacles (www.techradar.com)
  5. ^ WhatsApp (www.techradar.com)

By admin