- NanoBits
- Posts
- I Wore Ray-Ban Meta Glasses For A Week – Here’s What Happened! 👓
I Wore Ray-Ban Meta Glasses For A Week – Here’s What Happened! 👓
Nanobits Product Spotlight

EDITOR’S NOTE
Dear Future-Proof Humans,
Last week, I attended Meta's WhatsApp Conversations event in Miami, expecting the usual partner announcements about WhatsApp business calling features. What I didn't expect was walking out with a pair of Meta's AI Ray-Ban glasses as a thank-you gift from Meta. 👓
You know that feeling when you get home from a conference and your swag bag sits unopened for weeks? Not this time. These glasses had me curious enough to actually tear open the packaging the moment I got back to my hotel room.
Setting them up was surprisingly straightforward – a quick app download, some basic pairing, and suddenly I had a computer sitting on my nose. The concept still feels a bit surreal: asking questions to thin air and getting answers whispered directly into my ears, all while looking like I'm wearing regular glasses.
This week, I want to share my real-world experiments with the Ray-Ban Meta glasses and show you what these AI-powered frames can actually do in daily situations.
META’S VOCAL COMPANION
The Ray-Ban Meta Wayfarer Black AI Glasses represent Meta's sophisticated approach to everyday wearable technology, seamlessly blending the iconic Wayfarer design with advanced AI capabilities.
These smart glasses feature a 12MP ultra-wide camera, a five-mic system for high-quality audio capture, and a built-in Meta AI assistant that responds to "Hey Meta" voice commands. Users can take photos and videos, make calls, listen to music through open-ear speakers, and even translate languages in real time. The glasses seamlessly connect with your smartphone contacts for calls and messages, while offering deep integration with Meta's ecosystem, including WhatsApp, Facebook, Instagram, and Messenger for content sharing and communication.
The glasses support third-party applications, including Spotify, Amazon Music, Apple Music, and Shazam, for hands-free music streaming. Additionally, these glasses offer 32 GB of flash storage, which roughly translates to 500+ photos and 100+ 30-second videos. And, they provide up to 4 hours of battery life on a single charge, with an additional 32 hours from the charging case, while maintaining IPX4 water resistance for everyday durability.
This expanding ecosystem of partnerships signals Meta's serious commitment to the smart glasses category, recently reinforced by Meta's $3.5 billion investment in Ray-Ban maker EssilorLuxottica, giving the company a 3% stake in the world's largest eyewear manufacturer.
The companies have sold 2 million pairs since late 2023 and aim to increase production to 10 million units annually by 2026, positioning these glasses as a mainstream wearable technology rather than a niche gadget.
When compared to competing smart eyewear, the Ray-Ban Meta glasses occupy a unique position in the market landscape. Unlike Apple's Vision Pro, which serves as a premium indoor mixed-reality headset designed for immersive computing and professional applications at a significantly higher price point, the Ray-Ban Meta glasses target everyday outdoor use with lightweight, stylish frames that don't appear overtly technological.
While Google Glass pioneered the smart glasses category with heads-up display technology, it failed to achieve mainstream adoption due to privacy concerns and a bulky design. In contrast, the Ray-Ban Meta glasses focus on discreet AI assistance and content capture, avoiding obvious visual overlays. The Meta collaboration has successfully captured mainstream appeal by prioritizing style and practical functionality over advanced AR features, making smart glasses accessible to consumers who want enhanced daily experiences rather than immersive virtual environments.
MY EXPERIMENTS WITH META’S AI GLASSES
Let’s take these glasses for a spin…
Scene Analysis
Standing on my balcony, I decided to test the glasses' visual recognition capabilities. I asked it what it could see in front of me. The AI responded by describing a city skyline dotted with green trees, mentioning it could spot the Space Needle in the distance. The description was accurate and detailed, capturing the key elements of my Seattle waterfront view.
What caught my attention was how the AI processed this information. It seemed to analyze the scene in real time, likely capturing a temporary image for processing, but this photo never appeared in my saved images within the Meta AI app. The glasses essentially took a mental snapshot, processed it, described what they saw, and then discarded the visual data. This could be a very cool feature for visually impaired people who could get instant descriptions of unfamiliar spaces, where they could quickly understand the layout.
Instant Social Media Sharing
For my second test, I asked the glasses to share the scene directly to my Instagram story. It captured a photo of my balcony view and, since I had previously connected my Instagram account through the Meta AI app, it automatically prepared the image for posting. Before publishing anything, it asked for my final approval.
The photo quality exceeded my expectations. Colors appeared accurate and vibrant, the Space Needle stood out clearly against the skyline, and the overall composition looked professional. The glasses captured exactly what I was seeing without any noticeable distortion or color shifting.
However, I noticed it skipped typical social media editing options. There was no prompt to add text overlays, filters, or captions to the image before posting. The glasses treated it as a straightforward photo share rather than a full social media creation tool.

View from the balcony, clicked by Meta AI glasses
Video Recording
I asked the glasses to record what I was seeing from my balcony, and the results impressed me. The video quality came through crystal clear, capturing the waterfront view with sharp detail and accurate colors. The glasses can record up to 3 minutes of footage at a time, which I feel is the right balance between capturing meaningful moments and keeping file sizes manageable.
WhatsApp Audio Calling and Message Reading
After connecting my WhatsApp to the Meta AI app, making calls became very simple. I just said someone's name exactly as it appears in my contact list, and the glasses would confirm the contact name and ask for permission before placing the call, which adds a helpful safety layer against accidental calls.
The glasses use Bluetooth to route WhatsApp calls, freeing me from being tethered to my phone. I could walk around the house during long conversations or sit at my laptop without needing to plug in earbuds or headphones.
What impressed me most was the message handling capability. The glasses would read incoming WhatsApp messages aloud, provided I hadn't muted notifications for that contact or group. When I wanted to respond, I could simply dictate my reply, and the AI would send it back through WhatsApp. This created a completely hands-free messaging experience that felt seamless and intuitive.
WhatsApp Video Calling
The glasses offer video calling functionality that lets you share your exact view with the person you're video calling, i.e., the other person can see what you're seeing through the glasses' camera. If you want to switch back to your phone's front or rear camera, you can double-tap the capture button on the glasses frame.
I spent considerable time trying to get this feature working and troubleshooting steps, but despite multiple attempts, I couldn't successfully activate the video sharing function.
Capturing Image and Object Identification
I asked the glasses to identify objects by asking what they could see while looking at my kitchen counter. It correctly identified strawberries sitting in a plastic container. When I pushed further and asked about their freshness and whether they'd last another week, the response turned generic. The AI suggested standard storage advice about refrigeration and avoiding heat sources, rather than analyzing the actual condition of the fruit in front of it.
I feel the photo capture function will be extremely useful in situations where pulling out a phone feels awkward or impractical, like while playing with kids or pets, to capture those spontaneous moments.
One limitation I noticed was a slight delay (roughly a second) between giving the command or tapping the frame and the actual photo capture. This delay could lead to missing the perfect moment, which somewhat defeats the purpose of hands-free photography.

What do you think about these strawberries?
Listening to Music
I couldn't get my usual music apps, such as Amazon Music or Spotify, to connect with the glasses, but YouTube worked perfectly for testing audio playback. I streamed both podcasts and music videos, and the sound quality impressed me.
What caught my attention was how the glasses handle audio privacy. I asked people sitting nearby if they could hear anything from my end. At low to moderate volume levels, the audio remained entirely private for me. When I increased the volume, others could pick up sounds if they actively listened for them, but the glasses weren't broadcasting music to the entire room.
THE BIGGER PICTURE: META’S AR STRATEGY
Meta views smart glasses as the on-ramp to AR glasses, with the Ray-Ban collaboration serving as a stepping stone toward their true metaverse vision.
The company had previously launched Orion last year, which they believe is the most advanced pair of AR glasses ever made, though it remains a $10,000 prototype.
Furthermore, Meta's Reality Labs division has reported significant losses, with $8.3 billion in losses during the first half of 2023 alone, and $16 billion in 2022, reflecting the massive investment required to crack this technology.
Engineers had to reduce power consumption by a factor of 100 before Meta Orion glasses would work, illustrating the scale of technical challenges. The Orion glasses weigh 98 grams compared to 30 grams for classic Ray-Ban Aviators, and current prototypes cost about $10,000 per unit to manufacture.
Manufacturing at scale, achieving acceptable battery life, and developing cost-effective display technology remain the primary obstacles before AR glasses can become mainstream consumer products.
THE GOOD, THE BAD, AND THE UGLY
The glasses are lightweight and comfortable for all-day wear. The design closely resembles regular Wayfarers, which makes them less conspicuous than other smart glasses. These glasses are prescription lens-compatible, which can accommodate a range of -6.00 to +4.00 total power. Although some optical shops face logistical challenges handling the frames due to their built-in electronics and battery. I also love its ability to capture photos and videos from a first-person perspective. So you can effortlessly capture more moments, especially during your walks, travel, or social events. However, it lacks a heads-up display or augmented reality features, which limits its usefulness for navigation or contextual information.
Many users complain that the battery lasts only 3-4 hours per charge, which seems problematic at first glance. However, this limitation affects different users differently. Regular glasses wearers tend to notice the short battery life more because they're accustomed to wearing eyewear all day. In contrast, people who don't usually wear glasses often find 3-4 hours sufficient since extended wear beyond that timeframe causes ear discomfort anyway.
Some users have found the AI features to be limited or "nerfed," as they don’t offer the full range of capabilities one might expect from a phone-based assistant. And, of course, there are some occasional bugs.
The glasses struggle with basic audio recognition reliability. The wake word detection fails frequently, and error messages appear without a clear explanation. Voice commands often go unrecognized, which I found very annoying when trying to perform simple tasks. The device can get stuck in phantom streaming modes where no actual streaming or calls are happening, but the glasses believe they're active. I tried closing and reopening the app, but it doesn't resolve the issue. I had to perform a complete device reset to restore normal functionality, which disrupts the seamless experience these smart glasses promise to deliver.
However, the biggest challenge with this kind of device is privacy. The always-on camera and microphone raise privacy questions for both users and bystanders. Some worry about being recorded or tracked in public spaces.
This discourse brings me to the next big question: Do we have space in our lives for a fourth standalone hardware device (laptop, mobile, smartwatch being the other three)?
END NOTE
Last week, I spoke with a colleague who recently purchased the Meta AI glasses and welcomed a baby girl. The hands-free functionality lets him capture precious moments with his daughter without missing a beat, keeping his hands available for what matters most.
I also spoke with another close friend who strongly opposes adding another listening or tracking device to his life. He explained that these technologies overstimulate him and create cognitive overload that sends his mind into overdrive. This doesn't even address how uncomfortable people feel around someone wearing AI devices like rings, pendants, or glasses.
Hey, I get it! I love devices that make my life easier, and I'm all for incorporating them into my daily life. I want to feel more present in the moment, able to capture memories hands-free, and enjoy the convenience of quick audio access and AI assistance. Imagine doing all your daily tasks just by using your voice. But I also understand that the feeling of being "watched" or "recorded" can be off-putting. Additionally, since smart glasses are relatively new, social norms surrounding their use are still evolving. Users may inadvertently breach etiquette, such as recording in sensitive environments, which can lead to awkwardness or conflict.
These glasses feel like magic when they work, frustrating when they don’t, and somewhere between fascinating and intrusive depending on who's asking. Whether these glasses become mainstream will depend on how well Meta balances innovation with the very human need for privacy and social comfort.
In the meantime, I want you to think about:
If your phone could see everything you see and hear everything you hear (for real; we all know that they do), would you carry it? How is wearing smart glasses fundamentally different from the device already in your pocket?
Would you feel more comfortable around someone wearing smart glasses if there was a clear visual indicator when they're recording, or does the mere possibility of being recorded change how you behave around them?
In a world where we're already documenting our lives through social media, do smart glasses represent the natural next step in human connection and memory-keeping, or do they risk making us observers of our own experiences rather than participants?
💡 As of May 2025, the Ray-Ban Meta Wayfarer AI Glasses have been officially launched in India. You can now purchase them through authorized channels, without needing to import from abroad. Prices for these glasses in India start at ₹29,900 for the base styles and increase depending on the frame and lens combinations.
I've recently become fascinated by AI wearables like rings, pendants, and glasses. And, we're working to bring founders from this space directly to you, so they can share insights about their inventions and how these devices are reshaping our personal and professional lives. Stay tuned for exciting interviews and updates ahead.
Share the love ❤️ Tell your friends!
If you liked our newsletter, share this link with your friends and request them to subscribe too.
Check out our website to get the latest updates in AI
Reply