Skip to main content
Artificial Intelligence

What Smart Glasses Reveal (and Conceal)

The unseen privacy risks of wearable AI and the future of being in public.

Key points

  • Smart glasses discreetly record and analyze environments with AI superpowers.
  • Facial recognition tech may soon let strangers know your name instantly.
  • Bystander consent is all but impossible; privacy boundaries are quietly dissolving.

Last year, while traveling with a group, I visited an ancient religious shrine—a place revered for its beautiful silence and deep spiritual history. We were explicitly told: no phones, no cameras, and absolutely no pictures. Small groups were ushered in, each person allowed just a few minutes to take in the power of the scene. It was only after we left that I learned a fellow group member had discreetly filmed the interior of this shrine using smart glasses. From his perspective, prioritizing his desire to film exceeded his respect for this ancient space.

A New Privacy Minefield

AI generated image / Shutterstock
Source: AI generated image / Shutterstock

Meta’s latest Ray-Ban-branded smart glasses bring science fiction into daily life: maps and texts hover in your lens, AI translates conversations in real time, and cameras/mics quietly capture your world. Most consumers don’t realize these glasses accumulate troves of visual and audio data, not just about the wearer, but everyone nearby. Unlike a phone, smart glasses sit in plain sight, always ready to record, even when no one else consents.

Privacy revelations reveal another layer: Data and audio snippets routinely upload to Meta’s servers for AI analysis, and some features activate by default, even if consumers barely understand what’s being collected or for how long. Those wearing the glasses, in effect, become puppets for tech—gathering the all-important data on which the algorithms and AI continue to learn, adapt, and evolve.

Surveillance Is Already Everywhere, But This Is Different

On one hand, our every step in public is already being recorded: Cameras line the entrances of stores, banks, apartment buildings, airports—sometimes, it feels as though video is simply the cost of moving through city life. But until now, those cameras have mostly belonged to institutions, not individuals. Now, imagine walking outside and realizing that everyone is observing the world through video cameras. Not a handful of people with phones in hand, but a society where every pair of eyes might be relaying what they witness straight to the cloud. At a concert, on the subway, at your favorite café—everyone is a camera operator. Will that unsettle you?

Recent studies show that about half the public is already worried about wearable privacy. But adoption is growing fastest among the young, and the cultural unease hasn’t translated to meaningful resistance or regulation. Of course, it’s not in the best interests of companies like Meta to highlight these risks. If consumers become truly aware, recognizing every stranger as a potential live streamer, there might be greater resistance, or at least to the normalization of these devices in public. Instead, subtle design, minimal indicators, and glossy marketing keep privacy anxieties in the background… at least for a while.

From Masks to Glasses: New Tensions on the Street

It wasn’t that long ago that mask mandates in public settings led to heated confrontations. People shouted at strangers in grocery aisles and airplanes about others’ mask choices. I imagine this is what’s next for smart glasses—new confrontations erupting, with some demanding that others remove their wearable cameras or shaming others for donning them in sensitive spaces. The seeds of a new social battleground are already here, and it’s easy to imagine viral videos of these disputes echoing across social media.

Taking Public Shaming to a Whole New Level

Smart glasses don’t just passively collect data. They invite new abuses. Imagine filming a hook-up without their knowledge or a parent yelling at their child in a grocery store. Moments of vulnerability, embarrassment, or pain that once faded from memory can now be immortalized from the perspective of anyone in the room and shared globally in seconds. This technology takes the possibility for public shaming, ridicule, and exploitation to a whole new level. The damage won’t be limited to celebrities or the powerful. We are all exposed to this potential embarrassment.

Facial Recognition: Closer Than You Think

Some expect Meta to introduce smart glasses with facial recognition as early as next year, given leaked product roadmaps and industry reporting. When that happens, the implications are chilling. For example, I live down the street from a dance studio. After class, young ballerinas line the street waiting for their rides home. Imagine a stranger using glasses to scan the group and say, “Hi Beth, your mom is stuck at work, so she sent me to pick you up.” Or perhaps while walking down the street, someone stops you and says, “Leslie, it’s been ages! How are you?” Or simply, the person sitting next to you in the stadium gathering your name, political affiliation, and even an estimate of your income while you watch the game. These are not hypothetical science fiction scenarios; they could be part of daily life once facial recognition is in the hands of anyone wearing a pair of smart glasses. Even current models have been hacked using third-party apps to demonstrate this future, sparking widespread concern.

The Problem of Recording Without Consent

The Ray-Ban Meta Display glasses include a small LED light to indicate when recording, but many experts argue it’s too subtle for bystanders to notice in real-world conditions. There’s no easy way for others to opt out or verify if they’re being filmed. Some settings or technical hacks may even let users disable indicators entirely.

It’s worth noting that while privacy settings offer some protections for wearers, non-users are rarely accounted for in any meaningful way. Once a moment is captured and uploaded, bystanders typically have zero visibility or means to delete, control, or contest their personal data

What Can You Actually Do?

If you’re concerned about being filmed or analyzed by smart glasses:

  • Look for the small LED light near the camera (usually on the frame's corner). It signals active recording, but is easy to miss, especially outdoors or in certain lighting.
  • Smart glasses generally resemble classic glasses but are slightly bulkier in the arms (housing electronics, battery, and cameras); the lens edges may also be thicker than standard glasses.
  • If you think you’re being filmed or if someone in your space appears to be filming, politely ask the wearer to stop filming.
  • Keep up with advocacy groups pushing for stronger regulations, clearer indicators, and design accountability for public-facing tech.

The Cultural Cost: When Intimacy and Trust Erode

Perhaps the deepest risk isn’t technical; it’s cultural. When constant, low-grade surveillance becomes “normal,” people learn to censor themselves, assuming that anything they say or do could end up on a server, analyzed and archived forever. The foundational trust that underpins authentic connection, especially with strangers or in moments of vulnerability, starts to erode.

As a therapist, it’s impossible not to notice that the incidence of social anxiety has already been drastically increasing. It is fueled by uncertainty around visibility, judgment, and exposure. If people must worry not only about how they’re perceived in person but also how their images and words may be captured and distributed, it seems to me that the urge to avoid public spaces will intensify for many. Thus, this tech offers yet another reason for already anxious or self-conscious folks to stay indoors, potentially amplifying loneliness for many who crave connection but fear scrutiny. When intimacy feels threatened in every casual encounter, isolation becomes an appealing (if costly) refuge.

Innovation Supersedes Empathy

A culture that shrugs off the privacy rights of others signals a profound moral shift. What does it say about values when “innovation” is prioritized over consent, protection, and empathy? These are the questions Meta and its competitors force us to face, not in tech labs, but in every shared space, every casual glance, and every attempt to build trust in public.

References

Koetsier, J. (2024, October 3). Meta's Ray-Ban Smart Glasses Used To Dox Strangers In Public Thanks To AI And Facial Recognition. Forbes.

McDonald, K. C. (2025, May 18). Not a Good Look, AI: What Happens to Privacy When Glasses Get Smart? Cybersecurity Advisors Network.

advertisement
More from Marianne Brandon Ph.D.
More from Psychology Today