v1: smart glasses. why?

here's your tl;dr:

looking back on 10 years of smart glasses, I had one question.

does anyone1 actually want these?

A/N: I’m currently testing the latest Meta Wayfarers, so I won’t be talking about the latest updates here, but will cover that in a future issue!

In January 2024, I stood outside a Blank Street coffee shop in Back Bay, Boston, trying to work up the nerve to walk in. Partially because it was pretty crowded and I’d forgotten to order through the app, but mostly because on the top right corner of my glasses, a little red light had just blinked to life.

The Ray-Ban Wayfarers are one of the latest iterations of technological dream that has captivated Silicon Valley for over a decade.

On that day, I was wearing that dream on my face.

It felt pretty uncomfortable.

one of many unintentional selfies captured during testing (Jan 2024)

I’d already captured 30-second outtakes from the lives of the people passing by me as I walked through the city, but going into a crowded coffee shop felt much more invasive. The two camera would be inches away from tens of people who never consented to being recorded, who likely wouldn’t know they were being recorded in the first place unless (1) they stopped to take a real look at the frames, and (2) were familiar enough with smart glasses to know what that red light meant.

Would they care? If they did, would they approach me? I’d happily turn them off - in fact, I had no real plans to use that footage in the video I was making anyway.

And what would happen if they did approach me? I felt silly for worrying about my personal safety, but I’d also seen a few too many news articles about people being physically attacked for recording anything from an interaction with a homophobic customer in a Target during Pride, to interactions with the police, to misunderstandings about whether the person was filming someone else at all.

Anyway, I did end up going to the Blank Street, but I only captured about five seconds of footage - the recording time limit on the glasses is 60 seconds, and I’d spent 55 of those seconds debating on whether to go into the store in the first place. It was the last bit of footage I wanted to get before I returned the glasses.

It was also the last push I needed to fall down a rabbit hole (and why I’m only actually filming that video now) on a question that had bothered me since the first time I’d encountered smart glasses.

q: why do companies keep making smart glasses?

The OG: Google Glass

Let’s rewind to rewind to roughly 2012.

As I’m sure you all remember from seeing all those headlines, I was in 7th grade and had just gotten my first phone, a Sidekick. I was pretty excited about it, because it meant I didn’t have to sneak the family laptop into my room to read Twilight fanfiction for roughly three hours every night.

Less importantly, the tech world was abuzz with Google's latest moonshot: Google Glass. The journey of Glass began in mid-2011 with internal prototyping, followed by a 2012 announcement that set the tech world ablaze. By 2013, developers could get their hands on a pair for a cool $1,5003 . Google Glass promised a future where information floated before our eyes, where we could capture life's moments without reaching for a camera. The device boasted a heads-up display on the upper right-hand size, a 5MP camera that could record 720p video, and a touchpad to control the device, all packaged into a futuristic titanium frame. It was meant to be the next step in ubiquitous computing, a world where our digital and physical realities seamlessly merged.

Instead, it gave us "Glassholes" – early adopters whose constant recording made everyone around them uncomfortable. The backlash was swift and merciless. Bars banned the devices, citing privacy concerns. Legislators scrambled to address the implications of always-on cameras in public spaces. The term "glasshole" entered the lexicon, encapsulating the public's unease with this new technology. By 2015, Google had pulled Glass from the consumer market, relegating it to enterprise applications. But the dream didn't die. It just went back to the drawing board.

In the interim between Glass's consumer failure and the next major smart glasses launch, an interesting development occurred largely under the radar. Google, or more specifically Alphabet's life sciences division Verily, began exploring medical applications for the technology. Between 2013 and 2017, Glass found new life in hospitals, where it was tested for live-streaming patient visits, assisting with remote surgeries, and even capturing images of patients' retinas for diagnostic purposes. These applications leveraged Glass's unique capabilities – hands-free operation, point-of-view recording, and real-time information display – in an environment where such features could make a tangible difference. It was a glimpse of a future where smart glasses solved real problems rather than creating new ones.

Enter Stage Right: Snapchat

Fast forward to 2016, and Snap Inc. (then Snapchat) decided it was their turn to take a swing at smart glasses.

As I was taking the PSAT, Snap was putting the finishing touches on Spectacles, their take on wearable tech. The Spectacles were everything Google Glass wasn't: fun, fashionable, and focused on a single feature – capturing short videos for Snapchat. The genesis of Spectacles can be traced back to 2014 when Snap acquired Vergence Labs, the company behind Epiphany Eyewear. This acquisition laid the groundwork for Snap's foray into hardware.

Initially, the hype was real. Snap's marketing strategy was nothing short of brilliant. They distributed Spectacles through SnapBot, a pop-up vending machine that would appear in different locations, creating a scavenger hunt-like excitement and lines that rivaled a Toys R Us when the Nintendo Wii was released (shout-out to my dad who waited in that line to get me a Wii for Christmas!). Spectacles were priced at a much more palatable of $130-$150, featured a camera capable of recording short video segments that could be synced directly to a smartphone and uploaded to Snapchat. It was a far cry from the ambitious scope of Google Glass, but it seemed to be exactly what Snap's young, social media-savvy user base wanted.

I have absolutely zero recollection of said excitement, in spite of the fact that Snapchat was the main mode of communication within my friend group at the time5 .

But once the novelty wore off, most Spectacles ended up gathering dust in drawers. The problem wasn't the product itself, but rather its limited use case. Spectacles were essentially a $130 accessory for a single app, not a standalone device people could integrate into their lives. Moreover, the timing was off. In 2016, we hadn't yet hit the Gen Z TikTok level of social media saturation where people would be genuinely interested in constantly capturing their perspective. Snap, however, didn't give up. They released second and third generations in 2018 and 2019, even introducing a $380 model with dual cameras for 3D effects. As of 2023, they're on their fifth generation, now incorporating more premium augmented reality features. But in a world now dominated by TikTok, one has to wonder if they've missed their moment.

A Brief Aside: Bose

In 2019, a surprising entrant stepped into the ring: Bose. Known for their audio prowess, Bose released the Frames – sunglasses with built-in speakers. It was a clever sidestep of the video recording controversy, focusing instead on personal audio. The Bose Frames featured open-ear audio technology that produced surprisingly good sound without the need for in-ear buds. They were designed for music listening, audio AR experiences, and taking calls, all while maintaining situational awareness.

I’m genuinely unsure of whether I’d ever heard of the Bose Frames before researching this topic. It's a telling sign of the uphill battle these devices have faced in capturing not just our eyes, but our attention.

Mark Zuckerberg is Really Into The Metaverse

Which brings us to Meta's latest attempt at cracking the smart glasses code. The Ray-Ban Stories, updated in 2023 and rebranded as Meta Wayfarer, represent a more subtle approach. At a glance, they're indistinguishable from regular Wayfarers. But packed into the frames are two 12MP cameras, open-ear speakers, and a microphone array. The collaboration between Meta and EssilorLuxottica, Ray-Ban's parent company, is clearly a strategic one. By partnering with an established eyewear brand, Meta hoped to overcome one of the biggest hurdles facing smart glasses: style. Additionally, and unlike the sunglasses-only Ray-Ban Stories and Snap Spectacles, the Wayfarer can be purchased with clear prescription lenses.

As I walk through Boston, I snap a very Instagram-able photo of the Charles River without breaking stride. I call my dad, who can barely hear me over the wind, but his voice comes through clearly via open-ear audio, allowing me to stay aware of my surroundings. The video quality is solid, and the audio quality for music playback is surprisingly good. I’m pleasantly surprised.

At the same time, I'm reminded of the fundamental question that has dogged smart glasses since their inception: Why? Why do we need computers on our faces when we have perfectly capable smartphones in our pockets? The tech industry seems convinced that putting a computer on our faces is the next logical step after smartphones. But they haven't made a compelling case for why we need that computer there instead of in our pockets.

A quick aside: Speaking of navigating the gap between tech hype and reality...

If you're reading this, you're probably like me – fascinated by AI but often scratching your head at the latest "revolutionary" gadget. (Smart glasses, anyone?)

But here’s the thing: AI isn't one-size-fits-all. Your story – your needs, your work, your life – is unique. So why settle for tech that doesn't fit?

I offer personalized AI consulting to help find tech that actually makes sense for your life and values. 

Together, we'll:

1. Explore your daily challenges (no augmented reality required)

2. Identify AI tools that solve real problems in your world

3. Find solutions that match your needs, budget, and tech comfort level

Interested? Reply to this email to learn more or book a discovery call.

Ready to dive in? Book your first consulting session here.

Let's find the AI tools that truly enhance your story, not complicate it.

Now, back to our regularly scheduled smart glasses skepticism...

(What are) The (Use) Case(s) for Smart Glasses

Interestingly, it's in specialized fields where smart glasses could have found their footing. The medical applications that Google Glass explored continue to show promise. Imagine a rural doctor getting real-time guidance from a specialist during a complex procedure, or a surgeon accessing patient vitals without taking their eyes off the operating field. In the construction industry, AR glasses could provide workers with precise measurements and visualizations, improving accuracy and safety.

Perhaps the most promising recent development is in accessibility. Meta's latest software update for the Wayfarers includes features for the visually impaired, essentially turning the glasses into a real-life screen reader. This application leverages the unique advantages of smart glasses – hands-free operation and immediate access to information – in a context where these features are crucial, not just convenient.

So, Why?

As I spent more time with the Meta Wayfarers, I was struck by how often I forgot I was wearing "smart" glasses at all. Not because the features seamlessly integrated into my life, but because I never used them — they didn’t feel essential to my day-to-day life. The touchpad controls were clunky, and the requirement to use Meta's ecosystem for features like live streaming felt limiting. As a content creator, I can see the appeal, but I can't help but think a GoPro would serve the same purpose with higher quality and fewer restrictions.

In my opinion, this disconnect between Silicon Valley's vision and the actual needs of the average person is at the heart of both why smart glasses persist and smart glasses' struggle for relevance. Each iteration has tried to answer the "why" differently. Google Glass bet on ambient information and seamless recording. Spectacles focused on social media content creation. Meta is attempting a jack-of-all-trades approach, hoping that by doing a little bit of everything, they'll find the killer approach (and continue trying to quietly expand their reach into our lives). This iterative process does not appear to include asking actual people “why,” and developing a tool around that.

Looking back at my trip to Blank Street, I was acutely aware of the ethical minefield I was navigating. Sure, the glasses now sport a red recording light, addressing a major criticism of earlier models. But in a state with two-party consent laws for recording, am I obligated to announce to everyone in my vicinity that they might end up on my Instagram Live? Meta's privacy policy for the Wayfarers essentially boils down to "check your local laws and maybe consult a lawyer," which feels like a cop-out for a trillion-dollar company7 shipping a product with such significant privacy implications. Of course, as the person wearing the glasses, there are also significant privacy implications. We recently saw Meta update their policies on scraping Instagram for training data, but the data from the Wayfarers was fair game long before that.

I called Ray-Bans’ customer service line8 once I got home, because so many people return these glasses that they’ve intentionally made it harder to do so, and packed up the glasses for return once they sent a return label. Two weeks of Facebook seeping further into my life was quite enough, thank you very much.

And as I dropped them off at the local UPS Store, I was left with… somewhat mixed feelings.

To pull back the curtain a bit - I often worry that I come off as a pessimist in this space, when I actually consider myself to be cautiously optimistic about the potential ways that emerging technologies could substantially improve our lives. However, I struggle with the feeling that we're chasing a future that looks good in sci-fi movies but doesn't align with how we actually want to live our lives. Especially considering that it’s not really “us” chasing that future, but a very small group of predominantly wealthy white men with outsized (to say the least) influence on the direction these technologies take.

Perhaps the future of smart glasses isn't in trying to be all things to all people, but in embracing their niche applications. A world where surgeons don AR glasses in the operating room, construction workers use them for precision measurements, and visually impaired individuals navigate city streets with AI-powered assistance feels a lot more tangible – and valuable – than one where we're all walking around with computers on our faces, desperately trying to avoid being "Glassholes."

In the meantime, I'll stick with my regular, not-smart glasses (or contact lenses, which I’m sure some company will try to make “smart” in the near future).

Things I Consumed This Week

News

Other Fun Stuff

The Last Loop

1  no, content creators don’t count.

2  Yes, I return pretty much everything I test unless I really really like it - ex. the Eight Sleep mattress pad, which originally bought for a video in 2021.

3  This would be just over $2,000 in 2024. Google Glass walked so the Apple Vision Pro at a cool $3500.

4  I was going to say that there’s no connection to “MassHoles,” but I don’t actually know whether that’s true. 😂 

5  Remember YikYak? Good times. Not at all the same use case. Anyway, back to the newsletter.

6  again, is anyone using the Apple Vision Pro? Don’t worry, we’ll talk about it in the next issue.

7  In fairness, they are far from the only company in this space offloading liability onto their customers in this way.

8  There was a real person on the other end of the phone! 🙂 But they did try to sell me on keeping the glasses. ☹️