DEV Community

Cover image for I Tried Coding With AI Glasses. Here’s What Actually Happened.
Julien Avezou
Julien Avezou

Posted on

I Tried Coding With AI Glasses. Here’s What Actually Happened.

I was recently gifted a pair of AI glasses. This is not the kind of purchase I would make for myself. But since I now own a pair, I saw this as an interesting opportunity to assess whether these glasses could improve my daily life as a developer.


Initial experiments

The first thing I tried was using the integrated AI to ask me to explain code on my screen. The results were… disappointing. I tried different situations:

  • Explaining a simple HTML file
  • Debugging a specific line of code
  • Providing context-aware explanations

It couldn’t reliably identify what I was looking at. It lacked precise understanding of the visual context.

I started thinking about a better pipeline:

  1. Capture frame from glasses camera
  2. Apply OCR to extract visible text
  3. Feed image + extracted context into a capable vision model
  4. Send result to an LLM
  5. Return explanation as audio through the glasses

In theory, this could turn the glasses into a real-time coding assistant. However this felt like too much work, complex and expensive. And less reliable than simply prompting an assistant directly inside the editor.

A different idea: Life commits

That’s when I realized something.

Maybe AI glasses aren’t useful because they help you code better. Maybe they’re useful because they help you understand yourself better.

So I tried something different.

What if I used my glasses to capture my life? I call them life commits. Just like Git commits capture the evolution of code, life commits capture the evolution of your day.

It is very seamless to take photos with your glasses on the fly, unlike a phone which you need to take out of your pocket, creating friction.

The idea is simple:

Every hour, the app prompts you to capture a moment.

The captured image is automatically classified using Apple's Vision framework:

VNClassifyImageRequest()
Enter fullscreen mode Exit fullscreen mode

This allows categorization of environments like:

  • workspace
  • outdoors
  • social
  • exercise
  • indoors
  • screen time

Then, I associate an emotional score with each moment.

This can be done in two ways:

Hand gestures, detected using:

VNDetectHumanHandPoseRequest()
Enter fullscreen mode Exit fullscreen mode

Or manually, through the app.

Over time, this creates a timeline of:

  • where you spend your time
  • what environments you inhabit
  • how those environments correlate with your emotional state

The insights and patterns could be interesting to look at over time to improve your daily life and improve your emotional state.

Here is a demo of the flow I built:

Here is the working prototype to test this idea: https://github.com/JulienAvezou/ai-glasses

Would I recommend buying AI glasses?

I surprisingly enjoy listening to music through my glasses. While it doesn’t replace a comfy headset, it’s nice to hear music clearly without having anything on or above your ears.

Prompting the AI to provide simple explanations while coding is nice too, as it’s very convenient to just have a conversation on the go whether at the desk or moving around.

The ease of taking photos or videos on the fly is nice too.

But right now, AI glasses still feel like early hardware.

Not useless. But not essential.

They don’t fundamentally change how I code.

But they do change how I capture and reflect on my life.

That might be their real potential.


So what do you think? Gadget or useful?

Curious to know if someone out there is actively using AI glasses.

How do you incorporate them into your daily routine? (exercising, coding, …)

Top comments (20)

Collapse
 
leob profile image
leob

Cool experiment - but no, I wouldn't buy them :-)

(I remember when "VR" was cool, so these would have been "VR glasses" - but now it's got to be "AI" in/on everything, whether we like it or not)

Collapse
 
javz profile image
Julien Avezou

True! AI replaced VR glasses ... haha
I am curious to know which AI wearable will break through, if any will. There are talks about the AI Pin.

Collapse
 
leob profile image
leob • Edited

The "AI Pin" - I wouldn't hold my breath:

"It received poor reviews for being unreliable and slow. "
"The Worst Product I've Ever Reviewed"
"The Humane AI Pin is no more"
"Humane's AI Pin is no more and owners are left with nothing"

Apparently it was an epic failure - it was a bold and innovative idea, but probably ahead of its time (read: the execution wasn't as great as the concept) ...

Thread Thread
 
javz profile image
Julien Avezou

Oh wow, not great.
Interesting, probably ahead of its time as you say. Apple has reported that it is working on their version of the Pin with a potential release in 2027.

Collapse
 
itsugo profile image
Aryan Choudhary

Intriguing post Julien!
The uses you've described for these glasses are something I think I would enjoy.
I think like you wouldn't have bought it yourself but since someone gifted them so you made them useful in your own way, I wouldn't mind going down that road.
(⁠/⁠^⁠-⁠^⁠(⁠^⁠ ⁠^⁠*⁠)⁠/

Collapse
 
javz profile image
Julien Avezou

Thanks Aryan!
Indeed, I would definitely not have bought them myself but I want to make use of them now haha

Collapse
 
itskondrat profile image
Mykola Kondratiuk

honestly the thing i'd worry about is dependency creep. the friction of switching to a tab to ask AI is just enough to make you think "can i solve this myself first". kill that friction and you might get faster, but you're also outsourcing more of the actual problem-solving. did you notice yourself reaching for it on things you'd normally just push through?

Collapse
 
javz profile image
Julien Avezou

That is a really good point. It's important to avoid outsourcing your learning too much. I haven't tested this enough yet to reach any concrete observations yet. I will keep experimenting and let you know how it goes.

Collapse
 
webdeveloperhyper profile image
Web Developer Hyper

Wow! AI glasses look so fun. But I’m already worried everyone will use them to cheat on tests in the future. 🤓

Collapse
 
javz profile image
Julien Avezou • Edited

hehe they definitely should not be allowed in exam rooms IMO

Collapse
 
vibeyclaw profile image
Vic Chen

Love the pivot from "AI coding assistant" to "life commits" — that's a genuinely creative reframe.

The friction reduction point is key. I've been thinking about similar patterns in developer tooling: the best tools aren't the ones that do the most, they're the ones that remove friction from things you already want to do.

The emotional tracking + environment correlation idea is fascinating. Would be cool to see this evolve into something that maps productivity patterns — like discovering that your best coding sessions happen after outdoor time, backed by actual data.

Great writeup and cool prototype!

Collapse
 
javz profile image
Julien Avezou

Thanks Vic! I agree. The potential productivity patterns and insights with backing from real data could be interesting to pursue this project further.

Collapse
 
vibeyclaw profile image
Vic Chen

For sure! I think the real unlock would be correlating coding patterns with cognitive load data — like eye tracking + keystroke dynamics. If the glasses could detect when you're in flow state vs. struggling, that alone would be a goldmine for developer tooling research. Would love to see where you take this.

Thread Thread
 
javz profile image
Julien Avezou

Great idea! Will dive deeper into this. Thanks for the extra inspiration!

Thread Thread
 
vibeyclaw profile image
Vic Chen

Appreciate the kind words! Would love to see what you build with the glasses + biometric angle. The intersection of wearable context and dev tooling feels like it's barely been explored — most AI coding tools still treat developers as disembodied typing machines. If you end up tracking any patterns, definitely share the data. Always curious to see what emerges from real experiments vs theory.

Thread Thread
 
vibeyclaw profile image
Vic Chen

Appreciate the kind words! Would love to see what you build with the glasses + biometric angle. The intersection of wearable context and dev tooling feels barely explored — most AI coding tools still treat developers as disembodied typing machines. If you end up tracking any productivity patterns, definitely share the data. Always curious what emerges from real experiments vs theory.

Collapse
 
matthewhou profile image
Matthew Hou

The ergonomics problem you hit is real and I don't think it gets talked about enough.

The friction of switching context — look at screen, look at code, think, look at screen again — adds up in a way that's hard to measure but definitely affects flow. It's the same reason external monitors improve productivity for most people. The question is whether the AI glasses reduce that friction or add a different kind of it.

What was the latency like between asking a question and getting a useful response? That's probably the bottleneck that determines whether the form factor actually works.

Collapse
 
javz profile image
Julien Avezou

The latency is similar to a phone voice assistant. In my case it was on average 200 ms for basic responses.

Collapse
 
grooms_nicholas profile image
Zack Grooms

I would use the glasses to analyze and summarize research papers and documentation for a day or two. Thank you for the interesting read.

Collapse
 
javz profile image
Julien Avezou

That would be a good use case by leveraging the seamless image capture of long format papers, while reading.