I was recently gifted a pair of AI glasses. This is not the kind of purchase I would make for myself. But since I now own a pair, I saw this as an ...
For further actions, you may consider blocking this person and/or reporting abuse
Cool experiment - but no, I wouldn't buy them :-)
(I remember when "VR" was cool, so these would have been "VR glasses" - but now it's got to be "AI" in/on everything, whether we like it or not)
True! AI replaced VR glasses ... haha
I am curious to know which AI wearable will break through, if any will. There are talks about the AI Pin.
The "AI Pin" - I wouldn't hold my breath:
"It received poor reviews for being unreliable and slow. "
"The Worst Product I've Ever Reviewed"
"The Humane AI Pin is no more"
"Humane's AI Pin is no more and owners are left with nothing"
Apparently it was an epic failure - it was a bold and innovative idea, but probably ahead of its time (read: the execution wasn't as great as the concept) ...
Oh wow, not great.
Interesting, probably ahead of its time as you say. Apple has reported that it is working on their version of the Pin with a potential release in 2027.
Intriguing post Julien!
The uses you've described for these glasses are something I think I would enjoy.
I think like you wouldn't have bought it yourself but since someone gifted them so you made them useful in your own way, I wouldn't mind going down that road.
(/^-^(^ ^*)/
Thanks Aryan!
Indeed, I would definitely not have bought them myself but I want to make use of them now haha
honestly the thing i'd worry about is dependency creep. the friction of switching to a tab to ask AI is just enough to make you think "can i solve this myself first". kill that friction and you might get faster, but you're also outsourcing more of the actual problem-solving. did you notice yourself reaching for it on things you'd normally just push through?
That is a really good point. It's important to avoid outsourcing your learning too much. I haven't tested this enough yet to reach any concrete observations yet. I will keep experimenting and let you know how it goes.
Wow! AI glasses look so fun. But I’m already worried everyone will use them to cheat on tests in the future. 🤓
hehe they definitely should not be allowed in exam rooms IMO
Love the pivot from "AI coding assistant" to "life commits" — that's a genuinely creative reframe.
The friction reduction point is key. I've been thinking about similar patterns in developer tooling: the best tools aren't the ones that do the most, they're the ones that remove friction from things you already want to do.
The emotional tracking + environment correlation idea is fascinating. Would be cool to see this evolve into something that maps productivity patterns — like discovering that your best coding sessions happen after outdoor time, backed by actual data.
Great writeup and cool prototype!
Thanks Vic! I agree. The potential productivity patterns and insights with backing from real data could be interesting to pursue this project further.
For sure! I think the real unlock would be correlating coding patterns with cognitive load data — like eye tracking + keystroke dynamics. If the glasses could detect when you're in flow state vs. struggling, that alone would be a goldmine for developer tooling research. Would love to see where you take this.
Great idea! Will dive deeper into this. Thanks for the extra inspiration!
Appreciate the kind words! Would love to see what you build with the glasses + biometric angle. The intersection of wearable context and dev tooling feels like it's barely been explored — most AI coding tools still treat developers as disembodied typing machines. If you end up tracking any patterns, definitely share the data. Always curious to see what emerges from real experiments vs theory.
Appreciate the kind words! Would love to see what you build with the glasses + biometric angle. The intersection of wearable context and dev tooling feels barely explored — most AI coding tools still treat developers as disembodied typing machines. If you end up tracking any productivity patterns, definitely share the data. Always curious what emerges from real experiments vs theory.
The ergonomics problem you hit is real and I don't think it gets talked about enough.
The friction of switching context — look at screen, look at code, think, look at screen again — adds up in a way that's hard to measure but definitely affects flow. It's the same reason external monitors improve productivity for most people. The question is whether the AI glasses reduce that friction or add a different kind of it.
What was the latency like between asking a question and getting a useful response? That's probably the bottleneck that determines whether the form factor actually works.
The latency is similar to a phone voice assistant. In my case it was on average 200 ms for basic responses.
I would use the glasses to analyze and summarize research papers and documentation for a day or two. Thank you for the interesting read.
That would be a good use case by leveraging the seamless image capture of long format papers, while reading.