Smart Glasses Just Hit 20 Million Units. The Real Story Is the Wristband, Not the Display.
Source: Unsplash Everyone is staring at the screen on the lens. I keep looking at the band on the wrist. Smart glasses are officially ma...
Source: Unsplash
Everyone is staring at the screen on the lens. I keep looking at the band on the wrist.
Smart glasses are officially mainstream in 2026. According to Smart Analytics Global, sales are projected to jump from 6 million units in 2025 to 20 million units in 2026, with revenue climbing from $1.2 billion to $5.6 billion over the same period. IDC forecasts global XR device shipments will grow 33.5 percent in 2026, with most of that growth coming from glasses, not headsets. Meta sold out its $799 Ray-Ban Display launch so completely that it had to postpone the European rollout across the UK, France, Italy, and Canada. Apple and Samsung are preparing to enter the category. Every analyst on the planet is calling it the iPhone moment for wearables. Here is what they are mostly missing. The new product is not the screen on the lens. The new product is the Neural Band wristband that ships in the same box.
"Meta Neural Band and its EMG technology could be the best way to control any device."
, Alex Himel, Meta VP of Wearables, January 2026
What actually shipped on September 30, 2025
Mark Zuckerberg unveiled the Meta Ray-Ban Display and the Meta Neural Band at Connect 2025 on September 17. The product hit shelves at Best Buy, LensCrafters, Sunglass Hut, and Ray-Ban Stores on September 30. The headline spec is a full color, semi transparent display in one lens that turns on when you need it and disappears when you do not. The headline price is $799. The headline battery life is 6 hours of mixed use, with 30 hours total if you carry the collapsible charging case.
The on stage demo did not go cleanly. The tech glitched a few times during Zuckerberg's presentation. He kept going. By December, the supply situation got so tight that Meta paused the global rollout. CNBC reported on January 6, 2026 that the company explicitly cited inventory limits and unprecedented US demand. Wait times stretched well into 2026 for anyone trying to buy a pair.
I have been around enough product launches to recognize the difference between a manufactured shortage and a real one. This one looks real. Meta's partnership with EssilorLuxottica gave it 72.2 percent of the XR market in 2025. The form factor finally cracked. People are wearing these in public without feeling like they are cosplaying as a Black Mirror extra. That is the part that kept failing for the last 13 years.
Why the wristband matters more than the lens
The display is incremental. We have had small displays embedded in optics for years. What is actually new is the Neural Band. It is a wrist worn surface electromyography sensor that reads the tiny electrical signals your muscles produce when you intend to move your fingers, even when those movements are sub perceptible to anyone watching. You can scroll, click, and in the near future write full sentences without anyone seeing your hand move. Meta says the underlying technology is the product of years of research with nearly 200,000 participants.
I have worked on enough enterprise SaaS products to know that the input device is the part that locks in the next decade of interaction patterns. The mouse defined desktop UX. The touchscreen defined mobile UX. Voice has been trying and largely failing to define ambient UX since Alexa launched in 2014. The Neural Band is the first input device in 15 years that crosses three thresholds at once.
It is silent. It is invisible. It is always present on your body.
If you are designing products in 2026 and you are not thinking about how a user could control them with sub perceptible finger movements while their hand is at their side, you are designing for the past.
What the Neural Band changes for product design
I spent the last few weeks thinking about this with my team at Tkxel. Our enterprise SaaS clients are already asking what AI native interaction looks like in 2027. Glasses plus EMG is one of the answers. Here is the practitioner take, not the press release version.
- Invisible interaction: Gestures the wearer makes are not visible to bystanders. This is huge for accessibility, for professionals taking notes in meetings, and for anyone who hates the social tax of pulling out a phone. It is also a problem, because you cannot see what someone is doing.
- Always available input: The wristband stays on. There is no putting it down. Compare that to a phone, which spends most of its day in a pocket. The available surface for ambient computing just expanded.
- Eyes free: You do not have to look at a screen to interact. Combine that with on lens overlays and you are designing for a user whose attention is on the world, not on the device.
- Cross device potential: Alex Himel told the press in January that the Neural Band could become the universal input layer for any device. If that lands, your software has to assume input can come from a wrist 3 feet away from any screen.
- Discoverability collapse: This is the hard one. Gestures have no visible affordances. There is no button to point to. Onboarding becomes the entire product. If users do not learn the gestures in the first 5 minutes, they never use them.
The bystander problem nobody is solving
Privacy is the side of this story that the launch reviews softened. The Electronic Frontier Foundation published a piece in March 2026 titled "Think Twice Before Buying or Using Meta's Ray-Bans." Their case is simple. Smart glasses collapse the distinction between social interaction and data capture. The recording indicator is a small white LED that may or may not be visible in low light. A BBC investigation found that men were already using earlier Ray-Ban Meta models to secretly film women for content on TikTok and Instagram.
From a design lens, this is a category problem, not a Meta problem. Any glasses with a camera on the front have the same issue. The traditional UX answer is consent friction. Make recording obvious. Add audible cues. Add light patterns that bystanders can read across a room. None of this exists in a satisfying form. Apple is reportedly working on a directional light pattern for its own glasses to address this, according to Digital Trends, but nothing has shipped.
There is a second problem. The Neural Band reads your intent. Not your gestures. Your intent to gesture. That is a richer signal than anything we have ever shipped to consumers. Where does that data go? Who trains models on it? What does it mean when a company knows what you almost typed? I do not have answers. I have not seen anyone in the industry publish answers either.
The European story is a design story
Meta's European launch is stuck on a regulatory problem that is also a design problem. The EU is requiring replaceable batteries on consumer electronics. The Ray-Ban Display has a non removable lithium ion battery that bricks the entire device when it fails. This is a major e waste issue. Bloomberg noted that competing products like Alibaba's Quark AI S1 and Inmo Go 3 already use replaceable battery designs.
The reason Meta has not redesigned for Europe is simple. Replaceable batteries add weight. Weight kills wearables. The product team is stuck choosing between regulatory compliance and the design lightness that made these glasses sell in the first place. This is exactly the kind of constraint that traditional consumer electronics companies hit and traditional consumer electronics companies fail. A truly modular wearable, where the optical engine, the battery, and the frame are independent, is a hard industrial design problem that Meta has not committed to.
What designers should actually do about this
I write a lot about how product teams should prepare for AI native shifts. I published a piece on Medium in April 2026 called "How to Build AI-Native Experiences: 14 Mindset Shifts for Product Teams." Most of that thinking applies here, but glasses plus EMG add three new requirements on top.
First, start designing for ambient screens. The display in the Ray-Ban is glanceable, not readable. You cannot put a 12 row data table in front of someone's eye. You can put a 1 line answer there. If your SaaS product surfaces information, ask which 1 line each user role wants in their field of vision while doing other work. That is a real design exercise, not a thought experiment.
Second, treat input as bandwidth limited. The Neural Band can do clicks, scrolls, and pinches reliably. Handwriting is coming, per Meta's CES 2026 announcement. Voice still works. None of these are full keyboards. If your product requires typing 3 paragraphs, it is not glasses ready. If your product requires confirming a 1 word selection, it might be.
Third, build for invisible authentication. Phones used face and fingerprint. Glasses cannot do either reliably. The Neural Band's EMG signature might end up as the next biometric. Design auth flows that can read silent muscle micro patterns instead of asking users to look at a sensor. We covered some of the early thinking on this at reloadux.com/blog in our piece on AI readiness frameworks.
The honest take from someone who has shipped 42 products
I am not going to tell you smart glasses are about to replace your laptop. They are not. I am also not going to tell you they are a fad. Anyone who says either has not actually used a Ray-Ban Display for a full week.
The truth is that this is the first wearable category that crossed the social embarrassment line. People are wearing them. People are using them. The 20 million unit forecast for 2026 is conservative if Apple and Samsung ship credible competitors, which both will. The Neural Band is the part that, in 5 years, will look like the obvious historical inflection point. We will look back and say the touchscreen ended in 2026, even though we kept using touchscreens for another decade.
If you are a designer, start prototyping for glanceable displays and silent gestures now. Most teams are going to wake up to this in 2027 and discover their product cannot fit on the new surface. The teams that started thinking about it in 2025 are already a year ahead.
What do you think? Are you actually using smart glasses, or just reading about them? Drop your thoughts in the comments. I would love to hear how you are seeing this play out in your own work.
Sources:
1. Meta Newsroom, "Meta Ray-Ban Display: AI Glasses With an EMG Wristband," September 2025, https://about.fb.com/news/2025/09/meta-ray-ban-display-ai-glasses-emg-wristband/
2. CNBC, "Meta delays Ray-Ban Display glasses global rollout," January 6, 2026, https://www.cnbc.com/2026/01/06/meta-ray-ban-display-ai-glasses-pause.html
3. Smart Analytics Global via BusinessWire, "AI Smart Glasses to Quadruple Revenue in 2026," January 2026, https://www.businesswire.com/news/home/20260113778367/en/
4. IDC, "XR Market Expands 44.4% in 2025 as Smart Glasses Take Center Stage," 2026, https://www.idc.com/promo/arvr/
5. Electronic Frontier Foundation, "Think Twice Before Buying or Using Meta's Ray-Bans," March 2026, https://www.eff.org/deeplinks/2026/03/think-twice-buying-or-using-metas-ray-bans
6. Engadget, "Meta's EMG wristband is moving beyond its AR glasses," 2026, https://www.engadget.com/wearables/metas-emg-wristband-is-moving-beyond-its-ar-glasses-120000503.html
7. Meta Quest Blog, "CES 2026: Meta Ray-Ban Display Teleprompter, Neural Handwriting," https://www.meta.com/blog/ces-2026-meta-ray-ban-display-teleprompter-emg-handwriting-garmin-unified-cabin-university-of-utah-tetraski/
8. The Economy, "Practicality vs. Regulation: Meta-Ray-Ban Smart Glasses Face Design Overhaul," March 2026, https://economy.ac/news/2026/03/202603288760