Apple Is Testing Four Smart Glasses Designs — AI Wearables Race Heats Up in 2026
Apple is reportedly testing four distinct designs for its upcoming smart glasses, TechCrunch confirmed on April 12, 2026. The project marks a strategic step back from Apple's original mixed-reality ambitions — with AI-powered glasses aimed directly at Meta Ray-Ban's mass-market success. Full breakdown of what is known, the competitive landscape, and what AI wearables mean for how people interact with AI agents.
TL;DR
- • Apple is testing four smart glasses designs — a step back from ambitious mixed-reality hardware plans
- • The target is Meta Ray-Ban's mass-market success: lightweight AI glasses for everyday wear
- • Expected features: Siri 2 (Gemini-powered), visual AI search, live translation, health monitoring
- • Launch expected at WWDC 2026 or fall 2026 event — retail availability likely 2027
- • AI wearables shift how people interact with agents: from screen-first to ambient and always-on
Apple's Pivot to Smart Glasses: What We Know
Apple is testing four distinct designs for a smart glasses product, TechCrunch's Anthony Ha reported on April 12, 2026. The project represents a deliberate strategic simplification: Apple once planned an overlapping family of mixed-reality and augmented-reality devices. The smart glasses initiative is a focused bet on AI-powered eyewear as the mass-market wearable category to win.
Apple Vision Pro — launched in February 2024 at $3,499 — proved the company can build extraordinary spatial computing hardware. It did not prove that consumers want to strap a ski goggle to their face for daily use. Sales have been significant in enterprise and creative markets but have not broken into mass adoption.
Smart glasses solve the mass-market problem: they look like regular eyewear, weigh under 50 grams, and do not isolate the wearer from the physical world. They are AI-first devices, not display-first devices.
Why Apple Needs to Win the Glasses Category
Meta Ray-Ban smart glasses have sold over 7 million units since 2024. The product — which TechCrunch called “Apple's accidental moat” problem earlier this month — is simple: glasses with a camera, a speaker, and Meta AI. No display. No spatial computing. Just AI you can access by saying “Hey Meta.”
The success of Meta Ray-Ban has reframed the competitive landscape. The question is no longer “which company builds the best mixed-reality headset?” It is “which AI assistant do you wear?”
Apple cannot afford to cede that category to Meta. The glasses frame — literally — where AI enters daily life. Whoever owns the form factor that people wear all day owns the most ambient, highest-frequency AI interaction surface.
Apple's answer is Siri 2: a fully rebuilt AI assistant powered in part by Google Gemini (Project Campos), announced at WWDC 2025 and rolling out across iOS 26. The smart glasses bring Siri 2 off the phone screen and onto your face.
Use AI Now — Before the Glasses Arrive
Smart glasses are coming. In the meantime, Happycapy gives you agentic AI that handles tasks, research, and workflows across all your devices today. Free to start.
Try Happycapy Free →AI Smart Glasses: Apple vs. Meta vs. Others (2026)
| Feature | Apple Smart Glasses (rumored) | Meta Ray-Ban (2025 gen) | Nothing Smart Glasses (2027) |
|---|---|---|---|
| AI assistant | Siri 2 (Gemini-powered) | Meta AI (Llama) | Nothing AI |
| Display | Likely none (audio-first) | None | Micro-display (rumored) |
| Camera | Yes — visual AI search | Yes — 12MP | Yes |
| Live translation | Yes (iOS 26 feature) | Yes | Unconfirmed |
| Ecosystem | iPhone, iPad, Mac | Facebook, Instagram, WhatsApp | Android-first |
| Expected price | $299–$399 (estimated) | $299–$379 | TBD |
| Availability | 2027 (projected) | Available now | 2027 |
All Apple specs are based on industry analyst reports and Apple patent filings as of April 2026. Official specs not yet confirmed.
What AI Features Will Apple's Glasses Have?
Apple has not officially disclosed feature details. But between its patent activity, Siri 2 capabilities, and the competitive requirements of taking on Meta Ray-Ban, the feature set is largely predictable:
- Visual AI search — point the camera at anything: a restaurant menu, a landmark, a product label, a person's face (with permission) — and Siri 2 identifies and explains it
- Live translation — real-time in-ear translation of conversations in 40+ languages, powered by the Google Translate integration Apple launched in iOS 26.4
- Contextual reminders — the glasses know where you are, what you are looking at, and who you are with, enabling hyper-contextual AI assistance
- Health monitoring — heart rate, body temperature, and posture tracking built into the frame
- Hands-free AI workflows — voice-triggered Shortcuts, HomeKit control, calls, messages, and calendar management without touching your phone
The key architectural advantage Apple has over Meta is the iPhone. Apple's glasses do not need to be computationally powerful on their own — they offload heavy AI inference to the iPhone in your pocket, which runs a dedicated Neural Engine capable of running Gemini Nano on-device. The result is a lighter, cheaper glasses form factor that still delivers full Siri 2 capability.
What This Means for How People Interact With AI Agents
Smart glasses are not just a new hardware form factor. They represent a fundamental shift in how AI assistants enter human life — from intentional, screen-initiated interactions to ambient, always-available presence.
Today, using an AI assistant means unlocking your phone, opening an app, typing a prompt, and reading a response. That interaction model puts AI in the same mental bucket as email or social media — something you check. Smart glasses put AI in the same mental bucket as a conversation — something that is always there.
The shift has significant implications for AI usage patterns:
- AI query volume will rise 10x as friction drops — people ask their AI agent things they currently would not bother reaching for their phone
- Context richness increases — your AI assistant knows what you are seeing and hearing, not just what you type
- Workflow integration deepens — tasks you currently do on a computer (research, writing, scheduling) become conversational and hands-free
- The platform that powers your glasses AI becomes your most important AI relationship — which is why Apple, Meta, and Google are all racing to own this layer
For AI power users today, the practical implication is to invest in the AI infrastructure and habits that will transfer naturally to glasses: voice-first prompting, multi-step agentic workflows, persistent context and memory, and AI tools that operate across all your devices.
Frequently Asked Questions
What are Apple's smart glasses?
Apple is developing AI-powered smart glasses — lightweight eyewear with cameras, microphones, speakers, and on-device AI. Unlike Apple Vision Pro, the glasses are designed for everyday use and position Apple to compete with Meta Ray-Ban in the mass-market AI wearables category.
How many Apple smart glasses designs are being tested?
Four designs are being tested as of April 2026, per TechCrunch. The multiple design exploration indicates Apple is finalizing the form factor. A single design will be selected for mass production, likely announced at WWDC 2026 or a fall 2026 hardware event.
When will Apple smart glasses be available?
No official date is confirmed. Based on Apple's product cadence and the four-design testing phase, analysts expect announcement in mid-to-late 2026 and retail availability in 2027. Pre-orders could open as early as late 2026.
How do Apple's glasses compare to Meta Ray-Ban?
Meta Ray-Ban is available now at $299–$379. Apple's glasses are still in development but will integrate Siri 2, the iPhone ecosystem, and Apple Health. The core distinction: Meta glasses are Android and Meta AI-native, Apple glasses will be iPhone and Siri 2-native. Both are audio-first, camera-equipped AI wearables without a display.
Sources
- TechCrunch — Anthony Ha, “Apple reportedly testing four designs for upcoming smart glasses” (April 12, 2026)
- TechCrunch — “Apple's accidental moat: How the 'AI Loser' may end up winning” (April 2026)
- Hacker News — “Apple's accidental moat” — 325 points front page discussion (April 13, 2026)
- Meta — Ray-Ban smart glasses sales figures and Meta AI integration (2025–2026)
- Apple — iOS 26.4 Google Translate live integration announcement (April 2026)
Build Your AI Stack Before the Glasses Arrive
The users who win with AI wearables will be the ones with strong AI habits today. Happycapy gives you agentic AI across all your devices now — at $17/mo Pro, less than ChatGPT Plus.
Start Free with Happycapy →