Forget the Vision Pro. Apple’s next power move isn’t about strapping a screen to your face — it’s about putting AI eyes on everything else you’re already wearing.

A wave of reports led by Bloomberg’s Mark Gurman has confirmed what the rumor mill has been grinding toward for months: Apple is aggressively developing three AI-powered wearable devices simultaneously. Smart glasses. An AI pendant. Camera-equipped AirPods. And over the weekend, Tim Cook signaled that Visual Intelligence — the ability for AI to see and interpret the real world around you — is the thread that stitches all three together.

This isn’t a side project. It’s Apple declaring war on Meta, which has quietly turned its Ray-Ban smart glasses into one of the most unexpectedly successful product launches in recent memory.

Three Devices, One Idea: Give Siri Eyes

Every device in Apple’s wearable trifecta serves the same purpose — making AI ambient instead of something you have to pull out of your pocket.

Smart Glasses (Codename N50)

The headliner. Apple’s glasses are the most ambitious of the three:

  • Dual cameras — one high-res for photos and video, one feeding environmental context to Siri
  • No display in the lens — Apple wants these to look like actual glasses, not a prop from a sci-fi B-movie
  • Voice-first interface — ask about what you’re seeing, get directions, translate signs on the fly
  • In-house design — premium materials, multiple sizes and colors, no eyewear brand partnership

Working prototypes already exist. Production could start as early as December 2026, with a public launch targeted for 2027.

The AI Pendant

This one’s stranger. Think AirTag-sized, but with a camera and microphone. Clips to your shirt or hangs from a necklace. Apple employees internally call it the “eyes and ears” of the iPhone.

If that sounds like the Humane AI Pin that crashed and burned — you’re right, it does. The difference is Apple has an actual ecosystem to plug it into. Whether that’s enough to make the form factor work remains a very open question. This one’s still in early stages and could get axed.

Camera AirPods

The most mature project of the three, and potentially the first to ship — late 2026 is the current target. Low-resolution cameras designed for AI context, not photography. The pitch: Siri can see what’s around you without you lifting a finger or wearing anything new.

Why Visual Intelligence Changes the Calculus

All three devices orbit the same star: Apple’s Visual Intelligence feature.

If you’ve used an iPhone 16 Pro, you’ve seen the early version. Point your camera at a restaurant, get reviews. Point it at text in another language, get a translation. Point it at an event poster, add it to your calendar. It works — but it requires you to physically hold up your phone like a tourist photographing a menu.

Wearables kill that friction. Smart glasses make Visual Intelligence ambient. You look at something and ask. The pendant captures context from your chest. AirPods do it from your ears. In Apple’s vision, your AI assistant should always be able to see what you see, passively, without you doing anything deliberate.

Cook’s weekend comments confirm this is the strategy. Not better chatbots. Not faster language models. AI that can see.

Apple Is Playing Catch-Up — And Knows It

Here’s the uncomfortable part for Cupertino: Meta got there first.

Ray-Ban Meta smart glasses have moved over 2 million units since September 2023. Revenue tripled in the first half of 2025. EssilorLuxottica is scaling production to 10 million units per year by end of 2026. Zuckerberg has gone full evangelist, claiming people without AI glasses will face “a pretty significant cognitive disadvantage.”

Snap is spinning its AR Specs into a standalone company. The field is getting crowded fast.

Apple’s counter-bet is classic Apple: ship later, ship better. The dual-camera system with environmental sensing goes beyond Meta’s current offering. Deep integration with a revamped Siri — reportedly powered by Google-developed AI models for chatbot functionality in iOS 27 — could create experiences Meta can’t match.

But “could” is doing a lot of heavy lifting there.

The Privacy Problem Apple Can’t Hand-Wave Away

This is where Apple’s brand identity runs headfirst into its product roadmap.

“What happens on your iPhone stays on your iPhone” was the line. Now Apple is building three devices with cameras designed to be worn all day, continuously feeding visual data to AI systems. The pendant’s camera is described as always-on. The glasses have environmental sensing. That’s not just your data anymore — it’s everyone in the coffee shop, every person on the sidewalk, every document on your colleague’s desk.

Apple will lean on on-device processing. They’ll emphasize the cameras are low-resolution, designed for context not capture. But the gap between “camera for AI context” and “camera that records everything around you” is paper-thin, and regulators are going to have questions.

New legislation governing camera-equipped wearables in public spaces isn’t a matter of if, but when. Apple’s privacy reputation might buy them time. It won’t buy them immunity.

What This Actually Means

If Apple ships even one of these successfully, it normalizes ambient AI in a way Meta hasn’t quite managed. Apple’s 1.5 billion active devices create a distribution channel that can turn a niche product into a mass-market one overnight.

For developers, the shift from screen-first to voice-first, camera-first interfaces means rethinking everything. Apps designed for glass screens don’t translate to smart glasses without fundamental redesign.

For the AI wearable graveyard — Google Glass, Humane AI Pin, Rabbit R1 — Apple’s entry doesn’t guarantee the category finally works. But it raises the stakes dramatically. If Apple can’t make AI wearables mainstream, probably nobody can.

The Clock Is Ticking

Three devices. Three form factors. Three price points. One bet: that the future of AI isn’t typing into a chat box, but living your life while an AI watches and helps.

Camera AirPods potentially by late 2026. Glasses production starting December 2026. Public launch in 2027.

The question isn’t whether Apple can build these things — they clearly can. The question is whether the rest of us are ready to live in a world where everyone’s accessories have cameras pointed at everything, all the time.

And whether Siri — Siri — can actually be good enough to justify it.