Forget the chatbot wars. The biggest AI story developing right now isn’t about benchmark scores — it’s about where AI is going to live.
Within five days, we learned that Apple is fast-tracking three AI wearables simultaneously, OpenAI has over 200 people building hardware with Jony Ive, and Meta is expanding its already-successful Ray-Ban smart glasses globally. Snap is even spinning its AR glasses into a standalone company.
This isn’t coincidence. This is the starting gun of the AI hardware race.
Apple: Glasses, Pendants, and Camera AirPods
Bloomberg’s Mark Gurman reported on February 17 that Apple is “accelerating” development of three AI wearable devices at once — unusual for a company famous for doing one thing at a time.
The lineup:
- Smart glasses (codenamed N50): High-res camera, production potentially starting December 2026, public release targeting 2027. Described as “more upscale and feature-rich” than the other two.
- An AI pendant: An AirTag-sized device with cameras that pins to your shirt or hangs from your neck. Apple’s answer to the Humane AI Pin — except, presumably, one that works.
- Camera-equipped AirPods: Expected late 2026, making them potentially the first to ship.
The thread connecting all three? Visual Intelligence — the feature Tim Cook is personally championing as Apple’s defining AI capability. All three devices connect to the iPhone and lean on the long-awaited Siri overhaul.
The strategy is clear: Apple doesn’t want you pulling out your phone to interact with AI. It wants AI to see what you see, hear what you hear, and respond through devices you’re already wearing.
OpenAI and Jony Ive’s $200 Smart Speaker
OpenAI isn’t content being just a software company. According to The Information, the company has more than 200 employees building AI hardware with Jony Ive — the designer behind the iPhone, iMac, and iPad.
First product: a smart speaker with a built-in camera, priced between $200 and $300, targeting early 2027. This isn’t your parents’ Alexa. The device will reportedly:
- Use its camera to understand its surroundings — identifying objects and room context
- Feature Face ID-like authentication for purchases
- Listen to and interpret nearby conversations (yes, that’s as privacy-concerning as it sounds)
- Run on ChatGPT’s conversational AI
Beyond the speaker, OpenAI is exploring smart glasses and a smart lamp. Some hardware could debut as early as 2026.
The irony writes itself: Jony Ive left Apple, and now he’s building competing AI hardware for Apple’s biggest AI rival.
Meta: The Quiet Winner (So Far)
While Apple and OpenAI are still in development, Meta actually has AI wearables in people’s hands. The Ray-Ban Meta smart glasses are arguably the most successful AI hardware product to date.
This week brought two new developments: Meta announced plans to add facial recognition to Ray-Ban glasses later in 2026, and reports emerged of a revived smartwatch project codenamed “Malibu 2” with neural interface capabilities.
Meta’s glasses are expanding to Canada, France, Italy, and the UK in early 2026. The $799 price point positions them as premium but attainable.
Meta understood something the failed AI hardware startups didn’t: people already wear glasses. They don’t want a new device category. They want their existing devices to be smarter.
The AI Hardware Graveyard
The graveyard should haunt every company entering this space:
- Humane AI Pin: $230 million in funding, launched to enormous hype, quickly became a punchline. Poor battery life, unreliable AI, and the fundamental problem of being yet another device.
- Rabbit R1: Cute design, terrible execution. A repackaged Android app in a $199 orange box.
- Friend: An AI companion pendant criticized from every angle.
The lesson was supposedly clear: “AI doesn’t need a new gadget.” And yet Apple is literally building an AI pendant. So what changed?
Two things. First, the AI itself has gotten dramatically better. ChatGPT, Claude, and Gemini can hold real conversations, understand visual context, and perform useful tasks — the software finally deserves dedicated hardware. Second, these new devices aren’t trying to replace your phone. They’re companions to it, handling the quick-glance, ambient-awareness tasks where pulling out a phone interrupts the moment.
The Privacy Elephant
Every single one of these devices involves cameras pointed at the world, microphones listening to conversations, and AI processing what it sees and hears.
OpenAI’s speaker will monitor nearby conversations. Apple’s pendant has cameras pinned to your chest. Meta’s glasses have facial recognition.
We’re sleepwalking into ubiquitous AI surveillance, and these companies are betting convenience outweighs the creep factor. They might be right — AirPods normalized body-mounted microphones, smartphones normalized constant location tracking.
But there’s a meaningful difference between a device you actively use and one that passively observes. When your glasses identify strangers, your speaker eavesdrops on dinner, and your pendant records everything you see, the privacy implications aren’t theoretical. They’re architectural.
What This Means
The hardware race signals where the industry thinks it’s heading: ambient AI. Not AI you go to, but AI that’s always there — watching, listening, ready.
The implications are massive:
- Developers: Building for cameras, microphones, and constrained compute — not just text boxes
- Consumers: Expect AI subscriptions bundled with hardware, like cellular plans
- Regulators: The EU’s AI Act needs to grapple with always-on AI devices in public spaces
- Investors: The real winner might not be obvious yet — the iPod wasn’t the first MP3 player
The next 18 months will determine whether AI wearables are the next smartphone revolution or the next Google Glass. Smart money says somewhere in between — these won’t replace phones anytime soon, but the devices that nail “quick glance, instant answer” have a real shot at going mainstream.
The AI chatbot war was Chapter 1. The AI hardware race is Chapter 2. And it’s going to be a lot more interesting.