Apple plans glasses for 2026 as part of AI push, nixes watch with camera

Apple is aiming to release smart glasses at the end of next year as part of a push into AI-enhanced gadgets, but it has shelved plans for a smartwatch that can analyse its surroundings with a built-in camera.
Company engineers are ramping up work on the glasses — a rival to Meta Platforms’s popular Ray-Bans — in a bid to meet the year-end 2026 target, according to people with knowledge of the matter. Apple will start producing large quantities of prototypes at the end of this year with overseas suppliers, said the people, who asked not to be identified because the products haven’t been announced.
The iPhone maker is looking to join the emerging trend of AI-powered devices — an area where it faces fresh competition. OpenAI said Wednesday that it was teaming up with former Apple chief design officer Jony Ive to introduce hardware products starting next year. The artificial intelligence pioneer is acquiring Ive’s secretive io startup, with the goal of releasing a family of AI devices.
Apple’s glasses would have cameras, microphones and speakers, allowing them to analyse the external world and take requests via the Siri voice assistant. They could also handle tasks such as phone calls, music playback, live translations and turn-by-turn directions. The approach would be similar to that of Meta’s current glasses and upcoming devices running Alphabet Inc.’s Android XR operating system.
Apple shares were up less than 1% to $202.46 as of 3:13 p.m. in New York on Thursday after spending most of the session in negative territory. The stock fell 19% this year through Wednesday’s close.
The glasses were originally dubbed N50 internally but now carry the umbrella descriptor N401 — the name of a broader project exploring the category. The company’s plans could always change, though, and it has canceled previous projects.
Apple’s ultimate goal is to release a pair of spectacles with augmented reality, which uses displays and other technology to superimpose digital content on views of the real world. But those remain years away.
A spokesperson for Cupertino, California-based Apple declined to comment.
Bloomberg News reported earlier this month that Apple is working on a dedicated chip for smart glasses, with a plan to begin mass-producing the component as early as next year.
One person with knowledge of the glasses said they will be similar to the Meta product but better made. The Meta device has increasingly become a hit with consumers, and the company is planning a higher-end product for later this year. Those glasses will include a display, letting wearers see notifications, pictures and other relatively simple visuals. The social networking giant is planning its first pair with true AR in 2027, Bloomberg News has reported.
Apple has made other attempts to find a breakthrough AI product. That’s included the idea of equipping its smartwatches and AirPods earbuds with cameras, allowing them to absorb more info about the world around them.
The company had actively been working to release a camera-equipped Apple Watch and Apple Watch Ultra by 2027, but that work was shut down this week, according to the people familiar with the situation. The company continues to work on AirPods with cameras.
Perfecting an AI device promises to be a high-stakes competition. Apple has struggled to add compelling AI features to its iPhones, iPads and Macs, and now it risks missing out on entire new product categories.
The Apple Intelligence platform, released last year, has lagged behind competitors’ technology. But the company is racing to enhance its capabilities. That includes a plan to open up Apple’s large language models — a key foundation of generative AI — to outside developers. That could bring a wave of AI-enhanced third-party software to the company’s App Store.
Apple also intends to introduce its first foldable phone in late 2026, joining a category that most of its smartphone rivals have already entered. And it’s planning more new designs for 2027.
People working on Apple’s smart glasses remain concerned that its AI failings may undermine the new product. The Meta Ray-Bans and upcoming glasses running the Android operating system benefit from the strength of Meta’s Llama and Google’s Gemini artificial intelligence platforms.
Today, Apple uses Google Lens and OpenAI for analysis of the real world via the iPhone’s Visual Intelligence feature. The company will likely want to roll out its own technology for that with the upcoming hardware.
Much of Apple’s glasses work is being done by the Vision Products Group, which developed the Vision Pro headset. The team is working on new versions of that device as well, Bloomberg News has reported. That includes a cheaper and lighter model, as well as one that tethers to Macs for applications that require low latency, or less lag.
Still, the company’s road map for wearable devices continues to evolve. It previously aimed to develop AR glasses that would require a connection to the Mac — a plan it ultimately scrapped earlier this year.