Work on Apple's smart glasses continues to push forward

Samsung Jumps Into AI Smart Glasses, Hot on Apple’s Heels

Samsung and Apple are reportedly gearing up to challenge Meta in the fast-evolving world of AI-enabled smart glasses, with staggered launches that could shape the wearable tech market through 2027.

Rumors suggest Samsung has two distinct smart glasses on its roadmap. The first, said to be codenamed SM-O200P, is targeting a 2026 release. It’s described as a voice-first wearable without a dedicated AR display, designed to handle AI-driven tasks through spoken commands. Auto-dimming lenses in bright sunlight are reportedly part of the package, pointing to an all-day, outdoors-friendly design. A second model is said to arrive in 2027 with a true AR display, bringing camera, video recording, music, and calling features into a more immersive, visuals-forward experience.

Apple is also believed to be refocusing its efforts toward lightweight AI smart glasses with a 2026 window in sight. These glasses are said to include built-in cameras, microphones, and speakers to enable hands-free use, backed by an enhanced version of Siri for real-time assistance. Expected capabilities include proactive notifications, on-the-fly translations, and voice-guided help. Like Samsung’s 2026 model, Apple’s first iteration reportedly won’t include a dedicated AR display, emphasizing a sleek, everyday wearable with powerful AI instead of full augmented reality visuals.

All of this unfolds as Meta pushes ahead with its own smart eyewear lineup. Its current Ray-Ban smart glasses emphasize everyday utility with up to eight hours of mixed use, around two hours of continuous live AI features, ultra HD 3K video recording, and an upcoming “conversation focus” noise-cancellation upgrade aimed at clearer voice capture. On the AR front, the newer Ray-Ban Display model introduces an integrated screen large enough for reading text, viewing short videos, checking directions, and seeing live translations. The display is rated at 42 pixels per degree for sharpness, and its custom light engine and waveguide can reach up to 5,000 nits for visibility indoors and outdoors. Control extends beyond touch and voice thanks to the Meta Neural Band, which uses electromyography to interpret subtle hand signals for gesture-based navigation. The Ray-Ban Display is listed at $799.

If these timelines hold, Apple and Samsung’s first glasses without AR displays could arrive in 2026, while their AR-display contenders land in 2027. That gives Meta roughly a two-year head start to refine its AR experience and expand its ecosystem.

Why this matters for buyers and the industry:
– 2026 looks like the year AI-first, display-free glasses go mainstream, prioritizing voice, notifications, and real-time assistance.
– 2027 could be the inflection point for everyday AR displays, with Samsung and Apple poised to bring visual overlays to a wider audience.
– Competition is set to intensify around battery life, camera quality, display clarity, brightness, and intuitive controls like voice and EMG-based gestures.
– Pricing, comfort, and app ecosystems will likely determine which brand’s smart glasses become the default daily wearable.

Bottom line: Expect a two-stage battle. The first wave focuses on lightweight, AI-driven productivity and hands-free convenience. The second wave brings true AR visuals to the masses. With Samsung and Apple reportedly lining up their launches against Meta’s momentum, the next two years could define the future of AI smart glasses.