Apple's AR glasses launch year predicted by research firm

Research Firm Forecasts Apple’s AR Glasses Debut, Highlighting a Next-Gen High-Resolution Display

Apple is quietly reshaping its wearable strategy, and the message is clear: the future fight isn’t just about bulky mixed-reality headsets. It’s about lightweight smart glasses that blend AI features today and true augmented reality tomorrow. A new industry report from Omdia adds timely detail to how fast the AR smart glasses market is filling up, who’s launching what, and when the biggest names could finally collide.

Several brands are preparing to enter the next wave of AR eyewear using OLEDoS displays (also known as Micro-OLED). ROG and RayNeo are expected to introduce OLEDoS-based AR smart glasses within this year, signaling that the competitive landscape is already expanding before Apple and Meta even bring their “full” AR visions to consumers.

Meta, meanwhile, is said to be targeting 2027 for its first proper AR smart glasses. The expected recipe includes dual OLEDoS panels and a waveguide system designed to project imagery directly into the wearer’s eyes. This waveguide approach typically relies on micro-projectors and semi-transparent optical elements, an architecture that can deliver sharper visuals in brighter environments—an important requirement if AR glasses are going to work reliably outdoors rather than only in controlled indoor lighting.

If you’re wondering why OLEDoS keeps coming up in conversations about next-generation AR glasses, it’s because the technology is built for extreme compactness and clarity. Instead of putting OLED pixels on glass or plastic substrates like most phone and TV screens, OLEDoS places OLED elements directly on a silicon wafer. Because it borrows techniques from semiconductor manufacturing, it can achieve ultra-high pixel density in a tiny display size, while also improving power efficiency by integrating circuitry into the silicon backplane using CMOS processes. In practical terms, that’s a strong match for AR eyewear where space, weight, heat, and battery life are all major constraints.

As for Apple’s AR smart glasses, Omdia expects them to arrive in 2028—potentially months after Meta’s competitive AR model lands. The report suggests Apple’s AR glasses could use 0.6-inch dual OLEDoS displays, which fits the direction the broader industry is taking for premium AR optics.

But Apple’s next move may happen sooner—just not with a full AR display. The company is reportedly aiming for a 2026 launch of AI-enabled smart glasses equipped with cameras, microphones, and speakers. The idea is to make them deeply voice- and context-driven, paired with an upgraded Siri experience. These AI smart glasses are expected to focus on everyday utility features such as hands-free notifications, real-time AI assistance, and AI-powered translation, while skipping built-in AR visuals entirely.

That puts Apple on a collision course with products that already define the “AI smart glasses” category today. Meta’s widely recognized Ray-Ban smart glasses have established a baseline for what consumers can expect: up to eight hours of mixed use, around two hours of continuous live AI support, 3K ultra HD video recording, and a noise-handling feature designed to make conversations clearer.

Meta has also shown off a more advanced concept with its Ray-Ban Display smart glasses, which add an integrated display capable of showing text, directions, small video content, and live translations. The display is described as reaching 42 pixels per degree, positioning it as higher resolution than the displays used in several of the company’s consumer VR devices. With a custom light engine and waveguide optics delivering up to 5,000 nits of brightness, Meta claims the display remains usable both indoors and outdoors—an area where many early smart display glasses have struggled.

Pricing for the Ray-Ban Display setup has been presented at $799, with an additional wearable control accessory called the Meta Neural Band. This band uses electromyography (EMG) signals to detect subtle muscle activity, enabling gesture-based navigation without needing a handheld controller—one more sign that smart glasses are evolving into a broader wearable computing platform, not just a camera on your face.

Taken together, the timeline emerging from these developments is hard to ignore. The AI smart glasses era is already underway, AR glasses are next, and the race is accelerating. With multiple manufacturers launching OLEDoS-based products now, Meta aiming for 2027, and Apple potentially following with AR hardware in 2028—plus a separate AI-first model rumored for 2026—the smart glasses market is shaping up to be one of the most competitive consumer tech battlegrounds of the next few years.