Apple's AR glasses launch year predicted by research firm

Apple’s Screenless Smart Glasses Could Arrive by Early 2027—And They May Come in Four Distinct Styles

Apple is preparing to take on Meta’s Ray-Ban smart glasses with its own display-less smart glasses, aiming for a more premium feel and more style choices. The latest details suggest Apple wants to compete not just on features, but also on materials, comfort, and how seamlessly the glasses work with the iPhone.

According to recent reporting, Apple’s first smart glasses without a built-in display are currently expected to arrive by early 2027. The focus is practicality: integrated cameras, microphones, and speakers designed to make the glasses useful throughout the day, with a more capable Siri at the center of the experience. The idea is simple—put key iPhone-friendly actions in front of you without forcing you to pull your phone out every time.

What Apple’s smart glasses are expected to do

If the plans hold, Apple’s camera-equipped smart glasses would cover many of the everyday features people already use on phone-based wearables, but in a more hands-free form:

They can capture photos and record video, then sync with an iPhone for editing and sharing.

They can handle phone calls using the built-in audio system.

They can show or relay notifications so you can keep up without constantly checking your phone.

They can play music and other audio.

They can support hands-free control through Siri, using voice interactions to get things done on the move.

A big part of the pitch is context. With cameras and onboard sensors, the glasses could use computer vision to interpret what’s around you and feed that awareness into Siri and Apple Intelligence. That could unlock features such as smarter turn-by-turn navigation cues and visual reminder-style prompts tied to places and objects—as long as Apple can make Siri reliable and fast in real-world conditions.

Part of a broader wave of Apple AI wearables

The smart glasses reportedly aren’t a one-off. They’re described as one piece of a trio of new AI-focused devices Apple is exploring, alongside camera-equipped AirPods Pro and an AI pendant. The shared concept is “see what you see” computing—gadgets that can understand your surroundings and provide more relevant assistance, rather than acting like a simple voice command tool.

How Apple plans to stand apart from Meta

Meta may have an early lead in the category, but Apple appears to be betting on what it does best: tight integration and polish. Instead of building glasses that feel like a separate gadget ecosystem, Apple is expected to make the iPhone the hub—leaning heavily on smooth syncing, straightforward sharing, and utility-driven features built around Apple’s existing apps and services.

There’s also the design strategy. Apple’s smart glasses are said to target a more premium look and feel, including the use of acetate frames (a material often associated with higher-end eyewear), plus multiple frame shapes and color options. The rumored lineup includes:

A large rectangular frame similar in vibe to classic Wayfarer-style glasses.

A slimmer rectangular design resembling eyewear often associated with Apple CEO Tim Cook.

A larger oval or circular frame.

A smaller, more refined oval or circular option.

On the hardware side, the camera system may use vertically oriented oval lenses. Color options mentioned include black, ocean blue, and light brown—suggesting Apple wants these to feel like everyday personal accessories, not just tech products.

The broader takeaway from the report is that Apple’s advantages—brand power, custom chips, massive retail reach, and deep iPhone integration—could make its smart glasses a serious competitor if it delivers a truly functional, modern Siri experience. In other words, Apple doesn’t have to be first to win; it has to be the most useful and the most seamless.

AR glasses could follow later, with Micro-OLED displays

Beyond the initial display-less model, another expectation in the market is that Apple’s true AR smart glasses may arrive later, potentially around 2028, and could feature 0.6-inch dual OLEDoS (Micro-OLED) displays.

OLEDoS, also called Micro-OLED, places OLED elements directly on a silicon wafer rather than a glass or plastic base like traditional OLED panels. Because it uses semiconductor-style manufacturing and integrates circuitry into the silicon backplane with CMOS processes, it can deliver extremely high pixel density while staying compact and power-efficient—key requirements for lightweight AR glasses that can be worn comfortably.

For now, the near-term story is Apple’s push into camera-enabled smart glasses built for daily life: photography, communication, notifications, audio, and AI assistance—wrapped in a more premium, style-forward design, and anchored by iPhone integration. If Apple can make Siri genuinely helpful in the real world, these glasses could become one of the company’s most important new product categories in the years ahead.