top of page
ikkio logo

IKKIO: Building the Future of Accessible AI Assistants for Smart Glasses

How IKKIO is integrating with leading smart-glasses SDKs to design accessibility-first AI, starting with Meta Ray-Ban.


Smart glasses are moving from experimental hardware to everyday wearable technology.

From capturing moments hands-free to accessing AI in real time, the category is expanding quickly, and so is the opportunity to make it truly inclusive.

At IKKIO, we are building an AI assistant for smart glasses designed with blind and visually impaired users, while ultimately supporting anyone who benefits from contextual, voice-first computing.

Today, we’re excited to share more about our work integrating with leading smart-glasses platforms, including Meta Ray-Ban, Brilliant Labs Frame, and emerging devices like Mentra Glasses, and why we’re prioritizing Meta’s ecosystem as we move toward early testing and our upcoming waitlist launch.


Why we’re prioritizing Meta Ray-Ban


As a startup, focus matters. While we are building a hardware-agnostic AI assistant, we’re currently prioritizing integration with Meta Ray-Ban smart glasses considering its market adoption. Meta Ray-Ban represents one of the most widely distributed AI-enabled smart-glasses devices today. Millions already own them, making them an ideal platform for real-world accessibility testing. This allows us to:


  • Reach users faster

  • Gather real behavioral feedback

  • Test AI assistance in everyday environments

  • Iterate on real accessibility needs


We’ve already conducted early internal and exploratory tests and we’ll continue expanding these trials as we onboard beta users. A public waitlist for early testers is launching soon.


The advantage of widely available smart glasses


Accessibility innovation often struggles when hardware is rare, expensive, or difficult to obtain. Devices like Meta Ray-Ban change that equation. Because they are commercially available; fashion-forward, socially normalized; easy to purchase in retail and online; already used for camera and audio AI features and they create a foundation where accessibility software can scale faster. Instead of building for a niche prototype, we can build for devices people already own.

This dramatically accelerates feedback loops, especially from blind and low-vision users who want solutions that fit into mainstream products.


Most smart glasses today were not designed accessibility-first and that’s the opportunity we’re taping into. Glasses were built for content capture, social sharing, audio streaming, AI queries for general users, etc. By opening their SDK, Meta has enabled developers to create experiences tailored to specific communities and needs, including accessibility. This means developers can design, vision-to-voice assistance; scene interpretation; object recognition; navigation cues; contextual memory support and other relevant features. 

In other words, accessibility can be built as a software layer on top of mainstream hardware. It’s a powerful shift. And a very strategic move from smart glasses producers. 


Image description: Banner with the text  IKKIO: building the future of accessible AI assistants for smart glasses alongside our co-founder and CPO Aleksei in a room testing the Meta Display.
Image description: Banner with the text IKKIO: building the future of accessible AI assistants for smart glasses alongside our co-founder and CPO Aleksei in a room testing the Meta Display.

A broader industry trend: Open SDK ecosystems


Meta isn’t alone in this direction. We’re seeing a broader pattern across the wearable ecosystem with hardware companies opening their SDKs earlier, sometimes even before full device launches to encourage developer ecosystems. This approach accelerates innovation; expands use cases beyond internal roadmaps; enables specialized solutions (like accessibility AI); reduces time-to-market for new experiences and elevates the experience for users.


Google, for example, opened the developer pathways much earlier in the product lifecycle. This “SDK-first” strategy is becoming standard for spatial computing devices, like in the case of Brilliant Labs frame and halo and Mentra Glasses with their OS for smart glasses. 


Our multi-device approach


While Meta Ray-Ban is our current priority, IKKIO is being built as a hardware-agnostic AI layer.

We are actively working across multiple platforms, including:

  • Brilliant Labs Frame (open-source AI glasses)

  • Meta Ray-Ban

  • Emerging devices such as Mantra Glasses

  • Future Android XR and spatial computing wearables

This ensures our assistant can operate across ecosystems, not locked to a single manufacturer. For users, that means flexibility. For accessibility, it means scale.


How our AI assistant works


IKKIO combines multiple AI layers to create real-time assistance through smart glasses.

Our architecture includes:


Edge AI 


  • Fast response times

  • Lower latency for critical feedback

  • Privacy-preserving processing when possible


Cloud AI + LLM reasoning


  • Deeper contextual understanding

  • Conversational Q&A

  • Complex scene interpretation


Vision + multimodal processing


  • Image capture from glasses

  • Computer vision analysis

  • OCR and environmental understanding


The result is an assistant that can describe, interpret, and respond, hands-free. Accessibility as the starting point. We design with blind and visually impaired users first, but the broader vision is bigger. When you design for the edges, you improve the experience for everyone. 


Early testing and what’s next


We’ve completed early exploratory testing across our supported devices focusing on image capture pipelines; AI response timing; mobile and glasses; user interaction flows and feedback formats. As we expand compatibility with Meta Ray-Ban, we’re preparing to onboard more external testers.

Our next milestone:

Launching our public early-access waitlist.

This will allow blind and low-vision users, especially those who already own Meta Ray-Ban, to test the assistant in real-world scenarios and help shape its evolution.


Join the waitlist


If you’re interested in testing IKKIO early:

  • Join our upcoming waitlist

  • Get early access to the beta

  • Share feedback that shapes development

  • Receive tester perks

  • Be eligible for smart-glasses giveaways

(Waitlist link coming soon.)


Final words


Smart glasses are becoming everyday computing interfaces, but their true potential will only be realized when they are inclusive by design. We're working toward a future where wearable AI supports everyone, everywhere, in real time by building an accessibility-first AI assistant that integrates across devices. And we're just getting started!


As we open this next phase, we’re inviting the community to build this with us. Everyone who joins our waitlist in the early access window will have the opportunity to test the application, whether through Meta Ray-Ban or mobile-only mode (iOS required for this phase). Active testers who explore the app in real-life scenarios and share feedback through our short form will also be eligible to enter our Meta Ray-Ban raffle. It’s a small way for us to say thank you and a meaningful way to ensure what we’re building is shaped by the people it’s designed to support.



Iara Dias

CEO and Co-founder








 
 
 

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page