Android XR glasses for Visually Impaired
A concept


Problem Statement
How might we design an AI-powered, haptic-audio interaction layer for Android XR glasses to enhance spatial awareness and independence for visually impaired users?
Concept
Imagine a haptic and audio-based interaction layer for AndroidXR glasses enhanced by Gemini AI that empowers visually impaired users to:
🗺️ Navigate safely through unfamiliar spaces
🧠 Understand surroundings using real-time object and text recognition powered by Project Astra’s Gemini Live
👂 Interact hands-free using voice and gestures
📳 Receive directional guidance through a subtle haptic feedback (maybe a wristband or ring)
💯 Key Features Could be
1. Smart Ear Cueing – Contextual alerts via bone conduction or earbuds like “Approaching stairs”
2. Echo Feedback – Haptic ring or bracelet vibrates to indicate proximity or direction
3. Tap-to-Query – Tap the frame or say “Hey Gemini…” to parse surroundings
4. Public Transport Mode – Recognizes signs and guides with turn-by-turn cues
5. Caregiver dashboard App– tracks navigation, alerts & emergency flags, giving loved ones or guardians peace of mind.


Using Google hashtag#Stitch (Beta), I quickly mocked up a first-draft UI for the Caregiver App. Stitch literally gave me a near-perfect interface within seconds, and it even lets you copy the design straight into Figma.
I also generated a few 3D mockups with ChatGPT and couldn’t resist creating a cute Airbnb-style 3D icon version just for fun XD








This is still an early exploration, but I strongly believe that with just a few thoughtful adaptations, the Android XR glasses could become a game-changing accessibility tool.
If you're working in accessibility, assistive tech, or interaction design and are equally curious about this space, I’d love to talk. There’s so much potential here to make independence more tangible and technology more humane :))
Android XR glasses for Visually Impaired
A concept


Problem Statement
How might we design an AI-powered, haptic-audio interaction layer for Android XR glasses to enhance spatial awareness and independence for visually impaired users?
Concept
Imagine a haptic and audio-based interaction layer for AndroidXR glasses enhanced by Gemini AI that empowers visually impaired users to:
🗺️ Navigate safely through unfamiliar spaces
🧠 Understand surroundings using real-time object and text recognition powered by Project Astra’s Gemini Live
👂 Interact hands-free using voice and gestures
📳 Receive directional guidance through a subtle haptic feedback (maybe a wristband or ring)
💯 Key Features Could be
1. Smart Ear Cueing – Contextual alerts via bone conduction or earbuds like “Approaching stairs”
2. Echo Feedback – Haptic ring or bracelet vibrates to indicate proximity or direction
3. Tap-to-Query – Tap the frame or say “Hey Gemini…” to parse surroundings
4. Public Transport Mode – Recognizes signs and guides with turn-by-turn cues
5. Caregiver dashboard App– tracks navigation, alerts & emergency flags, giving loved ones or guardians peace of mind.


Using Google hashtag#Stitch (Beta), I quickly mocked up a first-draft UI for the Caregiver App. Stitch literally gave me a near-perfect interface within seconds, and it even lets you copy the design straight into Figma.
I also generated a few 3D mockups with ChatGPT and couldn’t resist creating a cute Airbnb-style 3D icon version just for fun XD








This is still an early exploration, but I strongly believe that with just a few thoughtful adaptations, the Android XR glasses could become a game-changing accessibility tool.
If you're working in accessibility, assistive tech, or interaction design and are equally curious about this space, I’d love to talk. There’s so much potential here to make independence more tangible and technology more humane :))