AI backpack “sees” for visually impaired people

It warns them about any potential obstacles in their path.

As an AI engineer, Jagadish Mahendran has spent a lot of time trying to help robots “see” the world around them.

Now, he’s doing the same for visually impaired people.

Using technology developed by Intel, Mahendran has created a voice-activated AI backpack that guides people with visual impairments in outdoor environments — and the system costs just $800, compared to thousands for some smart glasses.

A Better Assistive Technology

An estimated 285 million people are visually impaired, meaning they have vision problems that can’t be corrected with glasses. Of that group, 39 million are blind.

For blind people, navigating an outdoor environment can be both difficult and potentially dangerous — they may have trouble safely crossing the street or knowing when they need to step up onto a curb.

Guide dogs can help in these situations, but they can be expensive (and some people are allergic). White canes, meanwhile, won’t help them avoid overhead hazards, such as hanging tree branches.

There are other assistive technology devices to help with navigation, but they aren’t always ideal.

Voice-assisted smartphone apps can give visually impaired people turn-by-turn directions, but they can’t help them avoid obstacles.

Smart glasses usually cost thousands of dollars, while smart canes require a person to dedicate one hand to the tech — not great if they’re, say, trying to carry groceries home from the store.

Mahendran’s AI backpack, MIRA, hopes to be the perfect alternative.

“When I met my visually impaired friend, Breean Cox, I was struck by the irony that while I have been teaching robots to see, there are many people who cannot see and need help,” he said. “This motivated me to build the visual assistance system.”

Building an AI Backpack

Before jumping into development on the AI backpack, Mahendran and his collaborators interviewed several people with visual impairments to ensure the device would address the challenges they faced.

Armed with those insights, they developed a system consisting of a small backpack, a vest, and a fanny pack.

A $300 Luxonis OAK-D spatial AI camera contains the Intel computer vision tech that serves as MIRA’s “brains.”

To train the camera’s AI to identify curbs, crosswalks, and other objects, the researchers fed it images from existing databases, as well as some they took and labeled themselves.

After training, they mounted the camera in the vest and connected it to a computing device inside the backpack — this could be anything from a laptop to a Raspberry Pi.

A GPS mounted on top of the backpack also connects to the computer, and the battery powering the whole system goes in the fanny pack.

A Bluetooth-enabled earpiece lets the wearer communicate with the AI backpack.

They can give it commands, such as “Describe,” which prompts the AI to list nearby objects along with their clock positions (e.g., “Stop sign at 2 o’clock”).

They can also hear the AI’s automatic warnings about potential dangers — to let them know a hanging branch is straight ahead, for example, the AI will say “Top, front.”

MIRA can run for eight hours on a single charge and is designed to blend in.

The Future of MIRA

Mahendran told Freethink that the AI backpack prototype cost about $800 — that’s already thousands less than most smart glasses, but he and his team are working to get the cost down even further.

They plan to publish a research paper on the system in the near future and will make everything they develop for the project — the code, datasets, etc. — open source.

We are only limited by the imagination of the developer community.


Hema Chamraj

Right now, they’re raising funds for testing and looking for more volunteers to help them reach their ultimate goal of providing visually impaired people with an open-source, AI-based assistance system for free.

“It’s incredible to see a developer take Intel’s AI technology for the edge and quickly build a solution to make their friend’s life easier,” Hema Chamraj, Intel’s director of technology advocacy and AI4Good, said.

“The technology exists; we are only limited by the imagination of the developer community.”

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.

Related
Farmers can fight invasive insects with AI and a robotic arm
As the invasive spotted lanternfly threatens to expand its range, Carnegie Mellon researchers are developing a robot to fight back.
Google unveils AI try-on feature for shopping
Google’s AI-powered virtual try-on feature lets shoppers see what an article of clothing would look like on a wide range of models.
GitHub CEO says Copilot will write 80% of code “sooner than later”
GitHub CEO Thomas Dohmke goes in depth to answer questions about how AI-powered development will change the future of innovation itself.
No, AI probably won’t kill us all – and there’s more to this fear campaign than meets the eye
A dose of scepticism is warranted when considering the AI doomsayer narrative — there are commercial incentives to manufacture fear of AI.
AI is riding to the rescue on wildfires
AI-powered systems designed to detect, confirm, and detail wildfires at the earliest possible time may help firefighters tame infernos in the West.
Up Next
Digital Twin
Subscribe to Freethink for more great stories