Health

These apps help safe indoor navigation for blind people

Both apps communicate directions via audio and can pair with a smartwatch for vibration-based cues.

Two new smartphone apps are set to help blind individuals navigate indoor spaces using spoken directions, offering a safe alternative to traditional GPS, which doesn’t function inside buildings.

Roberto Manduchi, a professor of Computer Science and Engineering at UC Santa Cruz, has dedicated much of his career to developing accessible technology for the blind and visually impaired. Through his extensive work with these communities, he identified a particular need for tools to assist with navigating unfamiliar indoor environments.

“Moving independently in an unknown place is especially challenging without visual cues — it’s easy to get disoriented. The goal here is to make that process easier and safer,” Manduchi explained.

In a recent paper published in *ACM Transactions on Accessible Computing*, Manduchi’s team presents two smartphone apps designed for indoor wayfinding. These apps help users navigate to specific locations and retrace their steps, all while providing audio cues. Importantly, they don’t require users to hold their phones in front of them, making the system more convenient and discreet.

Scalable, safer technology

Smartphones serve as an ideal platform for accessible tech due to their affordability, built-in sensors, and support from IT infrastructure. However, many existing wayfinding apps require users to hold their phones out, which can be impractical. A blind person often needs one hand for a cane or guide dog, and holding a phone poses risks like theft, which people with disabilities are disproportionately vulnerable to.

While companies like Apple and Google offer indoor navigation for select locations such as airports, their systems rely on expensive infrastructure, limiting their scalability.

Leveraging built-in smartphone sensors

Manduchi’s app uses a building’s internal map to generate a route, then tracks movement using a phone’s built-in inertial sensors like accelerometers and gyroscopes. These features, common in smartphones, help monitor the user’s progress along a path, even in the absence of GPS.

The app further refines location tracking using a technique called particle filtering, which helps ensure the system doesn’t mistakenly assume a person is walking through walls.

The second app allows users to retrace their steps, a valuable feature for navigating out of a room independently. It uses not only inertial sensors but also the phone’s magnetometer to recognize magnetic anomalies, such as large appliances, as landmarks.

User-friendly navigation

Both apps communicate directions via audio and can pair with a smartwatch for vibration-based cues. The design emphasizes minimal input, allowing users to focus on their surroundings and safety.

To account for any tracking errors, users are given instructions slightly ahead of their next turn, with prompts like “turn left at the upcoming junction.” The system encourages users to rely on their own judgment, similar to driving where one checks for a turn rather than following GPS blindly.

“We believe in sharing responsibility between technology and the user,” said Manduchi. “Like driving, you don’t just turn because GPS tells you to—you look for the actual junction.”

After testing their system in the Baskin Engineering building at UC Santa Cruz, the team found the apps effective for navigating complex indoor spaces. Moving forward, they plan to refine the apps by incorporating AI features, such as scene recognition via photos, and expanding access to building maps, potentially through open-source software.

  • Press release – University of California – Santa Cruz – Eurekalert