Pranav Prasun recently completed his research master Design for Interaction at Delft University of Technology. With his project Ally SenseScape, he designed a smart aid that helps blind and visually impaired people navigate places they visit frequently, such as their workplace or university. For this project, he received the KHMW Jong Talent Graduation Award 2025 for Health Care Innovations for Underserved Communities.

 

Your graduation project, Ally SenseScape, focuses on indoor navigation for people who are blind or have low vision. How did you first become interested in this topic?
It was actually a gradual process. I’ve always wanted to work on projects with a positive social impact. During my studies in Industrial Design at TU Delft, I was part of a student committee that organised case days with companies in the Netherlands. During one of those days, I had the opportunity to work on a project at the Koninklijke Bibliotheek (KB), focused on making images in books accessible for people who are blind or have low vision (PBLV). That’s when I realised how little I really knew about their experiences, and I was amazed by how skilfully they find their way through the world, despite what we often consider a limitation. That really moved me.
Later, for my master’s thesis, I came into contact with Envision Technologies, a company developing AI-based personal assistance tools for accessibility. They had recently started exploring “accessible AI” (called Ally), and that became the starting point for my graduation project. I began by identifying the challenges faced by PBLV, and indoor navigation emerged as an important one to address.

Can you explain in simple terms how Ally SenseScape works, and what makes it different from existing navigation tools?
Many existing tools are technologically advanced but have a steep learning curve or don’t align well with what already works for users—such as a guide dog or a cane. My idea was not to replace those aids, but to enhance them.
For example, a guide dog cannot read signs, but AI can. Ally SenseScape builds on what users already do naturally. The system uses AI to read floor maps and understand what the environment looks and feels like at each turn. It guides users through multi-sensory cues—such as changes in floor texture, sounds, or smells—and provides audio guidance at the user’s own pace. It might say, for instance: “In about fifteen steps, the floor will change from wood to carpet (which can naturally be detected with a cane); your next right turn is after that.”
So the system complements the user’s sensory perception rather than replacing it.

For your research, you worked with people who are blind or have low vision themselves. What was that like?
I organised sessions in which participants could test prototypes and share their experiences. Those were the most valuable moments of my project. Simply listening and observing taught me a great deal.
One participant told me that his guide dog doesn’t understand “the third right turn”—it just understands “right” and tries to go right each time there’s a turn. When it’s the third one, he knows that’s the correct turn to take. Another said she avoids using the touch panel of her Envision glasses in winter because she has to take off her gloves to operate the controls. Sometimes during the sessions, participants also had both hands occupied—holding a cane, a coffee, or something else.
Observations like that made me realise that true accessibility often depends on small, practical design choices—such as using voice control instead of touch interaction.

You combined AI, wearable technology, and sensory cues such as sound and texture. What were the biggest challenges in bringing those together?
The biggest challenge was to make the system truly usable. I wanted it to be smart, but not complicated. Technical obstacles kept coming up: what happens if the internet connection drops, or if someone mentions a room that isn’t on the same floor? It was a constant balance between technological ambition and practical feasibility.
Because the project lasted only a hundred days, there wasn’t much time for extensive testing. That’s why I focused on developing a concept that Envision could continue to build on after my graduation. 

Your thesis was praised for its social impact and inclusivity. What drives you personally to design for vulnerable groups?
I believe accessibility should never be an afterthought, but a natural part of every design. In one of the workshops I attended, a Paralympic athlete said that Olympic stadiums are often made accessible after the Games—when doors are widened and ramps are added to make them ready for the Paralympics. That’s symbolic of how we design: for the majority first, and only later for everyone else.
Designing for accessibility doesn’t just help people with disabilities. It also benefits someone who is temporarily injured or facing situational limitations. Inclusive design simply makes the world more liveable for everyone. 

Which personal qualities helped you most during this project?
I’m naturally calm and observant—I tend to listen more than I speak. That helps me notice details that others might overlook. I’ve also been meditating for years. It helps me see situations from a broader perspective and connect more deeply with the people I’m designing for. I try to put myself in their position—not to mimic their experience, but to design with genuine empathy.

You come from India. How did you end up at TU Delft?
In India, I designed shoes and clothing—new collections every few months. It was creative work, but I started wondering what kind of impact it really had. I wanted to design products that genuinely improve people’s lives. The Design for Interaction master’s program in Delft appealed to me because it doesn’t prescribe what you should design—whether physical or digital—but lets the research determine the outcome.

What’s next for Ally SenseScape – and for you personally?
Envision Technologies has indicated that they want to develop it further, which makes me very happy. Personally, I want to continue working on projects at the intersection of inclusive design and new technology. That’s where I see both the greatest challenge and the greatest opportunity to make a difference.