Having had close, personal friends impaired by blindness, our team created Sonar as a way to allow the visually impaired community to “see” with technology. Our product is inspired by a traditional method used by visually impaired individuals: echolocation (see Many people who use echolocation use a small device that can be clicked to make sounds. The user can then hear the sound bounce off of objects, allowing them a rough understanding of the distance and composition of the object hit. Our product is similar in concept but incorporates modern machine learning technology. In the Sonar app, users can click to sense an object with their phone, and have the object detected read allowed to them. This allows the user to know exactly what is in front of them, without needing to infer based on sounds.

What it does

Sonar is a product that utilizes cloud computing machine learning infrastructure to enable the blind and visually impaired to “see”. With Sonar, users navigate their environment with their phone, simply pressing a button to have Sonar say what is in front of them in real-time.

How I built it

Our app backend is built in Elixir using the Pheonix web framework, using HTML, CSS, and Javascript for the frontend. This app is supported by Oracle Cloud infrastructure for deployment, and uses Google Cloud’s machine learning services for object recognition and speech synthesis.

Challenges I ran into

Both the language elixir as well as the Phoenix web framework were both new to us. This was the first time we both learned a new language and a new framework for a project. Most of the time spent on the backend was just the basic setup and watching online tutorials for the basics. This was especially difficult dealing with complexities like image uploading, audio file creation, and sending and receiving these complex objects in different forms.

Accomplishments that I'm proud of

This project was our first exposure to several technologies: Elixir, Phoenix, and Oracle Cloud Infrastructure. And we are proud of our commitment to learning new technologies and getting outside our comfort zone. While not everything we imagined, the project is complete and usable!

What I learned

Elixir is a highly scalable language built for realtime updates and pretty easy to get started with.  Also learned about audio and video file types and how they interact with javascript and json.

What's next for Sonar - Helping the Visually-Impaired See

We would love to continue allowing complete awareness of the environment. Facial recognition to detect your friends and emotion recognition to better connect with people around them.

Try It out



css, elixir, google-cloud, html, javascript, oracle-cloud, pheonix

Devpost Software Identifier