Smart_Eye: A Navigation and Obstacle Detection for Visually Impaired People through Smart App


  • Bhasha Pydala KLEF (Deemed to be University) Guntur
  • T. Pavan Kumar KLEF (Deemed to be University) Guntur
  • K. Khaja Baseer Mohan Babu University



Visually impaired, Objects, Detection, Recognition, Artificial Intelligence, sensors.


Vision is extremely important in our lives. The loss of sight is a serious issue for anyone. According to the WHO, one-sixth of the world's population suffers from vision impairment. According to World Health Organization (WHO) statistics published in December 2021, more than 283 million people worldwide suffer from sight problems, including 39 million blind people and 228 million people with low vision. Navigation in unfamiliar environments is a significant challenge for the partially sighted and visually impaired. Improving visual information on object location and content can aid navigation in unfamiliar environments. Many efforts have been made over the years to develop various devices to assist the visually impaired and improve their quality of life. Numerous efforts have been made over the decades to develop gadgets to support the visually impaired as well as enhance the quality of their lives by trying to make them skilled. There are many existing navigation alternatives that can aid these people. However, in practice, navigation alternatives are infrequently adopted and implemented. For universal use, many of these gadgets are either too heavy or too expensive. While emphasizing related strengths and limitations, it is necessary to produce a minimally expensive assistive device for people with visual disabilities. The proposed model provides an efficient solution for VIPs to roam from place to place by themselves through smart applications with AI and sensor technology. The smart application captures and classifies the images. The obstacles are detected through ultrasonic sensors. The user can get a sense of the obstacles in the path through voice command. The proposed model is very helpful for the VIPs in terms of qualitative and quantitative performance measures. This enables a ranking of the evaluated systems according to their potential influence on Visually Impaired people's lives.



Download data is not yet available.

Author Biographies

Bhasha Pydala, KLEF (Deemed to be University) Guntur



T. Pavan Kumar, KLEF (Deemed to be University) Guntur



K. Khaja Baseer, Mohan Babu University




Ahmad, N. S. (2018). Multi-sensor obstacle detection system via model-based state-feedback control in smart cane design for the visually challenged. IEEE Access, 64182-64192.

Andrius Budrionis, D. P. (2022). Smartphone-based computer vision travelling aids for blind and visually impaired individuals: A systematic review. Assistive Technology, 178-194.

Ashraf, A. N. (2020). Iot Empowered Smart Stick Assistance For Visually Impaired People. International Journal of Scientific & Technology Research, 356-360.

Bai, J. L. (2018). Virtual-blind-road following-based wearable navigation device for blind people. IEEE Transactions on Consumer Electronics, 136-143.

Bhasha, P. K. (2020). A Simple and Effective Electronic Stick to Detect Obstacles for Visually Impaired People through Sensor Technology. Journal of Advanced Research in Dynamical and Control Systems, 18-25.

Bhavani, R. (2021). Development of a smart walking stick for visually impaired people. Turkish Journal of Computer and Mathematics Education, 999-1005.

Chaitali M. Patil, A. P. (2016). A Mobile Robostick for Visually Impaired People. International Journal of Computer Engineering In Research Trends, 183-186.

Elmannai, W. &. (2017). Sensor-based assistive devices for visually-impaired people: Current status, challenges, and future directions. Sensors, 565-577.

Elmannai, W. M. (2018). A highly accurate and reliable data fusion framework for guiding the visually impaired. IEEE Access, 33029-33054.

Foley, A. &. (2012). Technology for people, not disabilities: Ensuring access and inclusion. Journal of Research in Special Educational Needs, 192-200.

Ilag, B. N. (2019). Design review of Smart Stick for the Blind Equipped with Obstacle Detection and Identification using Artificial Intelligence. International Journal of Computer Applications, 55-60.

Iskander, M. H.-A. (2022). Health Literacy and Ophthalmology: A scoping review. Survey of Ophthalmology, 1225-1233.

Islam, M. M. (2019). Developing walking assistants for visually impaired people: A review. IEEE Sensors Journal, 2814-2828.

Jafri, R. &. (2018). User-centered design of a depth data based obstacle detection and avoidance system for the visually impaired. Human-centric Computing and Information Sciences, 1-30.

Kurniawan, B., & Saputra, H. T. . (2022). Telegram Implementation on Security and Monitoring of Home Door Keys Based on Wemos and Internet of Things . Journal of Applied Engineering and Technological Science (JAETS), 4(1), 244–250.

Lee, Y. H. (2016). RGB-D camera based wearable navigation system for the visually impaired. Computer vision and Image understanding, 3-20.

Marzec, P. &. (2019). Low energy precise navigation system for the blind with infrared sensors. 2019 MIXDES-26th International Conference "Mixed Design of Integrated Circuits and Systems (pp. 394-397). IEEE.

Mountain, G. (2004). Using the evidence to develop quality assistive technology services. Journal of Integrated Care, 19-26.

Parimal A. Itankar, H. R. (2016). Indoor Environment Navigation for Blind with Voice Feedback. International Journal of Computer Engineering In Research Trends, 609-612.

Patil, K. J. (2018). Design and construction of electronic aid for visually impaired people. IEEE Transactions on Human-Machine Systems, 172-182.

Pentland, B. (1998). Living in a State of Stuck: How technology impacts the lives of people with disabilities. Spinal Cord, 210-219.

Ponnada, S. Y. (2018). A hybrid approach for identification of manhole and staircase to assist visually challenged. IEEE Access, 41013-41022.

R. Gnana Praveen, R. P. (2013). Blind Navigation Assistance for Visually Impaired based on Local Depth Hypothesis from a Single Image. Procedia Engineering, 351-360.

Rahman, M. W. (2021). The architectural design of smart blind assistant using IoT with deep learning paradigm. Internet of Things, 1126-1132.

Ran, L. H. (2004). Drishti: an integrated indoor/outdoor blind navigation system and service. Second IEEE Annual Conference on Pervasive Computing and Communications (pp. 23-30). hyderabad: IEEE.

Real, S. &. (2019). Navigation systems for the blind and visually impaired: Past work, challenges, and open problems. Sensors, 3404.

Salama, R. &. (2019). Design of smart stick for visually impaired people using Arduino. In New Trends and Issues Proceedings on Humanities and Social Sciences (pp. 58-71). IEEE.

Sangpal, R. G. (2019). JARVIS: An interpretation of AIML with integration of gTTS and Python. 2nd International Conference on Intelligent Computing (pp. 486-489). IEEE.

Shah, C. B. (2006). Evaluation of RU-netra-tactile feedback navigation system for the visually impaired. International Workshop on Virtual Rehabilitation (pp. 72-77). kgm: IEEE.

Tiponut, V. I. (2010). Work directions and new results in electronic travel aids for blind and visually impaired people. WSEAS Transactions on Systems, 1086-1097.




How to Cite

Pydala, B., Kumar, T. P., & Baseer, K. K. (2023). Smart_Eye: A Navigation and Obstacle Detection for Visually Impaired People through Smart App . Journal of Applied Engineering and Technological Science (JAETS), 4(2), 992–1011.