An AI-Powered Autonomous System for Real-Time Blind Smart Shoe

Authors

  • S. Ananth
  • L.S . Kavitha
  • S. Kesavprabu
  • S. Prasanna
  • S. Veerajayashuriya
  • M. Vignesh Kumar

Keywords:

Blind Navigation System, AI-powered navigation, YOLO object detection, ESP32-CAM, autonomous system, mobility enhancement, AI-Powered Smart Shoe

Abstract

This project introduces an AI-Powered Autonomous System for a Real-Time Blind Smart Shoe, aimed at enhancing mobility and independence for visually impaired individuals. The system integrates artificial intelligence, Internet of Things (IoT), and real-time computer vision to identify obstacles and navigate safely through various environments.At the core of the system is an ESP32 camera module, which continuously captures the surroundings in real time. These visual inputs are transmitted to a dedicated Android APK application via Wi-Fi. The application processes the incoming frames using pre-trained deep learning models to detect and classify objects or obstacles in the user’s path. Once an object is identified, the app immediately converts the detection result into voice output using Text-to-Speech (TTS) technology.To ensure that the voice feedback reaches the user clearly, a USB Bluetooth audio module is connected to a small speaker or headset worn by the user. This enables the system to provide spoken alerts, such as "Obstacle ahead", "Person", or "Vehicle", which helps the blind user make informed decisions while walking.Additionally, the system includes Google Maps integration within the app to support GPS-based route guidance. The user can input or select a destination, and the app will provide real-time navigation commands such as “Turn left in 20 meters” or “You have reached your destination”, ensuring smooth travel along predefined or dynamically generated routes.By combining real-time image recognition, auditory feedback, and navigation assistance, the Blind Smart Shoe system serves as a smart wearable solution that improves both environmental awareness and route navigation for blind and visually impaired individuals. It ultimately promotes safety, autonomy, and confidence in daily mobility

Downloads

Download data is not yet available.

References

Redmon, J. & Farhadi, A. (2018). YOLOv3: An Incremental Improvement. arXiv preprint arXiv:1804.02767.

Bochkovskiy, A., Wang, C.-Y., & Liao, H.-Y. M. (2020). YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv:2004.10934.

Lin, T. Y., et al. (2014). Microsoft COCO: Common Objects in Context. ECCV.

Simonyan, K., & Zisserman, A. (2014). Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv.

Howard, A. G., et al. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv.

Esp32-cam datasheet. https://www.espressif.com/en/products/socs/esp32

Arduino IDE Documentation. https://docs.arduino.cc/

OpenCV Library Documentation. https://opencv.org

TensorFlow Lite for Microcontrollers. https://www.tensorflow.org/lite/microcontrollers

Pathak, A., et al. (2019). Real-time Object Detection on Embedded Systems. Journal of Embedded Systems.

Hassan, M., et al. (2022). IoT-Based Assistive Technology for the Visually Impaired. Sensors, 22(1).

Kumar, A., & Patel, P. (2021). Object Detection and Navigation System for Blind People using YOLO and Arduino. IJERT.

Li, X. & Wang, H. (2022). Smart Shoes for the Visually Impaired: A Survey. Journal of Assistive Technologies.

Google Maps API Documentation. https://developers.google.com/maps

Kar, R., & Basu, A. (2022). GPS-Assisted Blind Navigation. Springer Assistive Technology Series.

Google Text-to-Speech API. https://cloud.google.com/text-to-speech

Liu, Y. et al. (2022). Lightweight AI Models for Edge Computing. IEEE Access.

Choudhury, S. & Das, A. (2020). Smart Assistive Footwear Using ESP32 and AI. IJITEE.

DeepSpeech: Mozilla. https://github.com/mozilla/DeepSpeech

Akhter, R. & Naeem, M. (2021). Object Detection for Smart Mobility Aid. Journal of Ambient Intelligence.

Google Firebase for Android Apps. https://firebase.google.com/

[Ullah, F., et al. (2023). Embedded AI for Smart Assistive Systems. Sensors Journal.

Niu, Y. et al. (2022). Android-Based Navigation Aids for the Visually Impaired. IJHCI.

[Wang, J. & Yu, L. (2022). Real-Time Vision AI in Wearables. ACM UbiComp.

Chaudhari, A. & Kumar, V. (2023). Smart IoT Shoe System using ESP32-CAM and AI. IEEE IoT Magazine.

Dutta, A., et al. (2022). YOLO-Based Real-Time Guidance System. Procedia CS.

Kar, R., & Basu, A. (2022). GPS-Assisted Blind Navigation. Springer Assistive Tech.

Azad, R., & Ismail, M. (2021). Real-Time Obstacle Avoidance using AI and IoT. Sensors.

Singh, R., et al. (2023). Voice-Activated Navigation Tools for the Blind. Journal of Accessibility.

Thakkar, P. & Joshi, R. (2020). Mobile App for the Visually Impaired with Navigation Support. IJCA.

Khanna, R., et al. (2023). Vision-Based Smart Walking Assistance. AI in Healthcare.

] Alam, M., & Shah, S. (2022). Object Recognition for Low-Power IoT Devices. MDPI Electronics.

] Shetty, N., et al. (2022). Smart Assistive Shoes: Design and Evaluation. Int’l Journal of Human-Centered AI.

Zhao, X. et al. (2023). Real-Time AI-Powered Edge Systems for Safety. IEEE Embedded AI Transactions.

Bluetooth HC-05 Datasheet and Communication. https://components101.com

Tesseract OCR and Embedded Vision APIs. https://github.com/tesseract-ocr

Ramalingam, S., et al. (2023). AI Shoes for the Blind: Hardware Implementation. IJECE.

Javed, H. et al. (2022). Sensor Fusion for Navigation in Smart Wearables. Sensors and Systems Journal.

Bhattacharya, A. & Roy, P. (2023). Embedded Navigation Aid for Impaired Mobility. Smart Systems Review.

[Rani, S. & Kaur, G. (2021). Comparative Study of AI Techniques in Assistive Devices. IJRASET.

Python Serial Communication with ESP32. https://pyserial.readthedocs.io

Android Studio with Firebase Integration for Alert Messaging. https://developer.android.com/studio

Usman, A. et al. (2022). Low-Cost Obstacle Detection for Blind Support. IOT-NextGen.

ArduCam ESP32 Camera Integration. https://www.arducam.com/docs/cameras-for-arduino/esp32-cam/

Navigating with ESP32 and GPS Neo-6M. https://randomnerdtutorials.com/

] GitHub - Android TTS Sample Code. https://github.com/googlesamples/android-tts

Abid, M. et al. (2021). Smart Mobility Device Using Deep Learning. Journal of AI Innovation.

Patel, H., & Mistry, S. (2022). Embedded AI Applications for Accessibility. SmartTech.

Chatterjee, A., & Singh, R. (2022). Assistive Smart Wearables for Navigation. Springer Series on AI Assistive Devices.

Banerjee, P. & Sinha, D. (2023). Object Detection on Edge: ESP32 and YOLO Integration. Embedded AI Conference..

Downloads

Published

2025-07-17

How to Cite

1.
Ananth S, Kavitha L ., Kesavprabu S, Prasanna S, Veerajayashuriya S, Kumar MV. An AI-Powered Autonomous System for Real-Time Blind Smart Shoe. J Neonatal Surg [Internet]. 2025Jul.17 [cited 2025Sep.18];14(32S):5517-31. Available from: https://www.jneonatalsurg.com/index.php/jns/article/view/8330

Most read articles by the same author(s)