Reading and understanding house numbers for delivery robots using the ”SVHN Dataset”

Loading...
Thumbnail Image

Date published

Free to read from

2024-06-19

Authors

Pradhan, Omkar N.
Tang, Gilbert
Makris, Christos
Gudipati, Radhika

Supervisor/s

Journal Title

Journal ISSN

Volume Title

Publisher

Department

Course name

ISSN

2641-0184

Format

Citation

Pradhan O, Tang G, Makris C, Gudipati R. (2024) Reading and understanding house numbers for delivery robots using the “SVHN Dataset”. In: 2024 IEEE International Conference on Industrial Technology (ICIT), 25-27 March 2024, Bristol, UK

Abstract

Detecting street house numbers in complex environments is a challenging robotics and computer vision task that could be valuable in enhancing the accuracy of delivery robots' localisation. The development of this technology also has positive implications for address parsing and postal services. This project focuses on building a robust and efficient system that deals with the complexities associated with detecting house numbers in street scenes. The models in this system are trained on Stanford University's SVHN (Street View House Numbers) dataset. By fine-tuning the YOLO's (You Only Look Once) nano model results with an effective detection range from 1.02 meters to 4.5. The optimum allowance for angle of tilt was ±15°. The inference resolution was obtained to be 2160 * 1620 with inference delay of 35 milliseconds.

Description

Software Description

Software Language

Github

Keywords

Artificial Intelligence, Character Recognition, Computer Vision, Object Detection, YOLO, SVHN

DOI

Rights

Attribution-NonCommercial 4.0 International

Funder/s

Relationships

Relationships

Resources