A Survey on Object Detection Performance with Different Data Distributions

Page view(s)
98
Checked on Aug 30, 2024
A Survey on Object Detection Performance with Different Data Distributions
Title:
A Survey on Object Detection Performance with Different Data Distributions
Journal Title:
Lecture Notes in Computer Science
Keywords:
Publication Date:
02 November 2021
Citation:
Pahwa, R. S., Chang, R., Jie, W., Satini, S., Viswanathan, C., Yiming, D., … Wah, W. K. (2021). A Survey on Object Detection Performance with Different Data Distributions. Lecture Notes in Computer Science, 553–563. doi:10.1007/978-3-030-90525-5_48
Abstract:
Detecting objects in a dynamic scene is a critical step for robotic navigation. A mobile robot may need to slow down in presence of children, elderly or dense crowds. A robot's movement needs to be precise and socially adjustable especially in a hospital setting. Identi- fying key objects in a scene can provide important contextual aware- ness to a robot. Traditional approaches used handcrafted features along with object proposals to detect objects in images. Recently, object de- tection has made tremendous progress over the past few years thanks to deep learning and convolutional neural networks. Networks such as SSD, YOLO, and Faster R-CNN have made signi cant improvements over traditional techniques while maintaining real-time inference speed. However, current existing datasets used for benchmarking these models tend to contain mainly outdoor images using a high-quality camera setup that is usually di erent from a robotic vision setting where a robot moves around in a dynamic environment resulting in sensor noise, motion blur, and change in data distribution. In this work, we introduce our custom dataset collected in a realistic hospital environment consisting of distinct objects such as hospital beds, tables, and wheelchairs. We also use state- of-art object detectors to showcase the current performance and gaps in a robotic vision setting using our custom CHART dataset and other public datasets.
License type:
Publisher Copyright
Funding Info:
This research / project is supported by the A*STAR - National Robotics Programme
Grant Reference no. : 1922500049
Description:
This is a post-peer-review, pre-copyedit version of an article published in Social Robotics. The final authenticated version is available online at: https://doi.org/10.1007/978-3-030-90525-5_48
ISSN:
1611-3349
0302-9743
1611-3349
Files uploaded:

File Size Format Action
icsr-revised.pdf 9.52 MB PDF Open