Poster + Presentation + Paper
10 October 2020 An accurate low-light object detection method based on pyramid networks
Author Affiliations +
Conference Poster
Abstract
Low light object detection is a challenging problem in the field of computer vision and multimedia. Most available object detection methods are not accurate enough in low light conditions. The main idea of low light object detection is to add an image enhancement preprocessing module before the detection network. However, the traditional image enhancement algorithms may cause color loss, and the recent deep learning methods tend to take up too many computing resources. These methods are not suitable for low light object detection. We propose an accurate low light object detection method based on pyramid networks. A low-resolution pyramid enhancing light network is adopted to lessen computing and memory consumption. A super-resolution network based on attention mechanism is designed before Efficientdet to improve the detection accuracy. Experiments on the10K RAW-RGB low light image dataset show the effectiveness of the proposed method.
Conference Presentation
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Qingyang Tao, Kun Ren, Bo Feng, and Xuejin GAO "An accurate low-light object detection method based on pyramid networks", Proc. SPIE 11550, Optoelectronic Imaging and Multimedia Technology VII, 1155015 (10 October 2020); https://doi.org/10.1117/12.2573925
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image enhancement

Computer vision technology

Detection and tracking algorithms

Machine vision

Multimedia

RELATED CONTENT

Shadow detection and removal based on the saliency map
Proceedings of SPIE (October 27 2013)
GPU-based real-time trinocular stereo vision
Proceedings of SPIE (February 04 2013)
Object shape extraction from cluttered bags
Proceedings of SPIE (May 01 2017)

Back to Top