Night Time Image Enhancement
Mrs.B.Rajeswari, Assistant .Professor, Department of IT,
KKR & KSR Institute of Technology and Sciences, Guntur Dt., Andhra Pradesh.
Basireddy Swathi, Biyyam Mani Sandhya, Chandolu Supraja, Animisetty Harika
UG Students, Department of IT,
KKR & KSR Institute of Technology and Sciences, Guntur Dt., Andhra Pradesh.
basireddyswathi1508@gmail.com, luckylakshmi8653@gmail.com,supraja7207@gmail.com,
harikaanimisetty555@gmail.com,
Abstract
Night time image enhancement plays a crucial role in various applications such as surveillance, autonomous driving, and photography. However, capturing high-quality images in low-light conditions remains challenging due to limited visibility and increased noise levels. In this project, we propose a novel approach for enhancing nighttime images using MIRNet, a state-of-the-art deep learning architecture specifically designed for low-light image enhancement tasks. We collect a dataset of low-light images paired with their corresponding well-exposed counterparts and train the MIRNet model to learn the mapping between the two modalities.
The architecture of MIRNet incorporates convolutional layers with residual connections to effectively capture low-light image features and generate visually pleasing enhancements. We evaluate the performance of our approach on a diverse range of nighttime scenes and compare the results against existing methods. Our experiments demonstrate that MIRNet produces superior results in enhancing nighttime images, significantly improving visibility, reducing noise, and preserving image details. The proposed approach holds promise for real-world applications where high-quality nighttime imagery is essential for decision-making and visual analysis.
Keywords: Night time image enhancement, MIRNet, Deep learning, Low-light imaging, Image Processing,Convolutional neural networks (CNNs),Residual connections, Supervised learning, Dataset preparation,Imagequalityimprovement,Noisereduction,Visibilityenhancement,Surveillance,Autonomousdriving,Photography.