One paper has been accepted by TIP

Our paper entitled "See Degraded Objects: A Physics-guided Approach for Object Detection in Adverse Environments" has been accepted by IEEE Transactions on Image Processing (TIP).

 

See Degraded Objects: A Physics-guided Approach for Object Detection in Adverse Environments

Weifeng Liu, Jian Pang, Bingfeng Zhang, Jin Wang, Baodi Liu, Dapeng Tao

In adverse environments, the detector often fails to detect degraded objects because they are almost invisible and their features are weakened by the environment. Common approaches involve image enhancement to support detection, but they inevitably introduce human-invisible noise that negatively impacts the detector. In this work, we propose a physics-guided approach for object detection in adverse environments, which gives a straightforward solution that injects the physical priors into the detector, enabling it to detect poorly visible objects. The physical priors, derived from the imaging mechanism and image property, include environment prior and frequency prior. The environment prior is generated from the physical model, e.g., the atmospheric model, which reflects the density of environmental noise. The frequency prior is explored based on an observation that the amplitude spectrum could highlight object regions from the background. The proposed two priors are complementary in principle. Furthermore, we present a physics-guided loss that incorporates a novel weight item, which is estimated by applying the membership function on physical priors and could capture the extent of degradation. By backpropagating the physics-guided loss, physics knowledge is injected into the detector to aid in locating degraded objects. We conduct experiments in synthetic foggy environment, real foggy environment, and real underwater scenario. The results demonstrate that our method is effective and achieves state-of-the-art performance. The code is available at https://github.com/PangJian123/See-Degraded-Objects.

 


登录用户可以查看和发表评论, 请前往  登录 或  注册
SCHOLAT.com 学者网
免责声明 | 关于我们 | 用户反馈
联系我们: