A methodology for evaluating illumination artifact removal for corresponding images


by Vaudrey, T, Wedel, A and Klette, R
Abstract:
Robust stereo and optical flow disparity matching is essential for computer vision applications with varying illumination conditions. Most robust disparity matching algorithms rely on computationally expensive normalized variants of the brightness constancy assumption to compute the matching criterion. In this paper, we reinvestigate the removal of global and large area illumination artifacts, such as vignetting, camera gain, and shading reflections, by directly modifying the input images. We show that this significantly reduces violations of the brightness constancy assumption, while maintaining the information content in the images. In particular, we define metrics and perform a methodical evaluation to identify the loss of information in the images. Next we determine the reduction of brightness constancy violations. Finally, we experimentally validate that modifying the input images yields robustness against illumination artifacts for optical flow disparity matching. © 2009 Springer Berlin Heidelberg.
Reference:
A methodology for evaluating illumination artifact removal for corresponding images (Vaudrey, T, Wedel, A and Klette, R), In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), volume 5702 LNCS, 2009.
Bibtex Entry:
@inproceedings{vaudrey2009aimages,
author = "Vaudrey, T and Wedel, A and Klette, R",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "1113--1121",
title = "A methodology for evaluating illumination artifact removal for corresponding images",
volume = "5702 LNCS",
year = "2009",
abstract = "Robust stereo and optical flow disparity matching is essential for computer vision applications with varying illumination conditions. Most robust disparity matching algorithms rely on computationally expensive normalized variants of the brightness constancy assumption to compute the matching criterion. In this paper, we reinvestigate the removal of global and large area illumination artifacts, such as vignetting, camera gain, and shading reflections, by directly modifying the input images. We show that this significantly reduces violations of the brightness constancy assumption, while maintaining the information content in the images. In particular, we define metrics and perform a methodical evaluation to identify the loss of information in the images. Next we determine the reduction of brightness constancy violations. Finally, we experimentally validate that modifying the input images yields robustness against illumination artifacts for optical flow disparity matching. © 2009 Springer Berlin Heidelberg.",
doi = "10.1007/978-3-642-03767-2_135",
isbn = "3642037666",
isbn = "9783642037665",
issn = "0302-9743",
eissn = "1611-3349",
language = "eng",
}