Webbläsaren som du använder stöds inte av denna webbplats. Alla versioner av Internet Explorer stöds inte längre, av oss eller Microsoft (läs mer här: * https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Var god och använd en modern webbläsare för att ta del av denna webbplats, som t.ex. nyaste versioner av Edge, Chrome, Firefox eller Safari osv.

Motion dependent spatiotemporal smoothing for noise reduction in very dim light image sequences

Författare

Summary, in English

A new method for noise reduction using spatiotemporal smoothing is presented in this paper. The method is developed especially for reducing the noise that arises when acquiring video sequences with a camera under very dim light conditions. The work is inspired by research on the vision of nocturnal animals and the adaptive spatial and temporal summation that is prevalent in the visual systems of these animals. From analysis using the so-called structure tensor in the three-dimensional spatiotemporal space, motion segmentation and global ego-motion estimation, Gaussian shaped smoothing kernels are oriented mainly in the direction of the motion and in spatially homogeneous directions. In static areas, smoothing along the temporal dimension is favoured for maximum preservation of structure. The technique has been applied to various dim light image sequences and results of these experiments are presented here.

Publiceringsår

2006

Språk

Engelska

Sidor

954-959

Publikation/Tidskrift/Serie

Proceedings - International Conference on Pattern Recognition

Volym

3

Dokumenttyp

Konferensbidrag

Förlag

IEEE - Institute of Electrical and Electronics Engineers Inc.

Ämne

  • Zoology

Nyckelord

  • Visual systems
  • Gaussian shaped smoothing kernels
  • Video sequences
  • Image sequences

Conference name

18th International Conference on Pattern Recognition, ICPR 2006

Conference date

2006-08-20 - 2006-08-24

Conference place

Hong Kong, China

Status

Published

Forskningsgrupp

  • Lund Vision Group

ISBN/ISSN/Övrigt

  • ISSN: 1051-4651