Webbläsaren som du använder stöds inte av denna webbplats. Alla versioner av Internet Explorer stöds inte längre, av oss eller Microsoft (läs mer här: * https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Var god och använd en modern webbläsare för att ta del av denna webbplats, som t.ex. nyaste versioner av Edge, Chrome, Firefox eller Safari osv.

Euclidean and Affine Structure/Motion for Uncalibrated Cameras from Affine Shape and Subsidiary Information

Författare

Summary, in English

The paper deals with the structure-motion problem for uncalibrated cameras, in the case that subsidiary information is available, consisting, for example, in known coplanarities or parallelities among points in the scene, or known positions of some focal points (hand-eye calibration). Despite unknown camera calibrations, it is shown that in many instances the subsidiary information makes affine or even Euclidean reconstruction possible. A parametrization by affine shape and depth is used, providing a simple framework for the incorporation of apriori knowledge, and enabling the development of iterative, rapidly converging algorithms. Any number of points in any number of images are used in a uniform way, with equal priority, and independently of coordinate representations. Moreover, occlusions are allowed

Avdelning/ar

Publiceringsår

1998

Språk

Engelska

Sidor

187-207

Publikation/Tidskrift/Serie

3D Structure from Multiple Images of Large-Scale Environments. European Workshop, SMILE'98. Proceedings

Dokumenttyp

Konferensbidrag

Förlag

Springer

Ämne

  • Mathematics

Nyckelord

  • image reconstruction
  • motion estimation
  • uncalibrated cameras
  • affine shape
  • structure-motion problem
  • Euclidean reconstruction
  • occlusions
  • computer vision
  • affine shape and depth

Conference name

3D Structure from Multiple Images of Large-Scale Environments. European Workshop, SMILE'98. Proceedings

Conference date

1998-06-06 - 1998-06-07

Conference place

Freiburg, Germany

Status

Published

ISBN/ISSN/Övrigt

  • ISBN: 3 540 65310 4