Distributed Visual Sensor Network Calibration Based on Joint Object Detections

Jennifer Simonjan, Bernhard Rinner

Publikation: Konferenzband/Beitrag in Buch/BerichtKonferenzartikelBegutachtung

Abstract

In this paper we present a distributed, autonomous network calibration algorithm, which enables visual sensor networks to gather knowledge about the network topology. A calibrated sensor network provides the basis for more robust applications, since nodes are aware of their spatial neighbors. In our approach, sensor nodes estimate relative positions and orientations of nodes with overlapping fields of view based on jointly detected objects and geometric relations. Distance and angle measurements are the only information required to be exchanged between nodes. The process works iteratively, first calibrating camera neighbors in a pairwise manner and then spreading the calibration information through the network. Further, each node operates within its local coordinate system avoiding the need for any global coordinates. While existing methods mostly exploit computer vision algorithms to relate nodes to each other based on their images, we solely rely on geometric constraints.
OriginalspracheDeutsch (Österreich)
Titel13th International Conference on Distributed Computing in Sensor Systems (DCOSS)
Seiten109-116
Seitenumfang8
DOIs
PublikationsstatusVeröffentlicht - 7 Juni 2017
Extern publiziertJa
Veranstaltung2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS) - Ottawa, ON, Canada
Dauer: 5 Juni 20177 Juni 2017

Konferenz

Konferenz2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS)
Zeitraum5/06/177/06/17

Schlagwörter

  • Cameras
  • Calibration
  • Estimation
  • Object detection
  • Three-dimensional displays
  • Visualization
  • Network topology

Dieses zitieren