Distributed Visual Sensor Network Calibration Based on Joint Object Detections

Jennifer Simonjan, Bernhard Rinner

Research output: Conference proceeding/Chapter in Book/Report/Conference Paperpeer-review

Abstract

In this paper we present a distributed, autonomous network calibration algorithm, which enables visual sensor networks to gather knowledge about the network topology. A calibrated sensor network provides the basis for more robust applications, since nodes are aware of their spatial neighbors. In our approach, sensor nodes estimate relative positions and orientations of nodes with overlapping fields of view based on jointly detected objects and geometric relations. Distance and angle measurements are the only information required to be exchanged between nodes. The process works iteratively, first calibrating camera neighbors in a pairwise manner and then spreading the calibration information through the network. Further, each node operates within its local coordinate system avoiding the need for any global coordinates. While existing methods mostly exploit computer vision algorithms to relate nodes to each other based on their images, we solely rely on geometric constraints.
Original languageGerman (Austria)
Title of host publication13th International Conference on Distributed Computing in Sensor Systems (DCOSS)
Pages109-116
Number of pages8
DOIs
Publication statusPublished - 7 Jun 2017
Externally publishedYes
Event2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS) - Ottawa, ON, Canada
Duration: 5 Jun 20177 Jun 2017

Conference

Conference2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS)
Period5/06/177/06/17

Cite this