Abstract
In this paper we present a distributed, autonomous network calibration algorithm, which enables visual sensor networks to gather knowledge about the network topology. A calibrated sensor network provides the basis for more robust applications, since nodes are aware of their spatial neighbors. In our approach, sensor nodes estimate relative positions and orientations of nodes with overlapping fields of view based on jointly detected objects and geometric relations. Distance and angle measurements are the only information required to be exchanged between nodes. The process works iteratively, first calibrating camera neighbors in a pairwise manner and then spreading the calibration information through the network. Further, each node operates within its local coordinate system avoiding the need for any global coordinates. While existing methods mostly exploit computer vision algorithms to relate nodes to each other based on their images, we solely rely on geometric constraints.
Original language | German (Austria) |
---|---|
Title of host publication | 13th International Conference on Distributed Computing in Sensor Systems (DCOSS) |
Pages | 109-116 |
Number of pages | 8 |
DOIs | |
Publication status | Published - 7 Jun 2017 |
Externally published | Yes |
Event | 2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS) - Ottawa, ON, Canada Duration: 5 Jun 2017 → 7 Jun 2017 |
Conference
Conference | 2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS) |
---|---|
Period | 5/06/17 → 7/06/17 |