Abstract
In this paper we present a distributed, autonomous network calibration algorithm, which enables visual sensor networks to gather knowledge about the network topology. A calibrated sensor network provides the basis for more robust applications, since nodes are aware of their spatial neighbors. In our approach, sensor nodes estimate relative positions and orientations of nodes with overlapping fields of view based on jointly detected objects and geometric relations. Distance and angle measurements are the only information required to be exchanged between nodes. The process works iteratively, first calibrating camera neighbors in a pairwise manner and then spreading the calibration information through the network. Further, each node operates within its local coordinate system avoiding the need for any global coordinates. While existing methods mostly exploit computer vision algorithms to relate nodes to each other based on their images, we solely rely on geometric constraints.
Originalsprache | Deutsch (Österreich) |
---|---|
Titel | 13th International Conference on Distributed Computing in Sensor Systems (DCOSS) |
Seiten | 109-116 |
Seitenumfang | 8 |
DOIs | |
Publikationsstatus | Veröffentlicht - 7 Juni 2017 |
Extern publiziert | Ja |
Veranstaltung | 2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS) - Ottawa, ON, Canada Dauer: 5 Juni 2017 → 7 Juni 2017 |
Konferenz
Konferenz | 2017 13th International Conference on Distributed Computing in Sensor Systems (DCOSS) |
---|---|
Zeitraum | 5/06/17 → 7/06/17 |
Schlagwörter
- Cameras
- Calibration
- Estimation
- Object detection
- Three-dimensional displays
- Visualization
- Network topology