Sensor Fusion Using Proprioceptive and Exteroceptive sensors

Thomas Schön, Linköping University

Abstract:  This talk will provide an overview of some of the speaker's research activities on performing sensor fusion using both proprioceptive sensors (measuring values internally to robot) and exteroceptive sensors (providing information from the robots environment). Sensor fusion refers to the problem of computing state estimates using measurements from several different, often complementary, sensors. The strategy is explained and (perhaps more importantly) illustrated using four different industrial/research applications, very briefly introduced below.  1. Real-time pose estimation of a helicopter (using inertial sensors and a camera) and autonomous landing of the helicopter.  2. Pose estimation of a helicopter using an already existing map (a processed version of an aerial photograph of the operational area), inertial sensors and a camera.  3. Vehicle motion estimation (using inertial sensors, steering wheel sensors and an infrared camera).  4. Indoor pose estimation of a human body (using inertial sensors and ultra-wideband). For more information, see <>