Cooperative multi-sensor data fusion to geo-localize ground moving targets using an aerial sensor and a human as an additional sensor

Date

2014

Authors

Motaghi, Azima

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

The ability to track targets using Unmanned Aerial Vehicles (UAVs) has a wide range of civilian and military applications. For example, for military personnel, it is critical to track and locate a variety of objects, including the movement of enemy vehicles. For civilian applications, we can easily find UAVs performing tasks related to land survey, weather forecasting, search and rescue missions, and monitoring farm crops. This study presents a novel method for determining the locations of moving ground-based targets using UAVs and human operators. In previous research, Sharma et al. [1] developed a vision-based target tracking algorithm. They used a Kalman-filter to estimate the target's position and velocity. An information-filter was used to control the sensor. Targets were geo-localized using the pixel locations of the targets in an image. The measurements of the UAV position, altitude, and camera pose angle along with the information embedded in the image provide the required input to an estimator to geo-locate ground targets. Using the highly sophisticated skills of humans for sensing environments, we are interested in integrating the abilities of human operators as a part of sensor network. The main contribution of this thesis is a cyber-physical system developed for reducing the localization errors of targets observed by either UAVs or humans working cooperatively. In particular, in the process of developing the system, we developed (1) an Extended Kalman Filter (EKF) based algorithm to estimate the positions of multiple targets, (2) a human sensor model using neural networks, and (3) a weighted filter to fuse local target estimations from multiple UAVs. Human sensor inputs were utilized to improve the geo localization accuracy of target position estimates. This technique requires operators to be equipped with an Android device; providing operators an easy access to Google map and Global Positioning System (GPS); such that they can specify a target's position on the map. Each sensor, UAVs or human operators, exchanges data through a Wi-Fi sensor network. A central station is used to collect the information observed by independent sensors for data fusion and combines them to generate more accurate estimates that would not be available from any single UAV or a human operator. The capability of the system was demonstrated using simulation results and Android hardware.

Description

This item is available only to currently enrolled UTSA students, faculty or staff. To download, navigate to Log In in the top right-hand corner of this screen, then select Log in with my UTSA ID.

Keywords

Artificial Neural Network, Data Fusion, Human Sensor Models, Sensor Network, Unmanned Aerial Vehicles (UAVs)

Citation

Department

Electrical and Computer Engineering