Cooperative Control of a Deployable Aerial Sensor Platform, 10-R8462
Richard D. Garcia
Inclusive Dates: 04/01/14 – 10/01/15
Background — While unmanned aircraft have proven their viability in both the commercial and military market, unmanned aircraft technology continues to suffer from limited payload capacity, limited perception, and dependency on global positioning system (GPS) as a primary localization method. These limitations are significantly compounded on small aircraft (<10kg) as they cannot sacrifice payload or power to employ large sensor suites or high computational equipment. These limitations quickly become operational failures in obstacle-rich environments or where GPS may be unavailable or corrupted.
The primary objective of this research effort is to develop algorithms, equations, and techniques that will enable an autonomous unmanned ground vehicle to safely and effectively deploy, recover, and navigate an aerial platform. The proposed work will enable an unmanned ground vehicle (UGV) to use its own high fidelity sensors to localize and control an aerial platform, off-loading obstacle detection and avoidance to the ground vehicle. By offloading many of the sensing and computational requirements from the aerial platform to the ground vehicle, the aerial platform can be lighter and simpler in construction, with an emphasis on mission payload. This technique will enable safe and effective flight during extremely low-altitude operation (including under foliage), and will enable flight in GPS-denied environments.
Approach — To accomplish the above objectives, we are developing algorithms that allow the UGV to identify and localize the aerial platform using sensors mounted on the UGV. Although extensive experience leads us to believe that a combination of ranging data (provided by a multi-planar laser range finder or radar) and imagery, possibly assisted by lightweight fiducials, will provide accurate localization at close range (~20m), specific sensor configuration design will use metrics collected from several manual flights representative of the operational area of the aircraft. Once the appropriate configuration of sensors has been selected, manual flight data can be used to tune basic localization calculations, which will be verified for accuracy using a VICON motion capture system.
Accomplishments — During the course of the project, the team successfully implemented an active tether that allows for extremely long-duration flight times and provides a 100 Mb/s connection between vehicles. Flight duration and positional control of the UAS was successfully tested in the VICON motion capture system, allowing for waypoint-based control of the UAS using the Robot Operating System (ROS). Outdoor autonomous flights were performed using the tether to provide sustained duration flight over a ground vehicle. During these flights, the UAS relayed overhead imagery to the ground vehicle, which calculated relative position of the UAS using opportunistic feature detection and tracking.