Advanced Situational Awareness Model and Visualization Environment, 10-R8107

Printer Friendly Version

Principal Investigators
Michael S. Moore
Cyril F. Meyer

Inclusive Dates:  10/01/09 – 09/30/10

Background - Current situational awareness (SA) capabilities deployed by the U.S. Army are not sufficient in their support for safety and lethality of the warfighter. Existing systems are not integrated and provide warfighters large amounts of data on independent data streams. The processes of correlating data and applying a priori knowledge and experience to determine the local operating picture is done mostly manually and is often crucial to the successful completion of the primary mission.

Approach - This project applied automatic data fusion and reasoning algorithms to automate SA tasks that are currently done manually by the warfighter and applied augmented reality techniques to create a visualization environment that combines information from many sources in a simple to understand interface overlaid on video.

Accomplishments - The goal of this research was to investigate technologies that provide the warfighter with more timely, trustable and useful information about the state of the world in which the soldier is deployed. The result is a demonstration system that shows how data fusion, automated reasoning and augmented reality can accomplish this goal. This system includes the following innovations:

  • Situational Awareness Reasoning Engine (SARE): A software component that executes a reasoning algorithm to maintain a local operational picture from first-hand observations (sensors on the vehicle) and second-hand observations (from other vehicles).

  • Augmented Reality Situational Awareness Visualization System: Software components that perform coordinate transformation and graphics algorithms to fuse the SA data with video streams and presents the fused data to the user on a graphical user interface. The software overlays graphical cues in the appropriate locations on video data streams to highlight the objects for the user. The highlights move with the identified position in the video as the vehicle moves or as the object locations are updated.

  • An "on the move" simulation/emulation environment: The project developed a system that is installed on multiple moving vehicles, which interacts with simulated sensors using a wireless network. The simulation/emulation environment combines simulated inputs with moving vehicles with actual sensors to create a valuable environment for evaluating sensors, operational techniques, and new situational awareness algorithms.

  • A research-level mapping application: The project developed a mapping application based on Google Maps® that proved that modern technologies can significantly improve the performance of the mapping functionality commonly used by the U.S. Army.

2010 Program Home