Booz Allen Convenes DoD, Simulation, Gaming Panelists to Talk VR/AR
Virtual reality (VR) and augmented reality (AR) use data science, machine intelligence, and creativity to enable training and planning in a safe environment—without consequences. How is this playing out in battlefield simulations, education, and beyond, and what can we expect in terms of technology—and ethics—in the future?
On October 4 at The Atlantic Festival, Booz Allen Vice President Munjeet Singh moderated a discussion with Anthony Cerri, retired director of TRADOC Games and Simulations, Amber Osborne, chief marketing officer of Doghead Simulations, and Lucien Parsons, founding director of the Mixed/Augmented/Virtual Reality Innovation Center (MAVRIC) at the University of Maryland.
The summaries of the questions and conversations are below.
What’s the state of VR/AR right now, and how does artificial intelligence (AI) factor in?
The VR of the 1980s, which yielded flight simulators for the Army and Navy, involved $200-million projects. Today, thanks to a few standardized hardware and software platforms and a generation of programmers experienced in video games, simulation, and training, “you can create something meaningful for just a couple of million dollars,” said Parsons.
These projects have been wide-ranging. Osborne and Singh referenced customizable virtual meeting spaces where participants can interact with 3D models and training software that incorporates AI for safer, more efficient, and cost-effective training—such as headsets that track eye movements and adjust simulations accordingly.
“We’re 25 years from Black Hawk Down,” said Cerri. “Imagine if those pilots had immersive AI displays, with sensor data fed back into algorithms guiding them how to maneuver.”
Where do you see this technology going?
Parsons compares VR to smartphones in 2008: People saw them as a means for getting email on the go but didn’t yet envision the multiple other uses to follow. As businesses and agencies become ready to bring VR into their operations, he envisions augmented applications, such as sonogram overlays that enable doctors to view sonograms while remaining engaged with the patient.
Osborne described haptic gloves (those designed by Haptx) that enable trainees to feel temperature and manipulate objects. Cerri envisions division commanders wearing similarly equipped suits feeding them sensations from the battlefield.
“The biggest barrier right now is the expense of the products,” said Osborne.
Overall, panelists said they anticipated improved modeling speed and fidelity and better distribution. Security is becoming an increased focus as well, as many game development tools weren’t designed for the secure networks of academia or government agencies.
What ethical issues do you see arising—and how should we deal with them?
For Parsons, one priority is helping people distinguish what’s real and what’s not as the technology evolves.
Cerri cited the division of labor between humans and machines. “The current DoD standard is that there’s never a lethal engagement of a robot without a human in the loop. But a growing number of people are challenging this because a robot can limit collateral damage and act more quickly.”
Osborne discussed the importance of education. “We need to get more people into headsets seeing what this technology can do.”
Watch a recording of this and other sessions, here.