Sailors on Aircraft Carriers Could Guide Drones With a Wave of Their Hands
On the busy, noisy flight deck of an aircraft carrier, ground crews talk to pilots with a shared language of hand gestures. Now, as the Navy prepares for drones to share the flight deck, it may teach the machines to understand those same gestures.
The U.S. Navy is holding a competition to select the service’s first operational carrier-based drone. The MQ-25 Stingray will be primarily an aerial tanker designed to refuel an aircraft carrier’s four F-35Cs and F/A-18E/F Super Hornet strike fighter squadrons. General Atomics, Lockheed Martin, and Boeing have all prepped designs for the competition, which seeks to select a winner that would start hitting the fleet in the 2020s.
Earlier this decade, the Navy ran a series of tests with the X-47B Unmanned Combat Air System drone. In those tests, UCAS was controlled on the ground with a wireless handheld device. The device “was used to control the X-47B’s engine thrust to roll the aircraft forward, brake and stop, and use its nose wheel steering to execute the tight, precision turns required to maneuver the aircraft into a catapult or out of the landing area following a landing.”
The problem with using a handheld device on a carrier flight deck is that the deck is constantly changing, with aircraft and vehicles constantly moving around and posing a potential hazard. If a deck crewman controlling a drone is looking down at a device, then they’re not paying attention to everything going on around them. It also requires them to learn new procedures for controlling a drone.
Instead, why not make the drone learn the old ways and adapt to the carrier’s environment? According to Aviation Week & Space Technology, that’s exactly what one company is doing. A new gesture-recognition system developed by Systems Technology Inc. (STI) is teaching drones to understand the same kind of gestures used by deck crews for nearly a hundred years.
STI’s system involves putting sensors in the signal wands used by aircraft directors that a drone can interpret into commands. The system, known as Deck Intelligent Aircraft Body Language Observer (DIABLO) uses machine learning to interpret the commands properly. Deck handlers have simulated moving an aircraft from the carrier’s aircraft elevators to a position on the ship’s catapult. Gesture recognition is reportedly in the “90-100% range.”