This article was written by Alex Hollings and originally published on

The United States Air Force says they intend to pit an artificial intelligence-enabled drone against a manned fighter jet in a dogfight as soon as next year.

Although drones have become an essential part of America’s air power apparatus, these platforms have long had their combat capabilities hampered by both the limitations of existing technology and our own concerns about allowing a computer to make the decision to fire ordnance that will likely result in a loss of life. In theory, a drone equipped with artificial intelligence could alleviate both of those limiting factors significantly, without allowing that life or death decisions to be made by a machine.

As any gamer will tell you, lag can get you killed. In this context, lag refers to the delay in action created by the time it takes for the machine to relay the situation to a human operator, followed the time it takes for the operator to make a decision, transmit the command, where it must then be received once again by the computer, where those orders translate into action. Even with the most advanced secure data transmission systems on the planet, lag is an ever-present threat to the survivability of a drone in a fast-paced engagement.


Because of that lag limitation, drones are primarily used for surveillance, reconnaissance, and airstrikes, but have never been used to enforce no-fly zones or to posture in the face of enemy fighters. In 2017, a U.S. Air Force MQ-9 Reaper drone successfully shot down another, a smaller drone using an air-to-air missile. That success was the first of its kind, but even those responsible for it were quick to point out that such a success was in no way indicative of that or any other drone platform now having real dogfighting capabilities.

“We develop those tactics, techniques, and procedures to make us survivable in those types of environments and, if we do this correctly, we can survive against some serious threats against normal air players out there,” said Col. Julian Cheater, commander of the 432nd Wing at Creech Air Force Base, Nevada, at the time.

Artificial intelligence, however, could very feasibly change this. By using some level of artificial intelligence in a combat drone, operators could give the platform orders, rather than specific step-by-step instructions. In effect, the drone operator wouldn’t need to physically control the drone to dogfight, but could rather command the drone to engage an air asset and allow it to make rapid decisions locally to respond to the evolving threat and properly engage. Put simply, the operator could tell the drone to dogfight, but then allow the drone to somewhat autonomously decide how best to proceed.

The XQ-58A Valkyrie demonstrator, a long-range, high subsonic unmanned air vehicle completed its inaugural flight March 5, 2019 at Yuma Proving Grounds, Arizona. (Air Force photo by Senior Airman Joshua Hoskins).

The challenges here are significant, but as experts have pointed out, the implications of such technology would be far-reaching. U.S. military pilots receive more training and flight time than any other nation on the planet, but even so, the most qualified aviators can only call on the breadth of their own experiences in a fight.

Drones enabled with some degree of artificial intelligence aren’t limited to their own experiences, and could rather pull from the collective experiences of millions of flight hours conducted by multiple drone platforms. To give you a (perhaps inappropriately threatening) analogy, you could think of these drones as the Borg from Star Trek. Each drone represents the collected sum of all experiences had by others within its network. This technology could be leveraged not just in drones, but also in manned aircraft to provide a highly capable pilot support or auto-pilot system.

“Our human pilots, the really good ones, have a couple of thousand hours of experience,” explains Steve Rogers, the Team Leader for the Air Force Research Laboratory’s (AFRL) Autonomy Capability Team 3 (ACT3).

“What happens if I can augment their ability with a system that can have literally millions of hours of training time? … How can I make myself a tactical autopilot so in an air-to-air fight, this system could help make decisions on a timeline that humans can’t even begin to think about?”

As Rogers points out, such a system could assess a dangerous situation and respond faster than the reaction time of even highly trained pilots, deploying countermeasures or even redirecting the aircraft out of harm’s way. Of course, even the most capable autopilot would still need the thinking, reasoning, and directing of human beings–either in the cockpit or far away. So, even with this technology in mind, it appears that the days of manned fighters are still far from over. Instead, AI-enabled drones and autopilot systems within jets could both serve as direct support for manned aircraft in the area.

By incorporating multiple developing drone technologies into such an initiative, such as the drone wingman program called Skyborg, drone swarm initiatives aimed at using a large volume of cooperatively operating drones, and low-cost, high capability drones like the XQ-58A Valkyrie, such a system could fundamentally change the way America engages in warfare.

Ultimately, it may not be this specific drone program that ushers in an era of semi-autonomous dogfighting, but it’s not alone. From the aforementioned Skyborg program to the DARPA’s artificial intelligence-driven Air Combat Evolution program, the race is on to expand the role of drones in air combat until they’re seen as nearly comparable to manned platforms.

Of course, that likely won’t happen by next year. The first training engagement between a drone and a human pilot will likely end in the pilot’s favor… but artificial intelligence can learn from its mistakes, and those failures may not be all that long-lived.

“[Steve Rogers] is probably going to have a hard time getting to that flight next year … when the machine beats the human,” Lt. Gen. Jack Shanahan, head of the Pentagon’s Joint Artificial Intelligence Center, said during a June 4 Mitchell Institute for Aerospace Studies event. “If he does it, great.”