The U.S. Department of Defense released a set of guidelines for how artificial intelligence (AI) should be deployed on the battlefield. These guidelines are intended to provide for the ethical use of new technologies that are increasingly defining modern warfare.
Among the principles laid out was the call for personnel to “exercise appropriate levels of judgment and care.” Additionally, it was stated that there needs to be “explicit, well-defined uses” for the AI.
Before the new guidelines, it was only required for humans to be involved in the decisionmaking process, sometimes referred to as “human in the loop;” whereas now members of the military will have “the ability to disengage or deactivate deployed systems that demonstrate unintended behavior.”
You've reached your daily free article limit.
Subscribe and support our veteran writing staff to continue reading.
The U.S. Department of Defense released a set of guidelines for how artificial intelligence (AI) should be deployed on the battlefield. These guidelines are intended to provide for the ethical use of new technologies that are increasingly defining modern warfare.
Among the principles laid out was the call for personnel to “exercise appropriate levels of judgment and care.” Additionally, it was stated that there needs to be “explicit, well-defined uses” for the AI.
Before the new guidelines, it was only required for humans to be involved in the decisionmaking process, sometimes referred to as “human in the loop;” whereas now members of the military will have “the ability to disengage or deactivate deployed systems that demonstrate unintended behavior.”
In other words, AI will need to have an off-switch.
However, not all are convinced that the new changes are sufficient or sincere.
Among the concerns is that the definition of “appropriate” is subject to interpretation and could be adapted without a consistent standard.
Another possibility is that the Pentagon is seeking to improve its image in Silicon Valley. For example, Google chose not to renew a contract with the Department of Defense in 2018 following a backlash among employees regarding a project that used machine learning to distinguish between people and objects. The project would be used for drones.
Secretary of Defense Mark Esper has been adamant that AI ought to be and will be a central component in future military developments for the United States.
“The United States, together with our allies and partners, must accelerate the adoption of AI and lead in its national security applications to maintain our strategic position, prevail on future battlefields, and safeguard the rules-based international order,” Esper said in a statement. He acknowledged that “AI technology will change much about the battlefield of the future.”
The weaponization of AI has unleashed growing concerns about the pernicious consequences that may arise from its adoption. Figures such as Elon Musk of SpaceX and Demis Hassabis at Google DeepMind have been particularly vocal in their opposition.
Despite this, the integration of AI into the military sphere appears to be continuing unimpeded. China has seen numerous starts-up setting up operations in a bid to cash in on technologies that will likely see an application in every major industry and sector of the economy.
In addition, the Marine Forces Special Operations Command (MARSOC) is increasingly exploring the possibility of using AI in their selection process. In particular, it is hoped that machine learning may assist in identifying potential recruits with traits that are seen as beneficial to the unit.
US Army Overhauls 10th Mountain Division Units for Faster, Smarter Combat
F-35 Stealth Defeated by Unknown Russia/Iran Radar
Iran’s Fattah-2 and Katie Can Nuke Tel Aviv
SOFREP Daily: Trump Wins US Presidency, Netanyahu Dismisses Defense Minister Gallant, North Korean Troops Under Fire in Ukraine
Russians Accused of Planning to Plant Bombs on US-Bound Planes
Join SOFREP for insider access and analysis.
TRY 14 DAYS FREEAlready a subscriber? Log In
COMMENTS
You must become a subscriber or login to view or post comments on this article.