Tech leaders from Google, Microsoft, and Facebook suggest ethics guidelines for using AI in the military to avoid ‘unintended harm to humans’
- The Defense Innovation Board made 12 recommendations for AI in the military
- The Board warns against unintended adverse consequences of using AI
- Members include tech execs from Google, Microsoft, Facebook, and LinkedIn
This week, the Defense Innovation Board issued a series of recommendations to the Department of Defense on how artificial intelligence should be implemented in future military conflict.
The Defense Innovation Board was first created in 2016 to establish a series of best practices on potential collaborations between the US military and Silicon Valley.
There are sixteen current board members from a broad number of disciplines, including former Google CEO Eric Schmidt, Facebook executive Marne Levine, Microsoft’s Chief Digital Officer Kurt Delbene, astrophysicist Neil deGrasse Tyson, Steve Jobs biographer Walter Isaacson, and LinkedIn co-founded Reid Hoffman.
Scroll down for video
‘Now is the time, at this early stage of the resurgence of interest in AI, to hold serious discussions about norms of AI development and use in a military context—long before there has been an incident.’ the report says.
The report says that using AI for military actions or decision-making comes with ‘the duty to take feasible precautions to reduce the risk of harm to the civilian population and other protected persons and objects.’
The report outlines five ethical principles that should be at the heart of every major decision related to using AI in the military.
AI in the military should always be: Responsible, Equable, Traceable, Reliable, and Governable.
Going off these principles the report makes twelve concrete recommendations for how to move forward integrating AI in contemporary warfare.
The Board recommends creating a risk management strategy that would formalize a taxonomy of negative outcomes.
The purpose of this taxonomy would be to ‘encourage and incentivize the rapid adoption of mature technologies in low-risk applications, and emphasize and prioritize greater precaution and scrutiny in applications that are less mature and/or could lead to more significant adverse consequences.’
The report recommends the development of a risk management methodology to account for all the potential negative outcomes that could come from deferring a significant amount of work or decision-making to a computer.
The Board also emphasize the importance of developing specific benchmarks to evaluate the reliability of AI as it compares to human performance in military setting.
In the same spirit, the Board encourages the military to create a rating for how reproducible an AI driven outcome or action is, so as to minimize the prevalence of unintended consequences.
A number of other recommendations are mainly administrative.
One calls for the creation of an official Department of Defense policy communications channel to make announcements and field questions different stakeholders.
Another recommendation calls for the creation of an internal ‘AI Steering Committee’ to oversee any current or future AI programs, and organizing workforce AI training in the department.
Former Google CEO Eric Schmidt (pictured above) is one of the members of the Defense Innovation Board, which was formed in 2016 to help encourage cooperation between Silicon Valley and the US Army
The report comes as a potential check on an earlier report from the Pentagon that recommended making AI a major focus of the future, something that would help keep America ahead of Russia and China compete for influence around the world.
This summer the Army announced it was developing a new missile system, called Cannon-Delivered Area Effects Munition (C-DAEM), that would use AI for guidance.
In 2018, Google declined to renew the contract for Project Maven, an AI initiative the company ran with the Department of Defense that helped train drones to identify potential military targets.
More than 3,000 Google employees signed a letter protesting the project.