29. The Policy and Law of Lethal Autonomy with Michael Meier and Shawn Steene

The Convergence - An Army Mad Scientist Podcast - Podcast autorstwa The Army Mad Scientist Initiative - Czwartki

Kategorie:

Michael Meier is the Special Assistant to the Judge Advocate General (JAG) for Law of War Matters at Headquarters, Department of the Army.  As such, Mr. Meier serves as the law of war subject matter expert for the U.S. Army JAG Corps, advising on policy issues involving the law of war.  Mr. Meier also reviews all proposed new U.S. Army weapons and weapons systems to ensure they are consistent with U.S. international law obligations.  Additionally, he is an Adjunct Professor at Georgetown University Law Center, instructing courses on the Law of Armed Conflict.  Mr. Meier is a retired JAG officer, having served in the U.S. Army for 23 years. Shawn Steene is the Senior Force Developer for Emerging Technologies, Office of the Under Secretary of Defense for Policy, where his portfolio includes Emerging Technologies and S&T, including Autonomous Weapon Systems policy and Directed Energy Weapons policy.  Prior to joining OSD Strategy & Force Development, Mr. Steene worked in OSD Space Policy, where his portfolio included Space Support (launch, satellite control, orbital debris mitigation, and rendezvous and proximity operations), as well as strategic stability and all space-related issuances (Directives, Instructions, DTMs, etc.).   He is a proclaimed Mad Scientist, having presented and served as a discussion panelist in our Frameworks (Ethics & Policy) for Autonomy on the Future Battlefield, the final webinar in our Mad Scientist Robotics and Autonomy series of virtual events. In today’s podcast, Messrs. Meier and Steene discuss the ground truth on regulations and directives regarding lethal autonomy and what the future of autonomy might mean in a complex threat environment.  The following bullet points highlight key insights from our interview with them: Current law and policy do not specifically prohibit or restrict the use of autonomous weapons. However, these systems will need to operate within the law of armed conflict and Department of Defense (DoD) directives. These restrictions entail that autonomous systems will need to be capable of distinguishing between appropriate targets and non-combatants, maintain proportionality in attacks, and undertake feasible precautions to reduce risk to civilians and protected objects.  Ultimately, operators and human supervisors will be held responsible under laws of conflict and U.S. policy. Thus, appropriate safeguards will need to be adopted to ensure appropriate human oversight of autonomous systems. DoD directives establish guidelines for this supervision and facilitate case by case reviews of systems with autonomous capabilities.  Artificial intelligence (AI) and autonomy are not interchangeable. While some autonomous systems use AI, this is not always the case.  The United States is concerned with and making efforts to addr

Visit the podcast's native language site