Congress will soon consider whether to adopt a vital policy to guard against artificial intelligence (AI) from ever playing a role in the decision to use U.S. nuclear weapons. But hold-outs in the Senate are standing in the way of the United States adopting this common-sense principle.
Tell your Senators to support the amendment introduced by Sen. Ed Markey to keep a human “in the loop” on nuclear weapon decisions. Tell your US Representative to support the principle in negotiations with the Senate.
The Senate amendment to the fiscal year 2025 National Defense Authorization Act would set in law the policy that a human must be kept in the loop “for all actions critical to informing and executing decisions by the President to initiate and terminate nuclear weapons employment.”
The House of Representatives has approved the same language in its defense authorization bill. If the Senate matches the House language, the human “in the loop” policy will likely become law. And if the Senate doesn’t adopt the same language, House negotiators must stand firm and push to include the rule in the final negotiated version of the law.
It is urgent that you act now to email your Senators and US Representative to support this critical law! Click below to email your Senators and US Representative, it only takes one minute!
On September 26,1983, a lieutenant colonel in the Soviet Air Defense Forces was on duty when the early-warning satellite system he was monitoring detected what appeared to be five approaching U.S. nuclear-armed intercontinental ballistic missiles. He was supposed to let the highest level of Soviet nuclear command know, who almost certainly would have launched an all out nuclear war.
But he reasoned that it was highly unlikely that the US would launch such a small number of weapons, so refrained from sending it up. It turned out that the mistaken warning was a result of sun reflecting off the high-altitude clouds, so he was correct. This is a great example of why human participation in nuclear weapon decision-making is so critical. If AI was being used, it would almost certainly have led to global nuclear holocaust.
The enormous ethical and legal consequences of nuclear weapons employment must always rest on the shoulders of human beings. Earlier this summer, UN Secretary-General Guterres said “Until these weapons are eliminated, all countries must agree that any decision on nuclear use is made by humans, not machines or algorithms.”
P.S. We can only meet daunting challenges like that above if we have the resources to sustain and intensify CFPA's organizing. At the end of sending your emails, you will have the opportunity to contribute to CFPA’s nuclear disarmament work. If you prefer, you can postal mail a check to the address under my name. I urge you to be as generous as possible.
Sincerely,
The Rev. Robert Moore
Executive Director
Coalition for Peace Action &
Peace Action Education Fund
7 Vandeventer Ave.
Princeton, NJ 08542