For Halloween: some people find scary giving artificial intelligence the discretion to kill.
"An autonomous missile under development by the Pentagon uses software to choose between targets. An artificially intelligent drone from the British military identifies firing points on its own. Russia showcases tanks that don’t need soldiers inside for combat. A.I. technology has for years led military leaders to ponder a future of warfare that needs little human involvement. But as capabilities have advanced, the idea of autonomous weapons reaching the battlefield is becoming less hypothetical. The possibility of software and algorithms making life-or-death decisions has added new urgency to efforts by a group called the Campaign To Stop Killer Robots that has pulled together arms control advocates, humans rights groups and technologists to urge the United Nations to craft a global treaty that bans weapons without people at the controls. Like cyberspace, where there aren’t clear rules of engagement for online attacks, no red lines have been defined over the use of automated weaponry. Without a nonproliferation agreement, some diplomats fear the world will plunge into an algorithm-driven arms race. In a speech at the start of the United Nations General Assembly in New York on Sept. 25, Secretary General António Guterres listed the technology as a global risk alongside climate change and growing income inequality. 'Let’s call it as it is: The prospect of machines with the discretion and power to take human life is morally repugnant,' Mr. Guterres said. ... "In 2016, the Pentagon highlighted its capabilities during a test in the Mojave Desert. More than 100 drones were dropped from a fighter jet in a disorganized heap, before quickly coming together to race toward and encircle a target. ... The drones were programmed to communicate with each other independently to collectively organize and reach the target. 'They are a collective organism, sharing one distributed brain for decision-making and adapting to each other like swarms in nature,' William Roper, director of the Pentagon’s strategic capabilities office, said at the time. To those fearful of the advancement of autonomous weapons, the implications were clear. 'You’re delegating the decision to kill to a machine,' said Thomas Hajnoczi, the head of disarmament department for the Austrian government. 'A machine doesn’t have any measure of moral judgment or mercy.'" www.nytimes.com/2018/10/19/technology/artificial-intelligence-weapons.html
0 Comments
Leave a Reply. |
Blog sharing news about geography, philosophy, world affairs, and outside-the-box learning
Archives
December 2023
Categories
All
|