The United States is the only country having ethical standards for AI weapons. Should We Be Concerned?
Artificial Intelligence News Drone Military Conflict
On August 29th, three days after a suicide bomber killed 13 American soldiers and 160 civilians at Kabul airport, US military intelligence was tracking a car moving towards the airport with “packages” that looked suspiciously like bombs. The aim was to use one of the Army’s Reaper drones to close in on the automobile through video and destroy it with a Hellfire missile when there were no innocent civilians nearby. The car came to a complete halt in a peaceful area.
General Kenneth F. McKenzie Jr., the head of US Central Command in Tampa, Florida, had given his approval to the tactical commander, who was most likely stationed at Creech Air Force Base in Nevada. Video feeds are frequently delayed by several seconds due to the fact that they must ricochet through military commanders all across the world. According to the US military, that gap may have been long enough for a few civilians to approach the target vehicle in this case. The explosion killed up to ten Afghan civilians, including seven children, and sparked international condemnation. There have been questions raised about whether the car represented a threat in the first place.
Military planners are looking for a better approach to attack from afar to avert future threats from ISIS, al Qaeda, and other groups that could emerge in Taliban-controlled Afghanistan—or any other remote area, for that matter. That quest is headed in an unsettling direction: allowing machines to select when and possibly who to kill.
Reapers and other US drones will be equipped with powerful artificial intelligence technology in the coming years. That creates an unsettling scenario: military drones tucked away in small, unmanned bases in or near Afghanistan, ready to take off, scan the terrain, analyze the images they collect in real time, identify and target terrorist activity, ensure the target is clear of civilians, fire a missile, confirm the kill, and return to base—all with little or no human intervention.
The primary objective for equipping Reaper drones with artificial intelligence (AI) is not humanitarian. This is a condensed version of the information.