From the company that brought you taser stun guns comes an AI weapon so dangerous it’s been rejected by the company’s artificial intelligence ethics committee. But that didn’t stop the CEO from announcing the gun in response to the May 24 elementary school shooting in Uvalde, Texas, like a misguided white horse hoping to keep our nation safe.
According to the BBC, Axon (formerly shockingly known as Taser International) has announced plans to make a lightweight taser that can be deployed on a drone or robot and controlled remotely via “targeting algorithms”. The operator, a human (for now), will “have agreed to accept legal and moral responsibility for every action that takes place”.
So they want to help stop school shootings.
As we deal with the tragic mass shooting in Texas, here’s how you can help.
Axon founder and CEO Rick Smith, himself a father of twin boys, told the BBC that the recent elementary school tragedy in Uvalde, Texas, has forced him to share the project with the public because “politicians in the US are not effective dealt with this.”
Its logic follows a precedent set by politicians themselves, particularly Republicans, who insist their efforts to “improve access to mental health and provide tools for crisis intervention officials” are an acceptable alternative for sweeping gun control reform . It doesn’t help that after the Uvalde shooting, Fox News provided a platform for guests who suggested handing out bulletproof “ballistic blankets” to schoolchildren as an effective way to curb gun violence.
Axon’s Taser drone project would apparently include integrated camera networks that share real-time sensor access with local public safety agencies. The company says it’s not sure who would be operating the drones: police departments, federal agencies, or Axon employees themselves.
A limited pilot project last year was planned by the company’s artificial intelligence ethics committee, which voted against the idea of further development. “Sensible minds may differ on the merits of police-controlled Taser-equipped drones – our own board disagreed internally – but we are unanimously concerned about the process Axon has used in relation to this idea of drones in classrooms,” he said Board.
But Smith doesn’t care. He told the BBC that pending regulatory approvals, a “proof of concept” model could be ready within a year and field trials could be possible in two years.
In a statement to the Associated Press, a New York University law professor who sits on the Axon AI ethics board called the idea “dangerous and fantastic.” “This particular idea is crazy,” said Barry Friedman, “Drones can’t fly through closed doors. The physical properties of the universe still apply. So unless you have a drone in every single classroom in America, which seems crazy, the idea just isn’t going to work.”
Senator Chris Murphy makes powerful speech asking Congress to control guns