top of page

Welcome to Crime and Justice News

A Call For More Testing, Better Rules Governing New Police Technology

Experts from the Information Technology and Innovation Foundation (ITIF) explored police technology’s advances and risks in a panel discussion based on a new report, reports Governing.

Police Tech: Exploring the Opportunities and Fact-Checking the Criticisms, details how technologies like AI and robotics can help police prevent and respond to crime. The report notes that opponents of police tech have some legitimate concerns, but that outright bans on some technologies are not the solution. Instead, more research, independent testing and governance rules could help mitigate risks.

“There’s plenty of room for technology to transform public safety and law enforcement,” said ITIF's Ashley Johnson.

Experts discussed the changing landscape of public safety technology, exploring advances in the capabilities of the tech itself and in the way that the public perceives the use of public safety tech.

Brendan Schulman of Boston Dynamics noted that robots are not new to law enforcement, but what has changed is the athletic capabilities of newer models like Spot, the company’s robot dog.

Schulman said Spot’s navigational capabilities make the device different for public safety officials.

The automation capability does not mean that the robot can act independently with its own intent, but rather that it can be directed to do a complex task like “open a doorknob” without someone operating it remotely to do so.

“So, there’s a significant amount of concern that I think is fictitious,” Schulman said, citing the influence of science fiction portrayals of robots. “But then there’s also, I think, a substantial list of concerns that are real and that the industry and the government should address.”

One of the ways these risks can be addressed is through independent testing and research.

For ShotSpotter, a company that uses acoustic surveillance technology by leveraging audio sensors to detect gunfire incidents, New York University conducted an independent audit of the company’s privacy concerns.

“We adopted the recommendations from NYU,” said ShotSpotter’s Tom Chittum. “And their conclusion was that our technology posed an extremely low risk to individual privacy.”

Schulman noted a lack of government policies regarding advanced robotic technologies, which amplifies public fears.

“I think what would be really useful — either at the department level, or city level or state level, or perhaps federally in terms of guidance to state authorities — is some type of framework,” Schulman said.

He believes that this type of framework could address the weaponization of robots, the use of cameras and warrant requirements for a robot to enter a specific premises.


Recent Posts

See All

A daily report co-sponsored by Arizona State University, Criminal Justice Journalists, and the National Criminal Justice Association

bottom of page