questioning the data detectives

| No Comments | No TrackBacks

The Economist's look at data detectives discusses changes in "the relationship between information and crime"--all because "people generate more searchable information than they used to:"

Smartphones passively track and record where people go, who they talk to and for how long; their apps reveal subtler personal information, such as their political views, what they like to read and watch and how they spend their money. As more appliances and accoutrements become networked, so the amount of information people inadvertently create will continue to grow.

To track a suspect's movements and conversations, police chiefs no longer need to allocate dozens of officers for round-the-clock stakeouts. They just need to seize the suspect's phone and bypass its encryption. If he drives, police cars, streetlights and car parks equipped with automatic number-plate readers (ANPRs, known in America as automatic licence-plate readers or ALPRs) can track all his movements.

Despite these changes, "the gap between information technology and policy gapes ever wider:"

Most privacy laws were written for the age of postal services and fixed-line telephones. Courts give citizens protection from governments entering their homes or rifling through their personal papers. The law on people's digital presence is less clear. In most liberal countries, police still must convince a judge to let them eavesdrop on phone calls.

The piece points out that "data can be abused personally as well as constitutionally:"

A policeman in Washington, DC, was convicted of extortion for blackmailing the owners of cars parked near a gay bar. ANPR firms insist what they do is constitutional--in America the First Amendment protects public photography. But not everything constitutional is desirable. Even the International Association of Chiefs of Police has admitted that ANPRs could have an impact on freedom by recording vehicles going to political gatherings, abortion clinics or other sensitive venues.

"The use of algorithms to tackle complex problems such as urban crime, or to try to forecast whether someone is likely to commit another crime," The Economist continues, "is not inherently alarming:"

An algorithm, after all, is just a set of rules designed to produce a result. Criminal justice algorithms organise and sort through reams of data faster and more efficiently than people can. But fears abound: that they remove decisions from humans and hand them to machines; that they function without transparency because their creators will not reveal their precise composition; that they punish people for potential, not actual, crimes; and that they entrench racial bias.

The article lists a few of the technological advances in question:

Acoustic sensors trained to recognise the sound of gunfire and send alerts to officers' mobile phones telling them when and where the shots were fired. Glasses that recognise faces and record everything. Drones equipped with high-definition video cameras. GPS readers and ANPRs, allowing for constant surveillance of entire swathes of a city. CCTV systems with embedded facial recognition that lets authorities track people in real time.

All of these new technological possibilities are upending a wide range of activities and the customs associated with them. Law enforcement is no different. But if citizens do not like how their doctor or hairdresser, or a social-media site, uses their data or tracks their purchases, they can go somewhere else. The state wields a monopoly on punishment through law enforcement. Police can arrest, and even kill, their fellow citizens. Judges have the power to imprison people. That makes transparency and public consent in the justice system essential.

Andrew Ferguson, author of a book on the subject, "suggests five questions that departments should answer before buying new technology:"

Can you identify the risks that the technology addresses? Can you ensure accurate data inputs? How will the technology affect community relations and policing practice? Can it be tested to ensure transparency and accountability? And will police use the technology in a manner that respects the autonomy of the people it will affect?

The old line about "more questions than answers" is as true as ever.

No TrackBacks

TrackBack URL: http://www.cognitivedissident.org/mt/mt-tb.cgi/3818

Leave a comment

About this Entry

This page contains a single entry by cognitivedissident published on June 6, 2018 9:41 AM.

joyful atheism was the previous entry in this blog.

Pirsig and Quality is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Monthly Archives

Pages

  • About
  • Contact
OpenID accepted here Learn more about OpenID
Powered by Movable Type 5.031