9/08/2018

Headline September 09, 2018/ '' 'O' PROMISE - O'CRIME- O' PERIL' '' : BIG DATA


'' 'O' PROMISE - O'CRIME - O' PERIL' '' : 

BIG DATA




PREDPOL IS JUST ONE of a number of firms offering crime-prediction software to police forces. While the precise components of each firm's algorithms probably differ-

The broad idea is the same. They aim to help police allocate resources efficient by using large amounts of data to predict and therefore prevent crime..........

Eight storeys above downtown Los Angeles, Sean Malinowski, deputy chief of the Los Angeles's Police Department [LAPD], focuses intently on a computer map of his old stomping ground.

Nestled between Burbank and Santa Clarita, the Foothill district is a hotch-potch of industrial and residential districts riven by highways. Mr. Malinowski ran its police station before his promotion moved him downtown.

Colourful dots representing reported crimes freckle the map like psychedelic pimples. Adjacent to some of the dots are red squares. Each one represents a 250,000-square-foot 2.3 hectare that Predpol, a crime-prediction software used by the LAPD and at least -

50 other law-enforcement agencies around the world, has flagged as being at risk of future-criminal activity. Mr. Malinowski says that, if he were still in charge of policing in Foothill, he would ask his officers to drive through these areas frequently, ''so we're there randomly - it throws the criminals off.''

The idea is not to nab people red-handed, but to deter them through increased police presence.

Predpol is just one of a number of firms offering crime-predictive software to police forces. While the police components of each firm's algorithms probably differ, the broad idea is the same. They aim to help police allocate resources efficiently by using large amounts of data to predict and therefore prevent crime.

The use of algorithms to tackle complex problems such as urban crime, or to try to forecast whether someone is likely to commit another crime, is not inherently alarming. An algorithm, is after all, just a set of rules designed to produce a result.

Criminal justice algorithms organise and sort through reams of data faster and more sufficiently than people than people can.

But fears abound: that that they remove decisions from humans and hands them to machines; that they function without transparency because their creators will not reveal their precise composition; that they punish people for potential, not actual, crimes; and that they entrench racial bias.

Defenders of such programmes argue, correctly, that police have always relied on prediction in some form. Officers line parade routes, for instance, because  experience has shown that the combination of crowds, alcohol and high spirits create and increased public safety  risk.

Eliminating predictions from policing would produce an entirely reactive force. All these programs do, defenders say, is harness more data from more sources to help police make better decisions.

But the algorithms on which police base their decisions are, as far as the public is concerned, black boxes.

The companies that create and market them consider their precise composition trade secrets. ''Algorithms only do what we tell them to do,'' says Phillip Atiba Goff of  John Jay College of  Criminal Justice in Manhattan.

If their creators face them biased data they will produce results infected with bias. And predictive policing is just one way in which the criminal justice is using algorithms to help them make decisions.

New Jersey uses an algorithm based on past criminal history, age, past - failure to appear at trial and the violence of the  current offence to determine whether someone is suitable for for bail - that is, whether he presents too great a risk of flight or of committing more crimes while awaiting trial.

Several states use algorithms to provide sentencing recommendation. At least 13 American cities use them to identify people likely to become perpetrators or victims of gun violence.

The Honor and Serving of the latest Global Operational Research on Predictive Policing continues. The World Students Society thanks authors and researchers at The Economist.

With respectful dedication to the Students, Professors of the world. See Ya  all............. ''register'' on  www.wssciw.blogspot.com and Twitter !E-WOW! - the  Ecosystem 2011:

''' Algorithm Sentencing '''

Good Night and God Bless

SAM Daily Times - the Voice of the Voiceless

0 comments:

Post a Comment

Grace A Comment!