'' ' O' ALBATROSS '
-BEWARE NEARBY- ALGORITHMS' ''
IN A STUDY of the first version of the list from 2013. RAND, a think-tank, found that people on it were no more likely to be victims of a shooting-
Than those in a random control group. Police say the current list is far more accurate, but have still refused to reveal the algorithmis behind it.
And both Chicago's murder rate and its total number of homicides are higher today than they were when police started using the list in 2013.
MEANWHILE ALL ALGORITHMS used in sentencing have faced criticism for racial bias. ProPUblica, an investigative journalism NGO, studied risk scores assigned to -
7,000 people over two years in Broward County, Florida, and found black defendants twice as likely as whites to be falsely labelled at high risk of committing future crimes.
It also found the questions predicted violence poorly : only around 20 % of those forecast to commit violent crimes actually did so. Northpointe, the firm behind the algorithm, disputed ProPublica's findings.
THE TRUTH IS that the questions on Northpointe's risk assessment form illustrates how racial bias can affect an algorithm even without any direct questions about race.
It asked how often a defendant, his family members and friends have been arrested. Those numbers will presumably be higher in poor, overpoliced, non-white districts than rich ones.
It also asked whether friends were in gangs, how often the defendant has ''barely enough money to get by'' and whether it is ''easy to get drugs in your neighbourhood''.
All questions that ethnic minority defendants will, on a average, answer affirmatively more often than white ones.
More broadly, a proprietary algorithm that recommends a judge punish two people differently based on what they might do offends, a traditional sense of justice, which demands that punishment for the crime not the potential crime.
Another analytical system, called Beware, assigns ''threat scores'' in real time to address as police respond to calls.
It uses commercial and publicly available data, and it has a feature called Beware Nearby, which generates information about potential threats to police near a specific address, meaning officers can assess the risk when a neighbour calls the emergency services.
This raises privacy concerns, but it could cause other problems, too. For instance a veteran who has visited a doctor and taken medicine prescribed for PTSD, who also receives gun catalogues in the post, could be deemed a high risk.
Police might then approach his house with guns drawn, and it is not hard to imagine that kind ending badly.
Such threat scores also risk infection with bad data. If they use social-media postings, they also raise free expression concerns. Will police treat people differently because of their political concerns.
Questions of bias also surround place-based policing. Using arrests or drug convictions will almost certainly produce racially biased results. Arrests reflect police presence more than crime.
Using drug convictions is suspect, too. Black and white Americans use marijuana at roughly similar rates, with the rate for 18- to 25-year-olds higher for whites than blacks.
The Honor and Serving of the latest Operational Research on Law, Policing and Crime continues.
With respectful dedication to the Leaders, Students, Professors and Teachers of the world. See Ya all ''register'' on The World Students Society - for every subject in the world and..... Twitter - !E-WOW! - the Ecosystem 2011:
''' Students & Ranges '''
Good Night and God Bless
SAM Daily Times - the Voice of the Voiceless
0 comments:
Post a Comment
Grace A Comment!