• Ei tuloksia

General Problems with the usage of Predictive Policy Tools

2.1.1 Ethnical problems

To tell the truth, talking about race problems is rather difficult without any partiality or bias. What makes the views on race and crime so polarized? What are the facts about race and crime? Is “big data” discriminating? Does PredPol have a hidden weakness? To dis-cuss these emerged questions are more complicated as we would have thought.

Firstly, important to understand the portrayal of crime mediated by society than the actual dynamics of crime. To take a look into for instance an American society it shows that, it was throughout in American history. Most of the representation of crime is predominantly committed by a Black person [Scheingold 2010]. In the UK reported by the government that the ethnic minorities people are more likely to be arrested and become a victim of a crime [Bulman 2017]. Among the society the perceptions about the identity of the as-sumed race of criminals might be engrained in the public consciousness and find the con-nection between them. It seems like, “talking about crime equal talking about race.”

[Barlow 1998]. This common discrimination penetrates also in the criminal justice sys-tem. Many people believe that they biased against them [Hurwitz & Peffley 1998]. Ac-cording to the studies; the survey was administered to a random sample of participants that revealed a belief that Blacks are crime prone. Approximately half of the respondents believed that a relationship exists between race and criminality, among them, 65%

thought that Black people perpetrated more crimes than other racial groups [Ferguson 2017]. If we are talking about the race -and crime is regarded as a political issue and have to take a look to the distant, have to deal with poverty or unemployment, which might be propounded other unanswered, hidden questions.

Nowadays a rise of big data policing “PredPol” emerged as a new technology to predict and diminish the crime rate. In additional to become more objective and accurate against the traditional policing. But the “coin has two sides”. The way the “big data” might be discriminated is that; PredPol uses an algorithm, that phenomena try to get rid of all the anomalies which might correspond with the minority populations.

The creation of the model to predict the outputs reveals some important questions. The algorithm, instead of fixing biases in policing it blames for a new set of problems. Inputs go in and generate the output, based on the correlation. If the algorithm based on biased data, it can result in a biased output. For instance, if the police arrest people mainly from their colour from the minority districts for drugs, even if most all of the people from all of the races use the drug, it will result in the correlation between race and the uses of drugs. Additionally, the researchers examine how PredPol predicts the crime, their study proposes that the software only deals with a “feedback loop” that means the police offic-ers being anew sent to those neighbourhoods, where the number of the racial minorities are high regardless of the true crime rate in that area.

Moreover, the correlation does not precisely show the existing criminal activity through society. Why input could be biased data? The danger is in these algorithms. The algorithm itself is quantitative, on the other hand, building the “big data” system requires human judgment, the data produced by them. The whole design of the system can be affected by human decisions. It may reflect the biases in these data, continual discrimination and neg-ative biases about a minority group [Reynolds 2018][Smith 2016].

Researchers used 2010 reported crime data from Oakland to predict where crimes would occur in 2011. [Fig.3.] They used the data from the survey database to create a heat map showing where drug use in the city was most prevalent in 2011. As I mentioned the soft-ware deals with a “feedback loop” that means the algorithm chooses how to distribute the resources between two locations. If more are sent to one location, they willing to make more arrests there, so it leads to send more officers to that same place and the area become over-policed especially ones with a high number of racial minorities – whether of the true crime rate in that area. That happened in this case as well, the Police practices in Oakland matched up with PredPol's suggestions. Police in Oakland is already doing what PredPol's map suggested- over-policed minorities neighborhoods even if white people used illicit drugs at higher rates than minorities according to the survey [Smith 2016]. [Fig.4] In this case we could see the PredPol is not capable to predict and diminish the crime in the right way.

Figure 14. PredPol case study

Figure 15. Estimated drug users vs. PredPol targets

Lastly, it hard to predict the future about the PredPol, whether it’s able to solve or ease the ethnical problem among society. At least we have a belief in this “big data system” to predict a high chance of crimes in that are in the future without bias and bring us a more objective and accurate policing.