Technology will play a big part in the future of policing

Predictive policing allows officers to be directed to the right area at the right time in order to catch criminals before a crime is committed. This is a new version of a classic tactic, where officers would rely partially on instinct and partially on a paper map in order to chose where to spend their shift patrolling. Now they are able to use a sophisticated algorithm — a complex series of code fed with historical data about crimes, such as where, how often they happen, and the type of crime they are — to do the same work at a fraction of the labor.

The new model of policing is based off the CompStat model pioneered in New York City in the early 1990s which was considered a success for reducing murder rates by 50 percent (CompStat). Not only is the program based off pre-existing data, but also pre-existing logic, such as “if there’s an area with lots of home break-ins, send police cars to patrol the area during the time of day when most of the robberies occur.

Crime can be predicted by other factors such as weather and wind patterns, too. PredPol, a supplier of predictive policing software, uses the same algorithms that are used by seismologists to detect earthquakes and by Walmart to predict the demand of a certain product because of specific conditions.

The right place at the right time

Predictive policing is being rolled out in numerous cities throughout the world with the goal of catching more criminals without spending more money on personnel. In Saskatoon, it is being pioneered as a joint venture between the Provincial Government, University of Saskatchewan, and Saskatoon Police Service. Because of the high rate of missing persons reports in the city — 2,700 in 2014, of which 2,000 were youth— it is being tested with these types of crimes first. “If we’ve got certain predictors that show that a person does ‘x,y,z’ when they go missing,” it will be easier to figure out where they went if you can add this information to a computer which can point you to the end-point. In Los Angeles, the software taught police that “burglars tend to be territorial, so once they find a neighbourhood where they get good stuff, they come back again and again.”

 Jeff Brantingham is an anthropology professor at the University of California who's research helped lay the foundations for predictive policing. 📸:  Business Insider .

Jeff Brantingham is an anthropology professor at the University of California who's research helped lay the foundations for predictive policing. 📸: Business Insider.

In most cases, the use of predictive policing has reduced the amount of crime. For example, in Los Angeles it predicted twice as much crime as trained analysts could and prevented double the crime. However in Kent, England, the crime rate did not decrease, which was attributed to the lack of training officers received on how to use the new software. In the United States, only half of the crimes committed are ever reported, according to government estimates. “The result is that systems will miss crimes that don’t fit patterns from the past, and law-enforcement agencies will devote more resources to looking for crimes they would already have found the old-fashioned way and less on crimes that require longer and deeper investigations.”

Predictive policing is also a more economically viable option for precincts facing budget cuts. “Significant budget reductions are requiring police managers and command staff to consider reductions in the retention of sworn personnel,” and the need to use resources more efficiently has become essential to police services under budgetary constraints.

A cause for racial profiling

One fundamental concern with the use of predictive policing is the public’s fear of racial profiling as a result of the new technology. Common belief is that more people from minority groups and those of a lower socioeconomic status will be targeted; however it must be understood that current algorithmic policing methods rely solely on data from the actual crime and not on who committed it, such as their age, name, or ethnicity. Therefore if a neighborhood where the majority of residents are people of colour has seen an increase in car theft in the last few months, it will be targeted because of the increase in crimes and not because of the congregation of people of the same race.

 Baltimore gang members stood bravely to protect a white reporter during riots. 📸:  Inquisitr .

Baltimore gang members stood bravely to protect a white reporter during riots. 📸: Inquisitr.

When concerning oneself with discrimination based on race, it is also important to note that discrimination based on socioeconomic status is also a large possibility since this new type of policing has not yet been tested with white-collar crimes. Both forms of policing rely on the ‘lowest hanging fruit’ model, whereas people who are less likely to fight charges are targeted; however it can be said that a criminal is a criminal, no matter the crime or status of economic power. The only difference is that predictive policing is only being utilized in mainly petty-theft crimes, which might force more officers to focus on these crimes instead of larger ones that have the capability to impact more people.

Who's held responsible?

Another challenge of using software to predict crimes is a need for all historic crime files to be completely up-to-date. Even if only one file has an error in it, there would be the possibility of it triggering incorrect predictions and rendering the whole concept useless. In order to avoid this, precincts would need to have staff members comb through each case file while adding it to the system in order to ensure that no errors are present in the paperwork. Furthermore, the issue of police corruption arises. While there has always been the possibility of corrupt police, with the use of computer systems any officer would have the power to see their peer’s assignments and locations, which in theory could come in handy for an officer to inform a drug ring they belong to that the neighborhood is being targeted as a hotspot. Police bias fed into the algorithm would have the opportunity to reinforce pre-existing inequalities while looking impartial because computers can’t be racist.

Systems which rely on information shared via social media networks to predict how likely someone is to engage in illegal activities, and UCLA is showing how it can help predict an assortment of events, including crimes. The university has been “working with some of the world’s experts in computer and data science, big data infrastructure, and psychology/behavioural science” to develop “a platform to analyse this social data and spit out real-time predictions about future events and help public health officials prevent disease outbreaks, stop violent crime and reduce poverty.”

October 24, 2014 was the day of the Marysville Pilchuck High School shootings where Jaylen Fryberg killed four students before ending his life. One month earlier, on September 20, he posted tweets suggesting he was considering harming himself and others. Predictive policing using only historical data would not have been able to prevent this crime from happening, but if the software had the capability to scour social media for prediction points, it could have. “People limit what they say when they know they are being watched, so models that rely on people’s speech have the very real potential to chill free expression.” Instead of using predictive models to find crime, similar technology could instead be used to address societal factors which turn ordinary people into criminals. In Los Angeles, a pilot program uses predictive models to discover at-risk children in the child-welfare system and give them the services they need to stay out of the justice system. With an appropriate amount of funding, programs like this have the ability to change the criminal justice cycle instead of throwing people behind bars.

Rights, infringed

In Canada the Charter of Rights and Freedoms — specifically section 8, the right against unreasonable search and seizure, section 9, to not be arbitrarily detained or imprisoned, and 15, the right to not be discriminated against — and the fourth amendment and false imprisonment laws in the United States could easily be violated by using predictive policing. The real issue lies with the concept of ‘reasonable suspicion’, which allows police to stop suspects and detain them for short periods of time; reasonable suspicion could constitute something as miniscule as swerving on the road on a late Saturday night after work.

We are wired to be uncomfortable with the possibility of a computer predicting what we will do when we, ourselves do not know. A system taking data and spitting out a conclusion that someone will commit a crime, and in the process arresting them for this prediction, doesn’t give someone the opportunity to change their path.

Detention under sections 9 and 10 of the Charter of Rights and Freedoms “refers to a suspension of the individual’s liberty interest by a significant physical or psychological restraint.” Psychological detention is defined as a situation in which an individual has a legal obligation to cooperate with a restrictive demand, or where they would conclude that they have no other choice than to comply. In R. v. Grant, it was set out that in order to determine if an individual has been deprived of their liberty of choice by the state, the court may consider: the “circumstances giving rise to the encounter as they would reasonably be perceived by the individual”; the nature of the conduct, including the tone of language used, physical contact, where the encounter occurred, presence of others, and the duration of the conduct; and the characteristics of the individual, when relevant, such as age, minority status, and physical stature.

Some caution, however, about the privacy and liberty questions; this tool could be used by the police to stop and frisk people in areas identified because of a software program. “No court has yet ruled on the impact of predictive policing,” and it has yet to be determined if the software should be used as a guide or cause for profiling.

Section 15 states that each “individual is equal before and under the law and has the right to the equal protection and equal benefit of the law without discrimination and, in particular, without discrimination based on race, national or ethnic origin, colour, religion, sex, age or mental or physical disability.” The affirmative action programs subsection goes on to state that it does not prevent a law or program from happening if its purpose is to improve the “conditions of disadvantaged individuals or groups including those that are disadvantaged.” In cities, such as Vancouver, where predictive policing is already being utilized, it could be considered a lawful breach of the Charter in the sense that the objective of the program is to improve the conditions of people living in neighbourhoods with higher crime rates.

Predictive policing will allow for a stronger police force in a country ridiculed with overworked police forces facing budget cuts. With the introduction of software-led policing it has been argued that there will be more racial profiling, but this also shines a light on the fact that this issue will not be introduced with predictive policing but was evident before the technology arrived. Using software to predict crime shines a light on the issues we are currently facing in policing departments nationwide. One drawback, however, is that if you are someone of colour who has never committed a crime before, your neighborhood could be subjected to disturbances because of a congregation of coloured residents. Furthermore higher-profile crimes, such as money laundering have yet to be tested using this new policing method

Predictive policing is a promising new approach to crime solving using data and predictions to make smarter, quicker decisions. It is evident through the use of the software in cities such as Los Angeles and Saskatoon that the software does work with little infringement of the law, but before it can be fully utilized the issues we currently face in our policing systems — specifically racial profiling and bias — must be taken care of so that they do not get passed on to predictive policing.