Skip to content

Critics say a predictive policing system could amplify racial bias in Oakland

on November 7, 2016

Twenty-two-year-old Kamani Holmes has dark skin, stands about six feet tall, and wears a full beard across his face. A Black Power fist adorns the center of his grey hooded sweater with the words “Movement Warriors” across the bottom. His light blue jeans perfectly match the tint of blue in his Jordan sneakers. Then, there’s the bracelet. It’s black so it can accompany any outfit—well, it must. Holmes’ GPS ankle monitor reminds him that he must check in with a parole officer regularly for three years. It reminds him that he’s still in the criminal justice system.

Holmes was first arrested at age 14 on suspicion of robbery. He said he was jailed for a week and released without being charged. By 15, he was arrested for possession of a firearm and burglary, and spent six months in the Orin Allen Youth Rehabilitation Facility, a juvenile hall. By age 19, he was convicted of felony first degree burglary and possession of a firearm in Pinole, California, serving over a year in San Quentin state prison. Now he’s on three years’ parole.

The same year Holmes was arrested in Pinole, neighboring Richmond Police Department (RPD) instituted a new predictive policing program known as PredPol, which experts say could disproportionately target men who look like Holmes. PredPol is a computer-based algorithm that police officers use to predict where and when future crimes will occur. Police officers input previous crime reports into PredPol, which then predicts which areas are likely to see future criminal activity.

Oakland Mayor Libby Schaaf has requested funding to purchase PredPol. According to Brian Hofer, the City of Oakland’s Privacy Advisory Commission Chair, the city has budgeted for the cost of PredPol software, but has not yet acquired it. The commission assesses policing technology and presents data to city officials before making a purchase. The 2016-2017 fiscal budget shows a $75,000 allocation for predictive policing software.

Critics say that software like PredPol could bring more problems to Oakland, a city that already has a rocky relationship between police and community members.  Hofer said it could influence officers to target people in certain neighborhoods. “Officers will already have this preconceived notion that there are going to be criminals there when they get there. So, they’re going to be suspicious of everyone that’s there,” he said.

Holmes said he is concerned about the idea of police paying even more attention to certain neighborhoods. “It’s no doubt about it that we see the police in the neighborhood. They’re already pulling up and just sitting there because they’re already in the area. It’s not to our benefit at all.  Based off my past experience, it enables the police to traumatize the youth a lot more,” he said.

Holmes was born in Oakland. His interactions with police started long before his own arrest, making him anxious around the police, even when he wasn’t committing a crime. He started encountering the police as a child during weekend visits to his grandmother’s house in Oakland. Holmes said his grandmother sold drugs and police officers would raid her house in the wee hours of the night. He remembered, “They kicked the door in, yelling. Turned on all the lights.” Then, he said, they aggressively handcuffed and took his grandmother to jail, an experience he recalls as humiliating.

Holmes said he was directly targeted by police officers for the first time when he was 11 years old. He was playing outside with his brothers when two officers approached them. “They searched me as if I was distributing drugs,” he recalled. Holmes said his 16-year-old brother, who was with him at the time, felt powerless to protect him. “The fact that his baby brother was being targeted … it was a rough experience,” Holmes said. The officers didn’t find any drugs. “We were clean as a whistle. At that time, I hadn’t even thought about committing a crime,” said Holmes.

Holmes said during his teenage years he saw police officers up close on a regular basis, even when he wasn’t doing anything wrong. Once when he visited his friend in Richmond, while they were playing video games they heard a knock at the door. Police officers ordered them to come outside and talk to them, he said. Someone nearby had gotten robbed and Holmes said the officers brought the victim to his friend’s house. The victim pointed to Holmes, he remembered, and then officers ordered him to sit in the back of the police car. While in the car, he recalls, he overheard an officer say, ‘I hope he confirms it’s him. This is it.’ But the victim didn’t; Holmes and his friend went back inside and continued to play video games.

Holmes said he doesn’t remember police helping residents when they came to his neighborhood. “I remember they would always be in the area. You didn’t see police getting out and helping old women or doing things that good Samaritans do,” Holmes said. “They were out patrolling, trying to police the neighborhood.”

In October, a study called “To Predict and Serve?” came out that states that some data-based policing methods might in fact target some neighborhoods more than others. The study, performed by researchers Kristian Lum and William Isaac of Human Rights Data Analysis Group (HRDAG) looked at an algorithm like PredPol.

Their group researches human rights violations, and Lum and Isaac study policing in major U.S. cities. The researchers said they fear that algorithms like PredPol could be ineffective, or worse, result in discriminatory policing if police departments don’t carefully consider the data that goes into them. If police feed crime reports into the formula that are based on biased policing practices, such as more heavily patrolling low-income areas and disproportionately reporting crimes in minority neighborhoods, the resulting predictions about future crime will also be biased.

“Predictive policing software is designed to learn and reproduce patterns in data, but if biased data is used to train these predictive models, the models will reproduce and in some cases amplify those same biases,” their study concludes.

Lum and Isaac believe that policing algorithms are a problem for cities that currently use them. The Chicago Police Department, for example, uses an algorithm that assigns a score to individual people who have been arrested, involved in a shooting or affiliated with a gang . It then generates what’s called a “Strategic Subject List.” The HRDAG study mentions a twenty-two-year-old Black Chicagoan named Robert McDaniel who had been listed by that algorithm as a risk for potential future involvement in violent crime, although he had no violent criminal record. According to the study, McDaniel didn’t even know he was on the list until a police officer paid him an unannounced visit, warning him to refrain from committing further crimes.

Unlike the Chicago Police Department’s algorithm, PredPol does not target individuals, but geographic areas. For their study, Lum and Isaac applied a simulation of the PredPol algorithm (using a similar algorithm released by the makers of PredPol) to Oakland arrest and police records related to drug crimes. Before applying the algorithm, they discovered there are 200 times more reported arrests in West Oakland and near International Boulevard, which are non-white and low-income populations, than any other neighborhoods. But, according to the 2011 National Drug Use and Health Survey, drug use occurs evenly across the city. This made them concerned that using the PredPol algorithm would drive even more police attention to the neighborhoods where they were already making the most arrests.

Indeed, the researchers found that using a predictive policing algorithm in Oakland could double police presence in Black neighborhoods. “Using PredPol in Oakland, Black people would be targeted by predictive policing at roughly twice the rate of whites. Individuals classified as a race other than white or Black would receive targeted policing at a rate 1.5 times that of whites. This is in contrast to the estimated pattern of drug use by race … where drug use is roughly equivalent across racial classifications,” they wrote in the study.

Lum and Isaac also discovered low-income households would be targeted disproportionately more than those from other economic statuses.

The researchers said they’re concerned that if police departments already target certain people, a predictive policing algorithm would strengthen officers’ biases, allowing them to continue to find crime in historically targeted areas, rather than look for it proportionately throughout an entire city. “This creates a feedback loop where the model becomes increasingly confident that the locations most likely to experience further criminal activity are exactly the locations they had previously believed to be high in crime: selection bias meets confirmation bias,” they wrote in their study.

But the researchers didn’t conclude that the algorithm itself has any bias. Instead, Lum is concerned about what she called “sampling bias,” which means the data already has an existing bias, because crime records show that police have already made significantly more arrests in low-income minority communities than in others.

“If there are two locations with the same amount of crime, but you’re more likely to observe it in location A than location B—any algorithm that doesn’t take into consideration the fact that Location A is more likely to be observed will have these [sampling bias] issues,” she said.

Isaac said police targeting low-income neighborhoods of color is nothing new. “I think that the novel point is that when you use predictive policing algorithms to try to forecast where future drug crimes are going to occur. Because of the bias in the collection process, it just reinforces it,” he said.

The PredPol software was created in 2011 and the company that currently sells it launched in 2012. CEO Brian MacDonald said the algorithm was created by Los Angeles Police Department (LAPD) and researchers at UC Los Angeles to use past crime reports to predict and prevent future crime.

After it was piloted in one Los Angeles area, MacDonald said that neighborhood showed a 10 percent reduction in crime, while crime in other areas either stayed the same or increased. Santa Cruz was the first city to officially use it in 2013. After using PredPol for a year, the department saw a 19 percent decrease in burglaries. MacDonald said now there are about 60 police departments throughout the U.S. that use PredPol, including Atlanta, Georgia.

MacDonald said a flaw in Lum and Isaac’s study is that PredPol itself doesn’t use drug arrest records. Instead, he said, it uses crime report data for incidents such as break-ins, robberies and assaults. The difference between using arrest records and crime reports, MacDonald said, is that crime reports are filed after police officers complete an investigation and believe they have identified the person responsible for a crime. But in the case of drug arrests, someone can be arrested and released without being charged or convicted. For this reason, he said, crime reports are more objective data points.

MacDonald said if records show that a particular crime occurs in a particular neighborhood, PredPol will tell officers to monitor that area. For example, he said, “If auto theft occurs predominately in Black neighborhoods in one city, it doesn’t say anything about who’s stealing the cars—that just tells you where the victims of auto theft were. So, people who have had their car stolen are going to want an increased police presence there to reduce auto theft. If it occurs in a predominately white neighborhood, that’s where the police will patrol.”

Isaac wrote via email that he and Lum used both OPD crime reports that resulted in arrest and those that did not, which he called “very similar to the type of data used in PredPol’s systems.” He said they chose to focus on drug data in particular because they were able to compare results from the 2011 National Survey on Drug Use and Health to the OPD’s drug crime reports. This helped them understand where drug use crimes occurred verses where those crimes were reported.

“The point of this analysis is to show that police records are statistically unrepresentative of where crimes occur, and reflect a mixture of societal and institutional factors,” he wrote. “Our arguments are not specific to drug crime, nor are they specific to PredPol. Rather, we are making a general point regarding consequences of machine learning models applied to crime report records that do not correct for the non-representative data generation process.”

While the OPD does not currently use PredPol, the neighboring Richmond Police Department (RPD) used it in 2013 before discontinuing it one year later. RPD spokesperson Lieutenant Felix Tan said it was difficult to quantify the algorithm’s success, but he doesn’t think that means that it doesn’t work. Ultimately, he said, he believes officers did the job better than technology could. “Our police officers are versed with their beat and they know when crime occurs, what time and where,” Tan said. “We figured we should re-focus and have our officers do what they know how to do best.”

Tan also said that PredPol is colorblind; he doesn’t think it targets people of any race or ethnicity. “I don’t think [race] mattered at all. All it said is something is going to happen in this area—be there. It didn’t say perpetrator is going to be this group of people,” Tan said. “Crime is crime. Suspects are suspects. Race doesn’t matter to us.”

The City of Oakland’s Privacy Advisory Commission Chair Brian Hofer said he is against the idea of Oakland using PredPol. Hofer’s job is to analyze surveillance programs that the city plans to purchase and then present that data to the city council. When he researched the algorithm’s use by other police departments, he said he didn’t find clear data that proves it works. He called it “a waste of taxpayer money” and said that “it could be really bad for civil liberties.”

Hofer also said it could perpetuate racial bias in OPD policing, a problem he said the department is already grappling with. He referred to a 2015 report by the Electronic Frontier Foundation, a group that studies surveillance programs. The study assessed Oakland’s use of automated license plate readers which, like PredPol, use crime records to police the city. The study showed that the license plate readers were disproportionately used in communities of color.

Hofer said that Oakland has a crime problem, but that using a predictive algorithm could cause officers to criminalize innocent people. “Elected leaders and police officers have to look like they’re doing something. We have to justify budgets and salaries. So, we’re going to have this computer [program] saying, ‘Hey, go to these certain neighborhoods at this certain time, because that’s where the crime is happening.’”

For example, he said, once the officer using the predictive software arrives in the targeted area, “I also have to justify being there. This computer told me there’s a crime. I need to tell my supervisor that I did something. We probably will find something wrong.” Then that new crime data will be entered into the algorithm and the cycle will continue.

Hofer feels the bigger problem the city should try to fix is to strengthen trust between residents and police. “The relationships have been broken so badly that we’re not even trying to repair that. We’re not trying to build that community policing, where beat officers knew the community. We’re just relying on technology. Oakland does everything by technology and data, because we don’t have any witness cooperation, because nobody trusts the police department. When police officers do something wrong, they’re not held accountable,” he said.

Hofer said that although the city has been at a standstill on whether or not to purchase the PredPol software, that could change by next year.

Mayor Libby Schaaf’s spokesperson could not be reached via telephone, email or text for comment. An OPD spokesperson also did not respond to a request for comment.

Today, Holmes lives in Pittsburg, California, but travels to Oakland every day to work at Urban Peace Movement, an organization that aims to reduce violence and incarceration in communities of color. This is Holmes’ last year of his parole. He’s recently received a grant to publish a magazine that will share information and resources for people who have been recently released from prison. Holmes also facilitated the first community forum for the City of Oakland to hear what young people want in their next police chief. He said he did it because he hopes to help rebuild the police department in Oakland.

But he continues to feel uneasy when interacting with police. “It’s so unpleasant I don’t even go to Oakland with my son,” who is four years old, Holmes said. But, he said, sometimes he has to, to visit relatives. During one rainy car ride, Holmes recalled, “The police pulled me over and I was compliant, gave them my license and registration and insurance. They had another officer standing on the passenger side where my son was.”

“I asked if I could roll the window up, so my son doesn’t get sick or doesn’t get wet and they said no,” Holmes recalled. “It’s real inconsiderate. It’s hard to deal with. It’s frustrating.

When it comes to interacting with the police now, Holmes said he tries to avoid them—as part of his parole order to stay out of trouble, and for his own sanity. But it doesn’t have to be that way, he said. “Until a child has a good or bad experience with police, kids say, ‘Police are good. I want to be a police officer when I grow up.’” He said, “Now, I just stay away from them.”

2 Comments

  1. […] Holmes was first arrested at age 14 on suspicion of robbery. He said he was jailed for a week and released without being charged. By 15, he was arrested for… read more […]



  2. […] Jeff Rojek unabashedly sells private sector PREDPOL technology using euphemistic references to “hotspots” , “intelligence led “and “evidence based” framework.   Rojek admits that the key to this experiment is “actually trying to get this model to work in an actual agency in the US”.    […]



Oakland North welcomes comments from our readers, but we ask users to keep all discussion civil and on-topic. Comments post automatically without review from our staff, but we reserve the right to delete material that is libelous, a personal attack, or spam. We request that commenters consistently use the same login name. Comments from the same user posted under multiple aliases may be deleted. Oakland North assumes no liability for comments posted to the site and no endorsement is implied; commenters are solely responsible for their own content.

Photo by Basil D Soufi
logo
Oakland North

Oakland North is an online news service produced by students at the UC Berkeley Graduate School of Journalism and covering Oakland, California. Our goals are to improve local coverage, innovate with digital media, and listen to you–about the issues that concern you and the reporting you’d like to see in your community. Please send news tips to: oaklandnorthstaff@gmail.com.

Latest Posts

Scroll To Top