海角大神

Microsoft says its software can tell if you're going back to prison

Microsoft is pitching its software and cloud data storage to law enforcement agencies. Researchers say that using data to find crime patterns can help stop burglaries, but using data analysis to 'predict' violent crimes is highly problematic.

|
Ann Hermes/海角大神/File
The Atlanta Police Department displays a city map through PredPol, a predictive crime algorithm, at the Operation Shield Video Integration Center in Atlanta, in January. Microsoft has been increasingly marketing its own crime prediction software to police. But experts say that computers can help solve crimes like burglary, predicting violent crime is highly unlikely

In a scenario that seems ripped straight from science fiction, Microsoft says its machine learning software can help law enforcement agencies predict whether an inmate is likely to commit another crime by analyzing his or her prison record.

In a and at policing conferences, such as one on Oct. 6 at the Massachusetts Institute of Technology, Microsoft has been quietly marketing its software and cloud computing storage to law enforcement agencies.

It says the software could have several uses, such as allowing departments across the country to analyze social media postings and map them in order to develop a profile for a crime suspect. The push also includes partnerships with law enforcement technology companies, including Taser 鈥 the stun gun maker 鈥 to provide police with cloud storage for body camera footage that is compliant with federal standards for law enforcement data.

But in a more visionary 鈥 or possibly dystopian 鈥 approach, 聽the company is also expanding into a growing market for what is often called predictive policing, using data to pinpoint people most likely to be at risk of being involved in future crimes.

Predictive or preventative?

These techniques aren鈥檛 really new. A predictive approach 鈥 preventing crime by understanding who is involved and recognizing patterns in how crimes are committed 鈥 builds on efforts dating back to the early 1990s, when the New York City police began using maps and statistics to track areas where crimes occurred most frequently.

鈥淧redictive policing, I think, is kind of a catch-all for using data analysis to estimate what will happen in the future with regard to crime and policing,鈥 says Carter Price, a mathematician at the RAND Corporation in Washington who has the technology. 鈥淭here are some people who think it鈥檚 like the 鈥楳inority Report鈥 鈥 鈥 in which an elite police unit can predict crimes and make arrests before they occur 鈥 鈥渂ut it鈥檚 not. No amount of data is able to give us that type of detail.鈥

Scholars caution that while data analysis can provide patterns and details about some types of crimes 鈥 such as burglary or theft 鈥 when it comes to violent crime, such approaches about who is at high risk of violent victimization, not a list of potential offenders.

鈥淭hinking that you do prediction around serious violent crime is empirically inaccurate, and leads to very serious justice issues. But saying, 鈥楾his is a high risk place,鈥 lets you focus on offering social services,鈥 says David Kennedy, a professor at John Jay College of Criminal Justice. In the 1990s, he pioneered an observation-driven approach that worked with local police in Boston to target violent crime. After identifying small groups of people in particular neighborhoods at high risk of either committing a crime or becoming a victim of violence, the , Operation Ceasefire, engaged them in meetings with police and community members and presented them with a choice 鈥 either accept social services that were offered or face a harsh police response if they committed further crimes. It eventually resulted in a widespread drop in violent crime often referred to as the 鈥淏oston Miracle.鈥

The Operation Ceasefire program, Professor Kennedy says, was never meant to be predictive. But using data to discover patterns of how crimes are committed and map where they occur has been more successful, especially for lower-level crimes.

One approach

A few years ago, Cynthia Rudin, who teaches statistics and focuses on machine learning at the Massachusetts Institute of Technology, received an e-mail from a member of her department asking for help using data analysis to aid the Cambridge Police in catching burglaries.

The e-mail originally came from Lt. Daniel Wagner, the department鈥檚 lead crime analyst. After meeting with Lt. Wagner, she was surprised when he presented her with an academic paper suggesting one method for analyzing the crimes using a computer model, then said he thought it wouldn鈥檛 work.

鈥淚 read the paper, it was a really dense mathematical paper,鈥 says Professor Rudin, an associate professor at MIT鈥檚 Sloan School of Management. When she came up with an alternate suggestion, Lt. Wagner was still skeptical. 鈥淚 couldn鈥檛 believe this guy, so he shot my idea down, and I went back to the drawing board, and everybody liked that idea.鈥澛

Her eventual approach 鈥 using a machine learning tool to analyze the department鈥檚 existing records and analysis to determine if there were particular patterns in how burglaries were committed 鈥 proved successful, resulting in a .

The software, which Rudin and doctoral student Tong Wang called Series Finder, works by identifying patterns based on information such as when and where a crime occurred, how the burglar broke into a home, and whether the victim was at home or on vacation at the time.

鈥淟uckily, the Cambridge Police Department had kept really good records, so the computer was learning from what analysts had already written down," she says. "It鈥檚 not like the computer was off on its own being artificially intelligent, it was supervised learning. When we ran Series Finder, it was able to correct mistakes in the database and it said 鈥楾hat isn鈥檛 right.鈥 鈥

Series Finder supplements the work of human analysts, she says, by identifying crime patterns in minutes that might take an analyst hours of poring over records and databases.

鈥淚t would be nice if the computer automatically said, there鈥檚 a pattern here: there鈥檚 a pickpocket here every Tuesday and Thursday,鈥 Rudin says. 鈥淗ow many crime series were not caught because those patterns weren鈥檛 detected? We鈥檒l never know, because humans just aren鈥檛 that good at sorting through larger sets of data.鈥

A lucrative market

Professor Rudin was able to use grant funding to make her research possible, providing the Cambridge Police with access to Series Finder for free. The goal, she says, wasn鈥檛 to commercialize the software but to showcase the impact of machine learning to solve real-life crime problems.

But commercialization has been inevitable, as companies such as PredPol, predicts crimes using software originally developed to forecast earthquakes, and Microsoft have expanded in a bid to attract more business from law enforcement agencies.

In one video tutorial for law enforcement agencies, Microsoft makes a sweeping claim. Using records pulled from a database of prison inmates and looking at factors such as whether an inmate is in a gang, his or her participation in prison rehabilitation programs, and how long such programs lasted, its software predicts whether an inmate is likely to commit another crime that ends in a prison sentence. Microsoft says its software is accurate 90 percent of the time.

鈥淭he software is not explicitly programmed but patterned after nature,鈥 Jeff King, Microsoft鈥檚 principal solutions architect, who focuses on products for state and local governments, says in the video. 鈥淭he desired outcome is that we鈥檙e able to predict the future based on that past experience, and if we don鈥檛 have that past experience, we鈥檙e able to take those variables and then classify them based on dissimilar attributes."

Rudin has also been working on using machine learning to predict recidivism, but she pitches her research differently.

鈥淭hey don鈥檛 just give you probability, they give you reasons. You can use them for various decisions like bail or parole or social services,鈥 she says. Ultimately, the goal isn鈥檛 profit, she adds, it鈥檚 developing tools that allow police to use data in ways that may help improve outcomes of the justice system.

Will predictive analysis help improve policing?

In its , Microsoft has touted the impact of its cloud storage software, particularly on making police body cameras 鈥 long controversial for police departments 鈥 easier to use.

The Oakland Police Department, for example, has begun using Microsoft鈥檚 Azure cloud storage service for its body cameras. In Oakland, body cameras have reduced officers鈥 use of force by 60 percent since they were adopted in 2009, the department says.聽

The police credit the cameras with stopping officer-involved shootings for the last 18 months, though there have since been two shootings this summer, for which the department its body camera video.

鈥淧redictive policing will never replace traditional law enforcement, but it will certainly enhance the effectiveness of existing police forces, allowing for an informed and practical approach to crime prevention rather than merely making arrests after the fact,鈥 wrote Kirk Arthur, Microsoft鈥檚 director of worldwide public safety and justice, in a in January.

While predictive policing is still in the early stages, some say the data it generates could have a mixed impact. While the information could improve police transparency, it could also lead to other problems.

鈥淚f police departments had access to social media accounts, and it turned out that crimes were being committed by people who liked a certain kind of music and a certain sports team, it could lead to certain kinds of racial discrepancies,鈥 says Dr. Price, the RAND researcher. 鈥淚t鈥檚 a useful tool, but it should always be done with [the idea of] keeping in mind how this will impact populations differently, and just sort of being cognizant of that when policies are put in place."

But Kennedy, the criminology professor, says that for violent crimes, using data that shows crime risks to influence policing actions could have devastating consequences.

鈥淧eople have been trying to predict violent crimes using risk factors for generations, and it鈥檚 never worked,鈥 he says. 鈥淚 think the inescapable truth is that, as good as the prediction about people may get, the false positives are going to swamp the actual positives ... and if we鈥檙e taking criminal action on a overwhelming pool of false positives, we鈥檙e going to be doing real injustice and real harm to real people.鈥

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines 鈥 with humanity. Listening to sources 鈥 with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That鈥檚 Monitor reporting 鈥 news that changes how you see the world.
QR Code to Microsoft says its software can tell if you're going back to prison
Read this article in
/Technology/2015/1106/Microsoft-says-its-software-can-tell-if-you-re-going-back-to-prison
QR Code to Subscription page
Start your subscription today
/subscribe