This summer, the Edmonton Police Service will turn on its newest tool for combating crime, the Operations and Intelligence Command Centre (OICC), a high-tech, multi-million dollar control centre in the Southwest Division location. When fully functional, four full-time teams of sergeants, constables and intelligence analysts will work 24/7, mining an ever-growing collection of public and private data with software and algorithms and distilling it all across a wall of 24 TV screens. It is, more or less, exactly what you’re thinking — which might mean you’re also thinking of Minority Report, or any other movie where police use technology to infringe the rights of law-abiding citizens.
That’s a reasonable concern, agrees inspector Warren Dreichel, who recognizes the pressure the EPS faces in making sure it gets it right. “We don’t want to be the agency responsible for screwing it up, because it’s really easy to set a bad precedent,” he says. Regulations exist to ensure the current, pre-OICC system follows privacy protocol, but Dreichel admits “hard limits (for the OICC) are still up in the air…Down the road, there’s a lot of work to do on privacy.”
At its core, the OICC is just another response unit, like a patrol-car team, but it has the biggest brain. When a call comes in, it will search millions of records from databases that have long been accessible — past police reports, criminal and suspect profiles, and data from health services and the Canadian Police Information Centre. It filters the info faster than officers rushing to the scene ever could. But it will also access closed circuit security and traffic cameras, and social media feeds. Police must establish agreements ahead of time with businesses to access their security footage, but they are free to access traffic camera content and Twitter feeds, because you are too.
“When lots of people post, it can give us one more piece of the puzzle,” Dreichel says. An example: There’s a robbery at a convenience store, and somebody tweets a car is speeding from the scene. If that person’s phone’s GPS is turned on, the OICC’s mapping environment can pull it up along with all social media activity in the area, and filter by keywords. “We would then contact them and ask if they’re interested in talking with us. If they say no, we don’t push it. And we don’t access closed or private accounts,” Dreichel says. With images, police can only run them against their own mugshot database, not social media at large. “We’re not allowed and, responsibly, we shouldn’t,” Dreichel says.
The police sliding into your DMs because you geotagged your Slurpee purchase might feel intrusive, but it’s not illegal — it’s essentially the online equivalent of door-to-door canvassing. But as Director of the Kule Institute for Advanced Study at the University of Alberta, Geoffrey Rockwell explains, using that database, in that way, could lead to false positives, and “it won’t be long before people figure out that they’re doing this and stop live tweeting or posting from crime scenes.”
Even if police narrow their use of social media down to actual criminals, problems arise. “By only scraping the Twitter feeds of known offenders, you’re going to get a secondary set of linked people who are all of a sudden being ingested into a large database who show up as ‘known to the police,’ which is sort of an ominous phrase,” Rockwell explains. Add in the real possibility of police security cameras with facial recognition software and you’re closer to the full-blown Minority Report scenario of predictive policing.
Anna Koop is a director of applied machine learning at the Alberta Machine Intelligence Institute (Amii), and when she “first heard we were meeting with (EPS), I thought, crap, they’re going to ask us to do predictive policing and I’m going to tell them no, because often I bring up predictive policing as a counter example of something that can have horrible consequences if it’s not done carefully.” But she left the meeting impressed with what they knew, like the perils of using a police database to train an algorithm to predict where future crimes will occur, creating a positive feedback loop that inevitably targets people based on race and class.
Koop clarifies that the problem facing police is the same one facing anyone who wants to apply machine learning to big data — the tool is only as “smart” as you make it — but the consequences of mistakes are drastically different. But that means the solutions aren’t unique to police either.
According to Koop, they start by recognizing that no database — and therefore no algorithm — will ever be non-biased, in either the societal or mathematical sense, so the assumption should be that there’s huge bias, and to scour for it. Using non-EPS databases helps because that reduces the bias of past enforcement history. There are technical solutions, like designing algorithms to account for certain biases. And, you can filter what algorithms see. With security footage, that means “suspicious activity detection” algorithms have less to look at and don’t consider everything suspicious. And, of course, the humans need constant training, too.
To its credit, the EPS sought council with Amii to discuss these and other solutions, so no one can say they are ignorant of the issues.
“We’re in a big data model, so we need to use it to be more effective,” Dreichel says. “I don’t know what the answers are, or what data sets are most important. But (places like Amii) can help. We have to get out of our own way on that, and listen to experts.