Even AI doesn’t want to see THAT: Met police want to use artificial intelligence to identify child porn, but it keeps tagging pictures of the desert

Wednesday, January 03, 2018 by

Part of combating child pornography is poring over images and videos containing such content. Going over footage child abuse is enough to distress any member of the force, but help may soon be on the way. The Metropolitan Police, Greater London’s own law enforcement agency, is developing an artificial intelligence (AI) system that can take on the task of scouring through computers and phones for child pornography. More than that, the AI system could be fully functional within two to three years’ time.

There’s a catch, however. Currently, the system hasn’t yet advanced to the point where it can distinguish human flesh from pictures of deserts. “Sometimes it comes up with a desert and it thinks its an indecent image or pornography. For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin color,” said Mark Stokes, head of the Metropolitan Police’s digital and electronics forensics department.

Although it still requires a whole lot of tweaking, the AI system could someday soon be saving police officers from subjecting themselves to psychological trauma. In 2016 alone, the Metropolitan Police went through 53,000 devices to seek out incriminating content. “We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans. You can imagine that doing that for year-on-year is very disturbing,” Stokes told the Telegraph.co.uk.

Another hurdle that the Metropolitan Police has to overcome is storage limitations. Currently, they make use of a London-based server storage center, but the growing resolution sizes and increasing volume of explicit imagery has strained its storage capabilities. The Metropolitan Police has considered moving the data to a cloud provider like Google or Amazon Web Services. This in itself has a number of drawbacks, including but not limited to the risk of hacking and leaking the information into public domain, and the absence of legal permission to store criminal images on cloud storage.

In spite of the various issues, Stokes has stated that they’re ironing out the kinks with providers, commenting: “We have been working on the terms and conditions with cloud providers, and we think we have it covered.” (Related: Latest facial recognition software can identify you even if your face is COVERED, exchanging even more privacy for “safety”.)

The use of AI in monitoring and identifying explicit content is one that has been on the rise in recent years. Apart from the system that the Metropolitan Police has been working on, there’s Identifying and Catching Originators in P2P Networks (iCOP), a project created by a team of academics from assorted disciplines. Already being utilized by various law enforcement agencies across Europe, the AI program is unique in that it can single out new images and videos of child sexual abuse.

“The existing tools match the files that are being shared through existing databases, but our program detects new data. If you look at peer-to-peer networks, images are shared at a pace that’s just not feasible for any human to go through [manually],” said Claudia Peersman, iCOP project leader and computational linguist.

The efficacy of iCOP is such that, in trials of real-life cases, its false positive rate came to less than eight percent. Though the team is still working on improving iCOP’s capabilities (it’s still unable to trawl through the dark net), this AI and the one developed by the Metropolitan Police are making it all the easier to put the kibosh on child pornography.

Go to VirtualReality.news to read up on more news articles or breakthrough regarding artificial technology and robotics.

Sources include:

DailyMail.co.uk

Telegraph.co.uk

Vocativ.com



Comments

comments powered by Disqus