top of page

EXCLUSIVE: AI bots to pose as kids in online chatrooms and 'snare thousands more paedophiles'

THOUSANDS more paedophiles could be snared by "AI bots" that police chiefs hope will replace undercover detectives posing as children in online chatrooms.

Police are hoping to be able to use the artificial intelligence technology to mimic work currently done by undercover officers who pose as children in online chatrooms to detect paedophiles attempting to groom them.

The digital technology could replicate the work of one officer "thousands of times" at the same time, according to Policing Minister Chris Philp, who revealed the plan was one of a number of high-tech proposals being worked on to get police "ahead of the curve" against criminals.

He warned organised crooks were rapidly using AI to develop mass scam emails to target millions more victims and to make fake realistic child sex abuse images.

He said: "AI can even work for conducting an investigation to identify paedophiles in online chat rooms rather than having a physical officer having to create a legend and pose as a child in an online chatroom - you can actually automate the process and have thousands of bots who are emulating that legend or similar legends rather than having to use police officers for each session.

"We need to be ready to embrace AI enthusiastically to make sure that we are at the forefront, but it is important to make sure the AI is safe."

Paul Taylor, Chief Scientific Advisor to the National Police Chiefs' Council (NPCC), (above) said: "It is not a question of if, but how we can use AI safely and effectively."

He said 64 AI innovations were currently being piloted by different local police forces that could potentially save £352million and 15 million officer hours a year if rolled out nationally.

He said: "(One) system learns over time how people investigate child exploitation and through that learning presents the evidence back to those individuals to allow them to do a more efficient investigation. Just a few months ago (the system) protected 21 young lives from abuse."

"Are we really going to standby and not at least explore how we can engage AI and other capabilities to deliver the benefits to the public?

"We can't move too slowly. I appreciate all the anxiety and concerns, but we have to find ways to do this quickly. If we do not, that gap will become too big and we will not recover. "Criminals do not act in the same way we do, they are using these technologies in ways that not even I can envisage, so we have to move quickly."

However, Alison Lowe, West Yorkshire Deputy Mayor for Policing and Crime (above) urged counterparts not to race ahead and to ensure the public consented to new techniques

She said: "I do urge caution and calm because it would be very easy to get lost in the excitement and the opportunity that AI poses, but we do need to get the basics right.

"Firstly, we need to consult the public we serve as we have made an assumption that we think the people are with us. So find out what the communities think about AI and how far they want it to go and understand the limits and scope of the work they want us to do.

"Supermarkets have got more regulation and health and safety than AI - we need AI to work for the communities to keep us safe, but we need to be the masters of it and not be mastered by it."

bottom of page