Police: "Fighting online child sex crime is a team effort"


KEEPING children safe online and fighting internet sex crime is everyone’s responsibility.
South Yorkshire Police has a specialist team dedicating to rooting out and prosecuting offenders who groom children online and seek out child abuse images.
Advertisement
Hide AdAdvertisement
Hide AdBut they insist parents, schools, the tech industry and children themselves can all play a part in stamping out the scourge.
Det Insp Claire Mayfield said: “Parents can do a lot, including having open discussions with their kids about what they’re doing online.
“It is everyone’s responsibility.
“It could be a company with a duty of care to its customers and people using those services, to the government in terms of putting legislation in place, social care and safeguarding and parents as well.
“And as kids get older they have that personal responsibility.”
Advertisement
Hide AdAdvertisement
Hide AdThe South Yorkshire team conducts its own pro-active work to root out offenders - earlier this month a man was sentenced for grooming a "12-year-old boy" on dating app Grindr who was actually an undercover police officer - but also receives reports from the public and referrals from the National Crime Agency, which oversees serious crime investigations nationally.
Dets Chief Supt Chris Singleton said: “Law enforcement agencies across the globe are adept at understanding where paedophile activity exists and what sites they visit — these are numerous and evolving.
“They are ever-evolving because the sites are either taken down or become old hat.
“They are constantly looking over their shoulders in terms of the enter in this behaviour, feel uncomfortable and move on.
Advertisement
Hide AdAdvertisement
Hide Ad“We’re not just talking about three or four providers of this material - there are dozens out there.”
File-sharing networks, where internet users can exchange files without using a central server, are known to be exploited by child abuse image traders, and Mr Singleton said this activity accounted for a “significant” portion of his team’s work.
Sensitive techniques, which the police decline to elaborate on, are used “to allow us to get an indication that there is activity taking place”, said Det Insp Mayfield said, after which: “Research is done and that information allows us to identify potential suspects and locations.”
There are other key players in the fight, including non-governmental organisations like the Internet Watch Foundation and major internet companies like Google, Facebook and Microsoft.
Advertisement
Hide AdAdvertisement
Hide AdThe tech giants assist by detecting known illegal imagery on their own networks, issuing deterren warnings to offenders seeking out this material online — a study by American academic Chad Steel found that searches for explicit child sexual abuse terms fell by two-thirds in the year after these warnings were introduced — and offer technical support to key organisations and smaller companies.
Live-streaming of abuse, which is filmed on webcams and often transmitted around the world, has also been identified as an emerging threat.
And the NSPCC said earlier this year that police nationally had recorded more than 3,100 online grooming offences in just a year — nine every day.
Another issue close to home, is what the police refer to as “self-generated imagery”, often produced by children or young people.
Advertisement
Hide AdAdvertisement
Hide AdWhat those involved may not understand is that the pictures they send to their boyfriend, girlfriend or anyone may be sent on and distributed worldwide — and that they are committing a criminal offence, even if they’re under 18 themselves.
Rather than criminalising all young people taking and sending these pictures, a more considered approach is often taken.
“A investigation will take place with social care and there will be safeguarding done and educational messages given to both parties,” said Det Insp Mayfield.
While ill-informed youngsters may avoid prosecution, there is no such short-shrift for dyed-in-the-wool collectors and traders.
Advertisement
Hide AdAdvertisement
Hide AdDet Insp Mayfield said: “These offenders have sometimes been doing this for most of their adult lives.
“If they have kept the material all that time we will go back and show the offending going back all those years.
“The technology is improving all the time and helping us to find these people.”
The NSPCC,and CEOP both offer tips on keeping children safe online, which you can find here and here.
The Key Players


National Crime Agency
Advertisement
Hide AdAdvertisement
Hide AdTHE NCA coordinates the fight against online grooming and offenders viewing and trading child abuse images and videos.
As well as leading complex operations itself, the national agency deals with referrals from America’s National Center for Missing & Exploited Children (see below) on behalf of US-based internet companies and from overseas police forces.
Its Click CEOP tool, which is embedded in many websites, also allows the public to report grooming or other offences direct to the NCA.
Last year, the NCA allocated almost 10,000 potential cases to UK police forces.
Internet Watch Foundation
Advertisement
Hide AdAdvertisement
Hide AdThe IWF is the UK’s hotline for reporting child sex abuse imagery and receives reports from internet companies, the police and the public.
Analysts also pro-actively search online for websites and other platforms hosting this material and have reported a surge in “hidden” sites.
Last year, the IWF identified more than 70,000 unique URLs with child sex abuse images.
The IWF maintains a regularly-updated list of these web addresses, which internet companies, from ISPs like BT and Virgin to social media and search giants like Google and Bing can block, as well as a list of keywords associated with explicit child sex abuse terms to help with further blocking.
Advertisement
Hide AdAdvertisement
Hide AdThe IWF’s Hash List, which has been widely adopted by the internet industry, includes hashes — digital fingerprints — which are each linked with an illegal image and allows companies to detect when these images are uploaded to or shared on their platforms.
National Center for Missing & Exploited Children
THE NCMEC is a non-profit organization that serves as the national clearinghouse in the United States in regards to issues relating to missing and sexually-exploited children.
NCMEC operates the CyberTipline, which serves as a reporting system for suspected child sexual exploitation, and accepts tips from the public and more than 1,500 tech companies about child sexual abuse imagery found or detected on their networks, as well as from search providers about illegal sites and images found during search indexing.
NCMEC, which also takes tips on other offences including child grooming and sex trafficking, attempts to identify where offenders uploading or transmitting child abuse images are located and dispatches the intelligence gathered to relevant law enforcement agencies.
Microsoft
Advertisement
Hide AdAdvertisement
Hide AdMicrosoft’s main contributions are Photo DNA and Photo DNA for video, which use hashes to detect known child sex abuse imagery, including in email service Outlook and cloud storage app One Drive.
The tech giant has donated the PhotoDNA technology to over 150 organisations, including companies, non-profits, and forensic tool developers.
Like many other companies, Microsoft reports to the NCMEC any child sex abuse material reported by its users or uploaded and detected by Photo DNA.
It also uses its own database of Photo DNA hashes and those provided by the IWF to prevent known illegal images from appearing in search results and issues deterrent warnings in response to searches for explicit child sex abuse terms.
Advertisement
Hide AdAdvertisement
Hide AdMicrosoft also includes family safety settings for its products, works with non-profit bodies and other companies and takes part in Safer Internet Day projects every year.
Google also uses hashes to detect known child sex abuse images on its systems and products, and uses IWF tools to block known illegal imagery from search results, as well as issuing similar warnings as Microsoft.
It also shows warnings to users using search terms associated with child sex abuse, stating that such content is illegal and directing them to helplines.
The search giant, which has assisted the NCMEC by providing cloud computing support and given technical help to the IWF, passes on to the NCMEC and in turn to law enforcement details of any illegal images reported by its users, uploaded and detected on all platforms or found during search indexing.
Advertisement
Hide AdAdvertisement
Hide AdIt also responds to formal legal requests from police investigating suspects.
The company’s latest weapon in the fight against child abuse images is an artificial intelligence tool which can help to identify suspected but not previously-identified illegal pictures.
Google is making this technology available to small companies and non-governmental groups to help them sort flagged content for abuse material.
Facebook uses Photo DNA to scan its network and all uploads for known illegal imagery, and also responds to user reports of this material, referring all cases to the NCMEC and law enforcement agencies.
Advertisement
Hide AdAdvertisement
Hide AdThe company also removes all photos of child nudity found on Facebook, blocks searches for known child exploitative terms and suspends accounts of users posting child sex abuse imagery.
Facebook said it was expanding its online safety team to 20,000.
Oath (Yahoo and AOL)
Oath also partners with the IWF and NCMEC to detect and report illegal images of children on its network.
It also provides families with a variety of safety advice across its brands and and sponsors UK Safer Internet Day education packs.