How Does Google Report CSAM?
October 3, 2023 Don Pumphrey, Jr. Criminal Defense, Sex Crimes Social Share
Google’s Search Engine is used by people around the world every day. While the online provider is meant to be used for educational and informative purposes, there are instances of individuals using its platform for alleged crimes. One of these crimes includes the possession of child pornography. If a person uses Google to search or download content considered to be child pornography, they could face criminal prosecution.
This page will provide insight on Google’s current data on child sex abuse material (CSAM) reporting and the methods used by the National Center for Missing and Exploited Children (NCMEC) for the detection and reporting of such illicit content. We will also provide a recent example case from Florida.
Google’s Data on CSAM Reporting
The following lists data provided on Google’s Transparency Report from July 2022 – December 2022:
- Total content reported to NCMEC – 6,704,684
- CyberTipline reports to NCMEC – 1,130,042
- Accounts disabled for CSAM violations – 365,428
- URLs reported and de-indexed for CSAM from Google Search – 437,020
- CSAM hashes contributed to NCMEC – 2,298,027
The following lists the top 10 countries where accounts disabled due to CSAM are located:
- Indonesia
- Brazil
- India
- United States
- Mexico
- Thailand
- Russia
- Vietnam
- Colombia
- Philippines
Process for Detecting, Removing, and Reporting CSAM
According to Google’s Safety & Security page, both the government and child safety organizations expect and require the reporting and removal of CSAM from their systems. Their goal is to, “prevent abuse on our platforms while minimizing the risk of an incorrect suspension.”
When Google finds traces of CSAM on their platform, they “remove it, report it, and often take the step to suspend the account.”
Detection
The two ways in which Google detects CSAM on its platform are through hash matching and artificial intelligence (AI).
Hash Matching Technology is used to find known CSAM online. It works by assigning images and videos to a “hash,” or a unique digital signature to compare against databases of known signatures. If the two come back with a match, the content is considered as either the same or closely similar. The hashes are obtained through the National Center for Missing and Exploited Children (NCMEC) and other sources. Google explains that this just serves as the starting point, with their team reviewing each independent purported CSAM hash to confirm the accuracy.
Artificial intelligence (AI) is used to detect newly created CSAM. It works by flagging content that has similar patterns to previously confirmed examples of CSAM. Google explains that their systems have been designed to distinguish between CSAM and benign imagery such as children playing in a backyard.
Further, Google provides other companies with access to their Child Safety Toolkit. The toolkit includes the Content Safety API—to prioritize the reviewing of likely abusive content.
In addition to the technology used by Google, they also have specialized content reviewers who provide expertise in relative laws, child safety and advocacy, cyber investigations, and what constitutes CSAM.
Duty to Report
Federal law 18 U.S.C. 2258A provides the reporting requirements for online providers. Section(1)(A) states the following:
To reduce and prevent the proliferation of online child sexual exploitation, a provider:
- Shall, as soon as reasonably possible after obtaining actual knowledge of any facts or circumstances of an apparent or imminent violation involving child pornography, report the following information to the CyberTipline of NCMEC:
- Information about the involved individual – Identifying information about the person suspected of violating federal law, including the email address, Internet Protocol (IP) address, uniform resource locator, or any other identifiable information.
- Historical reference – Information on how and when the suspected person or provider uploaded, transmitted, or received the alleged CSAM, including a date, time stamp, and time zone.
- Geographical location – Information on the location of the suspected individual or provider, including the IP address, or at least one form of geographical identifying information.
- Visual depictions of apparent CSAM – Any visual depiction of apparent CSAM or other related content to the incident such report is regarding.
- Complete Communication – The complete communication containing any visual depiction of apparent child pornography or other content, including:
- Any data or information regarding the transmission of the communication; and
- Any visual depictions, data, or other digital files contained in, or attached to, the communication.
Once Google has completed the review process, they report the information of alleged CSAM to NCMEC as required by law. After evaluation, NCMEC shall refer the case to local or relevant law enforcement. The appropriate law enforcement agency can request that Google disclose user information. These types of requests are issued in the context of criminal investigations.
To find out more about the forensic resources used in investigations by the NCMEC, refer to our blog post here.
Example Case
Miami-Dade Police arrested a Hialeah man after reports of him possessing multiple files containing child sexual abuse material (CSAM) were received.
According to the report, NCMEC received a tip on their CyberTipline on July 19, 2022, regarding a user who uploaded 122 files containing alleged child pornography. Investigators assigned to the case reviewed the reported account to discover the user uploaded images and videos considered to be child pornography.
Out of the 122 files uploaded, detectives reported that certain images depicted nude girls as young as 6-years-old. The videos uploaded to the files contained minors as young as 5-years-old and up to 12-years-old.
On August 24, 2023, police executed a search warrant at the home of 54-year-old Pedro Humberto Castillo-Paez. The report indicated that Castillo-Paez gave a “full confession” to authorities. He has since been charged with eight counts of possession of child pornography depicting the sexual performance of a child.
Contact a Tallahassee Defense Attorney
Internet providers such as Google are required to report apparent instances of CSAM to NCMEC and relative law enforcement. The proposed STOP CSAM Act of 2023 has stricter rules and regulations regarding a provider’s reporting of alleged CSAM. Although the bill has not yet passed, if it does it could result in over reporting, mandatory content filtering, and suppressing lawful speech. You can read more about the potential changes in the STOP CSAM Act of 2023 in our blog post here.
If you or a loved one are being prosecuted for a crime relating to child pornography, it is imperative that you consider hiring an experienced defense attorney. The penalties for alleged sex crimes against children are steep, including fines, imprisonment, and a lifetime registration with the Florida Sex Offender Registry.
The defense attorneys at Pumphrey Law have experience working on these types of cases. Don Pumphrey and his team are knowledgeable regarding Florida’s laws against child pornography. Further, our team is aware of the different ways a person can be wrongfully accused of a CP charge. In addition to building your case and a strong defense strategy, we may also advise using expert witness testimony from experts such as Loehrs Forensics.
Our attorneys will provide you with guidance and support throughout your case. You can receive a free consultation when you contact our office at (850) 681-7777 or leave us a message on our website.
Written by Karissa Key
Social Share