The arrest of Eliot Cutler last week on charges of possessing child exploitation material began with the work of an unusual nonprofit that each year forwards millions of tips to law enforcement that form the basis for thousands of prosecutions in the United States and abroad.

In Cutler’s case, it was an automatic red flag of a single image that authorities say they traced back to his computers.

Eliot Cutler Hancock County Jail

Since 1998, the National Center for Missing and Exploited Children has played a unique role in helping law enforcement track down and prosecute people who access, download or distribute online images and videos depicting child sexual abuse.

By federal law, every technology company that discovers illegal material on its platform has an obligation to report it to the center’s staff – and over the years, advancements in search technology have made the identification of child exploitation material a highly automated process.

Last year, 29.1 million tips flooded in to the center from Google, Microsoft, Facebook, Amazon, SnapChat, Kik and CloudFlare, along with scores of other lesser known technology providers. Those tips made up the vast majority of the 29.3 million tips the center received. Roughly 200,000 tips came from the public or other sources.

The overall number of tips the center receives has been steadily rising, with 16.9 million tips in 2019 and 21.75 million in 2020.


In over 99 percent of those cases, the reported information relates to depictions of child sexual abuse. Last year, about 6.5 percent of the tips the center received, about 1.9 million total, were traced back to internet users in the United States.

Tip in hand, the center’s analysts work to identify the physical location of the internet user who uploaded or downloaded the material. They then send what they find out to one of the 61 federally designated police groups that investigate the cases. Maine’s is the computer crimes unit at the Maine State Police.

Located at the Maine Criminal Justice Academy in Vassalboro, the computer crimes unit is a team of about two dozen investigators and analysts.

Less than 10 years ago, the cyber tips from the national center were not a lead source for their cases, as investigators were focused on monitoring peer-to-peer file sharing services, said acting Lt. Tom Pickering, the unit’s leader.

“They did very little if any cyber tip investigations then,” Pickering wrote in an email. “Today it’s the complete opposite.”

Now, dozens of cyber tips are sent to the unit each day. On Monday, the unit received 71 reports from the national center, said Shannon Moss, a spokesperson for the agency.


So far this year, the computer crimes unit has received 592 tips – and between 2018 and 2021, the number received annually more than doubled, from 551 to 1,237.

While some child exploitation material is passed around and bought or sold on the dark web, where internet companies and law enforcement have a harder time tracking users’ activity, all of the national center’s tips relate to the open web that is accessible to, and used by, billions every day. While most people use Facebook to sell a car or keep in touch with old friends or rally around a political cause, in millions of cases, the social network is a conduit for offenders to trade and obtain child sex abuse material.


Analysts at the national center are responsible for filling in gaps in the information that some companies do not hand over automatically. While the law requires that companies report the abusive material, federal statute leaves it up to the internet providers to determine what types of information to provide about the individuals they believe are responsible for the content.

“It depends on the company in terms of what they’re comfortable providing, but some companies will provide whatever identifiers that their users are using on their platform, so that could be a screen name, it could be an IP address,” said Lindsey Olson, the executive director for the center’s exploited children division. “There could be some historical information like dates and times.”

In one case that led to a Kentucky man’s prosecution for possession of child exploitation material, the national center passed on his IP address, which revealed the state, city, and the latitude and longitude coordinates of his neighborhood, along with his internet service provider, according to an excerpt of the report law enforcement received in that case that was published last year in the Berkeley Journal of Criminal Law.


In addition to location information for the suspected perpetrators, the center also hands over the offending material itself. Local police then must review the files and determine if they are illegal.

How the technology companies locate and flag the suspected illegal material is a process that relies on super-fast computers that analyze every file uploaded or downloaded from their services and compare them to a massive list of known child exploitation material maintained by the national center and distributed to the companies that partner with them.

Because there are millions upon millions of known child exploitation files and even millions and billions more user files that need to be analyzed, computer scientists have developed methods to speed up the process using clever shortcuts.


The first step is to take a “digital fingerprint” of every file that passes through a tech company’s servers. This fingerprint, known as a hash, is a string of numbers unique to that digital file. In the case of a photograph, changing even a single pixel would change the digital fingerprint. But feed two identical digital photographs into the same system, and the digital fingerprints generated would be a perfect match.

The database of known child exploitation material is also reduced to unique hash values. If a file uploaded by an internet user generates the exact same hash as an entry in the national center database, the technology company flags the user and sends information to the center.


Some algorithms can even predict close matches, in which most or part of a file appears to be similar enough to warrant a secondary review by a human analyst. Every time someone uploads or sends a file, Google converts it into a hash and compares it against the known database of illegal content. Most internet users will never realize the process is taking place, let alone be confronted by law enforcement.

The system is effective because the hashes, even at dozens and dozens of characters long, are still easier to compare than the raw underlying files they represent, meaning data specialists can rapidly process huge volumes of information.

“One of the easiest things for any computer to do is answer a yes-no question, does this match, yes or no?'” said a spokesperson for D.Eno Forensics PLLC, a private digital forensics laboratory based in the United States, who requested anonymity out of fear of reprisals from opponents of clients. “In the database world, it’s the easiest thing in the world to do.”

It’s also how law enforcement investigators quickly search through someone’s files after they seize a cellphone, computer or hard drive, the spokesperson said. Police use software that copies a mirror image of the data found on a suspect’s device and converts it into the same type of hash values.

In the case of Cutler, although police seized terabytes of information, doing a “first pass” to sort through it using hashed values could take a matter of hours, not days or weeks, the spokesperson said. That could be how investigators quickly identified the 10 images allegedly in Cutler’s possession that generated the four felony counts he faces now.

“It’s low-hanging fruit,” the security spokesperson said.



Developing ways to process the information rapidly is essential because the volume of data moving around the internet is astronomical. Last year alone, the national center received 84 million unique files that technology companies flagged as illegal, and the mountain of data it receives has been growing in recent years. Last year, Facebook alone sent the center over 22 million reports flagging potentially illegal material.

“This type of content, the child sex abuse material, it lives out on the internet for years and years and years,” said Olson, the center’s exploited children division executive director. “There have been some images that can circulate for a few decades. It really wears on these children who are now adults, to know that their content is still out there. So we do a lot of work with survivors and really try to get that type of content removed.”

Using hashes to quickly ferret out unwanted information is not unique to child exploitation material. It’s the same type of technology that Twitter, Facebook, Reddit and other popular platforms have used to automatically identify and take down other offensive or disturbing images and videos, including material put out by terrorist groups that depicts extreme violence or death or recruitment videos designed to draw people to their causes.

The national center maintains the database of known illegal material and the corresponding hash values, and offers the information up to tech companies, but there is no law requiring them to use it or deploy the match-search process at all, although there is an incentive for them to keep their platforms clean of illegal information as a good business practice.

In 2009, Microsoft developed an image-matching technology called PhotoDNA and gave it to the national center to use tracking down child exploitation material. Since then, the technology has evolved to help identify individual children depicted in the photos and videos so law enforcement can determine if they are new victims – and then, in some cases, locate the people who abused them.

“Unfortunately there is new content that is created every day,” Olson said. “Technology allows for child sex abuse material to be created very easily. Think of how easily people have access to phones and storage and tablets and everything like that.”

Related Headlines

Comments are not available on this story.

filed under: