Trending Topics

REPORT Act revolutionizes child protection: A deep dive into legislative and digital forensic innovations

Discover the pivotal role of the REPORT Act and AI technology in advancing digital forensics, expediting investigations and safeguarding children from online predators

African American Girl Using Smartphone on Floor

“The REPORT Act is a big win to help keep our kids safe.” — Heather Barnhart

SeventyFour/Getty Images

By Police1 Staff

In an era where technology continues to evolve at breakneck speed, the protection of children from online exploitation has become increasingly complex. The introduction of the REPORT Act marks a significant step forward in this fight, mandating more stringent reporting requirements for online platforms and extending the duration they must preserve crucial evidence. In conjunction with this legislative push, companies like Cellebrite and organizations such as the National Center for Missing and Exploited Children (NCMEC) are leveraging advanced technologies, including AI and digital forensics, to aid law enforcement in identifying, tracking and prosecuting offenders.

To shed light on these developments, Police1 sat down with Heather Barnhart, a leading expert in digital forensics and a key player in Cellebrite’s collaboration with NCMEC. In this Q&A, Heather discusses the implications of the REPORT Act, the role of AI in detecting online child exploitation, and how these innovations are transforming the landscape of child protection. Join us as we explore the intersection of technology, legislation and law enforcement in safeguarding the most vulnerable members of our society.

Introduction to the REPORT Act

Police1: Could you start by giving us an overview of the REPORT Act and what changes it introduces to how social platforms and websites handle cases of child exploitation?

Heather Barnhart: The REPORT Act is a big win to help keep our kids safe. Under this bipartisan legislation, online platforms are now required to report internet crimes against children (ICAC), including child sex trafficking and online enticement or grooming, and the companies must hold onto that content for one year — allowing police sufficient time to review a case. Prior to this legislation, reporting was not required, and organizations only had to hold onto the content for 90 days. If online platforms fail to report the content, they face stiff penalties.

The REPORT Act also modernizes how law enforcement submits child sexual abuse material (CSAM) to the National Center for Missing and Exploited Children (NCMEC). Agencies used to have to rely on traditional postal mail — now they can electronically submit the content in a secure environment. This will speed up the identification of victims and improve law enforcement’s ability to catch offenders. Notably, a child victim, their parents/guardians and attorneys can now directly report CSAM to NCMEC — accelerating both investigation time and a victim’s access to support and services.

Role of AI in detecting exploitation

Police1: How is artificial intelligence being used to enhance the detection of online sexual exploitation, and what are its limitations in current applications?

Heather Barnhart: Cellebrite is currently researching AI-generated images so we can help law enforcement more easily determine its origin from a forensic lens. We already easily identify real images due to the metadata within the file, but if that expected data is missing, the image could be AI-generated. We are looking for easy ways law enforcement can identify file markers and characteristics that show it was created by AI, via file paths and apps used before the content creation.

The more data we feed AI, the better it can assist us. As our AI engine scans more images in investigations and learns variables like skin and age detection, it gets more accurate and faster. A lot of people don’t realize there is a database within their smartphone where they can search for a woman, child, baby, or teenager and find pictures in that age range. Personal devices are tracking this data, which is available in various databases, depending on the device. AI learns those databases and makes our digital forensic tools brilliant. With that said — a best practice is verification and spot-checking. If the AI misses finding an image, that could be the one image that possibly gives a child their freedom.

Impact of new legislation

Police1: In your view, how will the REPORT Act and the Defiance Act change the landscape of child protection online? What specific provisions do you believe will make the most significant impact?

Heather Barnhart: Society is still coming to grips with the horrible ways AI can be used. It’s encouraging to see states passing legislation that holds offenders accountable for this explicit content — we saw this recently where two Florida teenagers are facing charges for allegedly generating fake CSAM of their classmates. The Department of Justice also recently charged a Wisconsin man for producing AI-generated CSAM — the first federal case of this crime type. Upcoming federal legislation like the DEFIANCE Act continues the momentum to combat AI-generated explicit deepfakes and makes it clear that a police officer simply saying, ‘I’m sorry this happened to you,’ is not good enough. Legislation ensures there are consequences and bad actors need it badly. As AI continues to evolve, and with it, the malicious use of the technology, we are working to keep pace with how to defend and protect the innocent who are being targeted. All this legislation, taken in totality, will make a difference.

Digital forensics in action

Police1: Can you describe how digital forensic techniques are employed in investigations of online child exploitation? How critical is digital forensics in these cases?

Heather Barnhart: NCMEC received more than 36 million reports of suspected child sexual exploitation to its CyberTipline, according to its 2023 report. Law enforcement agencies who get these tips are buried in a chilling amount of child sexual abuse material, and they are under-resourced and overwhelmed with the scale of these investigations. Digital forensics has the power to help agencies quickly and efficiently extract and analyze digital evidence to track down both victims and perpetrators.

When I started my career as a computer forensic analyst in 2002, people treated their images like trophies. We called it baseball collections. Predators would boast about how much they have in their “set.” Now that images are primarily digital, it’s so much worse. There are forums predators join to make content requests. Digital content is endless, particularly with everyone always having a camera on them via their smartphone. The cloud makes it much easier to share images compared with mailing a USB thumb drive or a DVD of images. Without digital forensics, it is nearly impossible to catch and track these criminals, as their digital footprints are what often lead to an arrest.

About the National Center for Missing and Exploited Children (NCMEC) and Cellebrite collaboration

Police1: The REPORT Act strengthens the NCMEC’s CyberTipline. Can you talk about how your collaboration with NCMEC and Cellebrite helps expedite these investigations?

Heather Barnhart: Cellebrite is a longtime partner of NCMEC and recently reinforced this commitment through our “Operation Find Them All” (OFTA) initiative. This groundbreaking public commitment is with NCMEC, The Exodus Road — an international organization that fights human trafficking — and Raven, a U.S.-based non-profit focused on the policy that helps ICAC Task Forces. OFTA is designed to accelerate investigations of online crimes against children through investment and in-kind use of Cellebrite’s end-to-end digital investigative technology platform. We are focused on four pillars of action: finding the children, finding the perpetrators of the abuse, finding those paying for the abuse and finding the images and videos and removing them from the internet.  

All CyberTips go through NCMEC and are then dispatched to the 61 ICAC task forces or federal investigators, depending on whom has jurisdiction. Those law enforcement agencies typically have Cellebrite solutions to help them advance the investigation. Through OFTA, if our technology is needed in a case where NCMEC or The Exodus Road is involved, law enforcement can leverage it to effectively help in cases involving missing, endangered, or exploited children. Our experts are also available any time NCMEC calls to help in critical cases they are working with law enforcement.

Challenges in digital forensics

Police1: What are the most significant challenges you face in digital forensics when dealing with cases of online child exploitation? How do you overcome these challenges?

Heather Barnhart: One of the biggest hurdles law enforcement teams face when investigating cases of online child exploitation is encryption. When criminals become intelligent on how to hide evil through technology, it makes our jobs more challenging. Encrypted containers, entire encrypted drives and fake personalities or personas enable criminals to mask malicious behavior and it is more challenging for law enforcement to identify both victims and perpetrators. Another challenge agencies face is jurisdictions with cloud repositories. Say you are in London and you’re the provider, but the cloud repository is in California. Is it U.K. laws or U.S. laws? Can we even get this data? Where can we work the case? It is a struggle.

Case studies and insights

Police1: Could you share some insights or notable experiences from your work on high-profile cases like the Boston Marathon bombing or the Pulse Nightclub shooting that highlight the importance of digital forensics?

Heather Barnhart: Digital evidence is everywhere — think about surveillance cameras that were so important in the Boston Marathon bombing case to identify the suspects. A person’s devices — particularly one’s smartphone — paints a picture of who they are and how they interact with the outside world. Not only can the digital evidence on the devices help prosecutors build strong, solid cases, but it also helps get at the why — assisting in bringing victims’ families closure. No matter the crime type, criminals’ devices can tell us who they were talking to and help investigators understand their motivation. Law enforcement can see what they were researching, how they were building their bombs, or how they planned the attack. For the victims, it helps investigators understand where they were when the incident happened. And sometimes for families, it can provide a semblance of closure, knowing they were trying to reach out to loved ones.

Educational and preventive measures:

Police1: Based on your extensive experience, what educational or preventive measures do you think are most effective in combating online sexual exploitation of children?

Heather Barnhart: Parents never think the unthinkable will happen to their child — until it does. Research shows open and honest conversations go much further than the “scared straight” ones. If your child comes to you and says, “I sent someone a naked photo of myself, they’re going to send it to everyone at school, what should I do?” — punishing the child will not help. Our kids live in a digital world where they will make mistakes, and as a parent, you want them to feel comfortable coming to you for anything. Imagine if they were being extorted for that image and it led them to such a dark place, they consider self-harm — that is a reality too many families are living with.

If your child has shared an image, you should report the image through NCMEC’s resource, Take it Down, a free service that helps remove and stop the sharing of explicit images on the internet. Do not delete anything from the device. If it rises to a situation where a bad actor gets hold of that image, law enforcement will need it for their investigation. Instead, place the device in airplane mode and turn it over to the police.

Future trends in digital forensics

Police1: Looking ahead, how do you see the field of digital forensics evolving, especially with advancements in technology and new legal requirements like the REPORT Act?

Heather Barnhart: Legislation is notoriously behind advances in technology. While lawmakers are moving quicker than anticipated with legislation like the REPORT Act, they must move faster. AI will soon have a role in everything we do, and people need to be vigilant of fake content. Even now, what we see on social media through filters and touch ups is not real life. The impact of AI will only grow, and people must be mindful of this as they interact in this modern world.

Personal motivation and advice

Police1: Finally, what drives your commitment to this field, and what advice would you give to upcoming professionals interested in digital forensics and eDiscovery?

Heather Barnhart: It’s important to find your passion and something that makes you feel complete. Whether we like it or not, we give so much of our life to work. If your work doesn’t make a difference, then why are you doing it? That’s how I see it every day. The work I do takes me away from family at times, yet it is contributing to a better future for my children and one day, their children. That’s what keeps me logging in and showing up every day.

About Heather Barnhart

Heather-Mahalik.jpg

Heather Barnhart

Heather Barnhartis the Senior Director of Digital Intelligence at Cellebrite. She advises on strategic digital intelligence operations and educates DFIR professionals on the latest challenges in the space and how Cellebrite helps address them. For more than 19 years, Heather has worked on high-stress and high-profile cases, investigating everything from child exploitation to Osama Bin Laden’s digital media. She has helped law enforcement, eDiscovery firms, and the federal government extract and manually decode artifacts used in solving investigations around the world. Heather is the co-author of Practical Mobile Forensics, currently a best seller from Pack’t Publishing and serves as a the DFIR Curriculum Lead, Senior Instructor, and author for FOR585: Smartphone Forensic Analysis In-Depth at the SANS Institute.

NEXT: In this episode of Police1’s Policing Matters podcast, host Jim Dudley speaks with Heather Barnhart, Senior Director of Community Engagement at Cellebrite, about the complexities law enforcement faces due to AI advancements: