Meta

Introducing Lantern: Protecting Children Online

Protecting children online is one of the most important challenges facing the technology industry today. Meta wants young people to have safe, positive experiences online and spent a decade developing tools and policies designed to protect them. As a result, they find and report more child sexual abuse material to the National Center for Missing & Exploited Children (NCMEC) than any other service today.

Many in our industry recognize the need to work together to protect children and stop predators. We use technology like Microsoft’s PhotoDNA and Meta’s PDQ to stop the spread of child sexual abuse material (CSAM) on the internet, but we need additional solutions to stop predators from using different apps and websites to target children.

Predators don’t limit their attempts to harm children to individual platforms. They use multiple apps and websites and adapt their tactics across them all to avoid detection. When a predator is discovered and removed from a site for breaking its rules, they may head to one of the many other apps or websites they use to target children.

Meta says.

As described in the Tech Coalition’s announcement today, Lantern enables technology companies to share a variety of signals about accounts and behaviors that violate their child safety policies. Lantern participants can use this information to conduct investigations on their own platforms and take action.

Meta was a founding member of Lantern. They provided the Tech Coalition with the technical infrastructure that sits behind the program and encouraged industry partners to use it. They manage and oversee the technology with the Tech Coalition, ensuring it is simple to use and provides partners with the information they need to track down potential predators on their own platforms.

One example of Lantern’s value is an investigation Meta conducted following information provided by Lantern partner MEGA during the program’s pilot phase. MEGA shared URLs with Lantern that they had previously removed for violating their child safety policies. Meta’s specialist child safety team used this information to conduct a wider investigation into potentially violating behaviors related to these URLs on platforms.

The team removed over 10,000 violating Facebook Profiles, Pages and Instagram accounts in the course of the investigation. In accordance with legal obligations, they reported the violating profiles, pages and accounts to NCMEC. In addition, they shared details of their investigation back to Lantern, enabling participating companies to use the signals to conduct their own investigations.

Previous Story

Popay Recognized as Global Aspirant for Multi-Country Payroll Solutions

Next Story

Artificial intelligence, blockchain and other technologies contributing to the future of local agriculture