Back

Openprovider x Offlimits: The shared road to a cleaner internet

Author: Brendan Boyle
0 MIN READ TIME
11/8/2024
Domain Security News
Openprovider collaboration with Offlimits to help fight illegal content online.

Offlimits is a Netherlands-based organization that strives for a safer online world with a special focus on preventing and combatting online sexual (child) abuse. It contributes to a cleaner internet and helps victims of online transgressive behavior and the area around prevention.

The work of Offlimits is focused on three entities: the CSAM (Child Sexual Abuse Material) Hotline, Helpwanted, and Stop it Now.

With 20 years in business, Openprovider is a domain registrar committed to providing a trusted digital identity for every business. The key objectives of our collaboration with Offlimits are:

  • Creating and increasing awareness within the web hosting community about the prevalence of child sexual abuse material (CSAM) online.
  • Outlining the risks around illegal content infiltration facing web hosters today.
  • Preparing professionals in the domain and web hosting spaces for a new era of artificial intelligence-driven child sexual abuse material. 

To kick off this collaboration, we recently sat down with Madeleine van der Bruggen, Deputy Director of Offlimits, in order to get a clear picture about the current situation around illegal online material. 

Can you give us a brief background about Offlimits in terms of its foundation and the work that it does?

In 1995, while the internet was in the early stages of developing, some people working in the hosting space started an initiative to keep the internet clean of child sexual abuse material (CSAM). This initiative saw the establishment of a hotline where people could send a URL with possible CSAM. Hotline employees would then check whether the material was illegal and request the hosting party to remove the images(s).

Almost 30 years later, the main goal of the hotline continues to be the freeing the internet of CSAM. We currently have six analysts checking the URLs we receive from members of the public and other hotlines. 

Apart from the hotline, Offlimits now has two other teams. We have a helpline for people who are concerned about their (sexual) feelings towards minors: Stop it Now. It is a way to help prevent sexual abuse of children, both in-person and online. The third part of Offlimits is Helpwanted. This is a helpline that people can call when they face any kind of online transgressive behavior, such as sextortion, doxing, online bullying, or sharing material without consent.

Offlimits says it “stands for a secure infrastructure of the Internet, in which technical security must not be weakened.” Can you give us a general overview of the current situation around online CSAM in terms of volumes, where it is most commonly found, and the misconceptions about the world of CSAM among the general public?

In 2023, our hotline processed 255,563 reports and 535,802 URLs of which 268,912 were illegal. This illegal content and activity were most commonly found in image forums and image hosters.

There are some facts about CSAM you might fight interesting or even surprising:

  • CSAM does not only exist on the dark web, but it is also fairly easily found on the clear web (mainstream search engines like Google). 
  • The amount of CSAM that is hosted in a country does not necessarily align with the amount of child abuse that happens in that country. 
  • A lot of the CSAM content is self-generated, which means that the material is created by the child itself (for example via a webcam). Sometimes they are forced to do this, or they are rewarded with, for example, money in online games. 
  • Some people might think that people who watch CSAM have romantic feelings for children or are “old men with long beards in shady basements.” That is mostly not the case – there is really no common profile of those who view these images. Stop it Now has found that approximately 50% of the people who contact us are men below the age of 26. These are men who have reached out to report risky porn viewing and an escalation in this behavior.

What can web hosters do to reduce the risk of CSAM infiltrating the websites they manage? 

An easy way for web hosters to make sure no known illegal material is uploaded on their site/server, is to subscribe to the Hash Check Service. The Hash Check Service allows users to scan their data for known CSAM, resulting in both faster detection and removal of this illegal content.

When our analysts find illegal material hosted in the Netherlands, for example, our hotline will inform the company that hosts the website. We send a Notice & Takedown request (NTD) for the removal of the illegal images within 24 hours. If the website is hosted in another country, our hotline will pass on the information to a reporting center in that country. This is done via INHOPE, the international organization of reporting hotlines. 

The images are also placed in a database of found child sexual abuse material. This way, all criminal material ends up with the authorities. The police and Interpol use this so-called hash database to identify victims and perpetrators.

Are there specific legal implications for web hosters that fail to carry out a duty of care to ensure that such content is not circulated on their platform?

If the hosting party does not move the material within time, the URL is sent to the Authority for the Prevention of Online Terrorist Content and Child Sexual Abuse Material: ATKM. The ATKM can order the removal or inaccessibility of the URL and enforce compliance by imposing penalties in accordance with Dutch legislation. We would encourage all Netherlands-based web hosters to check out more information here.

According to one report from the United States, 77% of small businesses cited either insufficient understanding of AI or uncertainty regarding its benefits as the main reasons for not integrating the technology into their operations. How can professionals in the domain industry use the good power of artificial intelligence to counteract the very serious threats of CSAM?

As we outline in our 2023 Offlimits annual report, the rapid development of artificial intelligence is another worrying trend because it’s being utilized by bad actors to create new victims with existing footage or to create even more disturbing imagery of sexual abuse of minors. 

On the other hand, of course, artificial intelligence can be extremely useful for detecting and preventing the dissemination of CSAM. It can assist in moderating uploaded images on platforms and websites. It can give warnings when an uploaded image potentially contains CSAM. 

But despite these very positive uses, we wish to emphasize that AI-generated deepfake porn or even CSAM causes real problems and in that way are no different than ‘real’ material.  


As Madeleine outlines above, this is serious stuff.

Openprovider will be bringing you more insights and updates from Offlimits in the coming weeks – to make sure you do not miss out on any updates, you can sign up to our monthly newsletter.

In the meantime, your business can support a cleaner internet by sharing this article across social media, using the hashtag #WeStandWithOfflimits. 

Together, we can help build a safer internet for younger generations.

0 Views
0 Likes

Share this:

More Topics Like This

Understanding the spam lifecycle: how to keep it away from inboxes

Today, over 45% of all email traffic is spam. And that isn’t just an inconvenience; it’s a security risk that’s growing every year.

Read more

Openprovider x Offlimits: The shared road to a cleaner internet

Openprovider talks to Offlimits about the risks facing web hosters in the areas of illegal content such as child sexual abuse material (CSAM)

Read more

Follow us on

Not a Member yet?

Become a Member today and get access to exclusive deals.