EU names three porn sites subject to its strictest online content rules
Age verification tech could be headed to adult content sites Pornhub, Stripchat and XVideos after the three were added to a list of platforms subject to the strictest level of regulation under the European Union’s Digital Services Act (DSA).
Back in April, the EU announced an initial list of 17 so called very large online platforms (VLOPs) and two very large online search engines (VLOSE) designated under the DSA. That first list did not include any adult content sites. The addition of the three platforms designated today changes that.
Per Wikipedia — which ironically enough was already named a VLOP in the first wave or Commission designations — XVideos and Pornhub are the number one and number two visited adult content sites in the world respectively. While Stripchat is an adult webcam platform that livestreams nude performers.
Currently none of the three services require visitors to undergo a hard age check (i.e. age verification, not self declaration) prior to accessing content — but that could be set to change in the region as a result of the trio being designated VLOPs.
The pan-EU regulation puts a raft of extra requirements on designated (larger) platforms, which have more than 45 million monthly average users in the region, including obligations to protect minors, as the EU notes in a press release today — writing [emphasis ours]: “VLOPs must design their services, including their interfaces, recommender systems, and terms and conditions, to address and prevent risks to the well-being of children. Mitigating measures to protect the rights of the child, and prevent minors from accessing pornographic content online, including with age verification tools.”
The Commission, which is responsible for overseeing VLOPs’ compliance with the DSA, also reiterated today that creating a safer online environment for children is an enforcement priority.
Other DSA obligations on VLOPs include documenting and analysing any “specific systemic risks” their services may pose with regard to the dissemination of illegal content and content threatening fundamental rights — with an obligation to produce risk assessments reports which, initially, must be shared with the Commission and later have to be made public.
They must also apply mitigation measures to address risks linked to the dissemination of illegal content online, such as child sexual abuse material (CSAM), and content affecting fundamental rights, such as the right to human dignity and private life in case of non-consensual sharing of intimate material online or deepfake pornography.
“These measures can include adapting their terms and conditions, interfaces, moderation processes, or algorithms, among others,” the Commission notes.
The three adult platforms designated as VLOPs have four months to bring their services into compliance with the additional DSA requirements — meaning they have until late April to make any necessary changes, such as rolling out age verification tech.
“The Commission services will carefully monitor the compliance with the DSA obligations by these platforms, especially concerning the measures to protect minors from harmful content and to address the dissemination of illegal content,” the EU said, adding: “The Commission services are ready to closely engage with the newly designated platforms to ensure these are properly addressed.”
For now, there is a lack of clear guidance for platforms on how to comply with the DSA’s child protection provisions when it comes to age verification. But the EU intends that to change — as a code of conduct focused on age-appropriate design is developed.
“According to the DSA, all providers of online platforms must take appropriate and proportionate measures to ensure that their services ensure a high level of privacy, safety and security for minors,” a Commission spokesperson told us. “The DSA does not detail specific forms of age verification measures. It follows a risk-based approach in relation to age-assurance mechanisms. This means that some online platforms maybe required to introduce age-assurance or verification mechanisms, depending on the level of risk these platforms might pose to minors who access them.”
“Designated Very Large Online Platforms, such as the three pornographic platforms now designated, are also required to assess systemic risks to children’s rights as well as risks to young people’s mental and physical wellbeing in annual risk assessment reports. On the basis of the assessed risks, VLOPs have to deploy effective mitigation measures against identified risks,” they added. “These measures need to be targeted to the systemic risks identified and can include, where needed to protect minors sufficiently, age assurance or verification measures and parental control tools.
“Compliance with these rules is subject to an independent audit to be performed yearly, and is monitored and enforced by the Commission according to the system envisaged by the DSA. Digital Services Coordinators and the Commission can step in and impose interim measures and sanction to service providers that fail to comply with their obligations.”
The Commission flags, as a “key action”, the planned development of a voluntary code of conduct on age-appropriate design that it says is intended to build on the DSA framework — offering guidance for platforms figuring out how to comply. The EU’s executive is in charge of helping to establish the Code, via what the Commission describes as “an ad hoc special group involving industry, civil society and academia”.
“In accordance with the strategy, the Commission will support methods to prove age in a privacy-preserving and secure manner, to be recognised EU-wide,” said the spokesman. “To that end, the Commission has created a Member State task force to work on age assurance, with a particular focus on age verification. It will also work with relevant stakeholders and European standardisation organisations to strengthen effective age assurance methods, as a priority, and issue a standardisation request for a European standard on online age verification.”
It will also be making an age verification toolkit available, via the Better Internet for Kids platform, to raise awareness of “existing effective and privacy-preserving methods of age verification, which will include an Age Verification self-assessment tool for digital service providers, and a child and/or family-friendly explanation of relevant solutions”.
The Commission also points to another in-train initiative, to develop a European Digital Identity framework, as a possible future tool for proving age online.
Zooming back out, the DSA also contains a set of general obligations which apply more broadly, including to smaller digital services but also to VLOPs — such as ensuring their systems are designed to ensure a high level of privacy, safety and child protection; and promptly informing law enforcement authorities if they become aware of any information giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person, including in the case of child sexual abuse — and the compliance deadline for those requirements kicks in a little earlier, on February 17, 2024.
While the DSA applies across the EU and EEA (European Economic Area) this is a region that, post Brexit, does not include the U.K. However the U.K. government passed its own Online Safety Act (OSA) this fall, setting up telecoms regulator Ofcom as the country’s Internet content watchdog and introducing a regime of even tougher penalties for breaches than the EU has (OSA fines can reach up to 10% of global annual turnover vs up to 6% under the EU’s DSA).
The U.K. law also puts a strong emphasis on child protection. And recent Ofcom guidance for porn sites aimed at helping them comply with a new legal duty to ensure minors do not encounter adult material online states they must carry out “highly effective” age checks — further stipulating such checks cannot include age gates that merely ask users to self declare they are over 18.
Ofcom’s list of U.K.-approved age check tech includes provisions like asking porn site users to upload a copy of their passport to verify their age; show their face to their webcam to undergo an AI age assessment; or sign in to Open Banking to prove they’re not a minor, among other methods the regulator deems acceptable.
This report was updated with comment from the Commission regarding age verification tech.
Atoms Lanka Solutions
0 comments
Write a comment