Pornhub pulls out of France over age verification law

Pornhub pulls out of France over age verification law

How did your country report this? Share your view in the comments.

Diverging Reports Breakdown

Porn Platforms Pullout: Aylo’s Stand Against French Age Verification

Aylo, the company operating Pornhub, YouPorn, and RedTube, announced its decision to suspend access to these adult content platforms in France. This move comes in response to a French requirement for greater age verification measures on adult sites.

Read full article ▼
Aylo, the company operating Pornhub, YouPorn, and RedTube, announced its decision to suspend access to these adult content platforms in France. This move comes in response to a French requirement for greater age verification measures on adult sites.

Clara Chappaz, France’s junior minister for artificial intelligence and digital technology, stated that Aylo is free to exit the country if unwilling to comply with the regulations. Arcom, France’s digital and audiovisual communication regulator, has the authority to block sites and impose fines if age verification is deemed insufficient.

Across the European Union, adult sites are facing increased scrutiny. Regulators announced investigations into several sites, including Pornhub, to ensure compliance with child protection rules, as concerns mount over minors’ access to such content.

(With inputs from agencies.)

Source: Devdiscourse.com | View original article

Teens exposed to weapons, violence & suicide content on social media

Kids exposed to social media posts about violence and suicide soon after joining platforms. BBC experiment created fake profiles for teenagers living in the west of England. Found they were shown the “worrying” posts within just minutes of scrolling on TikTok and YouTube. They were also guided to sexually suggestive content on Instagram. Online safety expert David Wright said: “While the findings are concerning they are, unfortunately, not surprising.” Tik Tok and Instagram said accounts used by children automatically have restrictions in place. YouTube said it recently expanded protections for teens on the site. You can get access to help and advice on the topics mentioned via the BBC Action line on 08457 90 90 90 or visit a local Samaritans branch or click here for details. For confidential support on suicide matters call the Samaritans in the UK on 1-800-273-8255 or visit http://www.samaritans.org/. In the U.S. call the National Suicide Prevention Lifeline on 1 (800) 273-TALK (8255).

Read full article ▼
Kids exposed to social media posts about violence and suicide

15 May 2025 Share Save Harriet Robinson & Andy Howard BBC News, West of England Share Save

Getty Images The experiment created fake profiles for teenagers living in the west of England

A BBC investigation has found young teenagers are being exposed to content about weapons, bullying, murder and suicide soon after joining social media platforms. The project, which saw six fictional profiles set up as 13-15-year-olds, found they were shown the “worrying” posts within just minutes of scrolling on TikTok and YouTube. They were also guided to sexually suggestive content on Instagram. Online safety expert David Wright said: “While the findings are concerning they are, unfortunately, not surprising.” TikTok and Instagram said accounts used by children automatically have restrictions in place, while YouTube said it recently expanded protections for teens.

Getty Images

This article contains descriptions of online content that some might find upsetting. You can get access to help and advice on the topics mentioned via the BBC Action line.

Getty Images Much of the content we scrolled through was not suitable for under-18s

We [BBC journalists Andy Howard and Harriet Robinson] set up fictional social media profiles for: Sophie, Maya, and Aisha on Instagram and TikTok. Harry, Ash and Kai on Youtube and TikTok. We scrolled each of their profiles for 10 minutes per day for a week. Amongst the endless posts about sport, gaming and beauty (some of the topics we initially searched for) there were others, seemingly unrelated to our profiles – that we did not ever search for – that felt more sinister. While Instagram proved to be less concerning overall, it did expose two of the children to sexualised content. You can see our methodology towards the bottom of this article. Here’s what we found:

TikTok The fake profiles were sometimes exposed to content that inferred suicide

Sophie, 15, from Dursley

Her profile said she was a big fan of Taylor Swift, reads fantasy and romance novels. She loves cute animals and spending time with friends. The big theme with this profile was mental health, including several TikTok posts about young people who had taken their own lives after being bullied, showing their gravestones. In one 10-minute session Sophie was also exposed to two posts on TikTok where the person posting said they were suicidal, with other people making similar comments. There was some talk of self-harm, though it was mainly videos of celebrities speaking to fans with scars and urging them not to do it. Sophie also came across a post about abuse with people commenting underneath that they were also experiencing this, for example: “I have an abusive dad”, “me too”.

Aisha, 13, from Keynsham

This profile was shown the least disturbing content, but on the first day, a search for “leopard print” on Instagram, one of Aisha’s interests, immediately led to scantily-clad women in suggestive poses. The For You page was then suggesting more of these types of images several days later despite never searching for it again. While these were not explicit, they suggested they would lead the user to more explicit content if they clicked through. On day two, after just 10 minutes of scrolling altogether on TikTok, Aisha was shown a gaming video acting out a school shooting with characters hiding in the bathroom and sounds of a gun being fired.

Maya, 15, from Swindon

Spends most of her time on TikTok, loves hauls and Get Ready With Me (GRWM) content. She enjoys gaming, especially Roblox, and is also a fan of artists like SZA and Sabrina Carpenter. After 30 minutes of scrolling on TikTok, this profile was regularly exposed to upsetting graphic descriptions of real-life murder cases and attacks with wording like “dragged from her house and beaten to death”. There was a video of a woman talking about the abuse a five-year-old girl suffered at the hands of her father, and another from a tabloid newspaper’s profile, describing a “brutal attack” on an autistic child. Maya was also shown videos about sick children and poor mental health. On one occasion, an animated video said “If it’s a blue, all your friends hate you”, followed by the symbol turning blue. Following many of the most upsetting videos, TikTok offered clickable searches, like: “Actual raw footage video”, “Her last moment alive” and “Murder caught on camera”. There was also a video of a woman (who you can’t see) screaming and shouting swear words. On Instagram this user was shown some sexually-suggestive content.

TikTok Knives were shown several times to one user

Ash, 15, from Bristol

Goes to the gym a lot, likes doing weights. Enjoys gaming, plays Destiny 2, Block Blast and Witcher. Loves music and is really into drumming. Within 20 minutes of scrolling, a video appeared on YouTube reviewing different weapons and how they perform on the human body. “Worst Weapons” used manikins to show the effect of knives and bullets. This profile was by far the most concerning of the three male teenagers we created, being shown videos covering subjects like “how to hide a dead body” on TikTok, hiding drugs from police officers, and what looked like [or was meant to look like] real footage of a man hitting a woman on YouTube. From the off, the language this profile was exposed to was aggressive and full of swear words. Towards the end of the week, this profile also seemed to start featuring more and more females either dancing or posing in suggestive ways on TikTok. They all had a similar appearance in terms of body shape and looks, with one post encouraging the profile to choose whether they prefer “the girl or the car” they were featured alongside.

Kai, 13, from Trowbridge

His favourite thing is gaming – he enjoys Fortnite, Roblox and Call of Duty. He is also into music like Snoop Dog and Eminem, and writes songs himself. As a fictional gamer, Kai’s profile generated a lot of screengrabs from different well-known video games. In particular, the footage from the first-person shooter games were graphic and aggressive. Close-up stabbings with blood spurting out of opponents was a regular occurrence on YouTube. This profile also took us into the realm of mysterious audio stories about shocking subjects, including a mocked-up phone call on TikTok from a father telling his daughter to lock herself in her room because “your mum wants to kill you”. Scrolling this account also showed a TikTok tutorial video of “how to steal” the metal statue from the front of an expensive car.

Harry, 15, from Taunton

This account was probably the least concerning of the three boys in terms of what it was exposed to, but did see content that seemed to come out of the blue. There was another knife advert/review on YouTube, comparing blades of differing prices with how they cut an orange. There was also an “anger level monitor”, where the user was encouraged to do a test and see how angry they are.

David Wright “Children deserve digital spaces where they can learn, connect and explore safely,” said David Wright

Mr Wright, from online safety and security organisation SWGfl, said the experiment echoed the concerns his charity had raised, that “children can be exposed to harmful content, even when no risky search terms are used”. “It is worrying. The content you describe… presents serious risks to children’s mental health and wellbeing, and we have all too often seen the tragic consequences,” he added. “Exposure to such material can, in some cases, normalise harmful behaviours, lead to emotional distress, and significantly impair children’s ability to navigate the online world safely.” Karl Hopwood, a member of the UK Council for Internet Safety, said that while demonstrating how sharp a knife can cut an orange is different from the more violent content, “it’s just easy for this to be taken out of context”. “Adverts for knives would, as far as I know, breach community guidelines and I personally don’t think we should be showing that sort of stuff to the youngest users. “The stuff around suicide/self-harm/depression isn’t great if you’re already vulnerable and feeling low – but for a lot of young people it may not be an issue at all,” he added. Ofcom is now enforcing the UK’s Online Safety Act, and has finalised a series of child safety rules which will come into force for social media, search and gaming apps and websites on 25 July. Mr Wright said regulation was a “vital step”, but “it must be matched by transparency, enforcement, and meaningful safety-by-design practices”, including algorithms being subject to scrutiny and support for children, parents and educators to identify and respond to potential risks. “We must move beyond simply reacting to online harms and towards a proactive, systemic approach to child online safety,” he added.

TikTok Some of the content reflected a negative state of wellbeing

A TikTok spokesperson said its teen accounts “start with the strongest safety and privacy settings by default”. They added: “We remove 98% of harmful content before it’s reported to us and limit access to age-inappropriate material. “Parents can also use our Family Pairing tool to manage screen time, content filters, and more than 15 other settings to help keep their teens safe.” Meta, which owns Instagram, did not provide a specific comment, but told us it also has teen accounts, which offer built-in restrictions and an “age-appropriate experience” for 13-15-year-olds. These restrictions automatically come into effect when the user inputs their date of birth while setting up the app. The company said while no technology was perfect, the additional safeguards should help to ensure sure teens are only seeing content that is appropriate for them. Meta is bringing Teen Accounts to Facebook and Messenger later this month and adding more features. A YouTube spokesperson said: “We take our responsibility to younger viewers very seriously, which is why we recently expanded protections for teens on YouTube, including new safeguards on content recommendations. “We generally welcome research on our systems, but it’s difficult to draw broad conclusions based on these test accounts, which may not be consistent with the behaviour of real people.”

How the project worked

Each user profile was created on a different sim-free mobile phone, with location services turned off. We created a Gmail account for each of the users on each device, and then created the profiles – Instagram and TikTok for the girls, YouTube and TikTok for the boys. The profiles were different to each other and based on research from the Childwise Playground Buzz report, which gives insight into children’s interests, favourite brands and habits. We did a few basic searches and follows on the first day based on each user’s specific likes, including music, beauty, gaming and sport. From then on we mostly just scrolled and liked. We did not post, comment on or share any content throughout the experiment. We scrolled on each platform for each profile for 10 minutes per day for a week.

Source: Bbc.com | View original article

Two porn sites investigated for suspected age check failings

Two porn sites investigated for suspected age check failings. Ofcom announced in January that, in order to comply with the Online Safety Act, all websites on which pornographic material could be found must introduce “robust” age-checking techniques. Itai Tech Ltd and Score Internet Group LLC did not respond to its request for information or show they had plans to introduce age checks, it added.

Read full article ▼
Two porn sites investigated for suspected age check failings

9 May 2025 Share Save Liv McMahon Technology reporter Share Save

Getty Images

Ofcom has launched investigations into two pornographic websites it believes may be falling foul of the UK’s newly introduced child safety rules. The regulator said Itai Tech Ltd – which operates a so-called “nudifying” site – and Score Internet Group LLC had failed to detail how they were preventing children from accessing their platforms. Ofcom announced in January that, in order to comply with the Online Safety Act, all websites on which pornographic material could be found must introduce “robust” age-checking techniques from July. It said the two services it was investigating did not appear to have any effective age checking mechanisms. Firms found to be in breach of the Act face huge fines.

The regulator said on Friday that many services publishing their own porn content had, as required, provided details of “highly effective age assurance methods” they were planning to implement. What the Online Safety Act is – and how to keep children safe online They added that this “reassuringly” included some of the largest services that fall under the rules. It said a small number of services had also blocked UK users entirely to prevent children accessing them. Itai Tech Ltd and Score Internet Group LLC did not respond to its request for information or show they had plans to introduce age checks, it added. The “nudifying” technology that one of the company’s platforms features involves the use of artificial intelligence (AI) to create the impression of having removed a person’s clothing in an image or video. The Children’s Commissioner recently called on the government to introduce a total ban on such AI apps that could be used to create sexually explicit images of children.

What changes are porn sites having to make?

Source: Bbc.com | View original article

Major porn platforms pull out of France over age check standoff

Pornhub, YouPorn and RedTube suspended in France from Wednesday. Move in protest against requirement for porn sites to verify users are 18 or older. France’s digital and audiovisual communication regulator can request that porn websites are blocked and levy fines if it deems their age verification systems are insufficient.

Read full article ▼
The firm behind Pornhub, YouPorn and RedTube is suspending access to the adult content platforms in France from Wednesday, in a move that media said was in protest against a requirement for porn sites to verify that their users are 18 or older.

“I can confirm that Aylo has made the difficult decision to suspend access to its user-uploaded platforms (P-rnhub, YouP-rn, RedTube) in France. We will be using our platforms to directly address the French public tomorrow,” a spokesperson for Pornhub said on Tuesday.

Aylo owns the affected platforms, as well as several other adult entertainment brands.

Arcom, France’s digital and audiovisual communication regulator, can request that porn websites are blocked and levy fines if it deems their age verification systems are insufficient.

“If Aylo would rather leave France than apply our laws, they are free to do so,” Clara Chappaz, France’s junior minister for artificial intelligence and digital technology, wrote in a post on social media platform X.

Arcom has said that 2.3 million minors access porn websites every month even though legally they should be blocked from them.

Adult content platforms have come under fire elsewhere in the European Union. EU regulators said last month that several sites, including Pornhub, would be investigated for failing to comply with rules to protect children.

Source: Cyprus-mail.com | View original article

France could pull the plug on Pornhub this summer

Under the law, France’s audiovisual and digital services regulator Arcom has the power to block porn sites if it deems they are not implementing age verification properly and in time. Starting Friday, those porn websites had to offer at least one secure age verification system.

Read full article ▼
Under the law, France’s audiovisual and digital services regulator Arcom has the power to block porn sites if it deems they are not implementing age verification properly and in time.

“The Minister for Digital Affairs Clara Chappaz has told us that the first site blocks could take place this summer,” a member of parliament familiar with the matter said, granted anonymity to disclose details of confidential conversations.

Pornhub’s owner Aylo is already challenging the law in court.

For the time being, only French and non-European pornographic sites can be blocked. Starting Friday, those porn websites had to offer at least one secure age verification system that checks visitors’ ages through things like a verified identity document or even a video selfie that estimates a user’s age.

That’s led to a rush from age verification software providers to offer their services to porn websites.

There has been “definite and last-minute interest” in verification tools in the run-up to Friday’s deadline in France, said Iain Corby, the president of AVPA, a large association of such software providers. But, he added, the association for now has spotted few actual deployments.

Source: Politico.eu | View original article

Source: https://news.google.com/rss/articles/CBMiWkFVX3lxTE50Zk93UzltT0d5bWtrejJsTDN1S1FLY05wVThWelJWNE02VHZHSWhMTEQ3REgtVF9VRXQyV0VHQlFma2NGeUxTQUZocXE3YnA2SnR6Qnlmc096UdIBX0FVX3lxTE9TMzBVemRFdHZ5V2tSM19GamFXaTNvNFRwMGoxb3laU3lYbkNPX0Z4M05nQmM2T1A0bGNvMUk4dEZ2YVNQXzlDRkVCeGFVTEhITmkyXzVVcDBvY3haQ0tz?oc=5

One thought on “Pornhub pulls out of France over age verification law”

Leave a Reply

Your email address will not be published. Required fields are marked *