The December 2025 CJEU ruling [in Russmedia Digital](https://curia.europa.eu/juris/document/document.jsf;jsessionid=8E8A425EA3E0DFBA53BD0C0147C7A2D4?text=&docid=306764&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=15411117) fundamentally changes how online platforms handle user-generated content containing personal data. If your business operates a marketplace, social network, dating platform, or any service where users can post content, this judgment creates immediate compliance obligations that cannot be ignored.
A Romanian classifieds platform publi24.ro faced a lawsuit after an anonymous user posted a fake advertisement falsely claiming a woman offered sexual services. The ad included her real photos and phone number. Russmedia removed the content within an hour of notification, but by then it had already been copied to other websites.
The first instance court awarded EUR 7,000 in damages. The appellate court reversed this, ruling that Russmedia was merely a hosting provider under the E-Commerce Directive and bore no liability. The case reached the CJEU with a straightforward question: can a platform operator hide behind intermediary liability exemptions when GDPR violations occur?
The Court’s answer was unequivocal. No.
The CJEU established that platform operators are data controllers for personal information in user-generated content, even when they don’t create that content themselves. This isn’t about what you write – it’s about what you enable others to publish through your infrastructure.
What made Russmedia a controller? The platform’s terms of service granted it extensive rights to use, copy, distribute, transmit, publish, modify and transfer content to partners without justification. The Court determined these powers constitute participation in determining purposes and means of data processing for commercial benefit, not just technical neutrality.
This matters because changing your terms of service probably won’t save you. The CJEU made clear that even without such broad contractual rights, a platform operator who knows or should know that users might publish special categories of data has obligations arising at the service design stage under Article 25(1) GDPR – data protection by design.

For advertisements containing special categories of personal data under Article 9(1) GDPR – information about sexual life, health, religious beliefs, political opinions, racial origin – platforms now have three mandatory requirements.
First, implement technical and organizational measures to identify such content before it goes live. This obligation exists from the moment you design your service. Automated detection systems must flag potential special categories of data based on text, images and context. The days of “publish first, moderate after complaints” are over for sensitive data.
Second, verify the advertiser’s identity and check whether they match the person whose data is being published. Anonymous posting becomes impossible when special categories of data are involved. You need to collect identity information and establish whether the person posting is the data subject or has explicit consent under Article 9(2)(a) GDPR.
Third, refuse publication where the advertiser isn’t the data subject and cannot demonstrate explicit consent or another Article 9(2) legal basis. You cannot publish when there’s a risk of violating the prohibition on processing special categories of data.
These requirements apply before content appears on your platform, not after users complain about it.
The CJEU established that platform operators are data controllers for personal information in user-generated content, even when they don’t create that content themselves. This isn’t about what you write – it’s about what you enable others to publish through your infrastructure.
What made Russmedia a controller? The platform’s terms of service granted it extensive rights to use, copy, distribute, transmit, publish, modify and transfer content to partners without justification. The Court determined these powers constitute participation in determining purposes and means of data processing for commercial benefit, not just technical neutrality.
This matters because changing your terms of service probably won’t save you. The CJEU made clear that even without such broad contractual rights, a platform operator who knows or should know that users might publish special categories of data has obligations arising at the service design stage under Article 25(1) GDPR – data protection by design.
For advertisements containing special categories of personal data under Article 9(1) GDPR – information about sexual life, health, religious beliefs, political opinions, racial origin – platforms now have three mandatory requirements.
First, implement technical and organizational measures to identify such content before it goes live. This obligation exists from the moment you design your service. Automated detection systems must flag potential special categories of data based on text, images and context. The days of “publish first, moderate after complaints” are over for sensitive data.
Second, verify the advertiser’s identity and check whether they match the person whose data is being published. Anonymous posting becomes impossible when special categories of data are involved. You need to collect identity information and establish whether the person posting is the data subject or has explicit consent under Article 9(2)(a) GDPR.
Third, refuse publication where the advertiser isn’t the data subject and cannot demonstrate explicit consent or another Article 9(2) legal basis. You cannot publish when there’s a risk of violating the prohibition on processing special categories of data.
These requirements apply before content appears on your platform, not after users complain about it.