This article, relating to Irish law, was written by the team in our Dublin office for 澳门六合彩资料 Ireland LLP.
Ireland鈥檚 Coimisi煤n na Me谩n (鈥Media Commission鈥) has announced their new Online Safety Code (the 鈥Code鈥). The Code ushers in a new, more intensely regulated era for online platforms. To date, large online platforms have acted of their own accord and established trust and safety teams which seek to maintain a safe user experience for social media users (鈥Users鈥). Various European Union regulations have required Ireland (and now the Online Platforms who do business here) to establish and conform to new safety standards.
The main legislative push for the Code is the Audio-Visual Media Services Directive (鈥AVMSD鈥), a significant EU Directive which overhauled the nature of how video and broadcasting is regulated in the EU. Ireland transposed that Directive by the Online Safety and Media Regulation Act 2022 (the 鈥Act鈥), the same legislation which established the Media Commission. The Media Commission has quickly established itself and has already begun substantive regulation, including own-initiative regulatory sweeps of online platforms.
The Code
The Code has a clearly defined scope which relates to video-sharing platform services which are 鈥渦nder the jurisdiction of the state鈥. This is a legal concept which is discussed in the Act, but could be summarised as video sharing platforms (鈥Platforms鈥) who have their EU headquarters in Ireland. The Media Commission have designated ten companies as Platforms that will be covered by the Code, such as Facebook, Instagram, YouTube, Udemy, TikTok, LinkedIn, X, Tumblr, Pinterest, and Reddit (The Media Commission has not come to a designated decision on Reddit as of yet).
Snapchat has not been included in the above list despite their strong presence online and popularity among young people as their headquarters are not in Ireland. However, the Media Commission will be working quite closely with its regulatory counterparts in other EU member states to hold platforms like Snapchat accountable for how they plan on keeping younger Users safe on their platforms.
The Code was expected to be a much longer, and more detailed document, however, the Media Commission has a large and sophisticated mandate across multiple pieces of legislation which might prompt further codes and regulatory guidance.
Code implementation
General obligations will apply from next month which includes videos containing content that may impair physical, mental, and moral development in minors. There will also be a nine month implementation period for more detailed provision such as any content which contains cyberbullying, promotion of self-harm, suicide, eating disorders or dangerous challenges, as well as access to pornography or extreme gratuitous violence. The Code will be Legally binding, and companies could face up to 鈧20 million in fines or 10% of the platform鈥檚 annual turnover.
The Code and recommender systems
Recommender systems are also well known as 鈥渁lgorithms鈥 and they determine what Users see on the relevant Platforms based on their likes, interests, personal profile etc. This personal profile might include their search history, age, and location for example. Such practices have been the source of scrutiny form a data protection perspective, and the Media Commission must now undertake regulating such recommender systems from a content perspective. Importantly, the Code will not deal directly with recommender systems. The basis for how the Media Commission will regulate recommender systems is the DSA, which is a separate but similar content regulation piece of legislation.
Concluding thoughts
By establishing clear obligations and emphasizing user safety, the Media Commission aims to create a safer online environment particularly for younger users. As platforms adapt to these new standards, the focus on accountability and protection will play a crucial role in shaping the future of digital media in Ireland.