Ireland’s media regulator has initiated an investigation into the social media platform X, formerly known as Twitter, amid concerns about its content moderation practices and users’ lack of accessible appeal options. This marks the first investigation by the regulator in its capacity to ensure compliance with the European Union’s Digital Services Act (DSA), which aims to bolster online safety across large platforms.
The probe primarily focuses on whether X is adhering to specific regulations outlined in the DSA, which mandates that platforms must have effective measures to address and mitigate the spread of illegal content and harmful material. If found in violations of these regulations, entities could face fines up to 6% of their annual turnover.
The regulator’s inquiry was triggered by feedback from its supervision team and information provided by the non-governmental organization, HateAid, along with a user complaint. Concerns have been raised about X’s internal complaint-handling systems, which some users have found to be difficult to navigate, limiting their ability to contest content moderation decisions effectively.
This scrutiny comes at a time when the EU has intensified its regulatory efforts aimed at large technology firms, pushing for greater accountability and user safety online. Similar regulations have been implemented in multiple jurisdictions, including directives in India that have also seen social media platforms come under the spotlight for content moderation practices.
The current situation represents an ongoing dialogue about the responsibilities of social media companies in maintaining transparency and integrity, especially in light of growing public concern over harmful online content. The hope is that these investigations and regulations will foster a more responsible digital environment that protects users and encourages platforms to prioritize accountability and user rights.
