Ena Bavčić, January 20, 2025
On January 7, Meta’s Mark Zuckerberg announced that they would stop collaborating with the fact-checkers in the US. Not stopping there, Zuckerberg addresses the EU’s efforts to regulate very large platforms as censorship. His address caused a wave of reactions across the globe.
Though shocking for many, Zuckerberg’s move was somewhat to be expected.
Of all the large platforms, arguably Meta is the largest. Potentially breaking US competition regulations, Zuckerberg has gained control over some of the most popular social media platforms and apps, Facebook, Instagram and WhatsApp.
At its inception, Facebook became a synonym for the Internet in many cultures. Without dwelling on the further genesis of the Meta-owned platforms and their role in global revolutions and atrocities, currently, Instagram and Facebook, along with Youtube, TikTok and X (former Twitter), represent the main sources of information for multiple generations.
Professional media in the EU and neighbourhood strongly rely on them to promote their content. In an address on 14 January 2025, the Reporters Without Borders representative said this number is as high as 70%. In the US 54% of the audience admits to consuming news from social media. The level to which they affect public opinion is largely discussed in media, civil society and governmental circles. Some governments went as far as limiting or completely banning platforms like TikTok believed to be promoting Chinese interests.
At the same time, US-based social media owners started endorsing far-right politicians. These opportune moves allow them to gain more influence. Musk gaining offices and directly influencing the media landscape in the US after Trump’s win is the most blatant example. But the fact was all along that algorithms designed by these platforms are increasingly pushing far-right content.
Though arguably, these companies do not entirely control the algorithms they have developed as they evolve based on the information fed into them, the lack of transparency they provide around them is excessive. BIRN’s Digital Rights Violations Annual Report provided a small insight into the issues the region faces when dealing with Meta’s algorithms. These range from amplifying hate speech in an already volatile region to shadowbanning professional media content mentioning the Srebrenica Genocide. These instances show what was known for years: Facebook prioritises engagement over the protection of discriminated groups, and far-right content is one of the most reliable sources of engagement.
In September 2024, Serbia’s Ombudsperson announced that Meta had not properly notified citizens of three Western Balkan countries on how they process their data, which they used for algorithm testing.
In such a context, media promoting vulnerable groups receive little to no attention, leaving already marginalised communities even more vulnerable. This is why the need for regulation of BigTech was not just an idea that Eurocrats came up with. Efforts for pushing for a democratic, human rights-based accountability of BigTech, and mechanisms for enforcing their due diligence have been a long-standing goal of digital rights, human rights and free expression advocates across the globe since algorithmic vulnerabilities started showing their weaknesses.
Among social media platforms, Meta stood out by at least trying to show some willingness to control the content by engaging in collaboration with experts and NGOs. Thus, they collaborated with them as fact-checkers and the Oversight Board.
So such a statement from Zuckerberg gave a bad taste to many. Media, human rights organisations, institutions and governments joined in expressing their concerns. They all agree on one thing – the European Union is under a test, and now is the time to respond. But what exactly can we do?
A whole of society approach to tackling these threats remains crucial when designing such a response. Such an approach should be employed both by the European institutions, member states and other local governments.
The European Commission need to ensure:
- The adequate implementation of EMFA, specifically those provisions enabling professional media to promote their content, and enforce better dialogue among them and social media platforms is crucial, as it offers a much-needed counterbalance in promoting voices of objective and reliable journalism.
- The EU must continue to enforce already adopted regulations, and push for platforms to adhere, ensuring adequate implementation of the Digital Services Act, Digital Markets Act as well as other acts. This includes both the implementation of regulations on the level of the Commission, as well as ensuring adequate transposition in local legislation.
- The EU must address the rising issue of algorithms comprehensively. This means ensuring that GDPR is applied equally to social media platforms, with the same level of scrutiny as for small businesses. Moreover, the implementation of the EU AI Act must be rooted in human rights principles.
- The Commission should ensure that these documents play an important role in the accession process, providing a basis for the governments in accession countries to benefit from these regulations.
- The EU is urged to continue and deepen coordination with media and media organisations, digital rights organisations and other stakeholders involved in the oversight of the implementation of EU regulation.
- The EU must address the issue of corporate lobbying by limiting space for unfair lobbying practices by BigTech.
Local governments should prioritise:
- Strengthening the rule of law and applying it when tackling digital harms to free expression, including strengthening procedures concerning elections and hate speech. Independence of the judiciary is a must, as local courts should play a crucial role in discussing bans and restrictions, ensuring that all decisions are based on human rights and rule of law principles.
- Transposition of EMFA and DSA must be done transparently and in open dialogue with media and human rights experts and organisations. Regulatory agencies and other relevant bodies need to involve media and civil society in every step of the process to ensure that the transposition of these documents is in line with defined standards and local needs.
- Governments need to keep media, civil society and academia at the table when discussing all relevant policies with large platforms, and pro-actively promote their expert knowledge and importance.
- Local Press Councils and media organisations can contribute to the work of regulatory agencies by providing information on media and individual journalists coming from discriminated groups, and those that follow ethical standards and whose voices need to be amplified.
Both EU and local governments need to support and enable investigative journalism, and media and media organisations that promote the rights of vulnerable groups.
Media literacy campaigns need to take full swing, and governments and the EU should take the lead in implementing them.
Media should invest more in raising the capacities of journalists for fact-checking and outreach. They should invest in community engagement asking them to amplify their objective and reliable content.
General audiences should engage more with the content, reporting harmful ones and amplifying the beneficial ones. To positively affect algorithms, it is important to create community engagement around marginalised, minority issues by showing support, liking, clicking and sharing minority media and activist content.