The European Union's Digital Services Act (DSA), designed to address algorithmic biases on social media platforms, is under scrutiny following a revealing analysis by Global Witness. This investigation highlighted a significant far-right political bias in algorithmically sorted 'For You' feeds on platforms like TikTok and X, formerly known as Twitter, ahead of Germany's federal elections. Enacted to enhance transparency and empower research into systemic risks on major platforms, the DSA is now facing challenges in its implementation, particularly with Article 40, which hasn't yet been fully activated due to pending legislative measures.
Global Witness's analysis found an alarming skew in political content recommendations. On X, 64% of recommended political content favored the far-right AfD party, while TikTok showed an even higher bias at 78%. Although Meta's Instagram also demonstrated a tendency towards right-wing content, the bias was comparatively lower, with 59% of political content leaning right. These findings have prompted the European Commission to open investigations into all three implicated social media giants, as the DSA regime's full implementation remains incomplete since its initiation in August 2023.
The DSA aims to create transparency and enable public interest research into democratic risks posed by social media algorithms. One of its key provisions, Article 40, is intended to grant vetted researchers access to non-public platform data. However, the necessary delegated act to implement this aspect has not yet been passed, delaying its effect. The Act also relies on self-reporting from platforms regarding risks, with enforcers tasked with receiving and reviewing these reports.
Ellen Judson from Global Witness voiced concerns over the lack of transparency in how these platforms's recommender systems operate.
“One of our main concerns is that we don’t really know why we were suggested the particular content that we were,” said Ellen Judson.
This opacity raises questions about how platforms such as TikTok and X determine the weighting and assessment of signals that might increase certain risks or biases.
“We found this evidence that suggests bias, but there’s still a lack of transparency from platforms about how their recommender systems work,” Judson added.
The intricate workings of these algorithms remain largely undisclosed, complicating efforts to understand potential unintended consequences.
“We know they use lots of different signals, but exactly how those signals are weighted, and how they are assessed for if they might be increasing certain risks or increasing bias, is not very transparent,” noted Judson.
Judson emphasized the importance of transparency in addressing these issues.
“I think the transparency point is really important,” she stated.
There are speculations about whether public figures like Elon Musk may influence algorithmic changes, given his engagement with AfD-related content.
“We have seen Musk talking about the AfD and getting lots of engagement on his own posts about the AfD and the livestream [with Weidel] … [But] we don’t know if there’s actually been an algorithmic change that reflects that,” Judson remarked.
In response to Global Witness's findings, some social media companies dismissed the results as unrepresentative due to the limited number of test accounts used in the study.
“They said that it wasn’t representative of regular users because it was only a few test accounts,” Judson explained.
Despite these dismissals, civil society remains vigilant for when Article 40's provisions will allow vetted researchers access to more comprehensive data.
“Civil society is watching like a hawk for when vetted researcher access becomes available,” Judson said.
Judson speculates that the observed biases may be unintended side effects of algorithms primarily designed to maximize user engagement.
“My best inference is that this is a kind of unintended side effect of algorithms which are based on driving engagement,” she suggested.
As the EU navigates these challenges, the effectiveness of the DSA in promoting transparency and accountability remains under scrutiny. The Commission's ongoing investigations into TikTok, X, and Instagram will be pivotal in determining whether these platforms can align with the DSA's objectives and mitigate algorithmic biases that threaten democratic processes.
Leave a Reply