Sharing News Left and Right

Yves here. This piece is interesting. It finds, using Twitter as the basis of its investigation, that “conservatives” are more persistent in sharing news to counter a platform that tries to moderate the circulation. If you read the strategies used by Twitter to try to prevent the spread of “disinformation,” they rely on the idea of ​​displacement. Definition from Wikipedia:

Nudge theory is a concept in behavioral economics, decision-making, behavioral policy, social science, consumer behavior, and related behavioral sciences that propose dynamic designs of the decision environment (choice structure) as ways to influence the behavior and decision-making of groups or individuals. Distraction is opposed to other methods of achieving compliance, such as education, legislation or enforcement.

Nowhere in the article does the article consider that these measures amount to a soft form of censorship. Everything seems right in trying to fight “false information”.

By Daniel Ershov, Assistant Professor at UCL School of Management University College London; Associate Researcher University of Toulouse and Juan S. Morales, Associate Professor of Economics Wilfrid Laurier University. Originally published at VoxEU

Ahead of the 2020 US presidential election, Twitter changed its user interface for sharing social media posts, hoping to reduce the spread of misinformation. Using extensive data on tweets from US news outlets, this column examines how its platform’s transformation has affected the distribution of news on Twitter. While the policy has significantly reduced overall news sharing, the decline has varied ideologically: content sharing has fallen more in left-leaning areas than in right-leaning areas, as conservatives have appeared less responsive to the intervention.

Social media provides an important access point to information on a variety of important topics, including politics and health (Aridor et al. 2024). Although it reduces the costs of consumers’ search for information, the power of social media to amplify and distribute it can also contribute to the spread of misinformation and disinformation, hate speech, and out-group hostility (Giaccherini et al. 2024, Vosoughi et al. 2018, Muller and Schwartz 2023, Allcott and Gentzkow 2017); increasing political polarization (Levy 2021); and encourage the rise of political extremism (Zhuravskaya et al. 2020). Reducing the spread and impact of harmful content is a key policy concern of governments around the world and an important aspect of platform governance. Since at least the 2016 presidential election, the US government has tasked the media with reducing the spread of false or misleading information before the election (Ortutay and Klepper 2020).

Town Down Versus Bottom-Up Regulation

Important questions about how to achieve these goals remain unanswered. Broadly speaking, platforms can take one of two approaches to this problem: (1) they can follow the ‘top-down’ rule by manipulating user access or visibility of different types of information; or (2) they can pursue ‘bottom-up’, user-centered regulation by changing user interface features to encourage users to stop sharing harmful content.

The advantage of the top-down approach is that it gives platforms more control. Before the 2020 election, Meta began changing user feeds so that users would see less of certain types of extreme political content (Bell 2020). Ahead of the 2022 US midterm elections, Meta took full advantage of new default settings for user news feeds that include less political content (Stepanov 2021). 1 Although effective, these policy approaches raise concerns about the extent to which platforms are empowered to directly manipulate information flows and the potential bias towards users of certain political or oppositional views. In addition, top-down interventions that do not have obvious risks cause user backlash and loss of trust in platforms.

As an alternative, a bottom-up approach to reducing the spread of misinformation involves giving up some control to encourage users to change their behavior (Guriev et al. 2023). For example, platforms can provide fact-checking services for political posts, or warning labels for sensitive or controversial content (Ortutay 2021). In a series of online experiments, Guriev et al. (2023) show that warning labels and fact-checking on forums reduce misinformation sharing by users. However, the effectiveness of this approach can be limited, and requires a large field investment in fact-checking capabilities.

Twitter’s User Interface Change in 2020

Another commonly proposed bottom-up approach is for platforms to slow the flow of information, and especially disinformation, by encouraging users to carefully consider the content they share. In October 2020, a few weeks before the US presidential election, Twitter changed the functionality of its ‘retweet’ button (Hatmaker 2020). Fixed button to encourage users to use ‘quote tweet’ when sharing posts. The hope was that this change would encourage users to think about the content they share and slow down the spread of misinformation.

In a recent paper (Ershov and Morales 2024), we investigate how Twitter’s changes to the user interface have influenced the spread of news on the platform. Many news outlets and political organizations use Twitter to promote and present their content, so this change was very prominent in reducing consumers’ access to misinformation. We collected Twitter data for popular US news outlets and examined what happened to their retweets shortly after the change was implemented. Our research revealed that this simple tweak to the retweet button had significant effects on news coverage: on average, media retweets dropped by more than 15% (see Figure 1).

Figure 1 News sharing and Twitter user interface changes

Perhaps most interestingly, we then investigate whether the change affected all news channels to the same extent. In particular, we first examine whether ‘low truth’ news sources (as classified by third-party organizations), where misinformation is most common, were significantly affected by the change as intended by Twitter. Our analysis reveals that this was not the case: the impact on these outlets was no greater than on outlets with better journalism standards; if anything, the results were small. Furthermore, the same comparison reveals that left-wing news outlets (again, distributed by a third party) were more affected than left-wing outlets. The average drop in retweets for liberal stores was around 20%, while the drop for conservative stores was only 5% (Figure 2). These results suggest that the mission of Twitter has failed, not only because it did not reduce the spread of false information related to the true news, but also reduced the spread of political news of one opinion relative to another, which may exacerbate political divisions.

Figure 2 Heterogeneity in outlet facts and slant

We investigate the mechanism of these results and discount a battery of other possible explanations, including different characteristics of the media, criticism of ‘big technology’ by outlets, different presence of bots, and differences in tweet content such as sentiment or perceived risk. . We conclude that a possible reason for the biased effect of the policy is that users who share news regularly were less responsive to Twitter’s displacement. Using an additional data set of individual users sharing news on Twitter, we see that after the change, conservative users changed their behavior less than liberal users – that is, conservatives appeared to be more likely to ignore Twitter notifications and continue sharing content as before. As further evidence of this mechanism, we show similar results in a political context: tweets from NCAA college football teams located in predominantly Republican states were less affected by the change in user experience compared to tweets from teams from Democratic states.

Finally, using web traffic data, we find that Twitter’s policy has affected visits to the websites of these news outlets. After the retweet button was changed, traffic from Twitter to media outlets’ websites dropped, and that did a disservice to liberal news outlets. These off-platform spillover effects confirm the importance of social media in the overall dissemination of information, and highlight the potential risks of platform policies on news consumption and public opinion.

The conclusion

Top-down policy changes in social media must take into account the fact that the effects of new platform designs may be very different for different types of users, and that this may lead to unintended consequences. Social scientists, social media, and policy makers should collaborate in dissecting and understanding these subtle effects, with the goal of improving their structure to encourage informed and balanced discussions that allow for a healthy democracy.

See the original post for clues

Print Friendly, PDF & Email

Source link