<p>Algorithms controlling a social media user's feed, while largely opaque, may not be polarising the society in the same ways as the public tends to think, social scientists say.</p>.<p>They have published studies examining social media's impact on individuals' political attitudes and behaviours during the US presidential election in 2020 in the journals <em>Nature</em> and <em>Science</em>.</p>.<p>"The notion that such algorithms create political 'filter bubbles', foster polarisation, exacerbate existing social inequalities, and enable the spread of disinformation has become rooted in the public consciousness," write Andrew M Guess, lead author of one of these newly published studies, and colleagues about the opaque-to-users algorithms used by social media companies.</p>.<p>The <em>Nature</em> study found that exposing a Facebook user to content from sources having the same political persuasions as them, or "like-minded" sources, did not measurably impact the user's political beliefs or attitudes during the 2020 US presidential election.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/technology/meta-owned-threads-sees-dip-in-user-engagement-brands-creators-have-different-views-on-its-future-1239902.html">Meta-owned Threads sees dip in user engagement; brands, creators have different views on its future</a></strong></p>.<p>"These findings do not mean that there is no reason to be concerned about social media in general or Facebook in particular," said Brendan Nyhan, one of the four lead authors on the study.</p>.<p>Nyhan said that while there are many other concerns we could have about the ways social media platforms could contribute to extremism, exposure to like-minded sources' content was likely not one of them.</p>.<p>"We need greater data transparency that enables further research into what's happening on social media platforms and its impacts," said Nyhan. "We hope our evidence serves as the first piece of the puzzle and not the last."</p>.<p>The studies published in <em>Science</em> helped answer these questions - Does social media make us more polarised as a society, or merely reflect the divisions that already exist? Does it help people to become better informed about politics, or less? And how does social media affect people's attitudes towards government and democracy?</p>.<p>Examining the effect of algorithmic feed-ranking systems on an individual's politics, Guess and team recruited participants through survey invitations placed on the top of their Facebook and Instagram feeds in August 2020 and divided them into treatment and control groups.</p>.<p>After a three-month analysis, the researchers found no detectable changes in the treatment group, who were less engaged with content on platforms and exposed to more ideologically diverse content, compared to the control group, whose feeds were not tampered with.</p>.<p>In a second study, also led by Guess, suppressing reshared content on Facebook, while substantially decreasing the amount of political news to which users were exposed, was found to not affect political opinions. They compared a control group for whom no changes were made to Facebook feeds to a treatment group for whom reshared content was removed from feeds.</p>.<p>Removing reshared content, previously shown to increase political polarisation and political knowledge, decreased users' clicks on partisan news links, the proportion of political news they saw, and their exposure to untrustworthy content. However, the authors could not reliably detect shifts in users' political attitudes or behaviours, other than a reduced news knowledge in the treatment group.</p>.<p>"Though reshares may have been a powerful mechanism for directing users' attention and behaviour on Facebook during the 2020 election campaign," conclude the authors, "they had limited impact on politically relevant attitudes and offline behaviours."</p>.<p>In a third study, Sandra Gonzalez-Bailon and colleagues report politically conservative users to be much more segregated and to encounter far more misinformation on the platform.</p>.<p>"Facebook… is substantially segregated ideologically - far more than previous research on internet news consumption based on browsing behaviour has found," write Gonzalez-Bailon and team.</p>.<p>They examined the flow of political content in a sample of 208 million Facebook users during the 2020 election - all content users could potentially see; content they actually did see on feeds selectively curated by Facebook's algorithms; and content engaged with through clicks, reshares, or other reactions.</p>.<p>Compared to liberals, the authors found politically conservative users to be far more siloed in their news sources and exposed to much more misinformation.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/technology/meta-plans-retention-hooks-for-threads-as-more-than-half-of-users-leave-app-1241480.html">Meta plans retention 'hooks' for Threads as more than half of users leave app</a></strong></p>.<p>While there is ongoing vigorous debate about the role of internet in the political news that people encounter, news that helps them form beliefs, and thus in "ideological segregation", this study found both algorithms and users' choices to have played a part in this ideological segregation.</p>.<p>It primarily surfaced in Facebook's Pages and Groups - areas policymakers may target to combat misinformation - as opposed to from content posted by friends, the authors said, which was an important direction for further research.</p>.<p>The findings are part of a broader research project examining the role of social media in US democracy. Known as the US 2020 Facebook and Instagram Election Study, the project provided social scientists with social media data, previously inaccessible.</p>.<p>Seventeen academics from US colleges and universities teamed up with Meta, the parent company of Facebook, to conduct independent research on what people see on social media and how it affects them. To protect against conflicts of interest, the project built in several safeguards, including pre-registering the experiments. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions, a statement from one of the universities involved in the project said.</p>
<p>Algorithms controlling a social media user's feed, while largely opaque, may not be polarising the society in the same ways as the public tends to think, social scientists say.</p>.<p>They have published studies examining social media's impact on individuals' political attitudes and behaviours during the US presidential election in 2020 in the journals <em>Nature</em> and <em>Science</em>.</p>.<p>"The notion that such algorithms create political 'filter bubbles', foster polarisation, exacerbate existing social inequalities, and enable the spread of disinformation has become rooted in the public consciousness," write Andrew M Guess, lead author of one of these newly published studies, and colleagues about the opaque-to-users algorithms used by social media companies.</p>.<p>The <em>Nature</em> study found that exposing a Facebook user to content from sources having the same political persuasions as them, or "like-minded" sources, did not measurably impact the user's political beliefs or attitudes during the 2020 US presidential election.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/technology/meta-owned-threads-sees-dip-in-user-engagement-brands-creators-have-different-views-on-its-future-1239902.html">Meta-owned Threads sees dip in user engagement; brands, creators have different views on its future</a></strong></p>.<p>"These findings do not mean that there is no reason to be concerned about social media in general or Facebook in particular," said Brendan Nyhan, one of the four lead authors on the study.</p>.<p>Nyhan said that while there are many other concerns we could have about the ways social media platforms could contribute to extremism, exposure to like-minded sources' content was likely not one of them.</p>.<p>"We need greater data transparency that enables further research into what's happening on social media platforms and its impacts," said Nyhan. "We hope our evidence serves as the first piece of the puzzle and not the last."</p>.<p>The studies published in <em>Science</em> helped answer these questions - Does social media make us more polarised as a society, or merely reflect the divisions that already exist? Does it help people to become better informed about politics, or less? And how does social media affect people's attitudes towards government and democracy?</p>.<p>Examining the effect of algorithmic feed-ranking systems on an individual's politics, Guess and team recruited participants through survey invitations placed on the top of their Facebook and Instagram feeds in August 2020 and divided them into treatment and control groups.</p>.<p>After a three-month analysis, the researchers found no detectable changes in the treatment group, who were less engaged with content on platforms and exposed to more ideologically diverse content, compared to the control group, whose feeds were not tampered with.</p>.<p>In a second study, also led by Guess, suppressing reshared content on Facebook, while substantially decreasing the amount of political news to which users were exposed, was found to not affect political opinions. They compared a control group for whom no changes were made to Facebook feeds to a treatment group for whom reshared content was removed from feeds.</p>.<p>Removing reshared content, previously shown to increase political polarisation and political knowledge, decreased users' clicks on partisan news links, the proportion of political news they saw, and their exposure to untrustworthy content. However, the authors could not reliably detect shifts in users' political attitudes or behaviours, other than a reduced news knowledge in the treatment group.</p>.<p>"Though reshares may have been a powerful mechanism for directing users' attention and behaviour on Facebook during the 2020 election campaign," conclude the authors, "they had limited impact on politically relevant attitudes and offline behaviours."</p>.<p>In a third study, Sandra Gonzalez-Bailon and colleagues report politically conservative users to be much more segregated and to encounter far more misinformation on the platform.</p>.<p>"Facebook… is substantially segregated ideologically - far more than previous research on internet news consumption based on browsing behaviour has found," write Gonzalez-Bailon and team.</p>.<p>They examined the flow of political content in a sample of 208 million Facebook users during the 2020 election - all content users could potentially see; content they actually did see on feeds selectively curated by Facebook's algorithms; and content engaged with through clicks, reshares, or other reactions.</p>.<p>Compared to liberals, the authors found politically conservative users to be far more siloed in their news sources and exposed to much more misinformation.</p>.<p><strong>Also Read | <a href="https://www.deccanherald.com/business/technology/meta-plans-retention-hooks-for-threads-as-more-than-half-of-users-leave-app-1241480.html">Meta plans retention 'hooks' for Threads as more than half of users leave app</a></strong></p>.<p>While there is ongoing vigorous debate about the role of internet in the political news that people encounter, news that helps them form beliefs, and thus in "ideological segregation", this study found both algorithms and users' choices to have played a part in this ideological segregation.</p>.<p>It primarily surfaced in Facebook's Pages and Groups - areas policymakers may target to combat misinformation - as opposed to from content posted by friends, the authors said, which was an important direction for further research.</p>.<p>The findings are part of a broader research project examining the role of social media in US democracy. Known as the US 2020 Facebook and Instagram Election Study, the project provided social scientists with social media data, previously inaccessible.</p>.<p>Seventeen academics from US colleges and universities teamed up with Meta, the parent company of Facebook, to conduct independent research on what people see on social media and how it affects them. To protect against conflicts of interest, the project built in several safeguards, including pre-registering the experiments. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions, a statement from one of the universities involved in the project said.</p>