The study measured how much knowledge test subjects gained from a highlighted issue, as well as how much knowledge they thought they had.
A recent study by researchers at York College shows that users who only read article previews in their Facebook feed think they know more than they do, compared to those who actually read the article.
The investigation was first published on Feb. 4 in the peer-reviewed Research & Politics scientific journal. Its objective was to measure how much knowledge test subjects gained from a highlighted issue, as well as how much knowledge they thought they had.
"Most social media users only have a passing engagement with posted news, exposure to political information on social media may simply create the illusion of political learning," the authors explain.
For the experiment, they gathered 1,000 subjects, who were divided into three groups and randomly presented with either a full news article, a Facebook News Feed including a preview of that article, or no information at all.
The first group read through a full news piece from The Washington Post about genetically modified (GM) foods. The second was given a Facebook newsfeed which featured the same article and the last group was given no information at all. Afterward, they had to answer six factual questions to measure their knowledge on the matter, and to rate their level of confidence, they were asked how many questions they thought they got right.
The results showed that those who read the full article knew most about GM foods in comparison to those who only read the snippet or nothing at all. However, findings also revealed that participants who read only the previews were overconfident of their knowledge; meaning they thought they knew more about that certain subject.
This investigation is relevant as about 68 percent of U.S. adults say they get their news from social media, according to a new Pew Research Center survey in 2018. “We believe that this has important implications for how people learn about politics,” the authors added.
As fake news spread over social media, this also means that the information people think they know might be affected. In 2017, the Cambridge Analytica scandal showed the world the reach their Facebook content might have on their political decisions.
A whistleblower from the company, Christopher Wylie, told the Observer that they “exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.”
This resulted in an inquiry by the U.K. Committee of Digital, Culture, and Media of the House of Commons. In one of its reports the MP’s argued to the existence of "alarming evidence" about the interference carried out by Cambridge Analytica in the elections of the South American country during the 2015 presidential campaign.