False stories travel way faster than the truth
Twitter loves lies. A new study finds that false information on the social media network travels six times faster than the truth and reaches far more people.
And you can't blame bots. It's us, say the authors of the largest study of online misinformation.
Researchers at the Massachusetts Institute of Technology looked at more than 126,000 stories tweeted millions of times between 2006 and the end of 2016. This was before Donald Trump took office but during the combative presidential campaign. They found that "fake news" sped through Twitter "farther, faster, deeper and more broadly than the truth in all categories of information." That's according to the study in Thursday's journal Science.
"No matter how you slice it, falsity wins out,” said co-author Deb Roy, who runs MIT's Laboratory for Social Machines. She is a former chief media scientist at Twitter.
Twitter funded the study but had no say in the outcome, according to the researchers.
The scientists calculated that the average false story takes about 10 hours to reach 1,500 Twitter users, but it takes about 60 hours for the truth to reach Twitter users. On average, false information reaches 35 percent more people than true news.
True news stories almost never got retweeted to 1,000 people. But the top 1 percent of the false ones got to as many as 100,000 people.
Researchers looked at how stories cascade, how they link from one person to another like a family tree. False information reached as many as 24 generations, but true information maxed out at a dozen.
Concern over bogus stories online has escalated in recent months due to evidence the Russians spread disinformation on social media during the 2016 presidential campaign.
Social media companies have experimented with using computer algorithms and human fact-checkers to try to weed out false information and abuse online. Twitter earlier this month said it is seeking help from outside experts to better deal with the problem and Facebook this week announced a partnership with The Associated Press to identify and debunk false and misleading stories about the midterm elections.
"We have witnessed abuse, harassment, troll armies, manipulation through bots and human-coordination, misinformation campaigns and increasingly divisive echo chambers,” a tweet by Twitter co-founder and CEO Jack Dorsey read. "We aren't proud of how people have taken advantage of our service, or our inability to address it fast enough."
The MIT study took the 126,285 stories and checked them against six independent fact-checking sites. These sites included snopes.com, politifact.com, factcheck.org, truthorfiction.com, hoax-slayer.com and urbanlegends.about.com. They used the sites to classify stories as true, false or mixed. Nearly two-thirds were false, just under one-fifth were true and the rest were mixed.
The six fact-checking websites agreed with each other on classification at least 95 percent of the time, plus two outside researchers did some independent fact-checking to make sure everything was OK, said co-author Sinan Aral, an MIT management professor.
Lead author Soroush Vosoughi, an MIT data scientist, said there were three false stories that traveled the farthest and fastest. They were about a Muslim guard called a hero in the Paris bombings of 2015; an Iraq war veteran finishing as runner-up to Caitlyn Jenner for an ESPN courage award ; and an episode of "The Simpsons" that had a story line in 2000 about a Trump presidency. (It was in 2015.)
University of Pennsylvania communications professor Kathleen Hall Jamieson, a co-founder of factcheck.org, had problems with the way the study looked at true and false stories. The MIT team characterized a story's truth on a 1-to-5 scale, with 1 being completely false. Factcheck.org, Jamieson said, looks more at context and does not label something either true or false.
She also suggested that calling this bogus information "false stories" does not capture how malignant it is. She said it would "better be called viral deception."
The researchers looked at obvious bots — automated accounts — and took them out. While the bots tweeted false information at a higher rate than humans, it wasn't that much of a difference, and even without bots, lies still spread faster and farther, Roy said.
David Lazer, a political and computer scientist at Northeastern University wasn't part of the study but wrote an accompanying report. He praised the MIT research but said the scientists may have missed a lot of bots and cyborgs — sort of in-between humans. His ongoing, not-yet-published research has found that about 80 percent of false stories come from just one-tenth of 1 percent of users.
The researchers dug deeper to find out what kind of false information travels faster and farther. False political stories — researchers didn't separate conservative versus liberal — and stuff that was surprising or anger-provoking spread faster than other types of lies, Aral said.
"Falsehood was significantly more novel than the truth," Aral said. "It's easy to be novel when you make things up."
That fits perfectly with previous research on the psychology of fake information, said Yale University's Dan Kahan and Dartmouth College's Brendan Nyhan, scientists who study the phenomenon.
"The more strange and more sensational the story sounds, the more likely they are going to retweet," Kahan said.
Nyhan and Lazer said that while more fact-checking and education of people on how to tell fake from real can be helpful, the more effective solution will have to come from the social media platforms themselves.