About 5 weeks ago CNN highlighted a study on how we share misinformation. The study is found here and says exactly what you think it says. Thanks to our internal narratives, the power of echo chambers, and our unprecedented ability to share information, we can turn rumors into beliefs "which, once adopted, are rarely corrected."
As if to drive home the point, CNN's Fareed Zakaria was sucked into this exact process and the result wasn't pretty. It started with a piece of "satire" on thepeoplescube.com:
I'm not sure you call an article with the headline "CNN host Fareed Zakaria calls for jihad rape of white women" satire, but I suppose you could argue that the site is obviously over the top and shouldn't be considered the source for a breaking scandal. But, the cycle didn't stop there. Site's like conservativepost.com and usanewsflash.com picked up the story and repeated it as news.
(Note the 58,019 shares of this article; yikes!)
As the comments of the article show, people reading the post assumed it was true:
Of course, Twitter had to get in on the fun, too. On Twitter, an individual's Tweet more-or-less carries the same weight as any institution. So when Nathan Pole offers up a clever Tweet on the topic, it looks totally legit:
@CNN host Fareed Zakaria calls for #jihad rape of white women in #USA! https://t.co/rM9eF7xvqZ #firehim #racist #evilman
— Nathan pole (@orangeontop) February 3, 2016
Naturally, I'm appalled by all of this. So, how do we fix it? Well, that's where things get hazy for me. Consider where we are at.
First, we have sites dedicated to debunking rumors, crowd-based article ranking and ferreting out facts; people choose to ignore them or better yet, discount them as not to be trusted.
Second, there are the echo chambers. The above study captures the dark side of this phenomena well:
Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization. This comes at the expense of the quality of the information and leads to proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia
And while the online world has made these "communities of interest" more robust than ever, they're also far more accessible. With one click, you can explore communities that that you would normally have no access to. From Home Schooling to Men's Rights to Children of Deaf Adults, it only takes a few clicks to jump into these communities. Simply put, stepping out of our echo chambers and into our neighbor's has never been easier.
And finally, the same technology that flattens the hierarchy and allows bad ideas to be spread with ease, can be used to spread good ideas. Shira's been following the case of Adnan Syed, an individual who may get his murder charge retried thanks to a podcast. The advocates of Adnan leverage Twitter, and my new favorite toy: Periscope to do on the spot public broadcasts of the latest changes in the case. The whole arrangement is a master class in using technology to effect change.
So how do we keep fiction from turning into fact, and these facts from being the basis for our decision making? At the end of the day, I've got no idea. But I know the problem is real, and that any solution could do just as much harm as good.
How would you address this conundrum?
No comments:
Post a Comment