IF YOU READ — and believed — a news item saying that a candidate was involved in child trafficking, would you vote for them?
Yet, this is exactly the kind of thing people are reading and believing on Facebook. I know of two people who believed something they read “on the Internet” about the Clinton Foundation being involved in child trafficking.
It didn’t take much for me to track this “news” to an anonymous comment on an obscure forum. But for a lot of people, Facebook is a trusted brand. They don’t think to question what they read there.
The child-trafficking story is just one example of fake news on Facebook. And before you pooh-pooh the influence of Facebook, have a look at the latest statistics from the Pew Research Center.
Almost 80 per cent of Americans on the Internet have a Facebook account. And about three-quarters of those with accounts check in every day. The figures for Canada and much of the world are likely similar.
That is the power and reach of Facebook, something that the company has carefully cultivated over the years, because it help them sell stuff and make money — worthy goals for any business. But on the other hand, they can’t turn around and blithely ignore the fact that if they are influencing people’s ideas about what to buy, they are also influencing their ideas about how to vote.
CEO Mark Zuckerberg is in denial:
Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea. Voters make decisions based on their lived experience.
Emily Bell, writing for the Columbia Journalism Review, has a perfect response:
Facebook founder Mark Zuckerberg posted a picture of himself in quarter-profile, with his toddler daughter Max, as they watched Donald Trump become president of the United States. The smiley emoji on his post says he is “hopeful.” “Holding Max, I thought about all the work ahead of us to create the world we want for our children,” he wrote.
The good news for Zuckerberg is that, unlike most people, he can make the world a better place almost immediately just by taking more responsibility for Facebook’s publishing policies. By acknowledging that Facebook can and should play a more active part in editing — yes, editing — its own platform, and hiring actual people to do so, Zuckerberg will further the civic commons as well as address a growing problem of how people perceive Facebook.
We used to worry about people being uninformed and apathetic. Now we have to worry about them being misinformed and taking action based on misinformation.
I fear we may reach a point where news becomes more disreputable than advertising. At least advertising has to adhere to certain minimum standards of truthfulness. The purveyors of fake news, on the other hand, can crank out all the junk they like and get away with it.
There is hope, though. Facebook did finally back down from allowing race-based advertising. In the end, they had to respond to an outcry from their users, because a tarnished reputation would be bad for business.
If Zuckerberg refuses to do anything about fake news, he risks Facebook eventually suffering the same fate as other platforms that have become abandoned cesspools.
I don’t expect people to leave Facebook in droves any time soon, but in the meantime there are things you can do.
First, understand that everything you read has a source. News articles, even fake ones, are written by someone who works for an organization of some kind. Check out the source. If it’s something you’ve never heard of, and if the news isn’t being carried by an organization you know to be reputable, then chances are it’s not real.
Second, if an item is obviously fake, don’t click on it. A lot of these stories are so outrageous that you can’t help but want to read more — if only for entertainment value. Resist the temptation, because Facebook’s computer algorithm will take your click as a sign you want more stories like that. Don’t encourage them.
Third, if you are concerned about fake news, then say so in a Facebook post. It might not seem like they are listening, but if enough users express a concern, they will have no choice.
According to Forbes, Facebook is the fifth most valuable brand in the world. That could easily change if enough of us no longer trust them.
Mark Rogers writes about media and technology at his newsonaut blog.