Business

Critical thinking needed as news feeds develop

The endless and ugly political season has me thinking about how people go about getting their news. We’ve just learned, thanks to a report from the Pew Research Center, that more than 40 percent of Americans find news on Facebook. That’s probably not surprising given Facebook’s huge user base, but it draws attention to the “trending news” features the social media giant offers, and reminds us that these continue to be a cause of controversy.

Glance at your Facebook page and you’ll see trending topics in the top right corner. Back in May we learned through a site called Gizmodo that Facebook was using so-called “news curators” – people contracted by Facebook who chose the topics that would appear on the page. Facebook clearly had plans to become, among other things, a digital newspaper, but charges of bias led the company to fire its curators and use computer algorithms to select trending topics.

Unfortunately, that move hasn’t solved the problem. Facebook is getting a reputation for trending stories that, upon closer examination, turn out to be bogus, or in some cases link to “news” in the form of nothing more than press releases. The site personalizes trends for each user, but the appearance of phony stories on everything from the September 11 attacks to the employment status of Fox News anchor Megyn Kelly has tarnished its image.

The size of Facebook’s user base makes it a likely target for disinformation. The entire model of social media is to cluster people into categories, a fact made lamentably obvious when you see like-minded politicos swarming around a topic and echoing each other’s thoughts. These are personalized news feeds but they encourage an “us vs. them” mentality that discourages objectivity. All of this makes Facebook’s trending news stories an important thing to get right.

News algorithms are going to get much better, but right now we’re still learning what they can do, and Facebook is hardly alone in its travails. I use Google Now a lot on my smartphone, letting it learn my interests and send me stories I can use in my writing. I’m continually startled when, having expressed my interest in astrophysics, I occasionally receive lurid tales of alien cities uncovered by our Mars rovers and hostile UFOs buzzing the International Space Station.

Surely we can do better than this. Now I see that Google is going to add a fact-check feature to its Google News service (news.google.com), which pulls in stories from a wide variety of sources. The plan is to offer fact-based context with the help of a product called ClaimReview. Users will theoretically be able to check factual content in major stories as they break, drawing on labels that describe news content and allow Google to identify its sources.

We’ll see how this works, but as I read it, content creators will be the ones to add fact-check tags to their posts, so we may wind up with a two-tier news presentation (at least on Google News) in which posts that have no fact-check capability become less trusted than those that do. This puts the onus on Google to keep an eye on content creators to make sure that tags are not being used inaccurately. We should start seeing the process in action on Google News soon.

In my view, the best fix for bogus information isn’t new algorithms but a habit of critical thinking on the part of readers, who must make intelligent choices in what they read and trust. The Internet demands a digital literacy that compels users to weigh news sources and follow up controversial claims with supporting material. This new literacy includes honing our search skills to become fact-checkers rather than true believers whenever dubious claims are presented.

But especially in election season, we live in an environment so fast-paced that bogus stories just disappear in the next onslaught of material before they are truly vetted. Yes, computer algorithms are going to get better. But the most important factor in getting accurate news is a skeptical reader who keeps asking questions, knowing how dangerous naive credulity can be.

Paul A. Gilster is the author of several books on technology. Reach him at gilster@mindspring.com.

  Comments