Unfortunately your Page, hnewswire.com, has been unpublished because it violates Facebook Pages terms. This means that you can still see the Page, but other people won’t be able to see it and you won’t be able to add new people to help you work on your Page.If you think this is a mistake, please let us know.
The Facebook Team
Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section, according to a former journalist who worked on the project. This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.
Several former Facebook “news curators,” as they were known internally, also told Gizmodo that they were instructed to artificially “inject” selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion—or in some cases weren’t trending at all. The former curators, all of whom worked as contractors, also said they were directed not to include news about Facebook itself in the trending module.In other words, Facebook’s news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation. Imposing human editorial values onto the lists of topics an algorithm spits out is by no means a bad thing—but it is in stark contrast to the [company’s claims](https://www.facebook.com/help/737806312958641) that the trending module simply lists “topics that have recently become popular on Facebook.”
These new allegations emerged after Gizmodo last week revealed details about the inner workings of Facebook’s trending news team—a small group of young journalists, primarily educated at Ivy League or private East Coast universities, who curate the “trending” module on the upper-right-hand corner of the site. As we reported last week, curators have access to a ranked list of trending topics surfaced by Facebook’s algorithm, which prioritizes the stories that should be shown to Facebook users in the trending section. The curators write headlines and summaries of each topic, and include links to news sites. The section, which launched in 2014, constitutes some of the most powerful real estate on the internet and helps dictate what news Facebook’s users—167 million in the US alone—are reading at any given moment.“Depending on who was on shift, things would be blacklisted or trending,” said the former curator. This individual asked to remain anonymous, citing fear of retribution from the company. The former curator is politically conservative, one of a very small handful of curators with such views on the trending team. “I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”
The former curator was so troubled by the omissions that they kept a running log of them at the time; this individual provided the notes to Gizmodo. Among the deep-sixed or suppressed topics on the list: former IRS official Lois Lerner, who was accused by Republicans of inappropriately scrutinizing conservative groups; Wisconsin Gov. Scott Walker; popular conservative news aggregator the Drudge Report; Chris Kyle, the former Navy SEAL who was murdered in 2013; and former Fox News contributor Steven Crowder. “I believe it had a chilling effect on conservative news,” the former curator said.
Another former curator agreed that the operation had an aversion to right-wing news sources. “It was absolutely bias. We were doing it subjectively. It just depends on who the curator is and what time of day it is,” said the former curator. “Every once in awhile a Red State or conservative news source would have a story. But we would have to go and find the same story from a more neutral outlet that wasn’t as biased.”Stories covered by conservative outlets (like Breitbart, Washington Examiner, and Newsmax) that were trending enough to be picked up by Facebook’s algorithm were excluded unless mainstream sites like the *New York Times*, the BBC, and CNN covered the same stories.
Other former curators interviewed by Gizmodo denied consciously suppressing conservative news, and we were unable to determine if left-wing news topics or sources were similarly suppressed. The conservative curator described the omissions as a function of his colleagues’ judgements; there is no evidence that Facebook management mandated or was even aware of any political bias at work.
Managers on the trending news team did, however, explicitly instruct curators to artificially manipulate the trending module in a different way: When users weren’t reading stories that management viewed as important, several former workers said, curators were told to put them in the trending news feed anyway. Several former curators described using something called an “injection tool” to push topics into the trending module that weren’t organically being shared or discussed enough to warrant inclusion—putting the headlines in front of thousands of readers rather than allowing stories to surface on their own. In some cases, after a topic was injected, it actually became the number one trending news topic on Facebook.
“We were told that if we saw something, a news story that was on the front page of these ten sites, like CNN, the New York Times, and BBC, then we could inject the topic,” said one former curator. “If it looked like it had enough news sites covering the story, we could inject it—even if it wasn’t naturally trending.” Sometimes, breaking news would be injected because it wasn’t attaining critical mass on Facebook quickly enough to be deemed “trending” by the algorithm. Former curators cited the disappearance of Malaysia Airlines flight MH370 and the Charlie Hebdo attacks in Paris as two instances in which non-trending stories were forced into the module. Facebook has struggled to compete with Twitter when it comes to delivering real-time news to users; the injection tool may have been designed to artificially correct for that deficiency in the network. “We would get yelled at if it was all over Twitter and not on Facebook,” one former curator said.In other instances, curators would inject a story—even if it wasn’t being widely discussed on Facebook—because it was deemed important for making the network look like a place where people talked about hard news. “People stopped caring about Syria,” one former curator said. “[And] if it wasn’t trending on Facebook, it would make Facebook look bad.” That same curator said the Black Lives Matter movement was also injected into Facebook’s trending news module. “Facebook got a lot of pressure about not having a trending topic for Black Lives Matter,” the individual said. “They realized it was a problem, and they boosted it in the ordering. They gave it preference over other topics. When we injected it, everyone started saying, ‘Yeah, now I’m seeing it as number one’.” This particular injection is especially noteworthy because the #BlackLivesMatter movement originated on Facebook, and the ensuing media coverage of the movement often noted its powerful social media presence. (In February, CEO Mark Zuckerberg [expressed his support for the movement](http://gizmodo.com/mark-zuckerberg-asks-racist-facebook-employees-to-stop-1761272768) in an internal memo chastising Facebook employees for defacing Black Lives Matter slogans on the company’s internal “signature wall.”)
When stories about Facebook itself would trend organically on the network, news curators used less discretion—they were told not to include these stories at all. “When it was a story about the company, we were told not to touch it,” said one former curator. “It had to be cleared through several channels, even if it was being shared quite a bit. We were told that we should not be putting it on the trending tool.”(The curators interviewed for this story worked for Facebook across a timespan ranging from mid-2014 to December 2015.)
“We were always cautious about covering Facebook,” said another former curator. “We would always wait to get second level approval before trending something to Facebook. Usually we had the authority to trend anything on our own [but] if it was something involving Facebook, the copy editor would call their manager, and that manager might even call their manager before approving a topic involving Facebook.”
Gizmodo reached out to Facebook for comment about each of these specific claims via email and phone, but did not receive a response.Several former curators said that as the trending news algorithm improved, there were fewer instances of stories being injected. They also said that the trending news process was constantly being changed, so there’s no way to know exactly how the module is run now. But the revelations undermine any presumption of Facebook as a [neutral pipeline for news](http://recode.net/2015/08/21/how-facebook-decides-whats-trending/), or the trending news module as an algorithmically-driven list of what people are *actually* talking about.
Rather, Facebook’s efforts to play the news game reveal the company to be much like the news outlets it is rapidly driving toward irrelevancy: a select group of professionals with vaguely center-left sensibilities. It just happens to be one that poses as a neutral reflection of the vox populi, has the power to influence what billions of users see, and openly discusses whether it should use that power to influence presidential elections.“It wasn’t trending news at all,” said the former curator who logged conservative news omissions. “It was an opinion.”
*[Disclosure: Facebook has launched a program that pays publishers, including the New York Times and Buzzfeed, to produce videos for its Facebook Live tool. Gawker Media, Gizmodo’s parent company, recently joined that program.]
“We take allegations of bias very seriously. Facebook is a platform for people and perspectives from across the political spectrum. Trending Topics shows you the popular topics and hashtags that are being talked about on Facebook. There are rigorous guidelines in place for the review team to ensure consistency and neutrality. These guidelines do not permit the suppression of political perspectives. Nor do they permit the prioritization of one viewpoint over another or one news outlet over another. These guidelines do not prohibit any news outlet from appearing in Trending Topics.”
**Update May 10, 8:50 a.m. EST: **The following statement was posted by Vice President of Search at Facebook, Tom Stocky, late last night. It was liked by both Mark Zuckerberg and Sheryl Sandberg:> My team is responsible for Trending Topics, and I want to address today’s reports alleging that Facebook contractors manipulated Trending Topics to suppress stories of interest to conservatives. We take these reports extremely seriously, and have found no evidence that the anonymous allegations are true. > > Facebook is a platform for people and perspectives from across the political spectrum. There are rigorous guidelines in place for the review team to ensure consistency and neutrality. These guidelines do not permit the suppression of political perspectives. Nor do they permit the prioritization of one viewpoint over another or one news outlet over another. These guidelines do not prohibit any news outlet from appearing in Trending Topics. > > Trending Topics is designed to showcase the current conversation happening on Facebook. Popular topics are first surfaced by an algorithm, then audited by review team members to confirm that the topics are in fact trending news in the real world and not, for example, similar-sounding topics or misnomers. > > We are proud that, in 2015, the US election was the most talked-about subject on Facebook, and we want to encourage that robust political discussion from all sides. We have in place strict guidelines for our trending topic reviewers as they audit topics surfaced algorithmically: reviewers are required to accept topics that reflect real world events, and are instructed to disregard junk or duplicate topics, hoaxes, or subjects with insufficient sources. Facebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin and we’ve designed our tools to make that technically not feasible. At the same time, our reviewers’ actions are logged and reviewed, and violating our guidelines is a fireable offense. > > There have been other anonymous allegations — for instance that we artificially forced [#BlackLivesMatter](https://www.facebook.com/hashtag/blacklivesmatter?source=feed_text&story_id=10100853082337958) to trend. We looked into that charge and found that it is untrue. We do not insert stories artificially into trending topics, and do not instruct our reviewers to do so. Our guidelines do permit reviewers to take steps to make topics more coherent, such as combining related topics into a single event (such as [#starwars](https://www.facebook.com/hashtag/starwars?source=feed_text&story_id=10100853082337958) and[#maythefourthbewithyou](https://www.facebook.com/hashtag/maythefourthbewithyou?source=feed_text&story_id=10100853082337958)), to deliver a more integrated experience. > > Our review guidelines for Trending Topics are under constant review, and we will continue to look for improvements. We will also keep looking into any questions about Trending Topics to ensure that people are matched with the stories that are predicted to be the most interesting to them, and to be sure that our methods are as neutral and effective as possible. > > Gizmodo’s Michael Nunez is out today with a sensational [story](http://gizmodo.com/former-facebook-workers-we-routinely-suppressed-conser-1775461006) in which former Facebook employees claim they regularly censored the platform’s “trending” news section to eliminate stories about conservative topics that were organically trending, [blacklisted](http://gizmodo.com/want-to-know-what-facebook-really-thinks-of-journalists-1773916117) certain news outlets from appearing and artificially “injected” stories they felt were important but that the site’s users were not discussing or clicking on. This comes a month after Nunez [published](http://gizmodo.com/facebook-employees-asked-mark-zuckerberg-if-they-should-1771012990) a leaked internal Facebook poll that asked “What responsibility does Facebook have to help prevent President Trump in 2017?” In short, as the curtain has been lifted on Facebook’s magical trending algorithm, the mythical unbiased algorithm powering what users see on the site is seen to be less machine and more biased human curator. Yet, given Facebook’s phenomenal reach across the world and the role it increasingly plays as primary news gateway for more and more people, the notion that it is systematically curating what its users see in an unalgorithmic and partisan way raises alarm bells on the future of how we access and consume information. > > Ryan Merkley, CEO of Creative Commons [wrote](http://www.wired.com/2016/04/stealing-publicly-funded-research-isnt-stealing/) in Wired last month that “If the Web has achieved anything, it’s that it’s eliminated the need for gatekeepers, and allowed creators—all of us—to engage directly without intermediaries, and to be accountable directly to each other.” Yet, such a rosily optimistic view of the web’s impact on society seems to ignore the mounting evidence that the web is in fact merely coalescing around a new set of gatekeepers. As Jack Mirkinson [wrote](http://www.salon.com/2016/05/01/googles_new_media_apocalypse_how_the_search_giant_wants_to_accelerate_the_end_of_the_age_of_websites/) for Salon earlier this month, “the internet, that supposed smasher of gates and leveler of playing fields, has coalesced around a mere handful of mega-giants in the space of just a couple of decades. The gates didn’t really come down. The identities of the gatekeepers just changed. Google, Facebook, Apple, Amazon: How many people can really say that some portion of every day of their lives isn’t mediated by at least one of these companies? … It seems that, at least for the moment, we are destined to live in the world that they create—and that includes everyone in the media business.” > > Far from democratizing how we access the world’s information, the web has in fact narrowed those information sources. Much as large national chains and globalization have replaced the local mom-and-pop shop with the megastore and local craftsmanship with assembly line production, the internet is centralizing information access from a myriad websites and local newspapers and radio/television shows to single behemoth social platforms that wield universal global control over what we consume. > > Indeed, social media platforms appear to increasingly view themselves no longer as neural publishing platforms but rather as active mediators and curators of what we see. This extends even to new services like messaging. David Marcus, Facebook’s Vice President of Messaging recently [told](http://www.wired.com/2016/04/facebook-believes-messenger-will-anchor-post-app-internet/) Wired: “Unlike email where there is no one safeguarding the quality and the quantity of the stuff you receive, we’re here in the middle to protect the quality and integrity of your messages and to ensure that you’re not going to get a lot of stuff you don’t want.” In short, Facebook wants to act as an intelligent filter onto what we see of the world. The problem is that any filter by design must emphasize some content and views at the expense of others. > > In the case of Facebook, the new revelations are most concerning because they go to the very heart of how these new social platforms shape what we understand about the world. It is one thing for a platform to announce it will delete posts that promote terrorism or that threaten another user with bodily harm, but to silently and systematically filter what users see through a distinct partisan lens, especially with regards to news reporting, adds a frightening dimension to just how much power a handful of Silicon Valley companies now wield over what we see online. > > All Original Content Copyright ©2017 hnewswire.com All Rights Reserved. “All Original Content Copyright ©2017 hnewswire.com All Rights Reserved. “hnewswire.com”, “hnewswire.com”, and the “hnewswire.com” logo, are Trademarks of the hnewswire.com – All Rights Reserved.”, “hnewswire.com”, and the “hnewswire.com” logo, are Trademarks of the hnewswire.com – All Rights Reserved.