Tag Archives: Facebook

Facebook v. Science

Social media have helped us cocoon ourselves into comfortable ignorance of “the other side” — so goes the prevailing notion of the last few years, since Facebook has been king.

A team of researchers at Facebook published an article Thursday that claimed to detail how much the site contributes to political echo chambers or filter-bubbles. Published in the journal Science, their report claimed Facebook’s blackbox newsfeed algorithm weeded out some disagreeable content from readers’ feeds, but not as much as did their personal behavior.

A flurry of criticism came from other social scientists, with one, University of Michigan’s Christian Sandvig, calling it Facebook’s “it’s not our fault” study.

Sample frame

Perhaps the most important limitation to the findings is the small, and unique, subset of users examined. Although the total number was huge (10 million), these were users who voluntarily label their political leanings on their profile, and also log on regularly — only about 4 percent of the total Facebook population, who differ from general users in obvious and subtle ways. Critics have pointed out this crucial detail is relegated to an appendix.

Despite the sample problem, the authors framed their findings by saying they “conclusively establish [them] on average in the context of Facebook […]” [emphasis added].

As University of North Carolina’s Zeynep Tufekci and University of Maryland’s Nathan Jurgenson pointed out, that’s simply inaccurate. The context the Facebook researchers examined was highly skewed, and cannot be generalized.

While the ideal random sample is not always available and convenient samples can tell us much about subpopulations of interest, the sampling selection here confounded the results. Those who are willing to include their political preferences in their Facebook bio are likely to deal with ideologically challenging information in fundamentally different ways than everyone else does.

In spite of this criticism, though, we now know more about that type of user than we did yesterday.

Algorithm vs. personal choice (what they really found, and didn’t)

Another troubling aspect of the study has to do with the way the main finding is presented. The authors write that Facebook’s newsfeed algorithm reduces exposure to cross-cutting material by 8 percent (1 in 13 of such hard-news stories) for self-identified liberals and 5 percent (1 in 20) for conservatives. The researchers also report that these individuals themselves further reduce diverse content exposure by 6 percent among liberals and 17 percent among conservatives.

The comparison of these — algorithm and personal choice — is what caused Sandvig to call this Facebook’s “it’s not our fault” study.

Tufekci and Jurgenson say the authors failed to mention the two effects are additive and cumulative. That individuals make reading choices that contribute to their personal filter-bubble is pretty much unchallenged. Yesterday’s study confirmed that Facebook’s algorithm adds to that, above the psychological baseline. This was not the emphasis of the comparison they made, nor of many headlines covering the study.

For instance:

Screen Shot 2015-05-08 at 3.11.44 PM

Tufecki and Jurgenson also point out the authors apparently have botched the statement of this main finding by claiming “that on average in the context of Facebook individual choices more than algorithms limit exposure to attitude-challenging content.” The findings they report are actually mixed: Self-identified liberals’ exposure was more strongly suppressed by the algorithm than by personal choice (8 percent v. 6 percent), while for conservatives the reverse was true (5 percent v. 17 percent).

Science is iterative

Amid all the blowback in the academic world, especially over the inflated claims of the conclusion, some called for a more dispassionate appraisal. Dartmouth’s Brendan Nyhan, who regularly contributes to New York Times’ Upshot, asked for social scientists to “show we can hold two (somewhat) opposed ideas in our heads at the same on the [Facebook] study.” Translated, the study is important, if flawed.

“Science is iterative!” Nyhan tweeted. “Let’s encourage [Facebook] to help us learn more, not attack them every time they publish research. Risk is they just stop.”

But there are rejoinders to that call as well. As University of Maryland law professor James Grimmelmann pointed out, “‘Conclusively’ doesn’t leave a lot of room for iteration.”

Nyhan’s point, that Facebook could stop publishing its findings given enough criticism also highlights that the study, conducted with their proprietary data, is not replicable, a key ingredient in scientific research.

Journals and journalists

Given the overstated (or misstated) findings, many have called out Science, the journal that published the article. Not only is Science peer-reviewed, but along with Nature is one of the foremost academic journals in the world.

While many of yesterday’s news articles noted the controversy around the publication, others repeated the debated conclusion verbatim. Jurgenson had harsh words for the journal: “Reporters are simply repeating Facebook’s poor work because it was published in Science. [Th]e fault here centrally lies with Science, [which] has decided to trade its own credibility for attention. [K]inda undermines why they exist.”

In the Summer 2014 GJR article, “Should journalists take responsibility for reporting bad science?” I wrote about the responsible parties in such cases. Although social media habits are not as high-stakes as health and medicine, journals, public relations departments and scientists themselves must be more accountable for the information they pass on to journalists and ultimately readers.

Although “post-publication review” is here to stay, the initial gatekeepers should always be the first line of defense against bad science  — especially when the journal in question carries the mantle of the entire Scientific enterprise.

Former St. Louis Post-Dispatch writer says jail series is inaccurate

Eddie Roth, St. Louis’ Director of Operations and former Post-Dispatch editorial writer, is using his Facebook page to criticize a recent Post-Dispatch series, “Jailed by Mistake.”

Roth maintains the series “is premised on ‘facts’ whose accuracy the reporters admittedly have been unable to verify, and that it distorts statements in ways that create a patently false and deeply unfair impression of official indifference.”

Post-Dispatch editor Gilbert Bailon said in a written statement to GJR that Roth was not a neutral observer and that the important thing was the findings of the investigation.  He wrote:

“The focus should be on the key findings of the Post-Dispatch investigation that found that people were being repeatedly arrested and jailed by mistake in St. Louis despite safeguards that could have prevented it. If new facts or information exists that informs the 100 cases we found, those should be brought forward by public officials, some of whom were aware of details of our research before it was published Oct. 27.”

The Post-Dispatch’s Jennifer Mann and Robert Patrick reported they had “identified 100 people arrested in error over the past seven years. Collectively, they spent more than 2,000 days in jail — an average of about three weeks each.”

Roth maintains in his posts that half of the 100 cases are older that five years and that only about a dozen occurred in the past two years.

Separately, St. Louis Circuit Attorney Jennifer Joyce wrote in an email to GJR “that a thorough look at about 10 percent of the cases in their (Post-Dispatch’s) published database…” showed that the “days are overstated by approximately 550. I assume there are similar errors in the other 90 percent of their database which would bring the days lower still.”

Joyce said, “even one wrongful arrest is too many,” adding, “this is an important topic even though it was poorly handled by the Post. While ethical and legal considerations prevent me from discussing in detail many of the cases, I can tell you that, overall, we found:

  •  “In at least one instance, a person the St. Louis Post-Dispatch claimed was wrongfully jailed was not arrested.
  •  “The majority of those arrested were initially arrested on their own charges.
  •   “The majority of those arrested to which the Post attributed time served on another person’s charges served jail time as the result of their own charges.
  •   “Many instances where the number of days the St. Louis Post-Dispatch claimed a person spent in jail were inflated.”

One example Joyce cited was that of Antonio Arnold.  The Post-Dispatch maintained Arnold was wrongfully arrested on a warrant for his brother Leonard Arnold and spent 211 days in jail. But Joyce found he spent 134 days in jail, all of it on his probation violation.

Joyce wrote she thought the “errors are due to the fact that the Post reporters simply do not have the ability to verify the information they publish as fact.”

Roth highlighted another mistake on his Facebook page.  He wrote that research by Joyce’s office showed that Cortez Cooper had never been wrongfully jailed, as the stories stated.

The Post-Dispatch reported that Cortez Cooper “spent more than a month in jail because his brother, Cecil Cooper, used his name during a drug arrest before being released pending charges… An email alerting officials to a possible mistake was sent out – but not heeded – two months before Cortez was mistakenly charged.”

But the Circuit Attorney found, after publication, there was no evidence Cortez Cooper was jailed.  Cecil Cooper used Cortez’s name, but it was Cecil who was in prison, Roth reported.  Cecil Cooper was confined under Cortez’s name, but it was the right person in jail, investigators concluded.

Roth makes the point that social media, such as Facebook, enable people to challenge the accuracy of news reports in ways they once could not.  He wrote:

“’We stand by our story.’ With those five words, news editors used to be able to dismiss challenges to accuracy & fairness. End of discussion. Not so much, anymore. Social media make it harder for traditional news organizations to ignore legitimate grievances about a news story, and make it more likely that they will be attentive and responsive to problems and mistakes in their reports and coverage. This strengthens journalism.”

In his latest post on Thursday, Roth criticizes the Post-Dispatch and Bailon for not having a published corrections policy, like the one in The New York Times.  Roth writes that when he asked for the paper’s corrections policy, Bailon wrote back:

“We do not publish a public corrections policy on the website or in print. Concerns from readers about possible corrections or clarifications are brought to the attention to the newsroom editors. The originating desk involved investigates the matters and confers with top management about the issues and whether to publish a correction.”

Bailon said in his statement that Roth’s “railing against our corrections policy is a distraction and irrelevant. Under longstanding policies, we issue corrections based on fact-based reporting and new information.

“We stand ready to acknowledge and correct any factual inaccuracies in our reporting if the Mayor’s Office, Circuit Attorney’s Office or the St. Louis Police Department can document how the records on which our reporting was based were wrong.”

Editor’s note: William H. Freivogel, publisher of GJR, is a former Post-Dispatch reporter and editor and a colleague of Roth’s and Post-Dispatch reporters and editors involved in the series.


Post-Dispatch series:


Editor Gilbert Bailon’s column on series:



Editor’s note: William H. Freivogel, publisher of GJR, is a former Post-Dispatch reporter and editor and a colleague of Roth’s and Post-Dispatch reporters and editors involved in the series.

Appeals Court likes “likes,” says they’re speech

The Fourth Circuit Court of Appeals has held that “liking” something on Facebook is speech protected by the First Amendment, reversing a lower court opinion dismissing a suit brought by former employees of a sheriff’s office who lost their jobs after they “liked” the Facebook page of their boss’s opponent in his re-election bid.

Last May, District Judge Raymond A. Jackson held that “merely ‘liking’ a Facebook page is insufficient speech to merit constitutional protection,” Bland v. Roberts, 857 F. Supp.2d 599 (E.D. Va. Apr. 24, 2012), slip op. at 6, and dismissed the fired employees’ claims.

But after reviewing the nature and consequences of “liking” something on Facebook, the appeals court held that “[o]nce one understands the nature of what [one plaintiff] did by liking the Campaign Page, it becomes apparent that his conduct qualifies as speech.” Bland v. Roberts, No. 12-1671 (4th Cir. Sept. 18, 2013), slip op. at 39.

On the most basic level, clicking on the “like” button literally causes to be published the statement that the User “likes” something, which is itself a substantive statement. In the context of a political campaign’s Facebook page, the meaning that the user approves of the candidacy whose page is being liked is unmistakable. That a user may use a single mouse click to produce that message that he likes the page instead of typing the same message with several individual key strokes is of no constitutional significance. slip op. at 39-40.

This makes sense. The courts have held that First Amendment protection extends to gestures, signs, and even some actions (“symbolic speech”). A Facebook “like” is no different; depending on the context, it can be an expression of endorsement, approval, or gladness that something was posted. And I’m sure that it can have other meanings that I’m not thinking of. But the point is that pressing the “like” button can, indeed, carry a message that can and should be protected by the First Amendment.

Unfortunately, the Fourth Circuit doesn’t have an official Facebook page to “like.” (Although there’s a page for former clerks.) But you can go to Justia’s Facebook page for Fourth Circuit opinions and like the Bland v. Roberts decision.

By doing so, you’ll be making an expression of support — and enjoy the protection of the First Amendment for doing so.


Eric P. Robinson is co-director of the Program in Press, Law and Democracy at the Manship School of Mass Communication at Louisiana State University.