How Facebook Quietly Blocks Researchers From Studying Site’s Harmful Effects

Available to WrapPRO members

Scientists and journalists have been banned and prevented from collecting and accessing data on its services

TheWrap

Even as Facebook continues to come under fire after a devastating series of leaks on its internal research, the company is quietly making changes to its code — actions that are blocking researchers from accessing clean data and studying potentially harmful effects of the platform.

Facebook in September started rolling out an update to its code that watchdog groups, researchers and journalists say interfere with their ability to monitor the platform’s ads and content. That means organizations like news nonprofit The Markup, which had been researching Facebook data through its Citizen Browser project, have had trouble determining how alternative right-wing content in Germany garnered more reach than other political parties.

Corin Faife, data reporter at the Citizen Browser project, and his team say they need to be able to parse the HTML in order to see what’s relevant. Now that Facebook is injecting “dummy text” into the data is shares, “they just made the code more confusing,” Faife told TheWrap. “We can’t prove that they’re doing it, but it seems like a reasonable inference.”

When asked about this alteration, Facebook spokesperson Lindy Wagner contended that the company had no intent to thwart researchers. “We constantly make code changes across our services, but we did not make recent code changes to block these research projects. Our accessibility features largely appear to be working as normal; however, we are investigating the claimed disruptions,” Wagner said in a statement to TheWrap.

The change to HTML largely affects the way researchers get automated data collection of Facebook News Feed posts. It’s a method that academic researchers, including New York University’s Ad Observatory, rely on to monitor trends on a larger scale. In 2020, NYU researchers had built a browser extension to study ads that Facebook surfaced to users — only to get shut down and banned from the platform this August for supposedly violating Facebook’s terms around user privacy.

“I don’t fully understand why Facebook has taken such a hostile stance,” Laura Edelson, researcher at NYU, told TheWrap. “Many researchers in the field (including me) have gone out of our way to try to work with Facebook. I have always provided my security findings to the company before I present them publicly, so they have the opportunity to improve their systems before vulnerabilities are publicly known.”

The Ad Observer tool allowed Edelson and her team to identify ads promoting QAnon and other far-right militias, revealing that Facebook failed to identify some 10% of its political ads on the platform. Edelson and her colleagues Damon McCoy and Paul Duke are still currently banned from Facebook.

“I would still rather work with the platform. I see the problem of disinformation and hate on Facebook as a social problem that affects everyone, and we’d be able to solve it a lot faster by working together. This is why I continue to call on Facebook to reinstate my account — I want to work with them,” she told TheWrap.

Despite the Federal Trade Commission responding that Facebook was wrong to block the NYU group based on its user privacy agreement, the company has not reinstated the accounts. The Facebook spokesperson did not have an answer for when Edelson and her colleagues’ accounts would be reactivated, or if there is a plan or investigation in progress.

Time and again, Facebook has been scrutinized over its lack of transparency when it comes to sharing data or information around how its algorithm works. In 2018, the company faced perhaps its biggest scandal over Cambridge Analytica, a data firm hired by former President Donald Trump during the 2016 campaign that managed to access information on 50 million Facebook users. Since then, the company has only tightened its grip on access and control over sharing data with academics — especially if the information happens to be unflattering.

“The research we’ve published has been kind of embarrassing for Facebook,” Faife said. “As a company, they are not very good with transparency if it inconveniences them. There is a point where we need to see independent researchers have access to Facebook, because it is so big — larger than some countries in some cases. That deserves greater scrutiny.”

In addition, Facebook is simultaneously concealing its own findings about harms propagated by its services. Recent investigations by the Wall Street Journal have shed light on Facebook’s own research showing that Instagram is harmful and toxic for teenage users, as well as how its platform is a home to drug cartels, human traffickers and anti-vaccination sentiments. Many, including regulators, are calling on Facebook to release these files, as well as make more of its research available to the public.

“They are being roped in from all sides,” said Cory Doctorow, advisor at the Electronic Frontier Foundation. “They don’t like what researchers find when conclusions are very unflattering. Facebook’s just really bad at gathering data on itself. They are a company that is committed to appearance, not doing good.”

Madelyn Webb, associate research director at nonprofit research organization Media Matters, said that monitoring Facebook has always been a struggle. In instances where the company is willing to make data available, much of it is incomplete or “riddled with caveats,” she said.

“Every single project we do on Facebook is us trying to scrounge. When you see researchers doing what I do, a lot of that is manually produced. We have only a tiny, tiny fraction, and the only reason we have access to that is because we have begged for it,” Webb said. “They are trying to do this, because the information doesn’t look good for them.”

There have been other instances recently where Facebook has made it difficult for outside researchers to do their jobs, whether intentionally or unintentionally. Earlier this year, a German group AlgorithmWatch said it shut down a project after Facebook cited privacy concerns about its crowdsourced research into Instagram’s political content. Princeton University researchers studying misinformation and election ad targeting have also halted their project due to concern over Facebook’s right to review their research before publication.

Only recently has Facebook started to open up about what it demotes in the News Feed. CEO Mark Zuckerberg also this month signed off on Project Amplify, a communications strategy to promote favorable stories about Facebook in the feed. To Webb, these moves are anything but transparency.

“There are some modicums of Facebook trying to improve their PR by granting access that is incomplete,” Webb said. “You can’t give somebody half a car and call that a gift. I wouldn’t say Facebook has made any real commitment to transparency. If anything, they’ve gotten better at spinning.”

Comments