Recently, the WSJ has published “The Facebook Files,” a set of articles based on a review of the social media giant’s internal documents, research, draft presentations, and online employee discussions. Among the reports is an allegation made by the news outlet that the company knows its platforms – including Facebook and Instagram – are “riddled” with flaws that “cause harm, often in ways only the company fully understands” and these alleged issues are known all the way up to the chief executive, Mark Zuckerberg. Among its reports, the WSJ says that changes made by Facebook to its algorithms three years ago to improve user connectivity and well-being made the platform “angrier” instead, with staff members warning of the potential damage being done. Changes were then allegedly resisted due to concerns surrounding declining user engagement. In addition, the publication says that researchers inside Instagram have found that the app is “harmful” and “toxic” for some younger users; in particular, teenage girls. “In response, Facebook says the negative effects aren’t widespread, that the mental-health research is valuable and that some of the harmful aspects aren’t easy to address,” the WSJ says. Furthermore, an alleged internal platform known as cross check/XCheck exempts some high-profile users from the rules applied to typical users, which shields these individuals from sanctions normally applied when material is posted that may break Facebook terms of service, such as posts inciting violence. In response, former UK politician and now Facebook Vice President of Global Affairs Nick Clegg said in a blog post on Saturday that the series “contained deliberate mischaracterizations of what we are trying to do, and conferred egregiously false motives to Facebook’s leadership and employees.” Clegg also says that the accusation at the core of the reports, that Facebook conducts research and dismisses anything that is not of benefit to the company, “is plain false” and is based on the “cherry-picked” selection of leaked documents. “With any research, there will be ideas for improvement that are effective to pursue and ideas where the tradeoffs against other important considerations are worse than the proposed fix,” the executive says. “The fact that not every idea that a researcher raises is acted upon doesn’t mean Facebook teams are not continually considering a range of different improvements.” Clegg also referenced one of the WSJ’s reports on how COVID-19 misinformation and “barrier to vaccination” content has been handled over the course of the pandemic. The publication writes that anti-vaxxers have been able to abuse Facebook’s own tools to sow doubt, flooding the platform with negative comments and potentially undermining initiatives to drive up vaccine acceptance rates. The executive says that health organizations continue to post because despite negative commentary, by their own measurements, promotion is still effective. “Facebook understands the significant responsibility that comes with operating a global platform,” Clegg says. “We take it seriously, and we don’t shy away from scrutiny and criticism. But we fundamentally reject this mischaracterization of our work and impugning of the company’s motives.”
Previous and related coverage
Quick, easy (and free) way to make Facebook more bearableFacebook is the AOL of 2021Facebook says Chinese hackers used its platform in targeted campaign to infect, surveil user devices
Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0