A new report from British lawmakers on how social media is used to spread disinformation finds that Facebook and other big tech companies are failing their users and dodging accountability.
"The guiding principle of the 'move fast and break things' culture often seems to be that it is better to apologise than ask permission," said Damian Collins, chair of the Digital, Culture, Media and Sport Committee that drafted the report. "We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self regulation must come to an end."
The 108-page report is often scathing on Facebook's practices and corporate conduct. The committee's inquiry into disinformation began in September 2017, as revelations emerged that Facebook had been used to spread disinformation during the U.S. presidential election and the U.K. Brexit referendum vote, both in 2016. In March 2018, the Cambridge Analytica scandal broke, and showed how users' data could be harvested and misappropriated.
"Companies like Facebook should not be allowed to behave like 'digital gangsters' in the online world, considering themselves to be ahead of and beyond the law," the authors write.
Social media companies have been able to evade responsibility by claiming to be merely "platforms," the committee notes. They recommend the formulation of a new category of tech company — neither exactly a "platform" or "publisher" — that clarifies companies' liability for harmful content posted by users.
The committee is particularly irritated by Mark Zuckerberg's refusal to appear before Parliament. "Facebook seems willing neither to be regulated nor scrutinized," it notes. "By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt" toward Parliament and an international committee of legislators, they write.
Greater government oversight
In its interim report, the lawmakers called for the U.K.'s Information Commissioner's Office to be an effective "sheriff in the Wild West of the Internet." The authors add that the ICO "needs to have the same if not more technical expert knowledge as those organisations under scrutiny," and that a levy on tech companies in the U.K. be used to pay for its work.
Some of the report's recommendations take aim at tech companies' most tightly held information. It suggests that an independent regulator have the power to obtain any information from social media companies that are relevant to its inquiries, including what data are held on an individual user, if the user requests it. The regulator "should also have access to tech companies' security mechanisms and algorithms, to ensure they are operating responsibly." This public body, it says, should be able to be take complaints from the public about social media companies.
The report also recommends an investigation into whether Facebook has engaged in anti-competitive practices, "to decide whether Facebook is unfairly using its dominant market position in social media to decide which businesses should succeed or fail."
Facebook "open to meaningful regulation"
In a statement, Facebook says it "share[s] the Committee's concerns about false news and election integrity and are pleased to have made a significant contribution to their investigation," but it does not directly deny its findings.
"We are open to meaningful regulation and support the committee's recommendation for electoral law reform," Karim Palant, Facebook's U.K. public policy manager, says in the statement. "No other channel for political advertising is as transparent and offers the tools that we do. We also support effective privacy legislation that holds companies to high standards in their use of data and transparency for users."
"While we still have more to do, we are not the same company we were a year ago," Palant says, pointing to the company's tripling the size of the team of people whose job it is to "detect and protect users from bad content."
While the committee's report focuses on social media companies and how to more effectively regulate them, its last recommendation speaks to the social media platforms' users, too, as it calls for "more pause for thought."
"More obstacles or 'friction' should be both incorporated into social media platforms and into users' own activities—to give people time to consider what they are writing and sharing," they write. "Techniques for slowing down interaction online should be taught, so that people themselves question both what they write and what they read—and that they pause and think further, before they make a judgement online."
300x250 Ad
300x250 Ad