People are mad at Facebook — for being too big, too powerful, too omniscient. Last week a co-founder of the company joined the calls from politicians like Senator Elizabeth Warren for the government to break up the social media giant.
Facebook has repeatedly asked for regulation, with CEO Mark Zuckerberg appealing to the government in a Washington Post op-ed this spring. Thus far, regulatory proposals have treated Facebook as a traditional telecom or media company. But the unending public anger towards the company, coupled with a sense of helplessness towards the influence it has over society, suggests that regulation shouldn’t be approached with benevolence. Given the numerous studies describing the addictive nature of social media, perhaps Facebook should be regulated the same way as other vices.
Earlier this month, Facebook co-founder Chris Hughes argued that Facebook should be broken up because of its lack of accountability. In the column, he describes himself compulsively scrolling Instagram as his infant son plays alone on the floor. Hughes acknowledges that his behaviour is not good for the child, but he says he can’t pull his attention from the screen. It’s reminiscent of an old anti-drug commercial, where an elderly woman sits woeful and alone at a prepared
dinner table. This is the result of ruthless optimisation for user engagement.
Studies have repeatedly found that social media can damage emotional health of frequent users, and excessive use has been found to be associated with addictive-like symptoms. Sean Parker, Facebook’s founding president, described the application as “a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.â€
Antitrust regulation won’t help here. Facebook achieved outsize market share with an addictive product. A competing platform would need to do an even better job of exploiting psychological vulnerability to topple the incumbent. But the solution to a harmful industry dominated by a monopoly is not to foster equally harmful competitors; it’s to reduce our dependence on industry as a whole.
Regulatory proposals should begin by protecting the youth. Facebook has a messaging app designed for kids under 13, but expecting the platform to protect children from harmful content is like asking the tobacco industry to make a kid-friendly cigarette. There’s sort of a conflict of interest going on. If Silicon Valley execs refuse to let their own children use apps, perhaps they shouldn’t be allowed to market their apps to other people’s children either.
A lot of our complacency towards social media platforms stems from a lack of understanding of how they take advantage of emotional vulnerabilities to keep users engaged. This is not unintentional. And much like the tobacco companies that spent four decades denying a link between smoking and lung cancer, Facebook has been equivocal in acknowledging its own harmful effects.
—Bloomberg
Elaine Ou is a Bloomberg Opinion columnist covering the consumer and retail industries. She was previously a national retail reporter for the Washington Postserve Is Bad for America,” and founder of Quill Intelligence.