Zhu: What we let social media companies get away with these days

Caroline Zhu, Staff Columnist

With news commentators comparing our current political and economic state to historical periods like the 1880s, it is of little wonder that antitrust concerns continue to grow, particularly in light of the recent attention Facebook has been getting. The boom in the early 21st century of fast-growing technology companies has led to sobering realizations about the state of our nation and its dependence on social media. These become particularly evident as Facebook comes under scrutiny, in the forms of a U.S. Justice Department antitrust investigation and a Congressional investigation into their attempt to launch their own cryptocurrency, Libra.

This attention is well-deserved, as Facebook’s scandal with Cambridge Analytica opened the most visible case on consumer data privacy in years. The exploitation of 50 million Facebook profiles to influence the 2016 election under the direction of President Trump’s former advisor, Steve Bannon, has, in turn, opened many adjacent conversations on what a social media platform’s role is in a larger social context. In a steadily increasing capacity, companies will claim that the social media platforms they run are just that: platforms. They provide a space for people to talk and circulate ideas, but the companies have no responsibilities beyond that. From these claims, a social media platform can be at no more fault for misinformation or moderating speech than can a cafeteria for generating mess hall gossip.

However, with the growth of extremist groups that start on the internet and take advantage of these platforms, there is good reason to ask: Should companies take responsibility for the content on their platforms, even if it is user-generated? Are corporations in charge of fact-checking and making sure the content on their sites is moderated to control hate speech or other harmful language? Given the effect of social media on the general populace—with 70 percent of Americans using some form of social media—companies must take on this responsibility. Given their integral role to the social functioning of modern society, social media platforms have placed themselves in the position of having to regulate their content.

For those who cry “free speech” and fear how the moderation of online content will crush the rights granted by the First Amendment, remember that there are significant exceptions to the freedom of speech promised by our Constitution. These include, but are not limited to, fraud, speech integral to illegal conduct, speech that incites imminent lawless action and true threats. Consider how much of the information that is shared on social media that can be categorized into these exceptions.

The algorithms that show content to users are both the draw and the danger inherent to social media. On the surface, these algorithms collect information on what users like and show them more information that reflects similar content. However, the danger of these algorithms is twofold. First, they bring together communities of similar demographics that can easily be targeted by those who seek to manipulate user data for monetary or personal gain, often through advertisement. Second, these algorithms generate echo chambers on the internet where harmful behavior and language quickly find footholds, creating many of the violent communities we see taking these ideas to dangerous extremes.

Facebook has rebranded itself in the past year to be centered around groups of people that use the platform to create communities. Although it is an admirable goal, it is naive for Facebook to assume that only amiable communities will form. It must be the company’s responsibility to regulate their platform accordingly. 

These situations are precisely what Congress members have pointed out in their lines of questioning during Mark Zuckerberg’s recent testimony. Specifically, the lax nature of how social media platforms have dealt with the information that comes across their applications and the resulting problems in the political and social landscape of the nation.

Without fixing these problems, misinformation in advertisements and the cultivation of extremist groups using social media platforms will only continue to grow. Although some argue that communities will always find other platforms with fewer rules, this argument is a nonstarter. Given the market share and social influence that platforms like Facebook have, regulating their platform will still have an impact. Regardless of what else users do, given the social responsibility they claim, a company must use the maximum extent of its power to work for the social good.

If a social media platform defines its success by a larger user base and social influence, it cannot then refuse the social responsibility that comes with the benefits of popularity. To do so grows increasingly dangerous in the current economic and political climate, where data collection and advertisement have played a role in swaying a major presidential election and could do so again. These issues may be addressed in the current U.S. Justice Department’s probe into the antitrust issues surrounding major technology corporations, but if they are not, then these platforms must be held responsible for the information disseminated using their platforms to the fullest extent.