In the last few years, there has been increasing dissatisfaction with social platforms like Facebook and Twitter—much of it deserved. The criticism stems from social media companies unwillingness to act as gatekeepers against the harmful and dangerous rhetoric on their sites—all in exchange for profits. But despite the growing concern about social media’s enormous reach, influence, and effect on society, it has for the most part escaped any attempt to be regulated.

Part of the reason is that the growth of online technology—including social media—initially caught many lawmakers flatfooted and ill-equipped to deal with it. And in an attempt to prevent any call for regulations, social media companies have proactively adapted measures to self-regulate. To many, these efforts have seemed more cosmetic and not sincere efforts to affect real change. But increasingly disturbing content and rising violence in the U.S. linked to social media, including the horrific storming of the U.S. Capitol, has put social media’s ills in a glaring spotlight. Consequently, social media’s day of reckoning may soon be at hand.

We should commend companies when they’re good corporate citizens—they’re not required to be. But we also need to hold companies accountable for any harm their products cause—particularly if they’re aware of a problem but deliberately choose to ignore it.

In the case of social media, regulation is a way to not just fix the problem and keep it from happening again. But coming up with a solution may prove to be difficult if not elusive. It will require an understanding of the legal, moral, political, and economic implications. It will also require a consensus by different branches of government and others. Right now, there isn’t even agreement on who is responsible for the content that appears on sites. Some contend that social media companies provide a service similar to a phone company—they’re simply a pipe that transmits content between two or more parties. As such, they have no responsibility for the content. Others disagree, claiming they are more like publishers. And like publishers, they have a level of responsibility for the accuracy and truthfulness of the content—even if they didn’t write it. There are a number of other issues that will need to be addressed, including:

• Who will be responsible for content—the originators of the content or the company distributing it?
• Who will decide what kind of content will and won’t be permitted?
• Will all content be required to be truthful or even accurate?
• Who will be responsible for overseeing social media companies?
• Is it possible to effectively monitor content on such a massive scale?
• Can regulations be set up in any realistic way that won’t interfere with freedom of speech and other individual rights?
• Who gets punished for a violation and what will the punishment be? Who enforces it?

The good news is that we have the beginnings of a road map on how to lawfully regulate content, having gone through issues like this before with radio, television, newspapers, and the publishing industry. Additionally, regulations that affect marketing and communication in healthcare, financial and other industries have been in place for a long time without having caused any undue restrictions.

In conclusion, rather than resisting change, social media companies should be the ones championing reform—true reform, not half-measures intended to keep the hounds at bay. It will only benefit them in the long run. The bad behavior of only a few companies can diminish the reputation of an entire industry, drive customers away, and require an inordinate amount of effort and money to fix. By not being proactive, the worst fears of social media companies might come to pass and they’ll become heavily regulated anyway.