Liability Rests With The Users
Blaming companies for all interactions on their platforms threatens free speech and the state of the Internet as we know it.
Reading Time: 4 minutes
As children and young adults are introduced to the freedoms of technology, many experience immediate backlash from anonymous web surfers, propagating a toxic and harmful environment. Verbal taunts quickly turn into an influx of offensive, derogatory, and hurtful messages. As a result, many schools and nonprofit organizations across the world have taken steps to combat the dangers of cyberbullying and excessive Internet usage.
But the dangers of the Internet have long since expanded to the adult world, manifesting themselves on a larger scale. The Internet can now be used to perform countless illegal activities that have very severe consequences. Yet cases arise that reflect both the difficulties the judicial system has in keeping up with the rapidly changing uses of the Internet and the abuses individuals can perpetrate in the ever-changing Internet.
After Matthew Herrick fell victim to stalking and harassment perpetrated by his ex-boyfriend, he filed multiple police reports and requested a restraining order against his harasser. Herrick’s ex had initially used the dating app Grindr to send hundreds of men to Herrick’s workplace, resulting in multiple complaints made to both Grindr and the police. Dissatisfied with the police force and afraid of his ex-boyfriend, Herrick decided to a file a lawsuit against Grindr, arguing that the platform was inadequately regulated and created without thought for the damage it could cause. Rather than pursuing legal action, Herrick shifted his focus to the platform itself, claiming that Grindr (and thus all online networks) were products, not services. They should be thus held accountable for the damage they could create, much like tangible objects.
In his claims, Herrick attacked the tech industry’s unchecked power and control over the spread of their online content, as seen in recent Facebook and Youtube scandals. But Section 230 of the Communications Decency Act (CDA) of 1996—originally meant to restrict pornographic content spread through the Internet—has been interpreted as a means of protection for such apps, deflecting accountability from the company to the individuals responsible for the acts. This is not to say that companies do not regulate their platforms; illegal, dangerous, and abusive content is regularly monitored by nearly all social networking sites through filters, account removal, and user-initiated report features.
In order for Grindr to be directly responsible for any misuse of their services, it must be proven that damage done to the affected user was a result of Grindr’s knowledge and subsequent negligence on the issue. Yet Grindr has followed its policy of removing all impersonating profiles and blocking any users associated with the flagged activity from using their accounts. Pushing for a crackdown on Grindr’s policies would include potential efforts to filter sent messages, block VPNs (virtual private networks), and ban removed users from creating more profiles. In other words, a serious effort to limit the ability of Grindr-like platforms to regulate their user experience would go in direct violation of both digital privacy and the First Amendment.
There is no clear way to foresee how heavy the restrictions on speech would be if social media sites were to follow through with such reforms. Evidently, Section 230 of the CDA would not only be weakened, but it would also institute liability directly on the platform through which malicious content was spread. To avoid legal punishment, companies would institute a series of filters that could restrict anything deemed offensive, whether that be curse words, spam, or controversial humor. Subsequently, this rollback could easily be manipulated to the company’s personal views, censoring publications they disagree with, banning “hate speech,” and allowing the spread of propaganda. In other words, the First Amendment would cease to apply in the digital world.
By letting companies have excessive jurisdiction of the regulation of their sites, one must consider what the face of the “new” social media platforms would look like. With online messages being constantly filtered and flagged users removed from the site entirely, the audiences of such sites would shrink exponentially. There is also no guarantee that censorship would stop at social media sites, as e-mail servers, blogs, and all online communication platforms would be susceptible to reform. It is thus crucial for the Communications Decency Act to remain intact and preserve the right of free speech. Abuses of such freedoms are bound to occur and must be dealt with in the proper branch of government: the judicial system.
Herrick’s individual case serves less as an example of an oversight on Grindr’s part. Rather, it is a failure by the judicial system to recognize his harassment claims. Despite having 14 police reports against him and a requested restraining order, Herrick’s previous partner remained unquestioned by the police. The lack of inquiry over accusations of harassment and stalking can be attributed to the prevailing lack of accountability on social media. Seeing how a screen can serve as a mask between the sender and recipient of a message, it becomes increasingly difficult to pinpoint the identity of an individual online. Many online services, including messaging platform WhatsApp, even offer encrypted messages that are impossible to trace back to the original sender. Despite such setbacks, the judicial system must uphold and continue to reinforce its ban on harassment and stalking by launching a traditional investigation. This accounts for the victim’s personal conflicts and immediate suspects. The shield of anonymity cannot allow for all perpetrators to be identified, but it is important for cases of online harassment to be recognized nonetheless and investigated to the fullest extent by the authorities.