Social networking sites and online message boards are an integral part of the Internet. Those websites are able to provide unfettered forums for the exchange of ideas because they enjoy certain legal immunities. But, those immunities are eroding.
The Digital Millennium Copyright Act (DMCA) provides immunity from copyright infringement, and 47 USC § 230, commonly known as Section 230, applies to all other content. These laws essentially say that a service provider is not a “publisher” of third-party content on their websites, and thus not liable for what their users do.
But, some recent cases have chipped away at this immunity. Recently, the Ninth Circuit Court of Appeals ruled that LiveJournal may not be protected by the DMCA safe because LiveJournal uses volunteer moderators. The logic? The Ninth Circuit found that common law agency theory had been short shrifted by the district court, and now the issue may be in the hands of a jury.
Similarly, the Southern District of New York denied a motion to dismiss in Enigma v. Bleeping last year, finding that the complaint over a “defamatory review” plausibly alleged that a volunteer moderator could be an implied agent of the website. The court entertained the idea that a volunteer “superuser” – who was not an employee–could be an agent of Bleeping Computer, which would have made the protections of the Section 230 of the Communications Decency Act (“Section 230”) inapplicable.
LiveJournal and Online Communities
One of LiveJournal’s more popular communities (with 52 million views per month) is a celebrity gossip group called “Oh No They Didn’t” (“ONTD”), and naturally, pictures of celebs abound on ONTD. This led LiveJournal to hire a paid moderator to work with the volunteer moderators for the group, to maximize advertising revenue.
Mavrix Photographs accused ONTD of posting their celebrity photos and sued LiveJournal directly for copyright infringement, under the theory that the volunteer moderators were agents of LiveJournal. Although the district court rejected this argument, the Ninth Circuit’s reversal is harsh.
Who Uses Moderators?
Here’s where it get’s tricky. Many websites (Facebook included) use moderators to maintain cohesion within an internet community. Moderators are generally people in charge of managing comments for a website and they are usually volunteers. Under the common law doctrine of agency theory, an “agent” is someone who is authorized to act on behalf of another, called a principal Usually, an agent is a paid employee of a company, but a paycheck is not always the deciding factor.
In the Bleeping Computer case, the court seemed open to the notion that moderators who volunteered for Bleeping Computer could be implied agents.
These decisions mean that courts are now dissecting the distinctions between publishers, third parties, and agents, specifically focusing on websites that use moderators and volunteers to curate some of the content.
What Website Owners and Service Providers Need to Know
New businesses usually weigh risks and liability when developing a model. if a website allows a third party to publish content, there are two main areas of law service providers should be aware of: 1) intellectual property infringement and 2) defamation law.
Intellectual property infringement is when someone uses a copyright, patent, or trademark without permission. One example would be posting a photo of a model without permission from the photographer. Defamation is a statement or comment that damages someone’s reputation.
Section 230 protections versus DMCA “safe harbors”
Section 230 says that service provider shall not be treated as the publisher or speaker of any information posted by a third party. This distinction between a “publisher” and a “service provider” took root in early defamation case law, which compared defamatory statements in newspapers with defamatory statements spoken on the telephone.
If you have ever read a comments section on a website or used your thumb to scan tweets, you have probably come across lots of pictures of celebrities (copyright infringement?) and harsh comments that could “damage someone’s reputation.” Many of these tweets likely violate copyright law or could be defamatory. Websites are generally immune from liability for these posts or comments because of §230 and DMCA §512 safe harbors.
The courts surmised that since newspapers had editors, they exercised more control over the publication, thus newspapers could be held liable for a defamatory statement. Telephone companies on the other hand were immune from defamation liability because they were merely passive conduits for third party expression. Section 230 codifies the spirit of this premise: that website owners who allow users to generate content are usually mere conduits of expression, thus website owners tend to be protected from defamation and copyright infringement suits related to third-party content.
DMCA §512 extended protections to service providers against intellectual property infringement liability, by establishing “safe harbors” as long as there are effective notice-and-takedown procedures and the service provider has no knowledge of the infringing materials.
Using a Moderator Throws a Wrench Into the Works
This line between who is a publisher and who is a “mere conduit of expression” seems neat and tidy on paper, but the internet by nature is not neat and tidy, so if your website uses a moderator to curate content, it throws a wrench into this tidy classification.
What is noteworthy about the LiveJournal case is that the Ninth Circuit did not specifically differentiate between the acts of the paid moderator and the volunteer moderators, so one could interpret that this decision means that even websites with 100% volunteer moderators could be liable if the volunteers are deemed to be agents. The Ninth Circuit held that the acts of the moderators could be attributed to LiveJournal because it was not clear if the photos were truly “posted at the direction of the user,” or if LiveJournal itself posted the photos.
Takeaways
So what is the takeaway? Courts may be more willing to chip away at §230 and DMCA §512 protections if a website uses a moderator, so website owners need to rethink their models. Since “moderator jurisprudence” is still developing, there are no black and white answers yet.
Here are a few points to consider:
- The difference between a “publisher” and a “mere conduit of expression” should be the guidepost a service provider uses – is the website like a telephone company that allows people to speak freely or is the website more like a newspaper were editors (maybe even a volunteer moderator) exercises some control?
- If the service provider chooses to allow a volunteer moderator to curate content, the company should carefully analyze their relationship to the moderator. Does a representative from the company communicate with the moderator or does the moderator have free reign to “moderate” as they see fit?
- What do the websites terms of use say about its moderators? Remember, courts will heavily analyze terms during litigation. If the Terms omit talking about moderators, it could be as damaging as implying the moderators are agents or employees.
- Is the website’s moderator a paid employee? Service provisders and website owners may want to analyze their contracts with paid moderators and decide if additional ad revenue is worth losing §230 and DMCA protections.
- If a service provider decides that having a moderator is worth losing §230 and DMCA protections, it is probably wise for service providers to invest in additional training for their moderators, which could include merely implementing new policies.
Regardless, anyone who runs a website with moderators that relies on §230 and DMCA protections, even if the moderators are volunteers, should review their current policies about moderators, Courts have not looked kindly on websites that use moderators lately, so a little investment now beats a lawsuit later. Agency theory may be a new quiver in the plaintiff attorney’s toolbag, and website operators must be careful about how they interact with moderators.