When you go online and write a comment that's nasty or defamatory about a person or even a business, you risk being sued for defamation.
But if someone else writes a nasty comment about a person on your public social media page, can you (as the page owner) be held liable for such comments even though you didn't write it?
A recent decision of Australia’s High Court determined that media companies are liable for the defamatory comments made on news stories on their public Facebook pages.
That is, media outlets will now be held accountable for the comments of random people on their public forums. But what's the reason behind it, what made the Australian High Court pass this order, and what are its implications now? Read on to find out.
- The Background
- Final Verdict With Respect To Liability Of Comments
- Some Other Verdicts
- Implications For Businesses
- How To Moderate Comments Using Statusbrew
You can directly jump to a section of your choice or keep scrolling.
Most of the media outlets maintain public Facebook pages for the purpose of increasing traffic to their websites. They also aim to increase engagement and popularity of their profiles and pages on Facebook through "likes," "comments," and "shares" on individual posts.
The increase in page engagement and readership results in increased advertising revenue from both the Facebook pages and the media outlets' websites.
In 2019, Dylan Voller filed a claim in the Supreme Court of NSW against three media titans—Fairfax Media Publications, Nationwide News (a subsidiary of News Corp), and Australian News Channel (Sky News). Mr. Voller claimed that they were responsible for defaming him via the comments made on videos of him that were posted on their Facebook pages. No complaints were made about the posts themselves.
Mr. Voller found his way into the public eye after his mistreatment in a Northern Territory detention center that was featured on ABC's Four Corners (Australia's leading investigative journalism program). Outlining the mistreatment, Voller participated in several interviews. And in doing so, he soon became an advocate for youth detention reform.
After the story was aired, a federal investigation into the youths' treatment within detention centers was announced by the prime minister. Different news outlets started to report the issue, thus amplifying the unraveling controversy.
In this reporting process, Facebook accounts that were managed and owned by Fairfax, Nationwide, and Sky posted a video of Voller within the detention center in a compromising state.
A variety of Facebook users started to comment on this video. While some users expressed support for Voller's plea to reform Juvenile detention, others began to accuse him of an array of violent crimes, perhaps implicitly attempting to justify Voller's treatment within the detention center. Some of the comments posted on the Facebook pages made allegations which he claims are false and defamatory.
Then, Voller sued Fairfax, Nationwide, and Sky because they were responsible for the defamatory comments made about him. He claimed that they were responsible because they are owners and managers of the page on which the comments were publicized.
However, due to the sheer volume of content on their Facebook pages, the media organizations had no knowledge of these comments. They were not given an opportunity to take them down before proceedings were commenced.
In those circumstances, the media organizations did not consider that they were publishers of the third-party comments and applied for the Court to determine the issue of publication upfront, ahead of the substantive defamation case going forward.
Facebook did not allow for turning off comments on posts to the page moderators at the time the comments were published; however, it has changed that.
Final Verdict With Respect To Liability Of Comments
The final verdict of the Australian High court runs counter to how virtually everyone thinks about the liability of content on the internet. The definition of "publisher" was the question before the High Court, as that isn't something that is easily defined in Australian law.
High Court 5-2 in the Voller case has held that media publishers are liable for defamatory third party comments posted on their social media pages, because by facilitating and encouraging the comments they assist in their publication. Implications: huge.— marquelawyers (@marquelawyers) September 8, 2021
The primary judge concluded, based on the evidence presented by the media outlets, that they were aware that some of the posts were made to invoke engagement from the public. They were also likely to "excite" comments that may be defamatory.
The media outlets contended that for them to have "published" those third-party comments, they must have had knowledge of the intent of the defamatory comments to be conveyed, which they did not.
The court finally found that, by creating a public Facebook page and posting content on it, the media outlets had encouraged and assisted the publication of comments on their content from third-party Facebook users. Therefore they were held responsible for the publishers of those comments.
Ignoring the defamatory content published on your public page means that the page owner holds the burden of posting such defamatory content.
The rule of the Judge was driven by findings that the media outlets had the means to monitor whether any third-party comments were defamatory before releasing them to the general readership and to delay publication of third-party comments.
The news outlets were not found to be mere conduits of the comments.
But what about comments that were unrelated to the subject matter of the original post? Would media outlets be liable for those as well? The short answer is yes.
In short, the effect of the decision means that the owner of a public social media account or page (be it an individual, association, or business) is a potential publisher of every comment made by anyone on their page.
You might be wondering: under defamation law, can the person who posted the comment also be held responsible for their comment?
The answer is yes. But from the perspective of the one suing, it might not be worth going after the individual social media user or the troll. A plaintiff is more likely to go after the media company itself as the publisher, with their deeper pockets.
The result of this decision is that the position in Australia has now become different from other Western democracies, including the United Kingdom, the United States, and New Zealand. In the most similar case, the New Zealand Court of Appeal ruled that an individual internet user who was the administrator or operator of a private Facebook page and who had no "actual knowledge" of the third-party comments posted on their page was not liable for defamation.
This move makes CNN, owned by AT&T Inc, the first major news organization to pull its Australian Facebook presence. However, CNN would continue to publish content on its own platforms in Australia. Schwartz Media (an Australian publishing house) and Peter Gutwein (an Australian politician who has served as the 46th premier of Tasmania) have ceased allowing comments on their Facebook pages.
Facebook itself was not part of the court case. The company spokesperson said they looked forward to "greater clarity and certainty in this area." The spokesperson further added that it was not their place to provide legal guidance to CNN, even though they have provided them with the latest information on tools that they make available to help publishers manage comments on the platform.
Some Other Verdicts
One of the interesting things about the Mr. Voller case is that his legal team sued straight away. That means they didn't issue a concerns notice first (which is a legal letter sent to the organization alleged to have made the defamatory comments to give them a chance to respond).
But going further, that wouldn't be allowed. Under the new defamation laws that came into effect this July in NSW, South Australia, Queensland, Victoria, and the ACT, plaintiffs must now serve a notice on each defendant and wait at least a fortnight before suing.
The same law introduced a "serious harm threshold" according to which the plaintiff has to prove that they have suffered severe harm/damage to their reputation due to the published comments under this rule.
This clause targets to rule out trivial defamation cases. While it's true that anyone can cause serious harm to one's reputation on social media, there is also a lot of banter that might be offensive but might not cause serious damage to their reputation. This clause is expected to give some protection to admins of social media pages in the future.
In addition, social media platforms are required to establish a standardized complaints system. Such a system aims to ensure that defamatory remarks can be removed and trolls can be identified easily.
Social media organizations are also required to disclose identifying details of trolls to victims without consent. This will enable a defamation case to be lodged.
Implications For Businesses
Australia's High Court's decision is explicitly related to Facebook, but it could potentially be applied to other social media platforms such as Twitter or Instagram. The executive chairman of News Corp Australia, Michael Miller, told the Sydney Morning Herald that the court decision was significant for anyone maintaining a public social media page.
It also has extensive consequences for any individual or organization operating a public social media page, right from local sporting and community groups to small businesses and multinational corporations.
In effect, the High Court found that administrators and operators of public social media pages have a positive obligation to assess and monitor comments made by third-party users on their public Facebook page. They need to remove any comments that may be considered defamatory.
However, this could create an unreasonable burden for social media page operators depending on the number of users. In addition, the requirement for these administrators to self-assess whether a comment may be defamatory is quite problematic as it involves objective consideration of what can be a complex legal concept.
In today's tech-savvy world, along with several benefits associated with operating a public social media page, businesses should consider their capacity to adequately monitor third-party comments and remove any that may be regarded as "defamatory." In some cases, it may be in the business's best interests to disable the comments function on their page or a particularly controversial post rather than risk a defamation claim.
Of course, that entirely limits engagements and essentially defeats the purpose of having a public social media page.
An alternative is to appoint an automated moderator to either approve comments or immediately remove those that may be considered defamatory. That has a significant cost involved, which is worth the investment rather than getting sued.
How To Moderate Comments Using Statusbrew
The risk of defamation claims will affect brands and organizations who use social media to drive engagement and manage Facebook pages and events.
Now, if you are concerned about this risk, you can choose to turn off public comments on some or all of your social media pages. However, this move will cut down on the possible engagement you can gain from users and may not be ideal.
Facebook and Instagram allow you to automatically hide comments that feature certain keywords.
Learn more about How To Hide Comments On Facebook & Instagram Ads & How to turn off comments on Facebook post
A more optimal option is to moderate the comments you receive on your social pages. As of now, Facebook does not have an option that allows page owners to approve the comments before they are posted.
Statusbrew can help you create automation that will auto-hide all incoming Facebook comments until they are reviewed and approved. Here’s how it will work:
- Teams create a rule that automatically hides all incoming Facebook comments.
- The next step is to auto-apply a tag to it and auto-assign it to the community managers.
- Community managers will then filter out all the hidden conversations using the Tag filter in the Engage inbox.
- They can then review each conversation, choose to unhide the most appropriate ones, and even respond to it if required.
Statusbrew also allows you to configure Notifications for any custom rule in Rule Engine. You can even set up notifications to ensure that you are notified of conversations that are of importance to your brand.
Want to discuss further? Book a free demo or start your free trial today!
Statusbrew is an all in one social media management tool that supports Facebook, Instagram, Twitter, Linkedin, YouTube, and even Google My Business.
Explore the Statusbrew range of social media tools