This article was originally published in The Hill Times on March 6, 2017.

If your organization—a political party, government department, or office supporting an elected official—is the victim of a fake news campaign, you have to implement rapid counter-measures.

Facebook has been taking steps to limit fake news, but they have important shortcomings, writes Ellis Westwood.

On Dec. 28, 1917, readers of the New York Evening Mail learned a salacious new piece of American history. Writer H. L. Mencken claimed Dec. 20, a date that had passed a few days earlier, was an important but overlooked anniversary—the birth of the bathtub in the U.S.

The country’s first tub, he explained, had been installed by Mr. Adam Thompson of Cincinnati on Dec. 20, 1842. But, due to public safety concerns (they were lined with lead), baths only became popular when President Millard Fillmore installed one in the White House in 1850.

Mencken’s article soon began to appear in other publications. Several years later, it had been cited in academic journals and even on the floor of the U.S. Congress.

The thing is, Mencken made the whole thing up. He confessed to the ruse eight years later, saying “my motive was simply to have some harmless fun in war days.” Mencken’s trick is still cited as one of the top news hoaxes of all time. It’s perhaps one of the first modern examples of fake news.

We saw the power of fake news during the recent U.S. presidential election, where untruthful stories, deliberately created and amplified on social media platforms like Facebook, may have influenced people’s views and ultimately the election’s outcome.

BuzzFeed has found that in the pivotal last three months of the campaign, the top-performing fake news election stories on Facebook generated more engagement than the top real stories from major news outlets, including The New York Times and Washington Post.

Since then, Facebook has been taking steps to limit fake news.

One step is to allow users to report fake news articles on Facebook.

Another is third-party fact checking. Facebook has partnered with the International Fact Checking Network (IFCN) to review and fact-check stories. When users suspect a story of being fake, they can flag it. These are then reviewed by third-party fact checkers, often journalists, who belong to IFCN.

If confirmed as false, the story gets a “disputed” tag. The tag will also harm the story’s score in Facebook’s algorithm, so fewer people will see it in their news feeds.

These measures are welcome first steps from Facebook. But, they have important shortcomings.

First, users can still share fake news, even articles that have been tagged as “disputed.” Even when they have been tagged, Facebook won’t remove them, so they can still be shared.

Second, there’s a delay between when fake news articles start to trend and when they are tagged as disputed. During that time, the story will continue to spread. This could heavily impact debates, announcements, and elections.

Third, fact-checkers will only tag articles as disputed if they contain clear-cut falsehoods. Misleading articles will remain untouched as long as they have some basis in reality.

There’s also the risk that disputing articles becomes another tactic for political parties or online trolls.

And, what about meta cases, like President Donald Trump labelling real news as fake news when it’s clearly real news?

Finally, fact-checking itself is viewed differently, depending on a person’s political views. According to research by the American Press Institute, fact-checking is viewed more favourably by liberals than conservatives. Maybe a “disputed” status would make conservatives more likely to share a piece of fake news.

What does this mean for you?

If your organization—a political party, government department, or office supporting an elected official—is the victim of a fake news campaign, you have to implement rapid counter-measures. First, you need to have quick-reaction social media tracking and analytics to identify inaccurate, hostile content. Then, you need to quickly mobilize your social media communities (supporters, followers) to dispute fake news before it trends. That’s playing good defence.

But, there’s also a role for good offence. Use social media to tell your story better and to more people. Do that by improving the quality of your social content, using richer storylines, and paid plays to amplify it to target audiences.

Fake news probably isn’t going to disappear any time soon. As the Mencken hoax shows, people are drawn to compelling stories, even ones they suspect might not be true.