Does Facebook’s clickbait crackdown signal a move towards higher quality content across the web?

Posted by Rebecca Appleton on 10 Aug, 2016
View comments Social Media
Facebook has announced an important update to its News Feed function with confirmation that it will actively work to reduce the number of clickbait-style headlines appearing in its users’ feeds.

You know the type – “A Woman Walked Into A Supermarket With Her Son… You Won’t BELIEVE What Happened Next.”

As part of its pledge to rid News Feed of clickbait, the social media platform has developed a system to categorize thousands of article headlines. It focuses on two important criteria. In short, if the headline withholds the information necessary to understand the article’s content, and if the headline exaggerates the content of the article to create unrealistic expectations for the reader, it will now be classed as clickbait. The new approach is similar to a spam filter on a large scale - posts falling foul of the basic requirements will now be removed from News Feed. This strategy is part of a number of changes Facebook has rolled out on News Feed as part of move towards higher quality content on the social network (it recently said it would prioritize updates from friends and family for the same reason).

Facebook will also bear social shares and likes in mind when making the decision about where a piece should appear on the News Feed. Some clickbait – for instance, Buzzfeed or Upworthy-style content – is incredibly popular on the social network, and certain articles are shared by hundreds of thousands of people, despite the titles. Articles with a high CTR but low engagement are a big red flag and could also be penalized under the new system. 

Google’s treatment of low-quality content

Google has long since clamped down on low-quality content – perhaps this is why so many clickbait creators have turned to social media to boost their clicks. Google judges quality based on a number of criteria. If pages:
·         Have little to no original content
·         Sneaky redirects
·         Hidden links
·         Scraped content or
·         Automatically generated content

they’re likely to be deemed low-quality and will be treated as such when it comes to determining their search engine ranking.
There have long been rumors that a high bounce rate is used by Google to judge whether content is low-quality or not. This makes sense as most clickbait articles have an incredibly high bounce rate. Users pulled in by a clever headline soon realize the content on the page is not what they expected when they clicked through, causing them to leave right away. This suspicion of bounce rate influence is a source of hot debate, with many insisting that Google doesn’t have access to onsite data such as page bounce rates. Detractors of this theory also point out that some pages are actually expected to have a high bounce rate (such as a contact form), which means the metric isn’t reliable enough to be a ranking factor.

At a Google Q&A hangout in March, Andrey Lipattsev, a search quality senior strategist said that it was difficult to make this data a ranking signal – with some websites not using Google Analytics and a high bounce rate being expected behaviour for some sites, this stands to reason. Lipattsev explained further why Google does not use bounce rates as a quality and ranking factor saying, “The disadvantages that I’ve most often seen described for this approach on a clear, pure ranking factor basis is that we’d need to have broad enough and reliable enough data about bounce rates, click-through rates, depth of view for the vast majority of pages and the vast majority of websites everywhere, in order to be able to make meaningful comparisons all the time.

“That is impossible, because we don’t have the technical means to do it. Even when you think about Google Analytics, not everybody has a Google Analytics code by far, so we can’t use that.

“If we don’t use that, what else are we gonna use? Start trying to come up with something we could use, but it’s always going to be a struggle.”

A move towards higher quality across the board

Facebook’s crackdown on clickbait does signal a move towards better-quality content across the board, with the social network now encouraging publishers to create original, well-researched and informative content if they want to achieve a prominent position on the News Feed. If other social networks – like Twitter, for example – were to follow suit, clickbait could be stamped out across the web once and for all.
Interestingly, Facebook has come right out and said it will use bounce rates as a measure of quality. Google, as we saw above, says exactly the opposite, pointing out the difficulty it experiences in measuring this metric.

What does this mean for your content?

The clear message is that clickbait is being outlawed. This means you’ll need to think carefully about titles and relevance from the outset. While it’s tempting to have an enticing headline which draws people in, you need to follow this up with powerful content that actually fulfils the promise.

·         Be clear, accurate, relevant and truthful in your headlines
·         Avoid sensationalism and misleading titles
·         Back up strong titles with equally strong content
·         Content should be original and unique
·         Don’t scrape web content or use content spinners to flesh out your articles
·         Focus on creating articles that are genuinely useful and entertaining, educational or shareworthy

Recent articles

Google's March 2024 updates - the impact so far
Posted by Edith MacLeod on 19 March 2024
Google is rolling out the March 2024 core and spam updates
Posted by Edith MacLeod on 15 March 2024
Interaction to Next Paint comes to Core Web Vitals
Posted by Edith MacLeod on 12 March 2024
Google says AI-generated images in Merchant Center must be labeled as such
Posted by Edith MacLeod on 4 March 2024
Microsoft’s guide to using prompt engineering for your ad copywriting
Posted by Edith MacLeod on 2 March 2024