Google is the largest search engine in the world. In the modern business landscape, performing well within Google’s search engine results is crucial for online success. On average, there are 3.5 billion searches made on Google every day, which highlights the huge opportunity for businesses to attract new customers.
During 2019, Google’s global market share remained at an average of 79% across the year. To compare, the next largest search engine is Bing, whose global market share averaged 12% during 2019.
Search Engine Market Shares – Feb 2019 to Feb 2020
Source – www.netmarketshare.com
It’s widely accepted that Google can release multiple algorithm updates almost every day. However, more notable updates are referred to as “core updates”. These are typically more significant in nature and cause more turbulence in search engine results pages (SERPs). It is important, therefore, to monitor your website’s performance when these are released.
The aim of this guide is to teach you how to understand whether your website has been hit by an algorithm update, the steps you should take to recover, and how to build strong foundations for your website based on best practices.
- What Are Algorithm Updates?
- Google Panda
- Google Penguin
- Google Hummingbird
- Mobile First Indexing
- How To Tell If You’ve Been Hit By A Google Core Algorithm Update
- How To Recover From A Google Algorithm Update
- Technical Considerations
- Content Considerations
- Off-Page Link Considerations
- On-Page Link Considerations
- What to Avoid?
- What Recovery Looks Like?
- Conclusion
What Are Algorithm Updates?
An algorithm update is a change to Google’s core ranking algorithms, adjusting the way they rank web pages and deliver search results for the user. As there are many different ranking factors in use and specific information regarding updates is often withheld, it can be difficult to prepare for and adjust to these updates.
Google is constantly making changes to its algorithm to ensure that users are given the most relevant search results for their query. The largest updates are called “broad core algorithm updates”. These can negatively affect website rankings if they are not following best practices, as outlined in the Google Webmaster Guidelines.
This website was hit by the March Broad Core Algorithm update in 2019.
Whilst most broad core algorithm updates don’t target specific issues, previous updates have focused on key areas that were previously affecting the quality of search results.
Some of the most notable algorithm updates over the last decade include:
Google Panda
The Google Panda update was released in 2011 and focused on content quality. The update targeted low quality websites, such as content farms and spam websites. Google wanted to serve users the most relevant search results for their search term, so they stopped passing value to these low-quality websites.
Websites that recovered from the Google Panda update did so by updating the content across the site, removing annoying “above the fold” advertisements and improving the overall user experience. This should be the standard for every website, so ensure that the content on your site is of a high quality.
Google Penguin
The first Google Penguin update was released in 2012 as an extension of the Google Panda update, and targeted poor link building practices. Purchasing links towards your website used to be standard practice, as more links increased your PageRank and therefore your perceived authority. However, this was abused by spam websites in a number of ways and Google needed to address this to improve the overall quality of their results.
The Penguin update meant that Google gained a better understanding of low-quality links and were able to penalise websites that participated in these practices. Natural and relevant links were rewarded, whilst websites with low-quality or paid-for links quickly lost traffic and rankings.
This client was positively affected by the Google Penguin update in 2016.
Since the introduction of Penguin 1.1, Google released a further five iterations of the algorithm, culminating in Penguin 4.0 in September 2016 which saw it becoming part of the core ranking algorithm.
Google Hummingbird
The Google Hummingbird update was released in 2013, and focused on user intent and natural language queries. The update did not result in huge fluctuations within the search engine results pages (SERPs) and was announced one month after its release.
Hummingbird was the introduction of semantic searches, in which Google was able to understand the wider context surrounding the keyword. The update was also understood as a change to Google’s knowledge graph – a set of SERP features on the right hand side of the search results, which provides the user with key facts about the person or topic. The knowledge graph displayed is dependent upon the intent behind the search query.
Mobile First Indexing
In March 2021, Google completely switched almost all sites to Mobile First Indexing as a response to the changing ways that we search. In 2021, more than half of all online traffic was generated by mobile devices, contributing 54.8% of total website traffic, highlighting the need for a great experience for users on mobile devices.
While Mobile First Indexing has been gradually implemented since its announcement in 2016 (and officially launched in March 2018), some sites are still not quite ready for Mobile-First Indexing. However, for most sites, Google now crawls and indexes the mobile version of content rather than the desktop versions. Get started with Mobile First Indexing.
How To Tell If You’ve Been Hit By A Google Core Algorithm Update
This website was hit by the BERT update in October 2019, but is now recovering.
Although Google’s core algorithm updates can result in positive changes for many websites, the opposite is true for those whose websites were not optimised in line with the Webmaster Guidelines.
Once you find out that a core algorithm update has been released, it is important to monitor your website’s performance. You can use Google Analytics and Google Search Console to highlight any sudden changes. If you work in a digital marketing agency, you may also have access to industry-leading ranking tools, such as Ahrefs, STAT and Searchmetrics. These can help you better understand and pinpoint any further losses in organic visibility and keyword rankings.
If you notice changes in your visibility and rankings for specific keywords, however large or small, you should check the competitive landscape. It’s likely that some of your competitors have been rewarded for the quality of their content, therefore improving their position for these keywords and resulting in a drop in your position. From here, you can begin to identify key areas for improvement. How often are they writing blog posts and producing quality content that’s cited by credible sources? Do they have extensive guides that offer value to the user? Review your competitors closely and make a note of any gaps in your content.
If you notice sharp decreases across your site as a whole, a full audit should be conducted to identify why your website has been hit by the algorithm update. There could be a combination of reasons, such as your backlink profile, internal linking structure, or the overall quality of content across your website. The next section covers the technical checks that should be conducted, as well as best practices for content across your website.
Whilst it is important to monitor your performance after an algorithm update, you should avoid “knee jerk” reactions to any sudden changes. Your performance may recover as the update is rolled out completely – this can often take a few days. After a week or so, if you are still seeing a loss in traffic and/or rankings, you should conduct a full website audit and create a strategy based on your initial findings.
How To Recover From A Google Algorithm Update
It is impossible to predict when Google is going to make a change to the algorithm or roll out a core algorithm update, and therefore you can’t prepare for it beyond ensuring your website adheres to Google’s webmaster guidelines.
When asked why specific sites gain or lose rankings after an update, John Mueller, a Senior Webmaster Trends Analyst at Google, said:
“It’s essentially just saying, from the way that we determine which pages are relevant, we’ve kind of changed our calculations and found other pages that we think are more relevant.
So it’s not a matter of doing anything wrong and then you fix it (and then Google recognizes it and shows it properly).
…More it’s a matter of well we didn’t think these pages were as relevant as they originally were. And these kinds of changes can happen over time.”
So there you have it. There is no way to prepare for an algorithm update, however, you can make sure that your content is relevant and of a high quality to serve your customers’ exact needs, and that your website has strong technical foundations.
If your performance is affected negatively by an update, your priorities for an initial recovery should be conducting technical checks and improving the quality of your content, to ensure that it is relevant and valuable for the user. We have outlined some of the changes you can make to your website to ensure that it’s in line with Google’s Webmaster Guidelines.
Technical Considerations
Health Check
If you’ve been negatively affected by an algorithm update, you should start with an overall health check of your website. There are many tools you can use for this. Use Google Search Console to check for any manual actions, crawlability and/or indexing issues. Google Analytics can help you to identify any drops in traffic or revenue and the specific areas of your site which have seen the largest decrease.
You can run a crawl on your website using tools such as Screaming Frog or DeepCrawl, to better understand how Google is viewing your website. From this data, you will be able to identify patterns and themes, and hopefully gain further insight into why your performance has declined.
Once you have identified any technical issues, you can put these into a roadmap, prioritised by impact and resource. Work through these fixes and track your progress. It is important to develop a working relationship with your website developers, as there are some elements that may require a developer to implement. Within your roadmap, you can assign responsibility to each technical fix, to ensure that both parties understand what needs to be completed.
Improve Page Speed
If your website is too slow to load, it’s likely to impact the way users interact with your site, causing them to potentially bounce off of your website, not convert and look elsewhere. It can also cost users money if they are using their mobile data to browse the internet.
In such a competitive landscape, keeping users on your website is vital for business. You can use PageSpeed Insights or Lighthouse to get a score out of 100 for your desktop & mobile page speed, and receive feedback as to how you can improve your score.
Unused code, such as JavaScript and CSS files, can also slow down your page load time. This redundant code should be removed to ensure a faster load time.
Remember to test a variety of pages across your website, and not just your homepage. Check your product and category pages, as this content will be different from your homepage and may take longer to load.
Analyse Local SERPs
If you have seen a decrease in rankings, it’s important to analyse the local SERPs for your most valuable keywords. Your local competitors may be ahead of you in terms of optimisation and content. Check their targeting and Google My Business panel – what are they doing differently and how could you improve in these areas?
Mobile First
It cannot be emphasised enough how important it is for your website to be optimised for mobile. Luckily, webmasters at Google have released a helpful guide that is continuously updated. With the development of Core Web Vitals alongside Mobile First Indexing, tools such as Google’s Lighthouse and Mobile-Friendly Test can help regularly check and identify improvements for your website for both mobile and desktop.
Improve or Remove Low Quality Content
If you find thin or low quality content when completing your technical checks, it might be worth improving these pages or consolidating them together to form more detailed content. More drastic measures can also include no-indexing or removing then redirecting such pages to make sure they’re not affecting your website’s performance. This is summarised in the matrix below:
This step was recommended by John Mueller in a Google Webmaster chat. If you feel that the content could be improved, there are ways you can optimise it. Keep reading for our content optimisation tips.
Content Considerations
When optimising your content, you should consider the search quality evaluator guidelines, which are used by Google’s third party search quality raters. The responses from these raters help Google to evaluate any changes which may need to be made.
The guidelines discuss the characteristics of a high quality website, along with things that should be avoided. Low quality websites are often lacking in Expertise, Authority and Trust, which are important signals for the user when looking for information on the web.
E-A-T and the Importance of Understanding Google’s Quality Rater Guidelines
E-A-T stands for:
- Expertise
- Authority
- Trust
Google has highlighted E-A-T as an “important quality characteristic” within their quality rater guidelines. There is no scoring system for E-A-T, as it is a concept which should be considered across every aspect of your website.
The content on your website should be in-depth and relevant to your product or service offering. Informational guides, blog posts, reports and whitepapers will help to establish you as an expert within your industry and provide valuable information for the user. You should always cite and link to relevant sources where possible, ensuring that the websites you are linking to offer quality content for their users.
This is especially important if your business is within the “your money your life” (YMYL) sector, and your advice “could potentially impact a person’s future happiness, health, financial stability, or safety.” Our Head of Digital PR, Laura Hampton, has written about this in a blog post.
A link profile containing backlinks from relevant industry sources can also help to improve the E-A-T of your website. Google will understand that your website is part of the wider industry and respected as an authoritative source.
If your business is within the Financial, Medical or Legal sector, E-A-T should be prominent across your website. However, every business should take E-A-T into consideration. Follow our advice to ensure that your website showcases Expertise, Authority and Trust, to both Google and the search user.
We have highlighted some key areas to improve the quality of content on your website, with E-A-T in mind.
Showcase Awards & Achievements
Awards and achievements should be showcased across your website, as they highlight your level of expertise. These could be embedded in your header, footer, or on your homepage. Take the Impression website, for example – our awards feature on our homepage, and we have a dedicated awards page.
Add Reviews To Your Website
Users rely on reviews to give them further insight into a product or service. If there are no reviews on your website, users may not trust you enough to convert. There are many review platforms that you can set up for your business, and have these reviews pull through onto your website.
Though company-focused reviews are recommended, we also advise generating and showcasing product and/or service-related reviews too. Through structured data, you can then markup this content to secure star rating in SERPs and increase your organic real estate.
Implement Author Bios
Providing the user with an author biography can help them to understand whether the author is an expert on the topic. The biography should include the author’s name, any titles, and a brief overview of their experience within the industry.
Author bios are an important E-A-T consideration. Our SEO Executive Liv-Mae discussed the importance of adding author bios to your content in this helpful blog post.
Articles Reviewed By An Expert
If you do not have experts within your business to write the content, it can be reviewed and fact-checked by an expert. The importance of the author’s expertise is highlighted in the search quality rater guidelines, as low-quality content is often identified by a lack of an authoritative author.
Medical News Today are great at ensuring their content is fact checked by an expert, and explicitly state this in each article.
Source – www.medicalnewstoday.com
Improve Heading Structure
Improving the heading structure of your content is important for readability and accessibility. The heading tags <h1> up to <h6> split up your content into sections, making it easier to read, understand and digest. It is especially important to follow the recommended heading structure so that users with screen readers can understand and navigate through your content.
Recommended Heading Structure <h1> This is your main heading </h1> <h2> This is your subheading </h2> <h3> Example 1 </h3> <h3> Example 2 </h3> <h3> Example 3 </h3> <h2> This is your next subheading </h2> |
Add Alt-Text to Images
You should optimise the images on your website with alt-text – a short description of the image. This provides context to Google, and helps users with screen readers to understand what the image is of. This can be updated within your content management system (CMS), e.g. WordPress, by editing the image and changing the “Alt Text” field.
Improve Content Quality
Simply having poor-quality or uninformative content can be the reason for loss in performance for a page. Ensure your content is providing valuable information for the target topic – review top-performing pages in your target SERP to compare against – and look to either optimise, remove or combine content that is performing poorly.
Link Considerations
When we think of links for SEO, our minds often jump to link building (the process of acquiring hyperlinks from other websites to point to our own) and for good reason. Backlinks are still a powerful mechanism for improving organic rankings and their importance in SEO success cannot be understated. However, the internal linking structure of your website can also be important for building authority and improving rankings for individual landing pages.
This all stems back to a ranking factor, created by Google founders Larry Page and Sergey Brin, named “PageRank”. The PageRank algorithm was the backbone of Google’s original ranking algorithm and was the formula that differentiated their product to other search engine competitors at the time, such as Yahoo and Altavista.
PageRank measures how important a webpage is by calculating the following:
- The number and the quality of links pointing towards it
- The number of outbound links that page contains
- The PageRank of each given page in the “network”, i.e. the World Wide Web
Before 2016, SEOs could originally see the PageRank of a page by using Google’s own PageRank toolbar.
However, SEOs soon took advantage of this ranking factor, reducing the practice of link building down to a simple numbers game, acquiring them through any means possible. The more links they could acquire, the greater their website’s ranking potential was.
The shortfalls of the original PageRank algorithm were exposed and Google had to take swift action to improve the quality of their ranking results and ensure their ranking algorithms weren’t exploited.
They eventually removed the PageRank Toolbar from public visibility, introduced the “nofollow” attribute and released the Penguin update (and multiple updates since), to combat bad link practices. However, PageRank is still a part of Google’s algorithm (though it’s likely to be a more sophisticated iteration), and it is important not to overlook this. It takes into consideration both external and internal links to a webpage.
If you have been hit by an algorithm update, it’s important to audit both your external and internal links.
Off-Page Link Considerations
Disavow Links Tool
Google’s Disavow links tool is part of the search console platform and allows you to tell Google to ignore certain backlinks. Its purpose is to remove your association with negative backlinks which are out of your control. The definition of a ‘bad backlink’ is of course subjective, but if a link falls under one of the following categories, you should consider submitting it within a disavow file:
- Spammy directory or forum link
- Paid link
- Link from a hacked website
- Correlates with a sudden increase in referring domains (negative SEO attack)
The disavow file is a text file containing the URLs that you no longer want to be associated with. These should be added with caution after a careful review of your backlink profile. It is worth noting that Google will take the disavow file as a suggestion, rather than a directive, but will typically take the indication from your disavow file and ignore these backlinks.
Google will first have to recrawl and reindex the URLs that you have disavowed, which could take multiple weeks. You should log the date of the disavow submission within Google Analytics, as well as any other tools you use to track performance. For further information on submitting a disavow file, you can read Google’s webmaster blog post.
Anchor Text
Anchor text is the written text that contains a hyperlink. It is an important SEO consideration, as Google began to review anchor text profiles after the Penguin update. If the anchor text pointing towards a page is not varied enough, it can suggest unnatural link practices.
Whilst you have limited control over the anchor text used to link to your website, you can ensure it is varied when writing guest posts or sharing press releases with a journalist. You can also improve the anchor text on your internal links, ensuring that the text is relevant and varied.
Backlink Reclamation
Backlink reclamation is the act of fixing lost or broken links towards your website. These links may have broken due to a URL changing or being removed from the website. You can identify broken backlinks using tools such as Ahrefs, within the “broken backlinks” section.
The process of reclaiming your broken backlinks involves contacting the webmaster or marketing manager in charge of the referring domain. You should reach out to this contact and politely ask for the link to be updated. Some may not respond, but it’s worth the effort for those who do respond and update the link for you. This is also an ideal moment to suggest a collaboration on a new blog post. If the site owners were not to respond, simply 301 redirect the broken URL to a live destination to reinstate this lost equity and authority..
On-Page Link Considerations
Site Architecture
Site architecture is a very important consideration for your website, as it will dictate which pages have the highest PageRank. The crawler will navigate through your site via links before the indexer determines which pages are the most important.
Your core pages, such as service or category pages, should be one level deep. Secondary pages, such as product pages or case studies, should be nested under these. As a rule of thumb, important landing pages should be no more than three levels deep in an effective site architecture.
If you need further information on how to plan your site structure, our Director, Aaron Dicks, has written an in-depth blog post on how to engineer your website architecture for great SEO.
Breadcrumbs
Breadcrumbs are a series of links, often nested at the top of a webpage, which track the user’s journey to a page. This allows them to navigate back to a category page or the homepage.
They are not only good for usability, but also for crawlability and link equity. They allow Google to gain context about deeper pages on the site.
Orphaned Pages
We know that Google uses links to discover new pages. If pages on your website are “orphaned”, they aren’t linked to from anywhere on your website, meaning that Google may struggle to crawl and index them. This is a larger problem if there are important pages which are unlinked to. You can identify orphaned pages using crawling software such as DeepCrawl or Screaming Frog.
Outbound Links
It is important to audit your outbound links, to ensure that they are linking to relevant and reputable sources. Linking out to reputable sources can add value to your content, for both Google and the user. Google will better understand your niche, and the user will trust you for adding further context to your content. Be sure that your outbound links are not linking to spammy or untrustworthy websites, as this could negatively affect your website.
Links to Non-200 Pages
As well as conducting backlink reclamation for external links, you should also regularly audit your internal links. Broken links on your website will result in a poor user experience and a loss of link equity.
You can identify broken links using a crawling tool such as Screaming Frog. Crawl your website and filter the response codes by “client error.” You can then click the “inlinks” tab across the bottom and identify any internal links towards these broken pages. From here, you can update these links via 301 redirects or remove them entirely.
Avoiding linking to 404 pages is arguably the highest priority task here but really, it’s about linking to content that’s predominantly indexable where you can. For example, avoiding linking to redirect URLs also has a part to play here. This is something Barry Adams has historically written about at some length and more information on the reasons why can be found on Polemic Digital’s guide to PageRank.
What to Avoid
Intrusive Notifications – Popups/Modals
We’ve all visited a website with too many advertisements and found ourselves clicking the cross. Not only are these adverts annoying, but they can slow down the page speed and obscure the content on the page. We’re also inundated with notification popups, asking us to allow or block notifications from the site.
In January 2017, Google released the Intrusive Interstitial Penalty, which aims to help users to easily access the content on a webpage.
This penalty is still in place, which was clear to see after the Daily Mail’s 50% drop in traffic. They were hit by June 3rd Broad Core Algorithm Update in 2019, which focused on improving many areas of E-A-T, and was speculated to have affected them in particular because of their poor usability across the site.
Best Practice
You may still want to place display ads on your webpages, but how can you do so without affecting usability?
- Adding “advertisement” text above or below the advert serves as a mini trust signal, showing Google that you are being transparent with your users.
- Place the advert next to the article, rather than within it, so that you don’t obstruct the text.
- Allow users some time on the site before asking them to sign up for email marketing – if they’ve not visited your site before, it’s likely they’d want to learn more about your brand first.
You should also clearly inform users when a piece of content is sponsored, to ensure that you’re following ASA guidelines.
Thin Content
You should avoid thin content on your website, as for the majority of the time, adds no value for the user and can even negatively impact other content on-site. This can also include duplicate content that provides no additional information. There is a balance between writing concisely for readability, and not writing enough. Ensure that each piece of content answers the potential search query, gives valuable information, and reads well.
“Salesy” Language
Not every blog post has to be tied to your product offering. If you’re only writing a blog post or guide to try and upsell something, it will be pretty obvious. Search users can see through this and may bounce off the page if the content is not written for the users query in mind. Writing informative content that shows you’re an expert in the area, and offers valuable insight/opinion that the user cannot get elsewhere.
Bad Link Practices
Bad link practices include:
- Buying links back to your website
- Irrelevant guest posts linking back to your website
- Link stuffing – multiple links in one paragraph
- Anchor text manipulation – where linking sites use purposefully placed exact match keywords to describe the nature of the link.
Although Google Penguin was released to target spammy link practices, they still frequently occur. With Penguin now being part of the core ranking algorithm, your website can still be affected by this – though the effects may not be as marked as they once were. Instead, look out for gradual decreases in visibility over-time if you feel your site is being affected by Penguin. In any case, it’s important to invest in sustainable link building practices to avoid anything like this occurring.
What Recovery Looks Like
If you have been hit by a Google algorithm update, it’s important not to panic and make any immediate changes. Monitor the situation over the next couple of days/weeks to see whether fluctuations appear momentary or longer-term. Take this opportunity to collect data and note any areas of the site that seem to be performing the worst.
Recovering from an algorithm update, especially if your website was badly affected, can take time and resource. Recovery may be slow and it could be a while before your website returns to previous performance levels. However, it’s important not to make any knee-jerk changes as data has shown that badly affected websites can recover after the next algorithm update – as long as you commit to an SEO strategy that offers the best possible experience for your users.
If your performance doesn’t recover to previous levels, you should firstly check whether this is a site-wide issue, or whether just a few pages have been affected. If this is the case, it’s likely that your website has suffered within key SERPs. Check to see how your competitors have fared. Make a note of anything they are doing differently that could be benefiting them. The key is to at least match their depth of content, user experience and brand awareness, if not surpass it.
If your performance has decreased site-wide, this is when you should complete a full audit. Break this down into three sections – Technical, Content and Link Practices. Once you have identified key issues, you can create your strategy. Prioritise tasks by impact and resource to ensure that you are fixing the problems which will have the greatest impact first.
Conclusion
There is no “quick fix” if your website has been hit by a Google core algorithm update. Optimising the content across your website and following advice from Google’s Quality Rater Guidelines should be a part of your ongoing SEO strategy. You should be aiming to provide your users with informative content, and ensuring that your website is seen as a trustworthy authority within your industry.
If you’re looking for advice related to a specific update, our internal Algorithm Committee keeps on top of any changes to Google’s algorithm. You can find their algorithm roundups on our blog.