Get in touch with our team
Feature image for 27.04.2018

27.04.2018

5 min read

Brighton SEO: Barry Adams – Technical SEO in the Mobile First Indexing Era

This article was updated on: 07.02.2022

We’ve all seen that Google is rolling out mobile first indexing with a lot of experimentation. We think that we pass the mobile friendly test and that’s it…but no. You need to be a bit more careful than that.

There are some key things you need to watch out for, even if you think you’ve ticked everything off.

Responsive design is Google’s recommended design pattern, but you still need to double check everything! It’s very easy for problems to slip in when you’re not paying attention.

Start with the basics. Do you have the same metadata and body content in place? This is especially important to check with dynamic and m. sites. This seems basic but have you done a thorough, page by page check? No? Then you need to. We’re all professionals sitting behind a laptop and very rarely navigate through our website that we. It’s easy to think that optimising your desktop site is enough, but the mobile site might behave in different ways, even if it’s responsive – CSS can still screw things up.

One thing to look out for is hidden content. Are you hiding anything? Google has said that hidden content doesn’t matter for the mobile index because it recognises that you have less space to work with. However, if you use any sort of JS to load this content thent here might still be an issue with how well Google can render your page. It might take weeks for JS content to be crawled and indexed. Check that everything is behaving correctly and can be indexed straight away.

Also look for structured data. Is it all there in your mobile site? All the nice rich snippets look good with the desktop site but Google might not find this data when it crawls your site with a mobile bot.

Check that hreflang tags point to the correct version of the page if they’re being used.

What about overlays? Everyone hates them and they provide a negative user experience, especially on mobile. You need to make sure that the close button can be tapped at a bare minimum. “There’s no fucking reason to have an overlay.” – Barry.

Next up is pagination. It’s easy to use infinite scrolling on mobile sites but using JS for it is going to make it tricky for search engines. The mobile crawler goes down 16000px which is quite a lot, but it might not be enough for your site. Are you using the right tags to implement the pagination? Check everything.

Be wary of JS content. It will be indexed but it’s very slow. It’s also bad for users. Client-side JS puts all the server-side resources on your client, but Barry thinks it’s a bit of a contemptuous move for users, especially if they’re on older smartphones, which you need to think about.

Remember also to look at your internal link structure. SiteBulb produces a really nice internal link map that visualises your site structure. It also allows you to visualise your desktop structure compared to your mobile structure. Look for discrepancies where your mobile site isn’t linking in the same way. DeepCrawl’s DeepRank metric is also a really good way to compare your site versions if you crawl them both and export the URLs with their DeepRank. If there’s a discrepancy you won’t be sending the same amount of link value. If you screw up internal links you might be sending value to the wrong pages and losing out on ranking. Pay particular attention to hamburger menus and other smaller mobile menus which can leave out critical links. Don’t ignore footer links.

This leads us to link prominence. The reasonable surfer patent is what Google used to describe a method that Google could use to evaluate links based on how likely a user is to click on it. A link that is very likely to be clicked passes a lot of value, vs one that is less likely to be clicked. Barry’s not sure how important that this will be for mobile, but thinks it’s likely that it’ll have some kind of significance. Look at the flow of PageRank – how do people click through from your home page?

The thinking behind Google’s initial philosophy is still important today. There is a damping factor of around 0.85 that is applied as you go down a level. Barry thinks that subdomains have a higher dampaing factor verses a link that’s within one domain, which means internal links in subfolders are much more powerful than links from subdomains.

Now let’s look at site configuration. How is your robots.txt set up for mobile? This is especially important to look at if you have an m. site. Is your XML sitemap referenced in your mobile robots file and does it have all the right files in it? You’d be surprised about all the things that can go wrong with a mobile sitemap.

What about redirects? Make sure that you look at how redirects have been implemented for the mobile version of your site. If you have an old m. site have all its redirects been updated for your new responsive website? Look through them all and see if any have been missed. These can be valuable quick fixes if you catch them.

And of course make sure your server can handle all the crawls. The mobile index is likely to crawl at a higher rate than the desktop crawler, so your server has to be able to handle the rate that Google wants to crawl if you want your site to be crawled regularly, otherwise Google will slow it down.

Read Google’s best practice guidelines but also read between the lines. Don’t take it all as writ and see what they’re implying. Dig deep and question it.