Technical SEO

Technical SEO is a huge area and deserving of a whole guide devoted to it. Rather than go into great length about technical SEO here we’re going to highlight the main areas which from a local SEO standpoint can help give you the maximum impact.

Schema and Structured Data

Schema and structured data is a way of marking up content in a way so that search engines can easily understand. This data can then be used in a variety of ways. For instance, if you run an e-commerce store data about price, availability and reviews for a product can be marked up so that data can be shown in the search results themselves. This can improve click through rate for the website using it.

Another application is from a local SEO perspective, by using the Local Business Schema. Here you can tag content such as:

  • Business name
  • Address
  • Telephone number
  • URL
  • Reviews

It’s worth spending the time to create the schema markup code for your business and add it to your website. Google has previously stated it’s not a ranking signal but has indicated that it may be in added in the future. I think that’s enough reason to take it seriously.

It sounds technical and daunting but actually, it’s very simple to add schema data to your website.

There’s even a free tool to generate the code for you. See http://makeschema.com/ and your details, click submit and away you go. Add the code as instructed and Google will pick it up.

For those of you who’ve set up Google Search Console, there is a handy little tool called Data Highlighter. This allows you to mark up content on your website and tell Google exactly what it is.

XML Sitemap

Not to be confused with an ordinary sitemap that visitors might see, the XML sitemap is a file or collection of files which is “read” by search engines. It includes information about all the pages on your website, such as URL, last updated, important and how often it changes.

Most modern Content Management Systems (CMS) will produce or will have an add-on that will produce and keep up to date an XML sitemap for you.

This is then submitted via Google Search Console. From the menu on the left choose Crawl > Sitemaps and click the red button Add/Test Sitemap. Add the location of your file and click submit.

Indexing

Ensuring that as many of your web pages as possible are crawled and indexed by Google is key to being able to rank for as many relative search queries as possible.

Again, you can check this in Search Console. Choose Crawl > Sitemaps and see how many pages Google’s says are indexed. How does this differ to the number of pages that were submitted via the sitemap?

You can also review this by choosing Google Index > Index Status. This shows you a graph of how many pages Google has in its index for your website.

If you have too few then you know that for some reason Google can’t index all of your web pages. Something is stopping it and will need further investigation.

Too many could just as bad. It’s possible that Google is indexing pages that you don’t want it too or that it’s indexing pages twice. Both of these problems can cause ranking issues due to duplicate content or the wrong page being returned for a search query.

Crawl errors

This section of Search Console is accessed by selecting Crawl > Crawl Errors. The important report we want to look at here is the Not Found report.

These are pages that are in Google’s index, that it has tried to reindex and for some reason, it has been unable. If Google can’t find it, it’s likely that a user won’t be able to find it too. If it’s an important page then this is a problem. It’s not a great experience to search for something, click your result and be told that the web page can’t be found.

In “web speak” we call this a 404 error. That is the error code given to a page that cannot be found.

Reduce 404s

All websites over time build up 404 errors. It’s best practice to keep these to a minimum. Depending on the size of your website you may want to check your Crawl Error report, once a month or at least once a quarter.

The report will include a link to the missing page and also tell you where on your website it’s linked from.

You can then find out why it’s disappeared. Most often it’s because it’s been deleted. Maybe it’s an old page or a product that you no longer stock or service you no longer offer.

Deleting pages on your website with no fore thought it a bad idea. That page is likely to have built up “authority” and even may have links pointing to it. By deleting it you’re losing those benefits.

If you must delete the page then you should “redirect” that old web page address to a new page. This is done using some called a “301 redirect”. It tells the visitor’s browser or search engine that this page no longer exists and takes them to the new relevant page. Any authority or links to the old page will be passed onto the new page.

In the case of e-commerce sites, deleting products is a bad idea full stop. Most e-commerce platforms will have a way of dealing with products that are either out of stock or no longer carried. This keeps the page in place but makes it not purchasable. You should be able to define alternatives so should a visitor land on that old product page they’re not completely disappointed.

Page load speed

There’s a lot of evidence building up between the speed of your website (how quickly it loads in a browser) and how long visitors will stick around. If your website takes too long then visitors will get frustrated and just give up and go back to the search results. This is bad for your website and for your rankings.

There are many reasons your website could be slow. Use one of the many tools out there to check its speed and to troubleshoot the cause.

You don’t necessarily have to achieve perfection, just don’t be really bad. These tools are not always 100% accurate, they just give you an indication.

Unless you know what you’re doing here. I would strongly advise you to seek professional support.

Thin content

Thin content is a term that can be used at either the page level or sitewide. It describes a lack of quality, depth and expertise. This isn’t necessarily the same as the quantity of content. A 1,500-word article of garbage is still going to be thin content, whilst a 300-word article could still be insightful and packed with everything the ready needs to know. It’s all about fulfilling the readers’ needs.

How do you know if your web pages and website is suffering from a thin content issue?

  • Firstly you need to read it and be honest with yourself, is it the best it could be?
  • If you know the search queries you’d like this page to rank for on Google, check out those pages who rank in the top 5 positions. How does it compare?
  • You can also look at Google Analytics.
    TIP: Make sure you enable the Organic Traffic segment so you can see how search engine visitors engage with your site. Click the + symbol. Scroll through the Segment Names and check the box next to Organic Search and click Apply.
  • Next, Click Behaviour > Landing Pages. Look for your pages on this list, you may need to select more than 10 rows to see the full list of pages. Analyse:
    • How many visits the page gets
    • What’s the bounce rate for this page? (being high is not necessarily a problem)
    • How many pages does the average visitor look at following this page?
    • What’s the average session duration?
  • Also, Click Behaviour > All Pages. Perform a similar comparison. The main indicator we’re looking for here is Avg Time On Page.

Looking at the Bounce Rate, Avg. Session Duration and Avg. Time On Page can give you an indication of the overall engagement. High bounce and low session duration and time of page show that users aren’t really interacting with or reading your content. Obviously, it depends on the purpose of a page as to whether this is an issue.

Duplicate content

The real problem with duplicate content comes from that it forces Google to choose between competing pages. It will choose to show one over the other(s), and this may not be the page you want.
A small paragraph of text repeated from one page to another is not considered duplicate content. And a page does have to be an exact copy to suffer. It can be a close match.

Depending on the size of your website checking for duplicate content can either be a manual process or you may need to use a specific tool.

Finding if the content on your website is duplicate with that on another website is far easier with a tool such as https://www.copyscape.com.

Keyword Cannibalization

This means where you have 2 or more pages which are optimised for the same keyword. It ends up causing a “fight” between the pages on Google. With each page being chosen alternately as the most relevant for the query.

You can check for this in Google Search Console. Click on a particular search query and then select the Pages radio button. This will show you all the pages on your website being returned for that query.

You will need to choose which page is the correct one and de-optimise the others. If it’s really serious then it may require the content being completely rewritten with the focus changed so it’s not competing. The old URL may also need to redirected to the actual page that should be the correct result.