All Posts by Aires Loutsaris

About the Author

My name is Aires Loutsaris and I am an eCommerce Search Engine Optimization Consultant working with some of the world's biggest VC funded Startups and eCommerce companies. I have over 14 years of experience in SEO, search marketing and user acquisition. I have an SEO course with over 2.5k paying students on Udemy, I was an SEO Lecturer at the University of Arts, London College of Fashion and have been shortlisted for five awards winning two. Fun Fact: I have performed SEO Services for all Three of the Universities I studied at: The Open University, The University of Hull and Kings College University of London. I have also been the Head of SEO at the agencies SEO.co.uk and Net Natives. Contact me for any reason, I'm happy to help and I am very approachable. Additionally, feel free to connect with me on LinkedIn.

Expertise, Authoritativeness and Trustworthiness
Apr 15

Everything You Need to Know About the Google EAT or Medic Update

By Aires Loutsaris | SEO Tips


Everything you need to know about Expertise, Authoritativeness and Trustworthiness.

Google update their algorithms thousands of times each year, from minor changes to updates that rock the industry such as the ‘Panda’ and ‘Penguin’ updates of years gone by.

In August & September 2018, Google dropped the “Medic” update (an unofficial name), where sites across several sectors/niches saw huge drops in visibility almost overnight; but most notably in the medical sector, hence the name.

There was also an update around March 12th, as confirmed by Google on Twitter:

<blockquote class=”twitter-tweet” data-lang=”en”><p lang=”en” dir=”ltr”>We understand it can be useful to some for updates to have names. Our name for this update is &quot;March 2019 Core Update.&quot; We think this helps avoid confusion; it tells you the type of update it was and when it happened.</p>&mdash; Google SearchLiaison (@searchliaison) <a href=”https://twitter.com/searchliaison/status/1106445826925064192?ref_src=twsrc%5Etfw”>March 15, 2019</a></blockquote>

<script async src=”https://platform.twitter.com/widgets.js” charset=”utf-8″></script>

There have been several reports of what the biggest factor was in this update (e.g. content, links etc. as with previous updates), however this update focussed on several factors that come under the collective term “E-A-T”, (Expertise, Authoritativeness and Trustworthiness).

With this in mind, here’s everything you need to know about E-A-T to ensure your site doesn’t get penalised, or if you have seen a drop off in visibility; what you can do to help recover.

Before we dive into our guide on E-A-T, here are some abbreviations that you might see cropping up several times throughout, so we’ve highlighted what they mean below:

  • E-A-T: Expertise, Authoritativeness, Trustworthiness
  • SQRG: Search Quality Rater’s Guidelines
  • MC: Main Content
  • YMYL – Your Money or Your Life

Check out Google’s Search Quality Rating Guidelines here.

What is E-A-T?

Expertise, Authoritativeness and Trustworthiness (E-A-T) is effectively what Google’s Search Quality Raters look for in a site, and in Google’s own words is one of the most important to consider when evaluating a site:

Expertise, Authoritativeness and Trustworthiness (E-A-T)

E-A-T can apply to an author or the entire site itself.

This is explained in more detail below:

What the search quality raters want to see is a level of expertise either from the author or the site as a whole. This means, for example if there are two articles on the same subject – one is post by Martin Lewis from Moneysavingexpert.com or from “John Smith” on a brand new finance blog; which one is a user more likely to trust? Martin Lewis is regularly featured both on and offline, as having a huge amount of experience in the field of finance.

Google want to know that the person creating the content is an expert on the subject – do they have qualifications? Are they cited as a source anywhere?  Whilst you may be financially qualified to give advice, if you’re new to the game then Google may not consider you an expert just yet. Things like having awards as a journalist can help contribute to making you an ‘expert’ in the eyes of a search engine.

If the site as a whole is focused on a particular subject such as medicine or finance – do they have any awards from external groups or societies? If so, this can also help.

Overall – there needs to be some sort of proof that you are an expert; if in the medical profession do you have a page on a hospital/treatment centre website that highlights your credentials?

Work on building your online reputation –this could be just making sure that if there’s mentions of your name; get them linked to you, and if you have any qualifications; include these in your author bio, and link to any pieces you’ve had published online.

The main other things are links from high authority sites, and reviews from customers. If you have links from news sites, Wikipedia, industry websites, governing bodies etc. you’ll be seen as trustworthy, however you also need strong customer reviews on external sites like TrustPilot and Feefo, or reviews on your Google My Business listing.

Also, having the basics in place a secure checkout/shopping page is also a signal of trust – so make sure you’ve got a switch to HTTPS on your site if you haven’t already done so.

There are quite a few elements to E-A-T, so we’ve put together this guide to explain what they are and what you can do to ensure your site is meeting Google’s requirements.

Who Does It Affect?

In a nutshell – every website owner!

Although, those websites that impact YMYL (Your Money or Your Life) are held to stricter guidelines than those in other sectors; so sites in the health and finance sectors need to make sure their sites are on point and don’t contravene Google’s guidelines in any way.

Whilst that’s easier said than done, we’ve got everything you need to know here, so read on.

What Is an ‘Ymyl’ Site?

A ‘your money or your life’ page is any site or page that can have an impact on your finances, lifestyle or wellbeing. Below are a few examples of this:

  • Shopping/financial transactions – any online store, online banking or money transfer/payment service
  • Finance websites – any site that provides information on finance; such as mortgages, tax, loans, insurance etc.
  • Medical/health websites – a page that provides information about a particular condition such as treatment, symptoms, causes etc. It could also cover drugs/medication, or diet advice
  • Legal information – anything that provides legal advice on important life decisions such as wills, divorce etc.
  • News – news about major events/important topics; these should be factual and citing similar to what other news outlets are reporting.

If the content on the page can influence someone’s decision and impact their wellbeing, safety or finances then it is considered an YMYL page, and therefore falls under stricter guidelines than something like a personal blog that is highlighting what someone has had for dinner for example.

Example of Sites Penalized for Not Having E-A-T on Their Sites

Some sites saw bigger drops in visibility than others, but overall it was YMYL sites that were impacted the most, particularly in the health sector.

There have already been case studies looking at the likes of Draxe.com which saw huge drops following the medic update last year (see below), losing over half a million keywords from the top 100 results:

oragnic keywords trend

Therefore, we’ve picked out a couple of other sites that dropped in visibility, and looked into why this might have happened for each site.

Example 1 – a Mortgage Broker Website


Landc.co.uk are a free mortgage broker service in the UK, and are considered to be one of the market leaders in this area, however following the updates in August and September, their visibility dropped off by around 25%. This is based on the total number of keywords ranking in the top 100 results (the figure was 15,501 in July, compared to 11,473 in November).

The below shows the number of keywords they were ranking for (according to SEMrush) – when looking at top 3 rankings, they dropped by around 35%, from 1,103 to 704 keywords in the top 3.

Looking at ahrefs, there’s also a significant drop off in both traffic and the number of keywords ranking from the end of July onwards, around the time of the updates:

a Mortgage Broker Website result

So, why did Landc drop off?

At first glance, they’re doing a lot right in terms of external reputation.

Looking at their links, they’ve got some links from highly authoritative sites:


They’ve got links from The Guardian, The Telegraph, Moneysavingexpert, Yahoo Finance, Express.co.uk, ITV.com, bbc.co.uk and more:

links from The Guardian, The Telegraph, Moneysavingexpert, Yahoo Finance, Express.co.uk, ITV.com, bbc.co.uk

If you were to ask any digital marketer where they’d want links for a finance site, they’d want to be getting links from those huge websites, so if Landc  have a dream backlink profile, what else could it be?

It clearly isn’t their reviews, as they’ve got a lot of happy customers:

mortgages google reviews

If you go to Trustpilot, the vast majority of their reviews are generally “excellent” or “good”, so their reputation can’t be the issue:

L&C Mortgages

Their about page (https://www.landc.co.uk/about-us/) has a lot of information about who they are, how long they’ve been around, and the fact they’ve won several awards:

L&C Mortgages about us

They’re clearly a brand that can be trusted, so what caused the drop in rankings?

A core part of their site is their ‘guides’ section, made to help educate home buyers about all aspects of moving house and getting a mortgage, such as their first time buyers section:

L&C Mortgages guide

There’s a lot of content that is clearly useful and answers most of the questions that a first time buyer would ask.

The problem is however, that there’s nothing about the author on these guides, for example:

fixed rate mortgages guide

This post was written by “Lisa Parker” – who is she? What are her financial qualifications? Is she a mortgage advisor?

Can the user trust that what she has said is factually correct? Probably, however to a search engine quality rater, there’s nothing that says why she’s an authority on fixed rate mortgages.

Looking at another guide, there’s a similar issue:

L&C Mortgages

Who is Adam Jones? Is he financially qualified? Again, whilst Landc as a company are clearly experts in this field, there’s nothing to back up the credentials of the author – it could be a simple fix to put this right and see their rankings recover.

If it’s not down to external reputation, and they meet most of the other E-A-T requirements as stated in the SQRG, then surely the authority of the content creator is playing a part in this.

Example 2 – an E-Commerce Site in the Fitness Supplement Sector


 an E-Commerce Site in the Fitness Supplement Sector

GNC.co.uk saw a clear drop off in their rankings according to SEMrush, but why?

Most of their core pages on the site haven’t changed since before the update hit in August (see below)

However – they are owned by Holland and Barrett, who have a far more in-depth page that shows off their credentials in the health supplement sector: https://web.archive.org/web/20180129155049/https://www.hollandandbarrett.com/info/who-we-are

It’s clear who Holland and Barrett are, how long they’ve been around, and they’re an internationally known brand:

Holland and Barrett

Also, they’ve won a string of awards from external companies who vouch for their products and expertise:

Holland and Barrett won a string of awards from external companies

So, if GNC’s site is relatively the same, and their parent company is a well-known brand,  was their drop off down to external reputation?

At first glance, a brand search shows they’ve got a knowledge graph with all their social profiles and information on, and cited on Wikipedia:


If you search for “GNC reviews”, straight away it’s clear that there’s a lot of room for improvement, as their reputation on several review sites isn’t great – their average rating on most sites is poor – if you look at Trustpilot.co.uk (one of the most well-known review sites), their customers clearly aren’t happy:

GNC reviews

GNC reviews

They’re really not hitting the E-A-T requirements, for example they’ve got an information and advice section that hasn’t been updated in months, however there’s no information about the author of the content, to back up what they’re saying, for example if we look at the following post:


  • No author information – who wrote the content? What is their background, are they experienced in HIIT to create a piece of content around it?
  • No publishing date/ No last reviewed date – is it still relevant?

Despite being on GNC.co.uk, it could be a guest blog from someone with no experience in this sector, and therefore as it’s related to health, it falls under the ‘YMYL’ category, so is subject to stricter review guidelines:

stricter review guidelines

So, if we look at everything together:

  • Not much information on their about page as to their history, their expertise etc.
  • Really bad customer reviews on several different review sites
  • Content isn’t kept up to date, is lacking in depth and doesn’t have any information about who wrote the content and what their credentials are.

GNC did recover slightly following the Google update in March, however they’re still not really meeting the requirements set out in the search quality rater’s guidelines, so unless they change direction, it’s unlikely they’ll increase visibility in the future.

What Can You Do to Recover If Your Site Has Dropped as a Result of the E-A-T Updates

So, if your site dropped as a result of Google’s medic update (or you’re unsure why it’s dropped), don’t panic.

Unless you’ve been actively spamming links or duplicating content; chances are you can restore your rankings to their former glory.

Here’s everything you need to ensure you have in place, along with why you need it (with examples from Google’s SQRG).

Author Information

As outlined above – make sure there’s enough information about who the author is – their name, any qualifications they have, how long they’ve been in the sector, links to any profiles on governing bodies websites, links to external coverage of them (such as awards, testimonials etc.)

Make it clear who has created your content and their credentials – if for example you’re talking about mortgages, make it clear your author has the relevant qualifications, how long they’ve been in their field etc. – give that element of trust to the user that what you’re telling them is correct and won’t impact their finances negatively.

The easiest way to do this is to create a bio page for your authors, and then link to this page from your author bio section. A great example of this is on Moneysavingexpert.com:


money saving expert

In the example with land.co.uk earlier – we highlighted that the authors of their guides don’t highlight their credentials when it comes to finances – there’s literally just a name and a photo, without a link to any more information about the author and his background, or his financial qualifications:

authors of their guides don’t highlight their credentials

There’s a very similar example in the rater’s guidelines, where it’s specifically mentioned that there’s no evidence that the author has any financial expertise, and this is a characteristic of a low quality page:

rater’s guidelines

If you’re an YMYL site and don’t have author information – get it in place ASAP!

About Info/Who Runs the Website

Ensure there’s an ‘about’ page, or a section about who is responsible for the content or keeping the site up to date. Whilst this may seem trivial, it’s an important factor in a page’s quality rating:

About Info Who Runs the Website

If it’s an YMYL site, you really need to have about information on there, such as who the company is, the registered office address, history of your company, and profiles of your senior management team will also help reinforce who your company is.

If you’ve had significant press coverage, or won any awards, include this information too.

Contact Information Is Easily Accessible

Make it easy for someone to get in touch with your website – have the contact information prominent, or a link to a contact page where there’s a phone number, email address or a contact form. This is specifically in the SQRG:


By not having contact information readily available, particularly on YMYL sites; you’re really shooting yourself in the foot. Raters are told to rate a page as low quality if there’s not much contact information available (amongst other things)

YMYL sites

In their examples of “lowest quality” pages, having no contact information is called out too:

lowest quality pages

So, if you want to make sure you’ve got this one covered:

  • Create a contact page (if you don’t have one already)
  • Link to it in your site’s navigation (or footer)
  • Make sure you’ve got a contact form, phone number and email address. If you’ve got physical stores, make sure the contact info for these is easy to find.

External Reviews & Your Website’s/Company Reputation

Customer reviews can help boost the trust around your brand. If you’ve got lots of happy customers, it’s a signal that people can be safe buying from you, or that you’ll give great quality service.

Make sure your company is on the likes of Trustpilot and Feefo, so that users can write open, honest reviews about your brand.

Once someone has made a purchase or signed up to your service; follow up with an email encouraging them to leave a review. It doesn’t matter if there are a few negative reviews about your brand; as they will exist in every walk life – the people reviewing your site are told to use their own judgement, so if you have 200 positive reviews and 3-4 bad ones, don’t sweat it.

External Reviews & Your Website’s Company Reputation

Search raters are told to look for as many different sources of reviews as possible, so it’s worth also considering setting up a Google My Business listing and a Facebook page if you haven’t already, so people can leave reviews about you here too. Remember, the more reviews, the better!

The below is taken from the guidelines, where they are told to look for external mentions/the reputation of the brand or website:


Therefore, a combination of both great customer reviews and mentions of your company from external websites should help you meet the guidelines’ requirements.

Links from Authoritative Websites

The holy grail – getting links from the high end websites. Easier said than done, but those with links from high authority websites such as news websites, government sites etc. will be beneficial.

Consider responding to #journorequests on Twitter, or creating a PR campaign that might attract links/coverage of your brand – not every mention has to be linked, but other sites talking about and linking to your brand will make the world of difference.

Links count as a ‘vote’ from one website to another – if you’re linking to a site you’re effectively vouching for its content. Have a look at what your competitors are doing, or what other content has worked well and got coverage, and look to create something better.

If there are specific industry news sites  – reach out to them and see if there’s a case study or guide that you can create for them, and include a link/mention to your own site in.

Gary Illyes was asked about this at Pubcon last year (see the tweet below from E-A-T Expert Marie Haynes)

<blockquote class=”twitter-tweet” data-lang=”en”><p lang=”en” dir=”ltr”>I asked Gary about E-A-T. He said it&#39;s largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that&#39;s good.<br><br>He recommended reading the sections in the QRG on E-A-T as it outlines things well.<a href=”https://twitter.com/methode?ref_src=twsrc%5Etfw”>@methode</a> <a href=”https://twitter.com/hashtag/Pubcon?src=hash&amp;ref_src=twsrc%5Etfw”>#Pubcon</a></p>&mdash; Marie Haynes (@Marie_Haynes) <a href=”https://twitter.com/Marie_Haynes/status/966325146968559616?ref_src=twsrc%5Etfw”>February 21, 2018</a></blockquote>

<script async src=”https://platform.twitter.com/widgets.js” charset=”utf-8″></script>

Content Is Kept up to Date

Content should be kept up to date, and reviewed regularly, especially if your content falls under the YMYL bracket.

If content is outdated, such as old medical advice, or financial legislation that has changed, then you’re less likely to rank over someone who reviews their content regularly and keeps it up to date. Under the rater’s guidelines, if the content “fails to meet” user needs, it specifically highlights content being outdated as a reason for not meeting user requirements:

fails to meet

Elsewhere in the guidelines document, it highlights on several occasions the importance of content being updated regularly:

importance of content being updated regularly

There will however be times when this isn’t either relevant or possible to do so – however having your content regularly reviewed at least is a great way to ensure that your content is kept up to date.

Check out the NHS as a great example of this, for example on their flu page; they have a note to say when it was last reviewed, and when it will be reviewed again:

their flu page

If your content is unlikely to ever change; consider adding in a ‘last reviewed’ date on your site – especially if you’re in a “Your Money or Your Life” category.

Content Is of a High Standard

Content has always been one of the most important ranking factors, and today it plays a bigger part than ever in how a site ranks.

The content on your site needs to be as useful as possible, and provide answers to questions that people are searching for. The content of your page is evaluated based on the E-A-T of your site, but what does this mean?

Below is an extract from Google’s SQRG, where they look at what are the most important factors:

With this in mind, here’s an example scenario of a “Your Money or Your Life” query:

Content Is of a High Standard

Content Is of a High Standard

A user is looking to buy a home for the first time, and is looking for information about the process of buying a house, getting a mortgage, what are the costs involved etc.

There are two sites trying to rank for “first time buyer mortgages” – which one would you rank highest out of the two below?

Site 1

A mortgage comparison site, that offers the user a list of the top 10 mortgage deals available to first time buyers, but doesn’t allow them to enter their details (such as how much deposit they have, their salary etc.), and there’s no information about the website, or about the process of buying a home.

Alongside the mortgage deals list there are banner ads that make it hard to differentiate between the site’s content and the adverts.

Site 2

A similar mortgage comparison site that has a landing page for first time buyers, that also includes the following:

  • A mortgage calculator to show what your monthly payments might be
  • A stamp duty calculator explaining the additional fees you’ll have to pay depending on the value of your home
  • A guide to the most commonly asked questions that first time buyers ask
  • Tips for viewing a home for the first time
  • How to negotiate on price
  • A checklist of things you’ll need to buy
  • What information you’ll need for a mortgage application
  • Reviews/testimonials from happy customers

Clearly, site 2 would be providing a better user experience, and therefore should rank higher within Google’s results.

Google have previously said that having additional supplementary content can help a site perform better – although supplementary content can be in the form of additional content or simply a link to another page on the site:

“Supplementary Content contributes to a good user experience on the page, but does not directly help the page achieve its purpose. SC is created by webmasters and is an important part of the user experience.”

In other words – make it easy for your customers to find everything they need to know about your product or service, and try to make it better than anything else out there.

Take a look at your own site’s pages, and search for your top keywords, and check out the content of the sites that rank above you, then ask yourself:

  • Is it better than my content?
  • Does it answer the most common questions around our niche?
  • Is the content more in-depth?
  • Does the site ranking above us have more positive external reviews?

You should be able to quickly identify areas for improvement within your own content.

What Do the Highest Quality Pages Have?

The highest quality pages are the ones leading the way in terms of E-A-T overall, and usually have the following:

Very High Quality Content

By ‘very high quality content’, we mean better than the rest. Huge, in-depth guides that cover everything you need to know about a particular subject, with different types of content such as text, imagery and videos, created by someone who has significant expertise:

Very High Quality Content

Very Positive Reputation

Whilst you’ll never have 100% excellent ratings/reviews, the best sites or companies tend to have the vast majority of their customer’s reviews as positive.

Also, they’re linked to and cited often as a source of information, or asked to provide information for a particular news piece, and win awards for the work they’ve done or the service offered.

Very High Level of E-A-T

A very high level of E-A-T means that they’re doing everything right – they’ve got some or all of the following:

  • Content created by experts in their field, renowned for their work
  • In-depth main content that is as a result of a lot of research and expertise
  • A range of different content types including links to other resources
  • On a secure website that will protect your data
  • Often linked to or cited as a source of content
  • Includes lots of product information
  • Has a significant number of positive reviews
  • Includes step by step instructions on how to complete a task

Example of a High Quality Site

Depending on the sector you’re in, not all of these may be achievable, but let’s look at an example of a high quality site – ao.com, an e-commerce white goods and electrical site here in the UK.

What makes Ao.com so good? Let’s have a look

External Reviews

Ao.com are flying in terms of reviews, with an average of 9.1 out of nearly 120,000 reviews

External Reviews

A whopping 95% of these are ‘excellent’ or ‘good’ – that’s over 113,000 customers who are happy with the service they’ve received from ao.com.

External Reviews

Quality content

Ao.com lead the way in terms of how they advise users on which are the best products to buy – they’ve created a ‘best buy’ section that explains which products to go for, such as: https://ao.com/best/fridges

best buy section

We’ve highlighted a few elements that are important to a user – customer reviews, the dimensions of the product so a customer knows it will fit in their kitchen, and an image gallery to give the user more pictures of what it looks like.

These ‘best buy’ pages are separate from the main product pages – they’re not always the most expensive products that are highlighted, but usually the ones that have the best reviews or represent great value for money.

If you look at a particular product page, they’re even better than the best buy pages: https://ao.com/product/fmn431w20c-hisense-american-fridge-freezer-stainless-steel-42307-27.aspx

best buy pages

What makes this page so good?

  • Product video review by ao.com
  • Product information unique to ao.com, not just the standard manufacturer information
  • Tools and time required to install the product
  • Key features
  • Reviews from customers
  • Size dimensions
  • What materials the product is made from
  • Prominent link to a help and advice section for FAQs
  • Easily accessible links to contact/about information
  • Secure website so you know your payment details are safe

Alongside the best buy and product pages, there are buying guides that highlight things to look for when buying a fridge freezer: https://ao.com/help-and-advice/help-me-choose/kitchen-and-home/buying-guides/integrated-fridge-freezer

buying guides that highlight things to look for

Although they don’t have authors on each product page, they have a great ‘about us’ page that explains who they are, what they do, where they’re based etc., so this satisfies Google’s requirements for having this information available.

Overall, they’re a perfect example of how an e-ecommerce site should be done, particularly with the way they present their content and handle customer enquiries.

What Makes a Page ‘Low Quality’?

When a site gets marked as low quality, it can be for one of a number of things (listed below), although sometimes these may be things that you’ve accidentally overlooked.

The things that the raters look out for are:

  • Lacking E-A-T: as outlined earlier, sites need to have a high level of E-A-T, so this means the content needs to be created by someone with expertise in a particular field (such as financial content created by someone with adequate qualifications), the content has to be relevant and on a trustworthy site, and should be secure if it’s an e-commerce platform.
  • Low quality main content: this is pretty self-explanatory, if the content isn’t actually useful or informative, or isn’t in-depth when it needs to be, then it will potentially be labelled as low quality. Raters are also told to look out for clickbait/misleading page titles, or content that doesn’t answer a user’s query.
  • Unsatisfying amount of main content – again, pretty self-explanatory; if there’s not a lot of content about a subject where there should be, then this can lead to being rated as low quality.
  • Intrusive/distracting ads – if a page is littered with ads and popups to the point where the main content is inaccessible to the user, or the ads are difficult to close, then this can also lead to your site being marked down.
  • Negative reputation of the content creator or the site itself: if a particular creator of content is known for stealing other people’s content or providing incorrect/misleading information, this can often have a negative impact on a site’s visibility. Also, if there are a lot of external reviews where customers are unhappy with the service or information provided, then this can also lead to a low rating.
  • Unsatisfying amount of information about the website or main content creator – as covered earlier, there needs to be information such as who the company is, an ‘about us’ page that describes who the company are, contact information that is easy to find, a physical address etc. If the person who has created the content doesn’t have any information ab

Those things don’t seem that bad right? Well, there’s one rating worse, which is “lowest quality” – If ‘lowest quality’ then you’re in real trouble.

How to Avoid Getting Penalized in Future

Chances are however, that unless you’ve outsourced your SEO to the shadiest person ever or your site has been hacked, then you won’t be graded so badly.

To be considered ‘lowest quality’ there’s some pretty strong criteria that raters are told to look out for, and it’s easy to see why a site would be considered to be of the ‘lowest quality’ when you see the type of things they’re told to look out for:

  • Lowest amount of E-A-T: If there’s no E-A-T or a low amount of it, then a site “fails to achieve its purpose” and therefore won’t rank as highly as a result.
  • No MC (main content): If pages exist purely for SEO purposes and don’t actually provide any information to a user, or there’s very little content on the page, then this will hinder your site’s performance
  • Lowest quality main content: If content is factually incorrect, or is difficult to read due to the way it is presented, or because of the site’s design, then this won’t benefit your users and therefore you shouldn’t be ranking well in the SERPs.
  • Copied MC: Self-explanatory really; if your content is copied from elsewhere and is literally word for word the same as another site (or only changed very slightly), then this is won’t help a user either and can actually lead to your site being penalised,
  • Auto-generated MC: if all the content on a site is replicated via feeds or simply copied automatically from elsewhere, without any original content or additions from the site copying the content, then this will be rated as lowest quality when reviewed.
  • Obstructed or inaccessible MC: If it is impossible to view the content without having to view an ad first, or clicking an advert then this is hindering the user’s ability to access your site, so will again be considered as of the lowest quality.
  • No information about the creator of the content or the website itself: As outlined above, there needs to be information in place about who owns the site, who the company is, customer service information etc., and author bios in place. Without this information, someone reviewing your site can’t determine if you are trustworthy or not, and therefore may give you a lower rating.
  • Pages that misinform users: If for example you’re in the medical sector and providing medical advice that is incorrect, or misleading financial information then you’re likely to be considered as ‘misinforming users’ so you’ll be rated accordingly.

There are other things that can also lead to a lowest quality rating such as sites that are made to spread hate/cause upset by targeting and discriminating against people based on their race, gender, nationality etc. Anything that can cause harm to a user will be marked as lowest quality.

Summary and E-A-T Checklist

So, we’ve covered what E-A-T is, who is impacted by it, examples of sites that have seen a reduction in visibility due to the E-A-T of their site, things to avoid doing, and examples of high quality sites.

When evaluating the E-A-T of your own website, remember the following:

  • Make it clear who runs the website – create an about us page that explains your company history, and link to any external press coverage, awards you’ve won etc.
  • Make contact information easily accessible – if someone needs to get in touch, don’t make it hard for them to do so!
  • Author information – if you’re providing guidance/information, get your experts on the case! Create author bio pages that highlights your credentials as an expert in the field, and link to them from your guides
  • External reviews – encourage your customers to leave reviews on Trust Pilot, Feefo etc. the best way to do this is send a follow up email once your product or service has been delivered. If you don’t ask for reviews, you won’t get them (unless it’s a negative one!)
  • Content reviewed and refreshed – check your core pages; is your content up to date? Does it need adding to? Are your competitor’s pages better than yours? If so, revise yours so it is more in-depth and answers your user’s queries
  • Check for any copied content – it may have been done accidentally, or if several sites are copying your content, refresh yours so that yours is unique
  • Make your content easily accessible – don’t cover the page in adverts that are intrusive
  • Try and get external links from authoritative websites – if you’re cited as a source of information you’ll be considered more trustworthy
  • Make the switch to https if you haven’t already – having a secure site is already a confirmed ranking factor, and if you’re handling any customer information it needs to be secure.

Finally, have a read of Google’s Search Quality Rater’s Guidelines and ensure your content matches their requirements

Has Your Site Been Affected by the E-A-T Updates? Let us Know Below!

How to do an SEO Technical Audit
Mar 04

How to Do an Seo Technical Audit

By Aires Loutsaris | Marketing , SEO Tips


How To Do An SEO Technical Audit. 1

What Is It and Why Is It Important?

Technical SEO is one of the most important things to consider as part of any SEO campaign. Whether you’re a seasoned pro or just starting out in SEO, you need to get the basics right; otherwise you’ll spend a lot of time (and money!) on content and links and you’ll always be hindered by a poor site.

If you’re not sure what things to consider or how to check for certain issues, then look no further – our guide on how to complete a technical SEO audit will answer your questions.

Our technical SEO audit will cover the following core areas:

  • Indexation
  • Redirects & response codes
  • Mobile
  • Content
  • Search Console
  • International targeting

Within some sections there’ll be sub-sections for you to consider.

So – let’s get on with the audit!

Tools you’ll need

There are some manual checks involved in a technical audit, but to help you along the way and speed things up a bit, we recommend using the following:. We’ll touch on each tool in more detail as we come to use it:

  • Screaming Frog – download it here. Screaming Frog crawls your site and compiles a list of every page on the site, along with other useful information such as response codes, page titles, and much more.
  • Google Search Console – set it up here. GSC is a dashboard that you can use to monitor issues that Google are seeing and highlighting
  • Google Page Speed Insights – measure the loading time for your site and find issues hindering performance.
  • GTmetrix – similar to Google’s page speed tool, but offers a bit more detail and guidance on fixing the major issues.
  • Google’s Mobile Friendly testing tool – pretty self-explanatory, but highlights if your site works properly on mobile devices, and what you can do to fix it if not.
  • Excel (or any other spreadsheet tool) – this is to save all the lists of URLs that need action.

Crawl Your Site with Screaming Frog

If you’ve got Screaming Frog installed, then simply open it and paste in your domain. If not, download it from here (if your site is more than 500 URLs, you’ll need to buy a licence to get the full site)

If you’ve not used Screaming Frog before, then check out their full user guide which explains how to do each task.

If you’ve used it before, then bookmark it as it will come in handy in future.

As a side note – Seer Interactive have this epic guide on how to do almost anything in Screaming Frog. If you’ve never used it before or have been using it for years; it’s really, really useful.

Paste your URL into the address bar at the top, and click “start”. Depending on the size of your site, it will take anything from a few minutes to several hours. Once it’s finished – save your crawl! You’ll thank us for this one; to save you having to re-do your crawl and wait for it to run again.

Once the crawl has completed, there are several things to export from SF depending on the size of your site and what you’re hoping to achieve, but at a top level we’d recommend:

  • Internal all
  • Directives
  • Response codes
  • Image alt tags

Remember to keep your crawl open for the duration of your audit, as you’ll need to export other things along the way.

What We’re Looking For

Ideally, we want to find major issues with our site that may be hindering performance – this may come in several forms, but what we want to know is that the users and search engines can find our content easily on any device.


There are several things that can hinder a site’s performance in the SERPs, so let’s dive in and get the basics right.

Duplicate Versions of Your Site

So, a site can resolve (load) at the following URLs:

That’s four different versions of the same site. To avoid confusion for users and search engines, we only want one instance of the site indexed, so we need to pick ONE preferred version, and ensure that there are proper redirects in place.

To check if you have this issue, simply try your website domain with and without the different variations. You *should* see the non www version of the domain redirecting to the full URL (or vice versa depending on how you’ve set it up). The same goes for http:// to https:// – simply try both versions of your URL and see what happens.

If there’s no redirect in place, then there’s no need to panic – it’s an easy fix!

How to Fix This Issue

So, if you’ve got duplicate versions of your site – first, check if there are canonical tags in place. A canonical tag tells search engines which is the correct version of your page to crawl/index. To check this, we can use the exports from Screaming Frog (under the ‘directives’ heading on your ‘internal-all’ export, or in the ‘directives’ export itself), or simply check the source of the page (crtl +U) and search for ‘canonical’. It should look like this:

canonical tag tells search

If there’s a canonical tag in place, then although it’s not ideal to have several different versions, if all your canonical tags point to the preferred URL then there shouldn’t be an issue.

The best way to solve this permanently is to redirect everything to the preferred version – so, if for example the preferred URL is https://www.yoursite.com, we need a redirect that pushes everything doesn’t contain www. To the www. URL and any http:// visits to the https:// version of your site.

If your site is in WordPress, it normally does a pretty good job of handling these things. Otherwise, ask your web developer to redirect them for you. If you’d like to handle them yourself, then use the code from these URLs to force the www. URL and the https:// version:



Add these to your htaccess file and this will sort the redirects for you.


A robots.txt file is a guide for search engine crawlers to follow (or avoid should you wish). The main things we want to ensure are:

  • Only parts of your site you don’t want indexed (such as login pages, or pages behind a login/paywall, or a dev/UAT site) are blocked
  • Your whole site isn’t being blocked (DISASTER!)

If you want your site to be seen by search engines, you do not want your robots.txt file to look like this:

robots.txt file

This directive tells ALL web spiders not to crawl your site. If this is how your robots.txt file looks and you want your sit e to be indexed, then you need to delete the “/” ASAP.

To test this, go to https://www.yoursite.com/robots.txt and see what it looks like. In most cases, it is usually fine and there’s nothing to worry about, but it’s worth a check.

Also, check that your XML sitemap location is added to your robots.txt file, so it looks like the below:

robots.txt file

If you want to block a certain file type or path, simply add them to your robots.txt file as pictured below:

robots.txt file

Google has a full list of robots.txt directives here, and a list of rules you can apply to your robots.txt file.

If you want to test that you have your robots.txt file set up correctly, use Google’s robots.txt tester – if there’s any errors they’ll be flagged in the testing tool:

Google’s robots.txt tester

XML Sitemap

An XML sitemap is a list of pages on your website; that will help search engine spiders crawl and index the content on your site.

Whilst it isn’t an absolute necessity to have one – it’s definitely worthwhile. Google’s own description of sitemaps is as follows:

A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them. Search engines like Google read this file to more intelligently crawl your site.”

Using a sitemap doesn’t guarantee that all the items in your sitemap will be crawled and indexed, as Google processes rely on complex algorithms to schedule crawling. However, in most cases, your site will benefit from having a sitemap, and you’ll never be penalized for having one

If you’re not sure if you have an XML sitemap – ask your web developer. In most cases, it will reside at /sitemap.xml, however it does depend on the filename of your sitemap.

If you need to create one, you can do this in Screaming Frog by clicking “Sitemaps” and “create XML sitemap”

XML sitemap

This will export an XML sitemap for you to upload to the root of your site.

If you’re using WordPress, then there are several plugins that can create one for you (such as Google XML sitemaps or Yoast)

Once you’ve got an XML sitemap, submit it to Google Search Console (click to jump to that section). This will help flag any errors with your sitemaps, which you can then check out in more detail. When you’ve submitted your sitemap, you’ll see a screen similar to the below:


If there’s an error with a particular sitemap, just click on the corresponding sitemap and it will explain what the issue is:


Once you’ve created and submitted your sitemap, just check regularly for any errors in GSC. To find out more about sitemaps, check out Google’s sitemap resource section.

Check Which Pages Google Is Seeing in Its Results

A site: search will give us an indication of how many pages Google has indexed from your site.

Simply type “site:yourdomain.com” (don’t include the https://www), as pictured below:

indication of how many pages Google has indexed from your site

As you scroll down Google’s results, you can see if there’s any pages indexed that we don’t want/shouldn’t be there.

The number of results should be close to the number of pages on your site (it will never be exact), but if it’s say 1000s of URLs out then there’s a problem somewhere.

Submit Your XML Sitemap

One of the best ways to get your content seen by Google is to include it in your XML sitemap and submit it to Google Search Console. Login into GSC, and under “index”, click ‘sitemaps’:

Submit your XML sitemap

At the top of the page you’ll see a field that looks like the below, where you can add your own sitemap – simply enter the URL of your sitemap and click ‘submit’ – it’s as easy as that!

add a new sitemap

Once your sitemap has been submitted, if there any issues; Google will flag them here.

Redirects and Response Codes

How you pass users through your site can have an impact on your site’s rankings, so it is important that this is done right.

Earlier on, we said export the “response codes” tab from Screaming Frog – you’ll need this now. Here’s a quick overview of some of the most common response codes you’ll see:

  • 200 – OK: this means that the page has loaded correctly and there are no issues
  • 301 – permanent redirect
  • 302 – temporary redirect
  • 404 – broken link/page
  • 500 – internal error

You may sometimes see a 403 or 502 code – in these instances the solution is to fix the broken page or redirect to the most relevant page.

Ideally, if a page is no longer valid (such as a discontinued service or product, or if you’ve renamed a page), rather than this page just not working, we want point users to another relevant part of the site.

The way to do this is through a 301 (permanent) redirect – this tells search engines that the address of that page has permanently changed to a new address. These will show in the SF export as “301” in the ‘status codes’ column on your export:

Redirects and response codes

What we’re looking for are any pages that are redirecting from A->B->C, this is known as a redirect chain. Ideally, a URL should redirect as follows:

https://www.yoursite.com/old-url/ to https://www.yoursite.com/new-url/.  If there’s an instance where it redirects from /old-url/ to /new-url/ to /new-url-2/ this is a redirect chain, and really we want to remove the middle redirect, and instead redirect from the first URL to the last URL.

In instances where there’s a 302 redirect; these will show as ‘302’ in the status codes column:

302 redirect

There may be times when a 302 redirect is useful; such as if you’re redirecting users to a seasonal page (such as a summer or Christmas page for example), however if that redirect is to remain in place permanently, these redirects should be updated to a 301 redirect rather than a 302.

Copy all of the URLs that are returning a 302 response code, and create a new sheet in your excel audit, and paste them into the new sheet. Rename the sheet ‘302’ response codes – that way, if you’re passing on the audit to someone to implement the recommendations, they’ll have a list of everything in one handy file.


As the majority of searches take place on a mobile device, and is only going to increase in future; mobile friendliness is a ranking factor in Google’s results. If your content doesn’t render properly or takes a long time to load on a mobile device; then you’re less likely to rank over a competitor that serves content on a mobile device correctly.

Mobile Friendly Test

Head over to Google’s mobile friendly testing tool: https://search.google.com/test/mobile-friendly , and enter in your site’s URL to run the test and see if your site is mobile friendly or not:

Mobile friendly test

If your site isn’t mobile friendly, you’ll be met with something similar to the below highlighting the errors:

site isn’t mobile friendly

The most common issues are:

  • Viewport not set
  • Viewport not set to ‘device-width’
  • Content wider than screen
  • Text too small to read
  • Clickable elements too close together

These items mean that either your site isn’t responsive, doesn’t serve content correctly or that users will find it difficult to read text, or click items on your site. An example of this can be seen below:

mobile friendly test 2

Img source: https://developers.google.com/web/fundamentals/design-and-ux/responsive/

To fix the viewport & the content wider than the screen issues, we can kill three birds with one stone, simply by adding the following code to your site’s header:

<meta name=”viewport” content=”width=device-width, initial-scale=1″>

This should also help with the ‘text too small to read’ issue, but it’s also advisable that you’re using a standard font, but there are also a few guidelines from Google to ensure your text can be read properly:

  • Use a font size of 16 CSS pixels as a minimum
  • Use sizes relative to the base size to define the typographic scale.
  • Ensure there’s enough space between the lines of text – use the browser default line height of 1.2em.
  • Don’t use too many different fonts on a page

Mobile URLs

Sometimes, some sites serve content to users on different devices at different URLs, such as:

Mobile URL

Img source: https://developers.google.com

If this is the case on the site you’re auditing, the main thing we need to look for is annotations that tell search engines that the same site exists at another URL. What this looks like is:

  • On your desktop site, a line of code that looks like:

<link rel=”alternate” media=”only screen and (max-width: 640px)”


  • On the mobile version, there has to be a canonical tag pointing to the desktop version of the site, that looks like:

<link rel=”canonical” href=”http://www.example.com/page-1″>

These tags need to exist on a page to page basis, so for every page on the site, there needs to be the corresponding tag on the mobile version of the site, rather than just one at a domain level.

This tells search engines that the same content is being served at another URL, but it is part of the same site, rather than two separate sites.

The alternative to this is to ensure that your site is responsive; although Google has no preference as to which setup you have, as long as it is accessible to users and all Googlebot user-agents.

Page Speed

Page speed is one of Google’s confirmed ranking factors – particularly on mobile devices.

No one wants to wait forever for a site to load, so…

If your site takes a long time to load, then it’s likely that a user will click the back button and go to another site, so it’s important to get the page speed sorted on your site.

To test your site’s page speed, visit either of these two:

They will both usually give similar recommendations to improve your page speed, but it’s good to test in both as you’ll get confirmation of which areas to focus on.

To test your page speed, simply paste your URL into both tools and press “analyse”

For the purpose of this post, we’ve picked a site that we know needs a bit of improvement page speed-wise:

page speed

The suggested improvements are:

suggested improvements page speed test

With most sites; it’s usually images that are slowing things down. The best thing you can do is compress your images so that they load quickly – if you’ve got Photoshop then make sure you “Save for web and devices” when saving any images.

If your site already has a lot of images, and you’re using WordPress, there are plugins out there such as Smush It which compresses images for you, meaning that you can serve scaled images to your users.

The main considerations for most page speed tools are:

  • Optimise your images
  • Enable Gzip Compression – this compresses any style sheets and your whole web page before it hits the user’s browser, so that files transferred between your server and the user’s browser are displayed quicker. Add the code from this link: https://gtmetrix.com/enable-gzip-compression.html to your site’s htaccess file.
  • Leverage Browser Caching – when a user loads your page, it has to download all the images and style sheets to the browser. Browser caching stores these images locally in the user’s browsing data, so if they revisit your site, the page loads significantly faster. Get the code to add to your site’s htaccess file here: https://gtmetrix.com/leverage-browser-caching.html
  • Minify JS/CSS/HTML files – this is essentially just compressing your core files so that they load quicker.

Sometimes, depending on your site’s theme/layout, there will be recommendations to remove JS code from above the fold of your page – more information on how to do this can be found in Google’s developer insight guidelines.

International Targeting

International SEO is essentially ensuring your site is effectively targeting other countries (and languages where appropriate).

Unfortunately; if you get it wrong it can have a significant negative impact on your site…but fear not as we’ve got you covered here.

The main thing is to ensure that your site is set up to effectively target other countries – especially if you are an international company. An example could be a retailer that sells clothes to the UK, US, Australia, Spain, France and so on.

With this in mind, there are a few things to consider in how you approach this:

  • Does your content all sit on the same domain, or separate domains?
  • Is your content being translated for each site?
  • Ensure there’s hreflang tags in place

You could also look at redirecting users based on their IP/location, however Google have advised against this in the past, so it’s best to ensure that your site is set up to serve international users correctly.

So, with those above considerations in mind, here’s how you go about it.

Subfolder or Ccltd (A Country Specific Top Level Domain)

Firstly, does your site have everything at www.yoursite.com, and then a subfolder (such as www.yoursite.com/us/, or www.yoursite.com/au/), or will they have their own domain?

Either way, the first thing to do is check that they’re added to Google Search Console, and set to target the correct location – you’ll need to use the old version of GSC to do this:

international targeting

Check that a country is set (sometimes this is missed during the initial GSC setup), and if there isn’t one defined, simply click the tick box and select your target country.

A CCTLD is a great signal to search engines that you want to target a specific country, for example www.yoursite.it is obvious that you want to be targeting users in Italy, however this will mean that no link equity is passed from the main domain, compared to if it were to reside at www.yoursite.com/it/.

Check what solution your site has in place (if any), and ensure that international targeting is set up in GSC.

Is Your Content Translated Correctly?

While having translated content is a great signal to search engines as to where you want to target, having poorly translated content is a big no-no, and can do you more harm than good.

Make sure that your content is translated to the language you’re targeting by someone who speaks the language fluently or it is their native language – having machine based translations or relying solely on Google translate is a recipe for disaster.

Hreflang Tags

A hreflang tag is a way of telling search engines that another version of your site exists in another language, or in the same language targeting a different country (for example, you may have an English site that targets England, Ireland, USA and Australia).

Without the tags, it’s duplicate content which could lead your sites to not rank as highly as they could/should be doing. Check out Google’s advice on hreflang tags here.

Check the source of the site you’re auditing, and see if there’s code similar to the below:

<link rel=”alternate” hreflang=”en-gb”
href=”http://en-gb.example.com/page.html” />
<link rel=”alternate” hreflang=”en-us”
href=”http://en-us.example.com/page.html” />
<link rel=”alternate” hreflang=”en”
href=”http://en.example.com/page.html” />
<link rel=”alternate” hreflang=”de”
href=”http://de.example.com/page.html” />
<link rel=”alternate” hreflang=”x-default”
href=”http://www.example.com/” />

Important: whilst you’re telling search engines that there’s several versions of your page, you also need a ‘self-referencing’ tag that tells which is the original.

The Screaming Frog crawl should find all pages that have hreflang tags –when you’ve crawled the site, click the “hreflang” tab:

Hreflang tags

Img source: Screaming Frog

This will then give you a list of URLs and if there are any issues with the hreflang tags, such as incorrectly set up, not containing a self-referencing tag etc.

If there are no tags in place, but you have an international site, you’ll need to create tags for each site.

Aleyda Solis has covered this in her blog post about how to avoid issues with hreflang tags, and as an additional bonus she’s created a fantastic, easy to use hreflang generator that creates the tags for you.

If you want to check the tags on your own site, visit https://hreflangchecker.com/#/

Important: Sometimes; hreflang tags can be found in the XML sitemap instead, so if you don’t see the tag in the HTML of the page, don’t automatically assume that there are no tags in place.


The content on your site is one of the most important factors in where your site ranks within a search engine’s results – does it match user’s queries, is it factually correct, is it unique to your site, is it up to date – all these things can play a part, so it’s vital that your content is as optimised as can be.

Duplicate Content

Duplicate content is when the text on a page on your site has the exact same content either somewhere else on your own site, or on an external website. Search engines reward great, unique content; so if your content is duplicated across several pages of your own site, or copied from somewhere else, you’re not going to rank well.

There is no actual penalty for having duplicate content, but Google will crawl your site less frequently, and de-value your site over another that has regular, informative, unique content.

To check if you have duplicate content on your own site, or if someone else is using your content, you can use Siteliner or Copyscape – they both offer free and premium versions, however one of the best/simplest ways is to take some lines of text from one of your pages, and paste it into a Google search with “ before and after the text, for example copying everything below including the “:

“The use of mobile devices to surf the web is growing at an astronomical pace”

Duplicate content

There are several thousand results all using the same text –if someone has duplicated your text, you can mail the site to ask them to alter it, however chances are you’ll not get a reply. Instead, it might be worth rewriting your copy slightly so that it is unique to your site.

If you have duplicate content across several pages of your own site, then make a list of the affected pages, and create unique content for each page, removing all instances of duplicate content.

Missing Page Titles

The page title is one of the most important on-page SEO elements, as it tells users and search engines what your page is about, and helps your site stand out in the SERPs.

In Screaming Frog, click the ‘page titles’ tab, and filter by “Missing” – this will show any pages that have no title tag on the page. Export this and add it to a new tab in your excel doc, so you have a list of all issues in one place.

Missing page titles

For all pages that should have a tag, create a new page title that meets the following requirements:

  • Unique to that page
  • Within 65 characters
  • Is descriptive of the page
  • Contains your target keywords

Duplicate Page Titles

A duplicate title tag is confusing to search engines – if you have several pages with the same title tag, how is a search engine spider meant to know which is the correct page to rank/index within its results?

In Screaming Frog, click the ‘page titles’ tab, and filter by “Duplicate” – export these and add to a new tab in your audit file.

Following the same guidelines as above, create a unique title for each page that has a duplicate title tag.

Short Page Titles

Whilst not a huge issue, page titles can contain up to 65 characters, so if your titles aren’t utilising this space, you could be adding extra keywords in there to help your pages rank higher. In Screaming Frog, click the “Page titles” tab, and filter by “Below 30 characters” – here you’ll have a list of all page titles that potentially could be expanded on.

Missing H1 Tags

An h1 tag is an HTML tag that describes what is on the page, and usually acts as the main heading on the page.

Search engine spiders use the page title, headings and content on the page to understand what the page is about, so it’s important to have an optimised h1, however sometimes depending on your site’s theme, h1s can be missing completely.

Using Screaming Frog, click the h1 tab, and filter by “missing” (see below)

Missing h1 tags

This will give you a list of all pages missing an h1 tag. Export these and add them to your spreadsheet so you have a list of all pages that need new h1s.

Multiple H1 Tags

Generally, pages should only have one h1 tag per page, however can have as many h2s/h3s and so on as you’d like. When there’s more than one h1 on the page, these can sometimes be ignored by search engines, so you’re wasting a great opportunity to send an extra signal to search engines.

Use Screaming Frog again, and instead filter by “Multiple”, this will show you all the pages that have multiple h1 tags. Add these to your missing tab, and you’ll have a list of all pages that need h1s redoing.

Meta Descriptions

The Meta description (pictured below) is the messaging you see in the search results below the page title and URL:

Meta Descriptions

While not a ranking factor, the right messaging can help improve click through rate (CTR), and therefore they should be addressed alongside your on page content.

Within the screaming frog exports, we’re looking the following:

  • Missing meta descriptions
  • Short descriptions
  • Duplicate descriptions

Using Screaming Frog, click the “meta description” tab, and as before, filter for each of the description types:

meta description types

Export missing, duplicate and below 70 characters, and save them to your spreadsheet in a new tab, as these will need new Meta descriptions creating for all of them.

Guidelines for Creating Meta Descriptions:

  • Use up to 165 characters – anything more may be truncated in Google’s results. We have previously seen some descriptions up to 300 characters, however in order to ensure that we’re not overdoing it, stick to 165 characters
  • Be descriptive of the page, but include a call to action such as “visit us today”, “contact us now”, “buy online now from ___”
  • Ensure your meta descriptions contain the target keyword of the page, and are unique to each page

Canonical Tags

A canonical tag is a line of code that tells search engines that there’s a specific version of a page or URL that we want them to index.  As mentioned earlier in the audit, a page can resolve at several URLs:

When this happens, we need to tell search engines which page to index, so we need canonical tags in place. Below is a live example from Thomas Cook’s website:

Canonical tags

Canonical tags

We’re looking for any pages that are either missing a canonical tag, or are pointing to a non-indexable page. Export the ‘missing’ and ‘none—indexable canonical’ tabs, and add them to your spreadsheet. Any page that doesn’t have a canonical, ensure there’s one in place – if there’s not a correct page to go to, simply add a self-referencing canonical (a page pointing to itself).

Any non-indexable ones (such as if the canonical page is returning a 404 error) should be updated so that they canonical to live pages.

Search Console

Search console is a free tool from Google that helps track the performance of your website within Google’s results. It can be used to highlight technical issues, what terms people are finding your site with, and any other issues Google is having with your website.

Firstly, check if you have access to your site’s search console account – if you’re working for a client, ask for full access so you can do your checks effectively.

Search console has recently been updated, with some features having been removed, and others set to follow in future, so for the purposes of this review, we’re focusing on the latest version of GSC.

When you first login, you will see a page that looks similar to the below:

Search Console

Select your site from the top left of the screen,  and you’ll see a menu that looks like the below:

Search Console

We’ll run through these in more detail.


The performance section highlights how your site is doing in Google’s results in terms of search queries, clicks, impressions, CTR and average position:

queries, clicks, impressions, CTR and average position

Simply click on each of the different headings highlighted, and you’ll see which pages are performing best, which queries are driving the most traffic to your site, along with which country your users coming from and what device types.

  • Clicks – self-explanatory; the number of people clicking on your site from Google’s results. If the click volume is low but there’s high impression volume, we know it’s worth targeting that term with our on-page copy
  • Impressions – this is the number of times your site was seen in Google’s results for a particular query.
  • Average CTR – average click through rate, so what % of users actually click through to your site from Google’s results
  • Average position – the average ranking for that particular keyword within Google’s results. This figure can often by misleading, so it’s always best to do a manual check, and any 3rd party tools if you have access to them.

From this data, you can see which terms people are searching for, and where you potentially need to improve – if there’s a high impression term that has a low CTR, then you can start optimising your site for these terms.

URL Inspection Tool

The URL inspection tool gives you up to date information about how Google is seeing and indexing your content within its results. Simply click the “URL inspection” tab on the menu, and paste in your URL – you’ll be greeted with a page similar to the below with a list of any issues:

URL Inspection tool

Within the tool we can see:

  • If the page is indexed, if not we can request indexing simply by clicking the “request indexing” button
  • View a rendered version of the page
  • Inspect a live version of a particular page
  • View a list of loaded resources
  • If your URL is in your XML sitemap or not

Depending on how your site is seen by Google, you’ll get varying results depending on if it its indexed or not, and any issues Google may have faced (such as the page is being blocked, is an alternate version of another URL, if the page is returning a 404/500 error) or something else. A full list of status issues can be found here.

Check your most important URLs with the inspection tool, and if any issues are flagged, work on these as a priority.


The coverage section shows you how many URLs are in Google’s index, how many pages are missing, and how many are showing errors (if any). Click “coverage” and you’ll see a screen similar to the below:


This is probably one of the most important sections within the new search console, as it will flag pages that are returning crawl errors. If your site is showing any errors, click on the error and you’ll see the offending pages:

server error

If for example you have pages that are returning a 500 error, you can export this list and set about fixing the pages or redirecting them. Once you fixed them, click the “validate fix” button which tells Google you’ve fixed these pages, so they should stop showing in future reports.

Some of the most common errors are:

  • Server error (5xx)
  • Redirect error (such as a redirect loop where a page redirects from one URL to another, to another and so on)
  • Submitted URL blocked by robots.txt – sometimes this may have been done accidentally, but at least it’s flagged in here and we can update the robots.txt file if necessary
  • Submitted URL marked ‘noindex’ – a noindex tag tells search engines not to include that particular page within its results; any pages with this tag will be flagged here.
  • Submitted URL seems to be a soft 404 – a soft 404 is a page that is broken, but returns a 200 status code (which means that the page is working correctly)
  • Submitted URL returns unauthorized request – this could be that Googlebot was blocked, or the content is behind a login/paywall
  • Submitted URL not found (404) – the page isn’t working and is returning a 404 error
  • Submitted URL has crawl issue – use the inspection tool to see what the issue is, if it is not one of the above specified issues.

If a URL is being flagged here, click on the URL and you can dive into this further:

flagged url

Click “fetch as Google” and try and see what Google is seeing with that particular page, and you may  begin to understand why the page is broken.


We touched on submitting your sitemap earlier, however this gives an overview of any issues that may have arisen with your XML sitemap, or you can submit another sitemap here.

Ideally, you want to see something similar to the below with no issues:


If there any errors, they’ll be flagged in here – there are quite a lot of potential sitemap errors, outlined in Google’s page about them here.

Mobile usability

The mobile usability section shows which pages (if any) have problems when viewed on mobiles. The first page (pictured below) gives an overview of some of the issues, then you can drill down into each individual issue:

Mobile usability

Click on each issue, and it will show the offending URLs:

Mobile usability

To fix these issues, simply head back to the mobile friendly section of the audit and implement those fixes, and click “validate fix” within search console.

Manual Actions

A manual action is effectively a penalty from Google, usually handed out for doing something that is against Google’s guidelines.

What you really, really want to see on this tab is the below:

Manual Actions

A manual action is quite hard to remove – you’ll have to clean up whatever you’ve done wrong, and request a review from Google.

The most common manual actions are for:

  • User generated spam – for example if there are 1000s of profiles on your site with usernames related to products, linking to external sites
  • Unnatural links – either pointing to your site through link building practises, or from your site if you’re selling links to 3rd parties, or if your site has been hacked for example and there are lots of links placed within the code of your site
  • Thin content – this is mainly if you’ve been copying other people’s content, or have poor quality pages that exist purely to make money, such as low quality affiliate pages or doorway pages
  • Pure spam – this is one of the worst and hardest to overcome; you’ll need to remove all offending code from your site, clean up your content, sort your links and request a review, and hope that you’ve done enough to clear your site.

There are other types of manual actions – head over to Google’s help section for more information on the less common issues.

One of the most common causes of a manual action occurs when someone buys an expired domain, puts new content on there and then receives a penalty. If this happens to you,  request a review, and include a snapshot of what the old site looked like from https://archive.org/web/, highlighting what you’ve changed, and you should get your manual action removed.

Security Issues

If your site has been hacked, users may be alerted with a warning in Google’s Chrome browser. They’ll also flag in here if there are any security issues; usually showing the URLs that are affected and what the problem is. Typically, the cause is the site has been hacked through a 3rd party plugin, and your content has been changed on your site without you knowing.

Sometimes, phishing pages can be setup to try and capture user’s personal details, so keep an eye on this section to see if there’s anything wrong with your site.


The links section shows you all of the links to your site that Google has seen. There may be some that you know are live that aren’t showing in here yet, but don’t worry – over time they’ll start to show.

Within this section, you can see:

  • Which sites link to you
  • What anchor text is used
  • Which pages are linked to most often
  • Internal links throughout your site

external links

With this data, you can identify which areas need more internal and external links to try and help boost your visibility overall.

How to Setup a New Gsc Account

If you’ve not already got a GSC account, it’s really simple to get started and set one up. Using the menu, click “add property”:

How to setup a new GSC account

You’ll then be asked which type of property you want to setup – we recommend the domain level one highlighted below:

How to setup a new GSC account

You’ll then be required to verify the site via DNS record, using the code provided:

How to setup a new GSC account

If you don’t have access to this, setup as a prefix property instead, and add the HTML file to your site, or the easiest way is to copy and paste the HTML tag into your site’s header:

add HTML file to your site

Once you’ve added the tag, click “verify”, and ‘done’ and you’re ready to go!

If you’re using WordPress, you can use the Yoast SEO plugin, copy the HTML tag and paste it into the dashboard and that’s it.

Yoast SEO plugin

Once you’re verified, give it a few days and wait for the data to populate in search console, then you can monitor your site’s performance on a regular basis.


So, you now know how to audit a site, and which parts are the most important (indexation, mobile friendliness, how your content is served to both users and search engines). All of the elements outlined in this audit play a vital part in how your site runs technically – so, now you can go and build solid foundations for your own successful SEO campaign.


Sep 02

10 Reasons Why You Need SEO

By Aires Loutsaris | Marketing

Your business certainly needs SEO. Google has refused to reveal the exact numbers but it is said that there are over 1 billion searches done every single day and that 20% of these are done to try and find local businesses. If you want your customers to be able to find you on the internet then you need to try and do everything you can to get yourself noticed and the best way to do this is through SEO.

  1. Branding

SEO can help you to establish your brand and even your business online. It gives you the chance to lead your customers and even your clients to your site and it is a great way for you to build the presence that you have online as well. When you take a look at your website, is it easy to navigate? If it’s not then this could go against your SEO and your brand authority in general so you need to do everything you can to establish your brand online through SEO if you want to be successful.

  1. Cost Reductions

If you think about how much you spend on advertising in the local paper and anything else of the sort, you’ll find that the return isn’t justifiable and the money that you lose on it is significant. If you spend your money on SEO however, then a return is almost always guaranteed if you do go through the right provider.

  1. Traffic

When it comes to SEO, your business website will receive traffic. If you do not use SEO then you won’t be able to target your traffic and you will also find it really hard to get the leads that you need as well. Your customer will consider you an expert if you are at the top of Google.

  1. Sales

SEO will boost your sales. Think about it, if a client finds your site, this will help you to bring in more sales and if you have a good level of SEO then your sales can be boosted even more.

  1. Targeting

There is a whole market out there that is trying to find you, and SEO can be a great way for you to tap into this market.

  1. Competitiveness

When you have SEO, you can easily outrank any competitors that you have and you can also gain a solid edge on the local market as well. Your online presence can easily help you to boost the brand image that you have and you can also take their customers from underneath them.

  1. 24/7 Operation

When you have a good SEO plan you’ll find that it will work around the clock to drive business for you. This is cheaper when compared to any other marketing plan out there and even if you did find one that is suited to your business, you’ll find that it doesn’t run 24 hours a day, so that’s something for you to think about.

  1. ROI

SEO can provide you with a huge boost for your business. The problem is that advertising is costly and it can also give you zero results as well. With SEO however, you can have thousands of people who are genuinely interested in your product and this is because you are targeting them in specific, so you know that you will always have the chance to make your money back and then some profit through your own sales.

  1. Authority

If your business is very prestigious, that’s great news but if you do not have any authority online then this will really hinder your site. Your customers and visitors won’t consider you to be an expert and you will lose sales. If you have a good level of SEO however then you can easily avoid all of this.

  1. Get Found

If you have good SEO then you can easily open up your business to a ton of new customers and it is a great way for you to make sure that you are doing everything you can to expand. SEO really could hold the key to success and it is a great way for you to get started with your business online.

So as you can see, there are a ton of benefits available if you do want to start up a business and there are plenty of benefits to getting involved with SEO as well. These two concepts work very well together and it really isn’t hard for you to get to the top of Google for your chosen keywords either. If you do want to get started then all you have to do is contact your local SEO provider today and they will work with you to give you the best result out of your marketing efforts and even your content in general so do keep that in mind.

Top 25 SEO Blogs You Need To Read
Sep 02

Top 25 SEO Blogs You Need To Read

By Aires Loutsaris | Marketing

If you want to get some quality reading material to kick-start your morning then you know that you have come to the right place. Here you can find out everything you need to know about SEO as well as finding out some of the top blogs that you should be reading on a day to day basis.

  1. Search Engine Land

Search Engine Land is one of the most popular SEO sources. You’ll find trends, guides, analysis and even tips and tricks.

  1. Search Engine Watch

This is one of the most important SEO resources on the web. It covers news and any Google updates, with plenty of readable posts by some of the top experts in the industry.

  1. Search Engine Journal

This is a very popular blog and it thrives on guest posts. This means that you can find a ton of different voices on a range of topics.

  1. The Moz Blog

If you have been in the world of SEO for quite some time then you will most likely have heard of Moz. They post every day, with valuable new reading material.

  1. HubSpot

Hubspot is a top rated blog and it is known for its very unique way of writing. This makes the whole thing easier to understand so you know that you won’t have any issues there.

  1. Google Webmaster Central

If you know that you have a blog then you will have to find ways that you can run it successfully. This site can help you to do that.

  1. SEMrush

SEMrush is an SEO blog that can offer you a very competitive analysis tool and it also has a great blogging platform.

  1. Yoast

Yoast is super user friendly and it is designed so that it can provide you with the knowledge you need to feel confident on a huge range of topics. This includes social media, conversions and more.

  1. Search Engine Roundtable

This site is a very popular source if you want to find out more about SEM. It’s run by experts and it can help you to boost your blog statistics.

  1. Akash Srivastava’s SEO Blog

This blog is focused on providing actionable tips. It’s a great way for you to learn everything that you need to know.

  1. HigherVisibility

This SEO blog can put you on the right path to getting more visibility for your site.

  1. Content Marketing Institute

The Content Marketing Institute can provide you with the best content and they can also give you the best SEO practices.

  1. SEO Book

The SEOBook was founded in 2003 and they had the purpose of training those who need it in online business tips, training sessions and even marketing tips.

  1. CopyBlogger

CopyBlogger focuses on copywriting as well as helping you to create SEO friendly content, starting today.

  1. KISSmetrics

Kissmetrics was launched by a digital marketer and the site is all about how you can optimize your site to get the most out of it in general.

  1. Quick Sprout

Quick Sprout focuses on traffic and analytics, with practical tips that can help you to boost your site.

  1. Clickz

Clickz is all about the latest SEO updates and it can also help you to find out what’s happening in the world.

  1. Distilled

Distilled is an online marketing company and they can help you with creative writing,  consulting, technical SEO and so much more.

  1. SEO Hacker

SEO Hacker can provide you with the best guides on SEO and they can also help you to boost your conversion rate and even your optimization as well so do keep that in mind.

  1. Seer Interactive

Seer Interactive is a top marketing company that can help you to optimize your blog and it can also help you to get all of the knowledge you need on paid search, SEO tools and more.

  1. SEO.com

SEO can show you how you can take advantage of the latest SEO trends and you can learn a lot about link building on this site.

  1. SEO by The Sea

SEO by the Sea can show you how you can study SEO intensively as well as understanding the future ranking signals that Google plans on imposing.

  1. iAcquire

IAcquire is a blog that can offer you a different perspective on various marketing strategies. It can help you to understand the psychology about SEO techniques as well as helping you to know if they work.

  1. BacklinkO

This site is one of the top blogs around. It is easily the best blog if you want to capture the best result out of your site.

  1. TopRank Online Marketing

TopRank Marketing is a great way for you to help companies and even organizations to get their marketing services integrated. It doesn’t matter whether you need help with your SEO, SMM and so much more.


Sep 01

SEO Vs PPC: Who Wins?

By Aires Loutsaris | Marketing

If you want to know the difference between SEO and PPC then you certainly have a lot to think about. The first thing that you need to think about is whether or not you plan on selling a product, and what your company goals are. For example, if you know that you plan on selling a product or even a service then there is a huge difference between the two. If you want to find out more, then take a look at the difference between PPC and SEO.


SEO isn’t just about optimizing your site and it isn’t just about ranking highly on Google either. It’s all about being an authority and you also need to make sure that you are continually being the best on the search engine as well. With major search engines such as Yahoo, Google and even Bing, they look at how people react when they come to your site and if they come back. They also look at the sites that are linking to you. When you take a look at PPC, it’s all about paying for the advertisement space that you want for your targeted keywords but you need to make sure that you understand the concept of it before you go and implement it with your own site.

With pay per click, your advertisement will show at the top of Google and it will also have the word advertisement next to it. This is great if you are trying to sell a product and it is also great if you are trying to do everything you can to get the best result out of your site in terms of your brand awareness, but if you are just trying to drive traffic to your blog or anything else of the sort then this is not the way to go.

Your Strategy

The campaign that you have all depends on your advertising strategy. The major advantage that SEO can offer you is higher quality leads and it can also help you to know if you are actually getting a targeted customer base as well. A lot of users have trained themselves to actually ignore paid results when they are browsing the web and this is because they are not within the target audience that the advertisement is trying to target. For this reason, you need to do everything you can to target the right audience with your advertisements and with flawless efficiency.

The truth is that there are mountains of data that suggests that you should use natural searches to try and gain visitors and that natural searches mean that your customer is also much more likely to trust in your company as well. If you do rank high for a specific keyword then it is a clear sign that you are credible and it also shows that you are an expert in the industry as well. Of course, it is important to know that SEO is not free. No matter how you look at it, it doesn’t matter whether it is your own time or whether you hire a vendor outside of your business to take care of your SEO for you, because it always comes with a cost.

So Should You Use PPC?

Don’t get rid of the thought of PPC just yet. PPC comes with a ton of extra benefits. For example, if you set up your campaign well enough then you will certainly see a huge return on your investment and you will also find that it is a great way for you to get to the number one slot. Remember that it isn’t the search engine’s number one priority to give the person who pays the most the top slot, but it can be a good way for you to get a short-term gain on your investment that could potentially turn into a long term return.

So as you can see, there are some differences between PPC and SEO and it really does help to know them so you can take advantage of what they both have to offer. If you want to find out more about PPC or even SEO then all you have to do is get in touch with us today to find out if we can help you. We can’t wait to show you why we are the best at what we d. You can get in touch with us by phone or even by email and this is the best way for you to find out everything you need to know about our team. We are here to help you and we are always happy to help you. You can get in touch with us at any time and we can even discuss your own site with you as well.