fbpx

All Posts by Aires Loutsaris

About the Author

My name is Aires Loutsaris and I am an eCommerce Search Engine Optimization Consultant working with some of the world's biggest VC funded Startups and eCommerce companies. I have over 14 years of experience in SEO, search marketing and user acquisition. I have an SEO course with over 2.5k paying students on Udemy, I was an SEO Lecturer at the University of Arts, London College of Fashion and have been shortlisted for five awards winning two. Fun Fact: I have performed SEO Services for all Three of the Universities I studied at: The Open University, The University of Hull and Kings College University of London. I have also been the Head of SEO at the agencies SEO.co.uk and Net Natives. Contact me for any reason, I'm happy to help and I am very approachable. Additionally, feel free to connect with me on LinkedIn.

How to do an SEO Technical Audit
Mar 04

How to Do an Seo Technical Audit

By Aires Loutsaris | Marketing , SEO Tips

Contents

How To Do An SEO Technical Audit. 1

What Is It and Why Is It Important?

Technical SEO is one of the most important things to consider as part of any SEO campaign. Whether you’re a seasoned pro or just starting out in SEO, you need to get the basics right; otherwise you’ll spend a lot of time (and money!) on content and links and you’ll always be hindered by a poor site.

If you’re not sure what things to consider or how to check for certain issues, then look no further – our guide on how to complete a technical SEO audit will answer your questions.

Our technical SEO audit will cover the following core areas:

  • Indexation
  • Redirects & response codes
  • Mobile
  • Content
  • Search Console
  • International targeting

Within some sections there’ll be sub-sections for you to consider.

So – let’s get on with the audit!

Tools you’ll need

There are some manual checks involved in a technical audit, but to help you along the way and speed things up a bit, we recommend using the following:. We’ll touch on each tool in more detail as we come to use it:

  • Screaming Frog – download it here. Screaming Frog crawls your site and compiles a list of every page on the site, along with other useful information such as response codes, page titles, and much more.
  • Google Search Console – set it up here. GSC is a dashboard that you can use to monitor issues that Google are seeing and highlighting
  • Google Page Speed Insights – measure the loading time for your site and find issues hindering performance.
  • GTmetrix – similar to Google’s page speed tool, but offers a bit more detail and guidance on fixing the major issues.
  • Google’s Mobile Friendly testing tool – pretty self-explanatory, but highlights if your site works properly on mobile devices, and what you can do to fix it if not.
  • Excel (or any other spreadsheet tool) – this is to save all the lists of URLs that need action.

Crawl Your Site with Screaming Frog

If you’ve got Screaming Frog installed, then simply open it and paste in your domain. If not, download it from here (if your site is more than 500 URLs, you’ll need to buy a licence to get the full site)

If you’ve not used Screaming Frog before, then check out their full user guide which explains how to do each task.

If you’ve used it before, then bookmark it as it will come in handy in future.

As a side note – Seer Interactive have this epic guide on how to do almost anything in Screaming Frog. If you’ve never used it before or have been using it for years; it’s really, really useful.

Paste your URL into the address bar at the top, and click “start”. Depending on the size of your site, it will take anything from a few minutes to several hours. Once it’s finished – save your crawl! You’ll thank us for this one; to save you having to re-do your crawl and wait for it to run again.

Once the crawl has completed, there are several things to export from SF depending on the size of your site and what you’re hoping to achieve, but at a top level we’d recommend:

  • Internal all
  • Directives
  • Response codes
  • Image alt tags

Remember to keep your crawl open for the duration of your audit, as you’ll need to export other things along the way.

What We’re Looking For

Ideally, we want to find major issues with our site that may be hindering performance – this may come in several forms, but what we want to know is that the users and search engines can find our content easily on any device.

Indexation

There are several things that can hinder a site’s performance in the SERPs, so let’s dive in and get the basics right.

Duplicate Versions of Your Site

So, a site can resolve (load) at the following URLs:

That’s four different versions of the same site. To avoid confusion for users and search engines, we only want one instance of the site indexed, so we need to pick ONE preferred version, and ensure that there are proper redirects in place.

To check if you have this issue, simply try your website domain with and without the different variations. You *should* see the non www version of the domain redirecting to the full URL (or vice versa depending on how you’ve set it up). The same goes for http:// to https:// – simply try both versions of your URL and see what happens.

If there’s no redirect in place, then there’s no need to panic – it’s an easy fix!

How to Fix This Issue

So, if you’ve got duplicate versions of your site – first, check if there are canonical tags in place. A canonical tag tells search engines which is the correct version of your page to crawl/index. To check this, we can use the exports from Screaming Frog (under the ‘directives’ heading on your ‘internal-all’ export, or in the ‘directives’ export itself), or simply check the source of the page (crtl +U) and search for ‘canonical’. It should look like this:

canonical tag tells search

If there’s a canonical tag in place, then although it’s not ideal to have several different versions, if all your canonical tags point to the preferred URL then there shouldn’t be an issue.

The best way to solve this permanently is to redirect everything to the preferred version – so, if for example the preferred URL is https://www.yoursite.com, we need a redirect that pushes everything doesn’t contain www. To the www. URL and any http:// visits to the https:// version of your site.

If your site is in WordPress, it normally does a pretty good job of handling these things. Otherwise, ask your web developer to redirect them for you. If you’d like to handle them yourself, then use the code from these URLs to force the www. URL and the https:// version:

https://www.inmotionhosting.com/support/website/htaccess/force-www-htaccess

https://www.inmotionhosting.com/support/website/ssl/how-to-force-https-using-the-htaccess-file

Add these to your htaccess file and this will sort the redirects for you.

Robots.txt

A robots.txt file is a guide for search engine crawlers to follow (or avoid should you wish). The main things we want to ensure are:

  • Only parts of your site you don’t want indexed (such as login pages, or pages behind a login/paywall, or a dev/UAT site) are blocked
  • Your whole site isn’t being blocked (DISASTER!)

If you want your site to be seen by search engines, you do not want your robots.txt file to look like this:

robots.txt file

This directive tells ALL web spiders not to crawl your site. If this is how your robots.txt file looks and you want your sit e to be indexed, then you need to delete the “/” ASAP.

To test this, go to https://www.yoursite.com/robots.txt and see what it looks like. In most cases, it is usually fine and there’s nothing to worry about, but it’s worth a check.

Also, check that your XML sitemap location is added to your robots.txt file, so it looks like the below:

robots.txt file

If you want to block a certain file type or path, simply add them to your robots.txt file as pictured below:

robots.txt file

Google has a full list of robots.txt directives here, and a list of rules you can apply to your robots.txt file.

If you want to test that you have your robots.txt file set up correctly, use Google’s robots.txt tester – if there’s any errors they’ll be flagged in the testing tool:

Google’s robots.txt tester

XML Sitemap

An XML sitemap is a list of pages on your website; that will help search engine spiders crawl and index the content on your site.

Whilst it isn’t an absolute necessity to have one – it’s definitely worthwhile. Google’s own description of sitemaps is as follows:

A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them. Search engines like Google read this file to more intelligently crawl your site.”

Using a sitemap doesn’t guarantee that all the items in your sitemap will be crawled and indexed, as Google processes rely on complex algorithms to schedule crawling. However, in most cases, your site will benefit from having a sitemap, and you’ll never be penalized for having one

If you’re not sure if you have an XML sitemap – ask your web developer. In most cases, it will reside at /sitemap.xml, however it does depend on the filename of your sitemap.

If you need to create one, you can do this in Screaming Frog by clicking “Sitemaps” and “create XML sitemap”

XML sitemap

This will export an XML sitemap for you to upload to the root of your site.

If you’re using WordPress, then there are several plugins that can create one for you (such as Google XML sitemaps or Yoast)

Once you’ve got an XML sitemap, submit it to Google Search Console (click to jump to that section). This will help flag any errors with your sitemaps, which you can then check out in more detail. When you’ve submitted your sitemap, you’ll see a screen similar to the below:

sitemap

If there’s an error with a particular sitemap, just click on the corresponding sitemap and it will explain what the issue is:

sitemap

Once you’ve created and submitted your sitemap, just check regularly for any errors in GSC. To find out more about sitemaps, check out Google’s sitemap resource section.

Check Which Pages Google Is Seeing in Its Results

A site: search will give us an indication of how many pages Google has indexed from your site.

Simply type “site:yourdomain.com” (don’t include the https://www), as pictured below:

indication of how many pages Google has indexed from your site

As you scroll down Google’s results, you can see if there’s any pages indexed that we don’t want/shouldn’t be there.

The number of results should be close to the number of pages on your site (it will never be exact), but if it’s say 1000s of URLs out then there’s a problem somewhere.

Submit Your XML Sitemap

One of the best ways to get your content seen by Google is to include it in your XML sitemap and submit it to Google Search Console. Login into GSC, and under “index”, click ‘sitemaps’:

Submit your XML sitemap

At the top of the page you’ll see a field that looks like the below, where you can add your own sitemap – simply enter the URL of your sitemap and click ‘submit’ – it’s as easy as that!

add a new sitemap

Once your sitemap has been submitted, if there any issues; Google will flag them here.

Redirects and Response Codes

How you pass users through your site can have an impact on your site’s rankings, so it is important that this is done right.

Earlier on, we said export the “response codes” tab from Screaming Frog – you’ll need this now. Here’s a quick overview of some of the most common response codes you’ll see:

  • 200 – OK: this means that the page has loaded correctly and there are no issues
  • 301 – permanent redirect
  • 302 – temporary redirect
  • 404 – broken link/page
  • 500 – internal error

You may sometimes see a 403 or 502 code – in these instances the solution is to fix the broken page or redirect to the most relevant page.

Ideally, if a page is no longer valid (such as a discontinued service or product, or if you’ve renamed a page), rather than this page just not working, we want point users to another relevant part of the site.

The way to do this is through a 301 (permanent) redirect – this tells search engines that the address of that page has permanently changed to a new address. These will show in the SF export as “301” in the ‘status codes’ column on your export:

Redirects and response codes

What we’re looking for are any pages that are redirecting from A->B->C, this is known as a redirect chain. Ideally, a URL should redirect as follows:

https://www.yoursite.com/old-url/ to https://www.yoursite.com/new-url/.  If there’s an instance where it redirects from /old-url/ to /new-url/ to /new-url-2/ this is a redirect chain, and really we want to remove the middle redirect, and instead redirect from the first URL to the last URL.

In instances where there’s a 302 redirect; these will show as ‘302’ in the status codes column:

302 redirect

There may be times when a 302 redirect is useful; such as if you’re redirecting users to a seasonal page (such as a summer or Christmas page for example), however if that redirect is to remain in place permanently, these redirects should be updated to a 301 redirect rather than a 302.

Copy all of the URLs that are returning a 302 response code, and create a new sheet in your excel audit, and paste them into the new sheet. Rename the sheet ‘302’ response codes – that way, if you’re passing on the audit to someone to implement the recommendations, they’ll have a list of everything in one handy file.

Mobile

As the majority of searches take place on a mobile device, and is only going to increase in future; mobile friendliness is a ranking factor in Google’s results. If your content doesn’t render properly or takes a long time to load on a mobile device; then you’re less likely to rank over a competitor that serves content on a mobile device correctly.

Mobile Friendly Test

Head over to Google’s mobile friendly testing tool: https://search.google.com/test/mobile-friendly , and enter in your site’s URL to run the test and see if your site is mobile friendly or not:

Mobile friendly test

If your site isn’t mobile friendly, you’ll be met with something similar to the below highlighting the errors:

site isn’t mobile friendly

The most common issues are:

  • Viewport not set
  • Viewport not set to ‘device-width’
  • Content wider than screen
  • Text too small to read
  • Clickable elements too close together

These items mean that either your site isn’t responsive, doesn’t serve content correctly or that users will find it difficult to read text, or click items on your site. An example of this can be seen below:

mobile friendly test 2

Img source: https://developers.google.com/web/fundamentals/design-and-ux/responsive/

To fix the viewport & the content wider than the screen issues, we can kill three birds with one stone, simply by adding the following code to your site’s header:

<meta name=”viewport” content=”width=device-width, initial-scale=1″>

This should also help with the ‘text too small to read’ issue, but it’s also advisable that you’re using a standard font, but there are also a few guidelines from Google to ensure your text can be read properly:

  • Use a font size of 16 CSS pixels as a minimum
  • Use sizes relative to the base size to define the typographic scale.
  • Ensure there’s enough space between the lines of text – use the browser default line height of 1.2em.
  • Don’t use too many different fonts on a page

Mobile URLs

Sometimes, some sites serve content to users on different devices at different URLs, such as:

Mobile URL

Img source: https://developers.google.com

If this is the case on the site you’re auditing, the main thing we need to look for is annotations that tell search engines that the same site exists at another URL. What this looks like is:

  • On your desktop site, a line of code that looks like:

<link rel=”alternate” media=”only screen and (max-width: 640px)”

href=”https://m.yoursite.com/page-1″>

  • On the mobile version, there has to be a canonical tag pointing to the desktop version of the site, that looks like:

<link rel=”canonical” href=”http://www.example.com/page-1″>

These tags need to exist on a page to page basis, so for every page on the site, there needs to be the corresponding tag on the mobile version of the site, rather than just one at a domain level.

This tells search engines that the same content is being served at another URL, but it is part of the same site, rather than two separate sites.

The alternative to this is to ensure that your site is responsive; although Google has no preference as to which setup you have, as long as it is accessible to users and all Googlebot user-agents.

Page Speed

Page speed is one of Google’s confirmed ranking factors – particularly on mobile devices.

No one wants to wait forever for a site to load, so…

If your site takes a long time to load, then it’s likely that a user will click the back button and go to another site, so it’s important to get the page speed sorted on your site.

To test your site’s page speed, visit either of these two:

They will both usually give similar recommendations to improve your page speed, but it’s good to test in both as you’ll get confirmation of which areas to focus on.

To test your page speed, simply paste your URL into both tools and press “analyse”

For the purpose of this post, we’ve picked a site that we know needs a bit of improvement page speed-wise:

page speed

The suggested improvements are:

suggested improvements page speed test

With most sites; it’s usually images that are slowing things down. The best thing you can do is compress your images so that they load quickly – if you’ve got Photoshop then make sure you “Save for web and devices” when saving any images.

If your site already has a lot of images, and you’re using WordPress, there are plugins out there such as Smush It which compresses images for you, meaning that you can serve scaled images to your users.

The main considerations for most page speed tools are:

  • Optimise your images
  • Enable Gzip Compression – this compresses any style sheets and your whole web page before it hits the user’s browser, so that files transferred between your server and the user’s browser are displayed quicker. Add the code from this link: https://gtmetrix.com/enable-gzip-compression.html to your site’s htaccess file.
  • Leverage Browser Caching – when a user loads your page, it has to download all the images and style sheets to the browser. Browser caching stores these images locally in the user’s browsing data, so if they revisit your site, the page loads significantly faster. Get the code to add to your site’s htaccess file here: https://gtmetrix.com/leverage-browser-caching.html
  • Minify JS/CSS/HTML files – this is essentially just compressing your core files so that they load quicker.

Sometimes, depending on your site’s theme/layout, there will be recommendations to remove JS code from above the fold of your page – more information on how to do this can be found in Google’s developer insight guidelines.

International Targeting

International SEO is essentially ensuring your site is effectively targeting other countries (and languages where appropriate).

Unfortunately; if you get it wrong it can have a significant negative impact on your site…but fear not as we’ve got you covered here.

The main thing is to ensure that your site is set up to effectively target other countries – especially if you are an international company. An example could be a retailer that sells clothes to the UK, US, Australia, Spain, France and so on.

With this in mind, there are a few things to consider in how you approach this:

  • Does your content all sit on the same domain, or separate domains?
  • Is your content being translated for each site?
  • Ensure there’s hreflang tags in place

You could also look at redirecting users based on their IP/location, however Google have advised against this in the past, so it’s best to ensure that your site is set up to serve international users correctly.

So, with those above considerations in mind, here’s how you go about it.

Subfolder or Ccltd (A Country Specific Top Level Domain)

Firstly, does your site have everything at www.yoursite.com, and then a subfolder (such as www.yoursite.com/us/, or www.yoursite.com/au/), or will they have their own domain?

Either way, the first thing to do is check that they’re added to Google Search Console, and set to target the correct location – you’ll need to use the old version of GSC to do this:

international targeting

Check that a country is set (sometimes this is missed during the initial GSC setup), and if there isn’t one defined, simply click the tick box and select your target country.

A CCTLD is a great signal to search engines that you want to target a specific country, for example www.yoursite.it is obvious that you want to be targeting users in Italy, however this will mean that no link equity is passed from the main domain, compared to if it were to reside at www.yoursite.com/it/.

Check what solution your site has in place (if any), and ensure that international targeting is set up in GSC.

Is Your Content Translated Correctly?

While having translated content is a great signal to search engines as to where you want to target, having poorly translated content is a big no-no, and can do you more harm than good.

Make sure that your content is translated to the language you’re targeting by someone who speaks the language fluently or it is their native language – having machine based translations or relying solely on Google translate is a recipe for disaster.

Hreflang Tags

A hreflang tag is a way of telling search engines that another version of your site exists in another language, or in the same language targeting a different country (for example, you may have an English site that targets England, Ireland, USA and Australia).

Without the tags, it’s duplicate content which could lead your sites to not rank as highly as they could/should be doing. Check out Google’s advice on hreflang tags here.

Check the source of the site you’re auditing, and see if there’s code similar to the below:

<link rel=”alternate” hreflang=”en-gb”
href=”http://en-gb.example.com/page.html” />
<link rel=”alternate” hreflang=”en-us”
href=”http://en-us.example.com/page.html” />
<link rel=”alternate” hreflang=”en”
href=”http://en.example.com/page.html” />
<link rel=”alternate” hreflang=”de”
href=”http://de.example.com/page.html” />
<link rel=”alternate” hreflang=”x-default”
href=”http://www.example.com/” />

Important: whilst you’re telling search engines that there’s several versions of your page, you also need a ‘self-referencing’ tag that tells which is the original.

The Screaming Frog crawl should find all pages that have hreflang tags –when you’ve crawled the site, click the “hreflang” tab:

Hreflang tags

Img source: Screaming Frog

This will then give you a list of URLs and if there are any issues with the hreflang tags, such as incorrectly set up, not containing a self-referencing tag etc.

If there are no tags in place, but you have an international site, you’ll need to create tags for each site.

Aleyda Solis has covered this in her blog post about how to avoid issues with hreflang tags, and as an additional bonus she’s created a fantastic, easy to use hreflang generator that creates the tags for you.

If you want to check the tags on your own site, visit https://hreflangchecker.com/#/

Important: Sometimes; hreflang tags can be found in the XML sitemap instead, so if you don’t see the tag in the HTML of the page, don’t automatically assume that there are no tags in place.

Content

The content on your site is one of the most important factors in where your site ranks within a search engine’s results – does it match user’s queries, is it factually correct, is it unique to your site, is it up to date – all these things can play a part, so it’s vital that your content is as optimised as can be.

Duplicate Content

Duplicate content is when the text on a page on your site has the exact same content either somewhere else on your own site, or on an external website. Search engines reward great, unique content; so if your content is duplicated across several pages of your own site, or copied from somewhere else, you’re not going to rank well.

There is no actual penalty for having duplicate content, but Google will crawl your site less frequently, and de-value your site over another that has regular, informative, unique content.

To check if you have duplicate content on your own site, or if someone else is using your content, you can use Siteliner or Copyscape – they both offer free and premium versions, however one of the best/simplest ways is to take some lines of text from one of your pages, and paste it into a Google search with “ before and after the text, for example copying everything below including the “:

“The use of mobile devices to surf the web is growing at an astronomical pace”

Duplicate content

There are several thousand results all using the same text –if someone has duplicated your text, you can mail the site to ask them to alter it, however chances are you’ll not get a reply. Instead, it might be worth rewriting your copy slightly so that it is unique to your site.

If you have duplicate content across several pages of your own site, then make a list of the affected pages, and create unique content for each page, removing all instances of duplicate content.

Missing Page Titles

The page title is one of the most important on-page SEO elements, as it tells users and search engines what your page is about, and helps your site stand out in the SERPs.

In Screaming Frog, click the ‘page titles’ tab, and filter by “Missing” – this will show any pages that have no title tag on the page. Export this and add it to a new tab in your excel doc, so you have a list of all issues in one place.

Missing page titles

For all pages that should have a tag, create a new page title that meets the following requirements:

  • Unique to that page
  • Within 65 characters
  • Is descriptive of the page
  • Contains your target keywords

Duplicate Page Titles

A duplicate title tag is confusing to search engines – if you have several pages with the same title tag, how is a search engine spider meant to know which is the correct page to rank/index within its results?

In Screaming Frog, click the ‘page titles’ tab, and filter by “Duplicate” – export these and add to a new tab in your audit file.

Following the same guidelines as above, create a unique title for each page that has a duplicate title tag.

Short Page Titles

Whilst not a huge issue, page titles can contain up to 65 characters, so if your titles aren’t utilising this space, you could be adding extra keywords in there to help your pages rank higher. In Screaming Frog, click the “Page titles” tab, and filter by “Below 30 characters” – here you’ll have a list of all page titles that potentially could be expanded on.

Missing H1 Tags

An h1 tag is an HTML tag that describes what is on the page, and usually acts as the main heading on the page.

Search engine spiders use the page title, headings and content on the page to understand what the page is about, so it’s important to have an optimised h1, however sometimes depending on your site’s theme, h1s can be missing completely.

Using Screaming Frog, click the h1 tab, and filter by “missing” (see below)

Missing h1 tags

This will give you a list of all pages missing an h1 tag. Export these and add them to your spreadsheet so you have a list of all pages that need new h1s.

Multiple H1 Tags

Generally, pages should only have one h1 tag per page, however can have as many h2s/h3s and so on as you’d like. When there’s more than one h1 on the page, these can sometimes be ignored by search engines, so you’re wasting a great opportunity to send an extra signal to search engines.

Use Screaming Frog again, and instead filter by “Multiple”, this will show you all the pages that have multiple h1 tags. Add these to your missing tab, and you’ll have a list of all pages that need h1s redoing.

Meta Descriptions

The Meta description (pictured below) is the messaging you see in the search results below the page title and URL:

Meta Descriptions

While not a ranking factor, the right messaging can help improve click through rate (CTR), and therefore they should be addressed alongside your on page content.

Within the screaming frog exports, we’re looking the following:

  • Missing meta descriptions
  • Short descriptions
  • Duplicate descriptions

Using Screaming Frog, click the “meta description” tab, and as before, filter for each of the description types:

meta description types

Export missing, duplicate and below 70 characters, and save them to your spreadsheet in a new tab, as these will need new Meta descriptions creating for all of them.

Guidelines for Creating Meta Descriptions:

  • Use up to 165 characters – anything more may be truncated in Google’s results. We have previously seen some descriptions up to 300 characters, however in order to ensure that we’re not overdoing it, stick to 165 characters
  • Be descriptive of the page, but include a call to action such as “visit us today”, “contact us now”, “buy online now from ___”
  • Ensure your meta descriptions contain the target keyword of the page, and are unique to each page

Canonical Tags

A canonical tag is a line of code that tells search engines that there’s a specific version of a page or URL that we want them to index.  As mentioned earlier in the audit, a page can resolve at several URLs:

When this happens, we need to tell search engines which page to index, so we need canonical tags in place. Below is a live example from Thomas Cook’s website:

Canonical tags

Canonical tags

We’re looking for any pages that are either missing a canonical tag, or are pointing to a non-indexable page. Export the ‘missing’ and ‘none—indexable canonical’ tabs, and add them to your spreadsheet. Any page that doesn’t have a canonical, ensure there’s one in place – if there’s not a correct page to go to, simply add a self-referencing canonical (a page pointing to itself).

Any non-indexable ones (such as if the canonical page is returning a 404 error) should be updated so that they canonical to live pages.

Search Console

Search console is a free tool from Google that helps track the performance of your website within Google’s results. It can be used to highlight technical issues, what terms people are finding your site with, and any other issues Google is having with your website.

Firstly, check if you have access to your site’s search console account – if you’re working for a client, ask for full access so you can do your checks effectively.

Search console has recently been updated, with some features having been removed, and others set to follow in future, so for the purposes of this review, we’re focusing on the latest version of GSC.

When you first login, you will see a page that looks similar to the below:

Search Console

Select your site from the top left of the screen,  and you’ll see a menu that looks like the below:

Search Console

We’ll run through these in more detail.

Performance

The performance section highlights how your site is doing in Google’s results in terms of search queries, clicks, impressions, CTR and average position:

queries, clicks, impressions, CTR and average position

Simply click on each of the different headings highlighted, and you’ll see which pages are performing best, which queries are driving the most traffic to your site, along with which country your users coming from and what device types.

  • Clicks – self-explanatory; the number of people clicking on your site from Google’s results. If the click volume is low but there’s high impression volume, we know it’s worth targeting that term with our on-page copy
  • Impressions – this is the number of times your site was seen in Google’s results for a particular query.
  • Average CTR – average click through rate, so what % of users actually click through to your site from Google’s results
  • Average position – the average ranking for that particular keyword within Google’s results. This figure can often by misleading, so it’s always best to do a manual check, and any 3rd party tools if you have access to them.

From this data, you can see which terms people are searching for, and where you potentially need to improve – if there’s a high impression term that has a low CTR, then you can start optimising your site for these terms.

URL Inspection Tool

The URL inspection tool gives you up to date information about how Google is seeing and indexing your content within its results. Simply click the “URL inspection” tab on the menu, and paste in your URL – you’ll be greeted with a page similar to the below with a list of any issues:

URL Inspection tool

Within the tool we can see:

  • If the page is indexed, if not we can request indexing simply by clicking the “request indexing” button
  • View a rendered version of the page
  • Inspect a live version of a particular page
  • View a list of loaded resources
  • If your URL is in your XML sitemap or not

Depending on how your site is seen by Google, you’ll get varying results depending on if it its indexed or not, and any issues Google may have faced (such as the page is being blocked, is an alternate version of another URL, if the page is returning a 404/500 error) or something else. A full list of status issues can be found here.

Check your most important URLs with the inspection tool, and if any issues are flagged, work on these as a priority.

Coverage

The coverage section shows you how many URLs are in Google’s index, how many pages are missing, and how many are showing errors (if any). Click “coverage” and you’ll see a screen similar to the below:

Coverage

This is probably one of the most important sections within the new search console, as it will flag pages that are returning crawl errors. If your site is showing any errors, click on the error and you’ll see the offending pages:

server error

If for example you have pages that are returning a 500 error, you can export this list and set about fixing the pages or redirecting them. Once you fixed them, click the “validate fix” button which tells Google you’ve fixed these pages, so they should stop showing in future reports.

Some of the most common errors are:

  • Server error (5xx)
  • Redirect error (such as a redirect loop where a page redirects from one URL to another, to another and so on)
  • Submitted URL blocked by robots.txt – sometimes this may have been done accidentally, but at least it’s flagged in here and we can update the robots.txt file if necessary
  • Submitted URL marked ‘noindex’ – a noindex tag tells search engines not to include that particular page within its results; any pages with this tag will be flagged here.
  • Submitted URL seems to be a soft 404 – a soft 404 is a page that is broken, but returns a 200 status code (which means that the page is working correctly)
  • Submitted URL returns unauthorized request – this could be that Googlebot was blocked, or the content is behind a login/paywall
  • Submitted URL not found (404) – the page isn’t working and is returning a 404 error
  • Submitted URL has crawl issue – use the inspection tool to see what the issue is, if it is not one of the above specified issues.

If a URL is being flagged here, click on the URL and you can dive into this further:

flagged url

Click “fetch as Google” and try and see what Google is seeing with that particular page, and you may  begin to understand why the page is broken.

Sitemaps

We touched on submitting your sitemap earlier, however this gives an overview of any issues that may have arisen with your XML sitemap, or you can submit another sitemap here.

Ideally, you want to see something similar to the below with no issues:

Sitemaps

If there any errors, they’ll be flagged in here – there are quite a lot of potential sitemap errors, outlined in Google’s page about them here.

Mobile usability

The mobile usability section shows which pages (if any) have problems when viewed on mobiles. The first page (pictured below) gives an overview of some of the issues, then you can drill down into each individual issue:

Mobile usability

Click on each issue, and it will show the offending URLs:

Mobile usability

To fix these issues, simply head back to the mobile friendly section of the audit and implement those fixes, and click “validate fix” within search console.

Manual Actions

A manual action is effectively a penalty from Google, usually handed out for doing something that is against Google’s guidelines.

What you really, really want to see on this tab is the below:

Manual Actions

A manual action is quite hard to remove – you’ll have to clean up whatever you’ve done wrong, and request a review from Google.

The most common manual actions are for:

  • User generated spam – for example if there are 1000s of profiles on your site with usernames related to products, linking to external sites
  • Unnatural links – either pointing to your site through link building practises, or from your site if you’re selling links to 3rd parties, or if your site has been hacked for example and there are lots of links placed within the code of your site
  • Thin content – this is mainly if you’ve been copying other people’s content, or have poor quality pages that exist purely to make money, such as low quality affiliate pages or doorway pages
  • Pure spam – this is one of the worst and hardest to overcome; you’ll need to remove all offending code from your site, clean up your content, sort your links and request a review, and hope that you’ve done enough to clear your site.

There are other types of manual actions – head over to Google’s help section for more information on the less common issues.

One of the most common causes of a manual action occurs when someone buys an expired domain, puts new content on there and then receives a penalty. If this happens to you,  request a review, and include a snapshot of what the old site looked like from https://archive.org/web/, highlighting what you’ve changed, and you should get your manual action removed.

Security Issues

If your site has been hacked, users may be alerted with a warning in Google’s Chrome browser. They’ll also flag in here if there are any security issues; usually showing the URLs that are affected and what the problem is. Typically, the cause is the site has been hacked through a 3rd party plugin, and your content has been changed on your site without you knowing.

Sometimes, phishing pages can be setup to try and capture user’s personal details, so keep an eye on this section to see if there’s anything wrong with your site.

Links

The links section shows you all of the links to your site that Google has seen. There may be some that you know are live that aren’t showing in here yet, but don’t worry – over time they’ll start to show.

Within this section, you can see:

  • Which sites link to you
  • What anchor text is used
  • Which pages are linked to most often
  • Internal links throughout your site

external links

With this data, you can identify which areas need more internal and external links to try and help boost your visibility overall.

How to Setup a New Gsc Account

If you’ve not already got a GSC account, it’s really simple to get started and set one up. Using the menu, click “add property”:

How to setup a new GSC account

You’ll then be asked which type of property you want to setup – we recommend the domain level one highlighted below:

How to setup a new GSC account

You’ll then be required to verify the site via DNS record, using the code provided:

How to setup a new GSC account

If you don’t have access to this, setup as a prefix property instead, and add the HTML file to your site, or the easiest way is to copy and paste the HTML tag into your site’s header:

add HTML file to your site

Once you’ve added the tag, click “verify”, and ‘done’ and you’re ready to go!

If you’re using WordPress, you can use the Yoast SEO plugin, copy the HTML tag and paste it into the dashboard and that’s it.

Yoast SEO plugin

Once you’re verified, give it a few days and wait for the data to populate in search console, then you can monitor your site’s performance on a regular basis.

Summary

So, you now know how to audit a site, and which parts are the most important (indexation, mobile friendliness, how your content is served to both users and search engines). All of the elements outlined in this audit play a vital part in how your site runs technically – so, now you can go and build solid foundations for your own successful SEO campaign.

 

Sep 02

10 Reasons Why You Need SEO

By Aires Loutsaris | Marketing

Your business certainly needs SEO. Google has refused to reveal the exact numbers but it is said that there are over 1 billion searches done every single day and that 20% of these are done to try and find local businesses. If you want your customers to be able to find you on the internet then you need to try and do everything you can to get yourself noticed and the best way to do this is through SEO.
Reason-why-you-need-seo

  1. Branding

SEO can help you to establish your brand and even your business online. It gives you the chance to lead your customers and even your clients to your site and it is a great way for you to build the presence that you have online as well. When you take a look at your website, is it easy to navigate? If it’s not then this could go against your SEO and your brand authority in general so you need to do everything you can to establish your brand online through SEO if you want to be successful.

  1. Cost Reductions

If you think about how much you spend on advertising in the local paper and anything else of the sort, you’ll find that the return isn’t justifiable and the money that you lose on it is significant. If you spend your money on SEO however, then a return is almost always guaranteed if you do go through the right provider.

  1. Traffic

When it comes to SEO, your business website will receive traffic. If you do not use SEO then you won’t be able to target your traffic and you will also find it really hard to get the leads that you need as well. Your customer will consider you an expert if you are at the top of Google.

  1. Sales

SEO will boost your sales. Think about it, if a client finds your site, this will help you to bring in more sales and if you have a good level of SEO then your sales can be boosted even more.

  1. Targeting

There is a whole market out there that is trying to find you, and SEO can be a great way for you to tap into this market.

  1. Competitiveness

When you have SEO, you can easily outrank any competitors that you have and you can also gain a solid edge on the local market as well. Your online presence can easily help you to boost the brand image that you have and you can also take their customers from underneath them.

  1. 24/7 Operation

When you have a good SEO plan you’ll find that it will work around the clock to drive business for you. This is cheaper when compared to any other marketing plan out there and even if you did find one that is suited to your business, you’ll find that it doesn’t run 24 hours a day, so that’s something for you to think about.

  1. ROI

SEO can provide you with a huge boost for your business. The problem is that advertising is costly and it can also give you zero results as well. With SEO however, you can have thousands of people who are genuinely interested in your product and this is because you are targeting them in specific, so you know that you will always have the chance to make your money back and then some profit through your own sales.

  1. Authority

If your business is very prestigious, that’s great news but if you do not have any authority online then this will really hinder your site. Your customers and visitors won’t consider you to be an expert and you will lose sales. If you have a good level of SEO however then you can easily avoid all of this.

  1. Get Found

If you have good SEO then you can easily open up your business to a ton of new customers and it is a great way for you to make sure that you are doing everything you can to expand. SEO really could hold the key to success and it is a great way for you to get started with your business online.

So as you can see, there are a ton of benefits available if you do want to start up a business and there are plenty of benefits to getting involved with SEO as well. These two concepts work very well together and it really isn’t hard for you to get to the top of Google for your chosen keywords either. If you do want to get started then all you have to do is contact your local SEO provider today and they will work with you to give you the best result out of your marketing efforts and even your content in general so do keep that in mind.

Top 25 SEO Blogs You Need To Read
Sep 02

Top 25 SEO Blogs You Need To Read

By Aires Loutsaris | Marketing

If you want to get some quality reading material to kick-start your morning then you know that you have come to the right place. Here you can find out everything you need to know about SEO as well as finding out some of the top blogs that you should be reading on a day to day basis.
SEO-Blog

  1. Search Engine Land

Search Engine Land is one of the most popular SEO sources. You’ll find trends, guides, analysis and even tips and tricks.

  1. Search Engine Watch

This is one of the most important SEO resources on the web. It covers news and any Google updates, with plenty of readable posts by some of the top experts in the industry.

  1. Search Engine Journal

This is a very popular blog and it thrives on guest posts. This means that you can find a ton of different voices on a range of topics.

  1. The Moz Blog

If you have been in the world of SEO for quite some time then you will most likely have heard of Moz. They post every day, with valuable new reading material.

  1. HubSpot

Hubspot is a top rated blog and it is known for its very unique way of writing. This makes the whole thing easier to understand so you know that you won’t have any issues there.

  1. Google Webmaster Central

If you know that you have a blog then you will have to find ways that you can run it successfully. This site can help you to do that.

  1. SEMrush

SEMrush is an SEO blog that can offer you a very competitive analysis tool and it also has a great blogging platform.

  1. Yoast

Yoast is super user friendly and it is designed so that it can provide you with the knowledge you need to feel confident on a huge range of topics. This includes social media, conversions and more.

  1. Search Engine Roundtable

This site is a very popular source if you want to find out more about SEM. It’s run by experts and it can help you to boost your blog statistics.

  1. Akash Srivastava’s SEO Blog

This blog is focused on providing actionable tips. It’s a great way for you to learn everything that you need to know.

  1. HigherVisibility

This SEO blog can put you on the right path to getting more visibility for your site.

  1. Content Marketing Institute

The Content Marketing Institute can provide you with the best content and they can also give you the best SEO practices.

  1. SEO Book

The SEOBook was founded in 2003 and they had the purpose of training those who need it in online business tips, training sessions and even marketing tips.

  1. CopyBlogger

CopyBlogger focuses on copywriting as well as helping you to create SEO friendly content, starting today.

  1. KISSmetrics

Kissmetrics was launched by a digital marketer and the site is all about how you can optimize your site to get the most out of it in general.

  1. Quick Sprout

Quick Sprout focuses on traffic and analytics, with practical tips that can help you to boost your site.

  1. Clickz

Clickz is all about the latest SEO updates and it can also help you to find out what’s happening in the world.

  1. Distilled

Distilled is an online marketing company and they can help you with creative writing,  consulting, technical SEO and so much more.

  1. SEO Hacker

SEO Hacker can provide you with the best guides on SEO and they can also help you to boost your conversion rate and even your optimization as well so do keep that in mind.

  1. Seer Interactive

Seer Interactive is a top marketing company that can help you to optimize your blog and it can also help you to get all of the knowledge you need on paid search, SEO tools and more.

  1. SEO.com

SEO can show you how you can take advantage of the latest SEO trends and you can learn a lot about link building on this site.

  1. SEO by The Sea

SEO by the Sea can show you how you can study SEO intensively as well as understanding the future ranking signals that Google plans on imposing.

  1. iAcquire

IAcquire is a blog that can offer you a different perspective on various marketing strategies. It can help you to understand the psychology about SEO techniques as well as helping you to know if they work.

  1. BacklinkO

This site is one of the top blogs around. It is easily the best blog if you want to capture the best result out of your site.

  1. TopRank Online Marketing

TopRank Marketing is a great way for you to help companies and even organizations to get their marketing services integrated. It doesn’t matter whether you need help with your SEO, SMM and so much more.

 

Sep 01

SEO Vs PPC: Who Wins?

By Aires Loutsaris | Marketing

If you want to know the difference between SEO and PPC then you certainly have a lot to think about. The first thing that you need to think about is whether or not you plan on selling a product, and what your company goals are. For example, if you know that you plan on selling a product or even a service then there is a huge difference between the two. If you want to find out more, then take a look at the difference between PPC and SEO.

SEO VS PPC
SEO-PPC

SEO isn’t just about optimizing your site and it isn’t just about ranking highly on Google either. It’s all about being an authority and you also need to make sure that you are continually being the best on the search engine as well. With major search engines such as Yahoo, Google and even Bing, they look at how people react when they come to your site and if they come back. They also look at the sites that are linking to you. When you take a look at PPC, it’s all about paying for the advertisement space that you want for your targeted keywords but you need to make sure that you understand the concept of it before you go and implement it with your own site.

With pay per click, your advertisement will show at the top of Google and it will also have the word advertisement next to it. This is great if you are trying to sell a product and it is also great if you are trying to do everything you can to get the best result out of your site in terms of your brand awareness, but if you are just trying to drive traffic to your blog or anything else of the sort then this is not the way to go.

Your Strategy

The campaign that you have all depends on your advertising strategy. The major advantage that SEO can offer you is higher quality leads and it can also help you to know if you are actually getting a targeted customer base as well. A lot of users have trained themselves to actually ignore paid results when they are browsing the web and this is because they are not within the target audience that the advertisement is trying to target. For this reason, you need to do everything you can to target the right audience with your advertisements and with flawless efficiency.

The truth is that there are mountains of data that suggests that you should use natural searches to try and gain visitors and that natural searches mean that your customer is also much more likely to trust in your company as well. If you do rank high for a specific keyword then it is a clear sign that you are credible and it also shows that you are an expert in the industry as well. Of course, it is important to know that SEO is not free. No matter how you look at it, it doesn’t matter whether it is your own time or whether you hire a vendor outside of your business to take care of your SEO for you, because it always comes with a cost.

So Should You Use PPC?
SEO-vs-PPC

Don’t get rid of the thought of PPC just yet. PPC comes with a ton of extra benefits. For example, if you set up your campaign well enough then you will certainly see a huge return on your investment and you will also find that it is a great way for you to get to the number one slot. Remember that it isn’t the search engine’s number one priority to give the person who pays the most the top slot, but it can be a good way for you to get a short-term gain on your investment that could potentially turn into a long term return.

So as you can see, there are some differences between PPC and SEO and it really does help to know them so you can take advantage of what they both have to offer. If you want to find out more about PPC or even SEO then all you have to do is get in touch with us today to find out if we can help you. We can’t wait to show you why we are the best at what we d. You can get in touch with us by phone or even by email and this is the best way for you to find out everything you need to know about our team. We are here to help you and we are always happy to help you. You can get in touch with us at any time and we can even discuss your own site with you as well.

>