19 Actionable SEO Tips


In this guide, I will be providing 19 SEO tips for businesses, webmasters and those wanting to learn more about SEO, as well as offering expert advice from some of the best-known names in the industry.

  1. Advanced keyword research

  2. Aim for Featured Snippets inclusion

  3. Try video optimisation

  4. Better image optimisation

  5. Don’t overlook local SEO

  6. Understand why internal linking is so important

  7. Investigate topic modelling

  8. Look to intelligently raise your click-through rates

  9. Find new ways of using the new Google Search Console

  10. Decrypt “not_provided” organic keywords in Google Analytics

  11. Implement Hreflang for international SEO

  12. Consider your page load speeds

  13. Use the correct redirects

  14. Ensure that your site is on HTTPS

  15. Optimise your site for mobile

  16. Optimise your site for voice search

  17. Don’t overlook schema.org markup

  18. Utilise social media to your advantage

  19. Bonus tip: Log file Analysis

It goes without saying that over the past ten years or so, the world of search engine optimisation (SEO) has become increasingly technical, with webmasters and marketers constantly needing to adhere to and keep up with a range of evolutions regarding algorithms, web development, and security.

With this fact in mind, it’s important to learn and understand some of the latest technologies, advancements, and guidelines in and around SEO and website optimisation.

Advanced keyword research

Often the starting point of any campaign, keyword research is a fundamental part of SEO, and it helps webmasters and businesses provide the right information to search engines so that users can find them.

Like most other elements of SEO however, it is also one that has seen great evolution over the past few years.

Google launched the hummingbird update several years back, it marked an important milestone for search. it also meant that Google now works to understand the semantic relevance of words.

This means, therefore, that significant amounts of thought must be put into research and search terms, so that it encompasses a range of considerations such as semantic keywords, longtail queries, queries that form questions, and queries that could be unique to voice searches on mobile devices (such as “where is the nearest independent coffee shop in London?”).

Advanced research means understanding query syntax, understanding your audience, search intent and of course, knowing how to map the keywords to your content. You can use Google trends analyse trend and predict search intent changes.

Here is a great article about how to predict search intent changes.

When doing keyword research one of my favourite thing to do is to categorise and group keywords.

I often create Keyword buckets and split the groups into different cycles such as TOFU (Top of the funnel)MOFU (Middle of the funnel) and BOFU (Bottom of the funnel)

Image credit: weidert.com

It helps me identify and prioritise keywords for different stages of a buyers cycle. Let’s take a look at an example.

Top of the funnel – In this stage, a user is often looking for a solution to a problem or trying to find information. It’s often considered as an ‘Awareness’ stage.

e.g: Is it possible to repair a broken iPhone screen myself?’

Middle of the funnel – In this stage, a user has often acknowledged the problem and often looking for a solution. It’s known as a ‘Consideration’ stage.

e.g: ‘Repair a broken iPhone screen

Bottom of the funnel – In this stage, a user is likely looking to buy or looking for a solution. It’s the ‘Ready to buy’ stage.

e.g: ‘iPhone screen repair specialist

So when you do keyword research and create a content plan its important to convey this to your writers. So, they can write content to cover all stages of users buying cycle.

You should always go beyond the traditional approach of looking at keywords with the highest search volume, CPC and look at the search query intent and serve the right type of content for the right stage of the buying cycle. Here is a fantastic article about doing cohort keyword analysis if you want to take your keyword research to the next level.

Aim for Featured Snippets inclusion

Often referred to as “position zero”, Featured Snippets are the small segments of information that often appear at the top of search results, just above the top organic search results.

Featured Snippets offer users direct answers to their queries without having to search through the results to locate information.

Go to www.google.com and search “Who is Suganthan Mohanadasan?”

As you can see I’m capturing the Featured snippet.

Here is an example from one of my clients in the UK.

Here is an example of how I’m outranking influential sites in one of the most competitive niches in the UK and landing Featured snippets for my client.

This means therefore, that the content that websites offer must be original, informative and concise. One of the most effective ways to get featured snippets is to provide crystal clear information in the best possible format. You can optimise your queries for different ways. Here is a great article about how to capture Featured snippets.

Here is another example of how my client is outranking a high authority site using Featured snippets.

Having a page appear in a featured result can provide a great range of benefits for a website, including becoming the authority on the subject, appearing above competitors in results, and receiving more traffic.

I asked James Norquay at Prosperity Media about how to acquire Featured snippets: His answer was: “You need to reverse engineer who is already showing for the snippet and plan how to rank for it. As there are multiple types of Featured Snippets it can differ. Answering the question, ensuring that you use featured snippet markup, formatting of images, ensuring that if it’s a table result it’s well formatted.

He continues, saying that: “It also comes down to the word count of the snippet. Other factors which I’ve seen come into play are the authority of the site, ensuring the site’s responsive, using ordered list and ensuring great social shares on the site.”

Try video optimisation

Now more than ever, other forms of media such as images and videos are being used to provide detailed information to both users and search engines.

Google started showing video carousels for different queries, and if you have a business that has video opportunities, then I encourage you to create and optimise them for capturing these carousel positions. You will drive traffic and views to your videos, and that will help them rank on YouTube as well.

Here is an example where video carousel is showing in Serps.

Early in 2018, Google began providing users with videos within Featured Snippets that could answer their queries. What’s more, it began highlighting segments within the video that directly concerned the search query used. But keep in mind that getting video carousel might not always be a good idea especially if you have an E-commerce site.

You can see Google identified the exact time stamp when the actual coffee making is taking place.

This means that webmasters, businesses, and even filmmakers (in some cases) must do more than just provide basic descriptions and titles to videos to drive traffic to them:

For example, providing accurate and faultless transcriptions yourself provides a slight edge over competing videos with automatic transcriptions.

What’s more, if your video contains steps (much like a recipe video would have), identify these in the video, including the timestamp of each one. You should also make the voice over clear and align it closely with your transcript.

Read more about optimising video in greater detail here.

Not only that it’s a smart idea to create and populate your Youtube channel with videos related to your business. It can also help you with reputation management.

Better image optimisation

Images have always been a vital SEO tactic, if sometimes overlooked. Ensuring that your images have the appropriate descriptions, tags, and compression has always been necessary, especially for mobile users.

Like other areas of SEO however, Image Search is continually changing, with an algorithm update having occurred only last September.

In an announcement on its blog, Google said that it was going to provide more immersive visual content with AMP stories, that topics were to be visually viewable in search engine results, and that it was going to provide more information with every image shown.

Google recently confirmed that they use image recognition to identify content in an image. However, They do not use image recognition on all images.

This means that webmasters must provide as much information as possible with every single image.

In the following examples you can see how powerful Google’s vision system is;

Google vision is able to accurately detect the uploaded image as a sports car (A Tesla Roadster 2020)

In this example the vision is able to detect sentiment.

You should also think about using next-gen image formats if possible. I have covered this in a separate article in more detail. Google is recommending this, but it’s important to remember that not all browsers support next-gen image formats.

Google Light house Audit is recommending the use of Next-gen image formats.

What’s more, Google also investigates the context of images, which means that if your image doesn’t provide the same contextual information as your competitors, the image might be considered to be less useful, which means the image will be outranked.

Furthermore Google confirmed the importance of adding text along with the images.

You should also ensure your images are correctly tagged (names, alt text, captions) and used with the context to be ranked by Google. You can win image carousels in serp and rank them in image search results.

Example of how my name is triggering an image carousel in SERP

Don’t overlook local SEO

Imperative for local businesses, by investing in Local SEO, small businesses and start-ups can cater to audiences and potential customers that are local to them.

In October 2018, it was reported that 46% of searches now have a local intent, which means that companies must optimise for their local areas to keep on top of competition.

There is a variety of ways in which you can optimise for local searches, including to ensure that you are accurately providing your NAP information (Name, Address, Phone number), and that you are registered with Google My Business, as well as Bing Places for Business.

In this example I have optimised my consulting business ‘Basic Gravity AS‘ to rank for local search. I’m outranking bigger and more established agencies in my City.

Other factors, such as having citations in local pages, using locally relevant keywords, appearing in both local and industry directories (a form of “natural” link building), and curating your reviews, can help your exposure within local SEO.

If your business has multiple locations, it is also worth considering optimising for them, but only when relevant.

Optimising your Google business page can help you rank for ‘near me‘ and ‘around me‘ searches as they’re becoming popular with mobile devices. Here is an excellent post about Local SEO from Accuranker and I have contributed to this article about optimising for ‘Near me’ searches.

I have written about Dental SEO in detail which covers a lot of Local SEO tactics.

It’s important to note that you have the ability to post to your business page and it will show up in Google search when someone search for your brand.

If you have a nationwide business you should consider creating landing pages to target different cities.

Here is an excellent example of city-based landing pages done by Interflora (A florist in Norway) You can see how they have a dedicated page for every city with optimised URL, content and breadcrumbs.

Understand why internal linking is so important

Internal links are incredibly important to any website, as they help both internet users and crawlers discover new pages and areas.

This means that users and crawlers should be able to navigate through your website seamlessly; being able to reach the homepage from any other page in just a few clicks (two to three is recommended).

Although poor internal linking structure isn’t something that is penalised, it does offer poor usability, which is an important consideration for Google.

Ultimately, a poor linking structure can hamper organic growth, so ensure that all your important landing pages are linked, and if you create blog posts, ensure that any services or products that are mentioned contain textual links to their relevant pages.

At the same time, However, keep the links on any page to a reasonable amount, as an excessive number in any one page can indicate that the page is being used for spam and may hamper your SEO efforts. But, In a recent AMA with Reddit Gary Illyes from Google confirmed that there are no over optimisation penalty for excessive internal linking.

We hope this is a legitimate answer from Gary

I spoke to Andy Drinkwater (A prominent SEO based in the UK) about this and he was sceptical about Google’s stance. He said: “I wouldn’t believe everything Google tells you. I have seen things go well by getting the anchors and ratios correct and then go downhill when it’s gone beyond a certain point. Worth keeping in mind that these penalties will be algorithmic rather than manual.”

If you’re struggling to build a process for Internal links, Kevin Indig has developed a fantastic internal linking framework called TIPR. You can read about it here.

Investigate topic modelling

MarketMuse did a study and found that one of the key components of Topic modelling is creating comprehensive and in-depth content around your topics and make them easily accessible by the search engine crawlers.

Over the years, leading engines have modified their search algorithms so that they concentrate on and favour topic-based content.

As a result, SEOs and webmasters are having to adapt to topic-based algorithms, and one of the solutions of this is to create a topic cluster model of site architecture.

I asked Nick Eubanks, founder and CEO of From the Future about the importance of Topic modelling. He said: “At this point, topic modelling has become an essential part of crafting content you hope to rank. Google’s NLP algorithms have reached an inflection point of sophistication where it’s really not about word use but idea representation, and if your page isn’t checking all the boxes for matching the most specific intent of the query, you’re simply not going to rank any more (at least not for long).

He continues: “A topic model is like an SEO treasure map in that it shows you all of the concepts that one would expect to find, whether human or robot, on a page about a given subject – so you know which path to follow to dig up that top ranking position.”

The idea behind this kind of structure is to create a page (such as for a service or product) and to build related pages and articles around it. Bots and crawlers are then able to locate the pages and understand that they are semantically related.

This also means that users can easily flow from one page to the next while keeping within the desired topic. It also indicates that there is depth and thought being invested into the subject, which can boost the authority of that central page in Google’s index.

Here is an article I wrote about website redesign SEO and I have covered topic modelling in detail.

Look to intelligently raise your click-through rates

Although some tend to overlook organic click-through rates (CTRs), there are a whole host of reasons why you should take the time out to improve them, including the simple fact that they can help your exposure within search engine rankings.

What’s more, pages with high CTRs tend to have higher conversion rates, and increasing your CTR two times over can increase your conversion rate by as much as 50%.

There is a lot of speculation about whether CTR is a ranking factor and there are arguments supporting and denying the theory.

A prominent SEO Dan Taylor believes CTR data is just used as part of the wider algorithm assessment.

Britney Muller found references to CTR in Google’s documentation recently.

A recent Tweet about CTR’s impact on Rankings.

I asked Britney Muller (Senior SEO Scientist at Moz) about her view on CTR and rankings and she responded by saying;

“Why wouldn’t Google use pogo-sticking, CTRs and other engagement metrics as a measure of delivered SERP success? We know that CTRs shape personalized results. There are many examples of this. –One of my favourites is when Rand and I discovered Moz’s help hub pages were ranked above product pages on the Seattle office IP. We discovered this was most likely due to our incredible Help Team clicking on our help/support pages frequently to help customers. Google saw that our internal traffic was seeking those pages frequently and they delivered them above the primary product pages you see in external Moz searches.”

There are a range of intelligent ways of increasing CTRs, including:

  • Disposing of keyword-heavy formats in favour of high quality content.

  • Increase the quality and emotional range of your content.

  • Ensure that your audience can relate to your brand.

  • Use lists and bullet points when providing information.

  • Ensure your title tags use powerful words.

  • Keep your URLs descriptive.

I’m a big fan of Sanitycheck. A powerful tool which allows you to run experiments and AB split testing. One of my favourite ways of using this tool is to change Titles and Meta descriptions and running experiments to see if it helps increase CTR on poorly performing pages.

Here is an example AB test I did for a client page. As you can see the results showed negative results as changing the title didn’t make any positive impact on the CTR.

I asked Nick Swan at SanityCheck the importance of running Split tests.

He said: “Improving Click-Through Rates from Google by crafting more interesting titles and meta descriptions is like getting additional ‘free’ traffic from Google. But unless you test and record the results of title and meta description changes – how will you know what works?” He continues, saying that: “AB split testing titles and meta descriptions are the best way to get results you can trust. It removes seasonality changes (e.g: Bank holiday weekends) and reduces the likeliness of algorithm updates affecting the outcome of results. By having a control group and a test group you also don’t need to worry about the pages having the exact same amount of clicks before you make any changes as you calculate a new baseline based on the two groups. AB split testing is more work up front – but it gives the most reliable results.”

Keep in mind that there are other user behaviour signals used by Google.

Find new ways of using the new Google Search Console

In January 2018 Google announced that it was releasing the Beta version of its new Search Console to a broader audience.

With the new version came a whole new range of reports including:

  • Search performance

  • Index coverage (Here is a great article about it)

  • AMP status

  • Job posting

Although the new Search Console and the old version lived side by side for quite some time, It was predicted Google will shut down the older version by March 2019. But John Muller recently confirmed that the old version is not closing in March.

There is no better time to investigate and utilise some of the new functions and changes that Google has employed since its original announcement.

Even though Google has taken away some features from GSC, they have introduced some new and exciting features.

Only last month for example, 

 of a range of new features within the Inspect URL tool, including HTTP response, page resources, JavaScript logs and rendered screenshots.

Keeping up to date with all the new features is key to making the most of the incredible amounts of information that the new Search Console provides.

Decrypt “not_provided” organic keywords in Google Analytics

The term “not provided” in Google Analytics first started appearing in 2011 when Google decided to encrypt this data to protect users privacy.

With this change website owners were no longer able to see organic keywords that are driving traffic to their sites. But, why is it essential for website owners to know this vital piece of information?

Daniel Schmeh at keywordhero.com told me: “It’s important to map search queries with their consecutive sessions to understand how well the content resonates with a particular user’s need as expressed through his query.”

He continues, saying that: “Metrics involved could be sales or time-on-site for example. Webmasters can use this information to; A) Better understand their users and B) Improve the site to better fit their needs.”

There are however, a range of ways to work around this issue, including the use of tools.

You can reverse engineer this process using a tool like Semrush or Ahrefs. (Both paid tools)

Using Ahrefs ‘Top pages’ report, you can easily see the keywords ranking, and you can break this down by individual page.

As you can see the highlighted page ranks for over 119 keywords.

You can also automate this using a tool like Keyword hero. This tool uses machine learning to match queries from Google search console and potentially clickstream data and map them into Google analytics. So it’s an excellent solution for large sites, and you can put it on Autopilot as the system works in the background and creates a new view in Google analytics.

Keyword hero uses Machine learning to uncover encrypted organic keywords.

Implement Hreflang for international SEO

Put simply, Hreflang is an HTML tag attribute that tells crawlers and bots how pages relate to one another within different languages and countries.

Google uses the tag to serve the correct regional information or language to a user based on their country and language preferences.

If you have a website that tends to attract people from different countries and regions, it is integral that they are catered for. Although it might not help increase traffic, the tag sends positive signals to Google that you are catering to your audience and providing them with good usability.

I asked Matthew Howells-Barby, co-founder of Traffic Think Tank about the importance of Hreflang and this is what he said: “If your site contains content in multiple languages, or your site has alternative pages that target individuals from different countries, then there’s no excuse not to use hreflang. Hreflang solves two major problems with having a multi-language/country site.

The first is that it removes any duplicate content issues. Instead of Google looking at the content you created for France visitors as a duplicate of the content you created for UK visitors, Google will see that these are both pages that should remain in the index and be used for different audiences.”

He continued, Saying that: “The second problem this solves is that you will be able to tell Google which version of a page should be served to which person. If someone is searching in google.fr and they’re French-speaking, they should be sent to the French version of your page, targeting visitors from France. Without setting up hreflang, you leave it up to Google to surface the right version of the page, and while Google is great at many things, it often leads a lot to be desired when it comes to surfacing the correct version of a page with zero guidance from the website owner.”

For example, my consulting agency website www.basicgravity.com is available in two languages. English and Norwegian and I have setup href lang attributes to tell Google which version belongs to which language and country.

Because of this setup, my site ranks for keywords in Norwegian as well as English and shows up correctly in the corresponding locale.

Example Hreflang implementation

Hreflang is often difficult to get it right. There are several ways to implement this on your site. Make sure you understand how it works and whether you have a need for it before implementing. For WordPress sites If you use WPML plugin for multilingual sites, the Hreflang codes are automatically added to your pages.

WPML Automatically adds Hreflang Tags

Alternatively you can use Hreflang tags plugin to manually add the tags to your site.

Hreflang tags plugin is easy to configure.

You can also use Aleyda Solis Hreflang tag generator to create your tags. It’s important to test your tags before you go live and you can use an Hreflang validator to ensure your tags are correct. I use Dejans validator, and it works great.

Dejans Hreflang validator is a great tool.

Consider your page load speeds

With more people using and searching through the internet using mobile devices, ensuring that you provide good page load speeds is integral for a wide range of reasons.

Not only do slow loading web pages negatively impact your bounce rates, but load speed is also an incredibly important ranking factor, especially as more sites are placed on the Mobile-First Index.

There is a range of tools for checking how user-friendly your pages are for both mobile and desktop users, but Google’s PageSpeed Insights tool is incredibly useful.

If you have a slow site, your developer or a CMS specialist can help speed things up. There are different ways to improve the performance of your site. However, what happens when you have a large site which often changes and how do you keep up with the changes? Some unexpected change could have reduced the speed of the site significantly. It’s challenging to keep monitoring the speed regularly.

But, you can automate this task using Speed Monitor.

SpeedMonitor signup page

Once you set up your website the tool, it will continuously track your sites speed and provide you with history. You can setup speed thresholds and get notified if the speed goes below a certain level.

Page speed data history is a killer feature.

A speed update was rolled out as recently as July 2018, although, according to John Mueller (Google’s Senior Webmaster Trends Analyst), the update only affects the slowest websites. You can see an interaction on Twitter below:

I asked Cindy Krum at Mobilemoxie on how a page load speed can affect a sites rankings and her response was: “As SEO’s have been talking about for a long time, page speed is important. Google says that page speed is only a negative ranking factor if pages are especially slow, but in very competitive searches, there is a chance that it will be the thing that gets you to rank better than your competitor. What is equally important though, is that fast page speed drives engagement and conversion, so even if you don’t get better rankings, you can hit more KPIs with the same rankings.”

She continued, Saying that: “Smart SEO’s will be thinking about page speed in terms of Deferred JavaScript Rendering. Making this process quick and painless for Google will probably ensure a more comprehensive crawl. Do this by using the most standard JavaScript that you can – including using AMP JavaScript whenever you can, even if it does not validate, and never will. The same is true of CSS – either use AMP CSS, or limit your CSS as much as you can, eliminating un-used classes, and styles. The last thing is to combine external assets whenever you can, to minimize the number of round trips – then compress the files, and host them on a fast CDN. These steps have always helped, but may also ensure error-free Deferred JavaScript Rendering.”

I also asked Tommy Roved, A digital marketing consultant from Semto for further opinion on why site load speed is important for SEO and this is what he said: “Site speed has always been important, when working with SEO – But the last couple of years this factor has been increasingly important – Google´s focus on mobile, ( mobile first indexing ) and the Google Speed Update in 2018 are two good reasons to realize the “need for speed” – If your site has a page load over 2 sec – I would try to improve It, if possible. When you speed up your site – It has an important effect on things like Conversion, Bounce rate and engagement and that is for one – Good for your site/business performance, but it is also good signals to send to Google – The behaviour of the users and their experience when they visit the site is a ranking factor – And when your on page one – These signals are getting more and more important for Google. Site speed is important when working with SEO in 2019 and It will be important many years to come.”

Use the correct redirects

Redirects are an incredibly important aspect for SEO, and there are a range of different ones that you need to consider before you implement.

For example, a 301 redirect should be used for pages that will be “moved permanently”. Such a redirect will also pass up to 99% of the link equity to the redirected page. The number refers to the HTTP status code, and is often considered to be the best one for implementing redirects.

A 302 redirect is for when pages are “Moved Temporarily”. It has been indicated that Google could treat 301s and 302s similarly, although the former is preferred.

You can use 302 redirects if you are doing maintenance on your site, although it is rare that they are necessary.

If you ever come across a 404 error, it means that the URL path provided no information and the page no longer exists. No redirects have been performed on this page.

You can use a Web crawler like Sitebulb to easily locate redirect issues.

You also have to ensure your www and non www and http to https redirects are in place else Search engines will rank them and cause duplicate content issues. It will seriously affect your site rankings.

One of the easiest way to check the redirects is to use a crawler. One of my go to tool is Sitebulb. It can show you the redirects and highlight any issues as well as guidance on how to fix the redirect issues among many other things. I asked the Sitebulb founder Patrick Hathaway about the common redirect issues and how to fix them and this is what he said:

“Redirects are an important part of the web, and most active internet users will encounter them multiple times every day. They primarily exist to ensure that site visitors end up on the correct page, and considering the ever-changing nature the internet, are an absolutely necessary cog in the machine.

Often, redirects come about because of an underlying ruleset or ‘catchall’ that exists to avoid accidental errors (e.g. all HTTP requests get redirected to the HTTPS equivalent). The net result of this is that, on most sites, there are lots of redirected URLs that could exist, and it can become the SEO’s job to track down redirects and make sure that they only come into play when necessary.

There are 3 things that can go wrong with redirects:

  1. The redirect points to an incorrect destination URL.

  2. The wrong status code is being served.

  3. There exist links on the website to URLs that redirect.

On the face of it, none of these cause BIG problems. Similarly, the scale on which these issues occur is a factor – if it’s a very small scale, or pages of little importance, then fixing the issues is unlikely to move the SEO needle.

Redirects become a real problem is when these issues exist at scale, or through some systemic process that means the issue is continually recreated.”

#1 The redirect points to an incorrect destination URL

This is obviously bad if it leads to a page that no longer exists (404), or to another URL that itself redirects (creating a redirect chain).

But a more subtle situation you may encounter is when URLs are redirected to a URL with a completely different topic. Redirects exist in the first place to tell search engines, ‘hey, this page has moved, please find it over there now.’ So if you have a page about dog bowls, and then redirect it to a page about cat food, there is a big disconnect between the ‘old’ content and the ‘new’ – this is a bad use of a redirect.

The significance of this is that as little as 4 years ago, it was considered received wisdom to deal with discontinued products by redirecting ‘up a level’ to a category or sub-category page. So if your product: ‘Awesome Dog Bowl’ is no longer available, you’d set a 301 redirect to point at the ‘Dog Bowls’ category page – in order to pass link equity through the 301 from the old page up to the dog bowl category page. However, Google got wise to this type of optimisation, and now treat this kind of redirect as a soft 404.

#2 The wrong status code is being served

Most redirects you will encounter will be 301 (permanent) and 302 (temporary) redirects.

Whilst some Googlers may try and argue otherwise, a solid rule of thumb is to use 301 redirects whenever you want to pass link equity from one page to another. In our dog bowl analogy above, the ideal way to handle this would be to introduce another page, advertising a replacement product (e.g. ‘Awesomer Dog Bowl’) and setting a permanent 301 redirect from the old product page to the new one.

#3 There exist links on the website to URLs that redirect

You will rarely, if ever, compete a technical SEO audit without coming across any redirects at all. And the way you normally find them is through links to redirected URLs on your website.

They often come from old, hard-coded links to URLs that have over time been moved or replaced. But they can also come from more systemic issues, such as links in navigation to redirected URLs, or persistent links to old HTTP pages following a HTTPS migration.

And this is where the bigger problems can lie, as these redirects are not only accessible to users, but also to search engine crawlers. If you have a large scale redirect issue, you could be asking search engine crawlers to spend additional crawl budget jumping through tens of thousands of redirects in order to find URLs that “should” return a 200 status in the first place.

Redirects are, and should be, an important part of your website. They should exist to preserve link equity in a strategic manner, and provide users with a fallback for when changes are made. Fixing redirect issues should be carried out periodically as part of an audit process.

Ensure that your site is on HTTPS

Google has always put security at the front of nearly everything it does, and this has never been more evident than its requirement for sites to be on HTTPS.

Put simply, HTTPS, is the secure version of HTTP, which is the protocol over which data is sent between a website and a browser. Having a HTTPS certificate on your site means that all communications between it and users is encrypted.

As of July 2018, Google marked all non-HTTPS sites as “not secure” in its Chrome browser. This means that if a person tries to go on a site that is not secure, they will be informed by Chrome that the website is not safe.

Google has also hinted that there is a small ranking boost for sites that are on HTTPS.

You can buy an SSL certificate or get a free certificate with Let’s Encrypt.

If you’re using WordPress and not sure how to migrate your site to SSL, then use this free Plugin to help with the migration.

Sometimes even after installing an SSL certificate, you won’t see a secure green padlock. Instead, you will see an error like ‘Connection not secure’ and its a common problem. There are two types of mixed content errors. 1. Mixed scripting – Happens when an HTTPS site runs script file over HTTP. 2. Mixed passive content – Happens when an HTTPS site runs a resource over HTTP such as an image. An easy way to solve this problem without using a developer is to use Really Simple SSL Plugin. It will help you with redirects and fix any mixed content issues.

Optimise your site for mobile

Now that Google is quickly adding sites to its mobile-first index (which means that sites are judged by their mobile sites before their desktop sites), it is becoming ever more important that your site is usable to a range of mobile devices.

Back in 2016 Google said that: “Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results.”

The new way of indexing began rolling out in March 2018 and there are a range of best practices that you should adhere to, including adding XML and media maps, optimising your meta descriptions and media (images and videos), setting up structured data, and utilising AMP pages.

You can also check how friendly your site is using Google’s Mobile Friendly Test.

When your site is added to the Mobile first index you will receive an email from Google search console.

You will get an email like this when your site goes into Mobile first index

Optimise your site for voice search

Although Google Voice Search was once considered by sceptics to be a passing fad, it is now one of the quickest growing elements of search.

With the introduction of Siri, Cortana, Google Now, Alexa, and Google Assistant, more people are searching for information through voice search, which means that websites need to optimise so that their content is being served to audiences using these technologies.

Webmasters must take into account the fact that people are now using longer queries when searching using voice search, with 22% of those searching for local information and content.

https://youtu.be/t3E4hTzpOdo

Note: Basic Gravity is my consulting company and I’m still exploring Speakable markup (Will be covered in this article)

There are a range of ways to optimise for voice search, including ensuring that you have optimised and well-written FAQ pages, that your site contains conversational keywords, that your pages have structured data markup, and that you have claimed your Google My Business listing.

You can also read Google’s official guidelines.

Saijo George, founder of tl;dr Marketing says: “Right now, all we’ve seen is voice search being used for informational queries. If that’s your focus, then do pay attention to voice search but if transactional queries are what you’re aiming for, I would recommend focusing on image/video. With all the strides made in image processing in recent years there will be a lot of transactional focused search happening in that space.” 

He continues: “That said, I would not ignore optimising your site for voice search as this is a relatively new opportunity and it’s certainly a growing field of search. Therefore it’s a good way to reach a new segment of the market – which your competition might not be even aware of!”

  • Focus on optimising for the featured snippet. Right now, most Google Assistant answers are based on that.

  • Add structured data if applicable. Focus on Speakable (which is in Beta) and Q&A Page.

  • Consider creating your own Google Actions and Alexa Skills.

Don’t overlook schema.org markup

Schema.org markup (often referred to as schema markup) is a vocabulary of tags that you can use within HTML to provide crawlers with key information about how they should be represented within SERPs.

Example: See how this website is getting star ratings using Schema markup.

As you can see the webmaster did a good job of adding all schema elements.

Schema.org can be used to describe a range of elements including:

  • Creative work

  • Events

  • Organisations

  • People

  • Places

  • Products

You can find a list of items for markup on the Schema.org website.

Schema.org markup is used by a range of major search engines (including Yahoo and Bing) and is part of a great collaboration between them.

As you can imagine, there are a range of benefits for implementing the markup on your site.

I reached out to Dave Ojeda, a leading schema markup expert and asked him about the future of Schema and why it’s important to implement them, and his response was: “Google has already proven that it is essential by providing benefits via Search Features. If Structured Data is set up correctly. Moving forward with the advancement of voice search, Google Speakable (BETA) gives us a glimpse of how Google may let us influence those Q&A type searches.

Looking forward to when Speakable is available to all sites not just verified news sites.”

John Mu confirmed that Google prefers JSON-LD Schema markup, so it’s recommended you use it.

Here is an example Person schema I have implemented on this website.

Person JSON Schema example

There are several standalone Schema plugins for WordPress. Schema Pro is a popular choice. You can also use an SEO Plugin for this purpose. Most of the popular SEO Plugins support schema nowadays. My favourite is Rank Math. You can also check out my review of Rank Math here.

Utilise social media to your advantage

Google has always been hazy about the advantage of whether active social media channels offer a ranking boost to websites, although there are some advantages of having and using them on a regular basis.

The first thing to remember is that social media profiles can also rank in search results. Although social shares might not affect how well a page ranks, it can affect how much exposure a profile receives in results pages.

This is particularly important for brand names, as keeping consistent and informative profiles across platforms will be key for the impression that they leave on people searching for them in search engines.

It’s also important that many social platforms, including Twitter and Facebook, have their own search facilities, so maintaining a good social profile on them is important for driving traffic to your site.

Another helpful tip is that sharing your content in social media helps to index your links quickly within Google. One of my favorite way of indexing a link is to share it on Twitter.

Bonus tip: Log file Analysis

What if you can understand how Google bot is behaving on your website? This is where Log analysis comes to play. In simple terms, every time you visit a site a log file is created in the server.

The file stores specific information about the visitor and the same goes for Google bot. So the process involves analysing the Google bot data. We can see what type of bot is hitting your page, top folders, top pages, etc.

Example log entry by the Google bot

Once you have this information, It’s easy to locate issues and make the necessary changes.

According to Johnmu and many SEO Experts the log analysis is one of the most underrated strategies in this space.

One of the primary outcomes of Log analysis is optimising for Crawl budget. Having said that, if you have a small to medium site you shouldn’t worry too much about crawl budgets.

I’m keeping this as less technical as possible however if you’re interested in learning more about this check the following resource.

When it comes to tools, it’s difficult to recommend a specific tool since log files don’t always have a standard format. It changes depending on the server type, and there are custom types of log files as well.

There are many Log analysers, and my favorite one is Screaming frog Log analyser (Paid) but, there are other free alternatives available. Seolyzer is an excellent tool and its free to use. It would be best if you chose your tool based on your need and technical stack.

Logo hero works nicely with Google Analytics. Image credit: log-hero.com

Conclusion

I hope you find these SEO Tips useful and put them to good use for your upcoming campaigns. Let me know your feedback in the comments.