WE'RE A FULL SERVICE CREATIVE DESIGN AND DIGITAL AGENCY.
DESIGN. DEVELOP. DELIVER.

WE'RE A FULL SERVICE CREATIVE DESIGN AND DIGITAL AGENCY.
DESIGN. DEVELOP. DELIVER.

WE'RE A FULL SERVICE CREATIVE DESIGN AND DIGITAL AGENCY.
DESIGN. DEVELOP. DELIVER.

ARCHIVE POSTS


Searchmetrics released its annual study of Google’s top search ranking factors recently. This is used as a comparative benchmark for webmasters, online marketers and SEO’s to identify patterns and trends. The company’s historical database contains over 250 billion pieces of information such as keyword rankings, search terms, social links and backlinks with global, mobile and local data covering organic and paid search plus social media.

 

The full 63-page report can be downloaded here. We thought we’d take a look at the key takeouts from the report which measure the top 20 search results for 10,000 keywords:

 

Ranking factors are more personal

 

Much has changed over recent years with universal ranking factors certainly a thing of the past. These days each individual search query has its own ranking factors that are continually changing. Benchmarks like the Searchmetrics annual report, which provide key insights, can be used by search specialists to understand what impact Google’s evaluations will have on its clients’ rankings.

 

Content remains king as does user intent

 

We all know content is king with the main challenge for SEO’s ensuring each post is relevant to users. Word count continues to be important with the top ranking sites exceeding an average of 1000 words. Continuing to write content specific to your readers and on a regular basis remains of prime importance. Search queries can vary enormously however so the more specific the content is, the more likely it is to be found and shared.

 

User intent is an essential part of the mix for SEO’s. Regularly studying the SERPs for a brand’s basic keywords will help uncover what people are looking for. Likewise, customer data will keep SEO’s up to speed with industry developments while focusing on user intent, rather than just keyword research, will help enhance search queries.

 

Technical factors continue to be important  

 

In addition to having relevant content, web pages need to provide a smooth user experience and being fully optimised. Ensure your website is mobile first as these sites will always rank higher in Google search than those that are not. Tips on how to make your website mobile friendly can be found here. You should also review how quickly your site loads and how large the files are as well as performing your on-page technical basics. Nearly half the pages in the top 20 of the Searchmetrics report were encrypted using HTTPS.

 

User signals provide Google with feedback

 

Google has a wealth of data generated from its search results, browsers and analytics. As such it’s able to identify how satisfied a user is with their search results with an evaluation created in real time. Top ranking factors include click-through rates, time spent on a site and bounce rates. The average bounce rate for websites on the first page of Google is 46%, with the time on site for the top 10 sites being 3 minutes 10 seconds. Websites in positions 1 to 3 typically have an average CTR of 36%. When working on your keyword research and user intent, also consider local search as well as the topic.

 

Backlinks have less influence on rankings

 

Although a website with an established link structure should never be underestimated, links are less influential when it comes to search results. There has been a dramatic drop over the past 12 months with relevant content and user intent ranking above. It’s now possible for a site to have a higher Google ranking than a competitor even if it has fewer links. While this is topic dependent, typically it’s come about as mobile searches have rarely linked even if they’re shared or liked.

 

Backlinks do form part of Google’s algorithm but they’re certainly not the driving force they were previously. Penguin is now a factor in the algorithm too, which means less stability: websites can move up and down rankings quickly as a result of other’s efforts. While you should ensure you keep your backlink profile clean, it’s important to continue with your outreach activity too. Links pointing to your website with a high domain authority will ensure you are seen as an authority in your niche.

 

So to summarise, we can expect content and user intent to increase in importance with technical also a key driver in search results. Backlinks are on the decline and now just one of many contributing factors in a site’s visibility.

 

 

For further information, complete our contact form today or call our Digital Marketing Manager Paul Mackenzie Ross on 020 3146 4341



In our first post tackling common SEO problems and how to overcome them, we covered:

  • duplicate content
  • title tags
  • meta descriptions
  • broken links and
  • image optimisation

These being just some of the technical issues faced on-page and off-page. In part 2, we take a look at another 5 SEO issues and what you can do to resolve them.

6. Missing alt attributes & broken images

Alt attributes (commonly and mistakenly called ALT tags – ALT is an attribute of an IMG tag) help search engines like Google understand what an image is about. If the attributes associated with that image are missing and there’s no description, it can cause SEO problems. Image alt attributes should include your keywords to ensure they’re categorised in the right way. We covered broken links in our last post – broken images cause similar issues in that they can lead to a poor user experience. Both these issues can be overcome by ensuring your alt attributes accurately describe your images. That way they will be properly indexed in search results too.

7. Low text to HTML ratio

Low text to HTML ratio means there’s much more back-end HTML code than there is text. Often it’s a sign of a poorly coded website (for example with above average Javascript, Flash and inline styling), hidden copy or the site is slow loading.You can increase the speed of your site by removing code that’s not needed. Also, move inline scripts and styles to separate files and add relevant on page text where it’s required. Other aspects you might want to check include removing white spaces, using CSS for styling and formatting, resizing images (removing those you don’t need) and keeping the size of your page under 300kb.

You can increase the speed of your site by removing code that’s not needed. Also, move inline scripts and styles to separate files and add relevant on page text where it’s required.Other aspects you might want to check include removing white spaces, using CSS for styling and formatting, resizing images (removing those you don’t need) and keeping the size of your page under 300kb.

Other aspects you might want to check include removing white spaces, using CSS for styling and formatting, resizing images (removing those you don’t need) and keeping the size of your page under 300kb.

8. H1 tag and title issues

A title tag is what appears in your search results with an H1 tag what visitors to your website see on a page. While multiple H1 tags can appear on a page, it’s important to get the hierarchy right to ensure your website is indexed in the right way. H1s should be consistent with title tags but not the same. Ideally, you should use one H1 tag per page with H2 tags breaking up the content.

9. Low word count

When Google introduced its Panda updates way back in 2011, the idea was to reduce the amount of “thin content” in the search results. Around this time the notion that web pages should contain a minimum of 300 words came about and that thought still persists today with even the popular Yoast SEO plugin for WordPress still touting the “recommended minimum 300 words”.

While there is no set word count to rank with a search engine, the preference is long-form pages with the text including keywords and phrases. Google is known for ranking websites with more depth and longer content. Equally, visitors to your website want to see content that is relevant to the topic they searched for. Even if you’re sharing an image-led post or infographic, it will need some context behind it. Evergreen content is often popular with lists, tips and how to guides the most well-received.

So remember – google likes high-quality content. In its own words:

“…sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”

10. Too many on-page links

While all websites will include on-page links, having too many links is unnatural and can dilute the value of a page. It’s important therefore that links are relevant and useful. This way you can ensure your website will rank well and have a natural link profile. If you remove the low-quality links from your website, you will provide a better user experience particularly for those accessing your website via a mobile and table. High-quality links will improve your SEO ranking.

There are so many ranking signals that Google considers for SEO, which are constantly changing and evolving. If you or your company needs help navigating the minefield is that search engine optimisation, feel free to get in touch and ask for a free SEO audit. Better still, you can let us evaluate your website speed & performance, security, mobile-friendliness and SEO in a complete website audit – claim your free website audit now.


Clever Marketing - Surrey Digital AgencyIf you need further help and assistance, with your 10 SEO problems, get in touch with Woking web agency Clever Marketing on 020 3146 4341.

Our Digital Marketing Manager alone has nearly 20 years of SEO, SEO, PPC and content marketing experience so he’ll be able to help you out.



Initially introduced back in 2012, the purpose of Google’s Penguin algorithm is to identify unnatural backlinks in Google search results.As of September 2016, Penguin 4.0 was released and now running in real-time – this will be Google’s last update. But what does it mean and how could it affect you? We take a look at the questions everyone is asking.

As of September 2016, Penguin 4.0 was released and now running in real-time – this will be Google’s last update. But what does it mean and how could it affect you? We take a look at the questions everyone is asking.

What is Google Penguin?

Google Penguin is a webspam algorithm designed to capture websites that have created unnatural backlinks to gain an advantage in search results. While other factors are taken into consideration to ensure websites meet webmaster guidelines, the primary focus is backlinks. Penguin finds unnatural links that webmasters use to manipulate search results.

What makes links important

If you have a well-respected site with a good domain authority linking to your website, it’s like a recommendation. Equally, if you have a large number of smaller sites linking back to your website, this too can be effective. Anchor text can also play a part as it’s clickable text with a hyperlink suggesting the website in question should be trusted.

What’s different about Penguin 4.0

With Penguin data now in real-time, it’s possible to continually re-crawl and re-index web pages. Refreshing data in this way means bad links devalue individual rankings (rather than receiving old-fashioned penalties) which can be recovered from in real time. In previous versions, Penguin updates would penalise an entire domain. Penguin 4.0 is more granular with ‘penalties’ issued for specific pages. It works by devaluing spammy links and adjusting a site’s Google ranking based on spam signals. Penguin is now part of the core Google algorithm which consists of 200 other signals that can affect rankings.

How to avoid being penalised

As has always been the case, webmasters should focus on creating compelling content that is updated regularly. The focus should be on the end user, making sites unique, valuable and engaging.

Avoid duplication or thin content, use rich anchor text and have relevant links. Any links pointing to a web page need to have value to the end user, providing relevant information related to the product or service. Penguin penalties will mostly relate to links and anchor text whether its external links from your website or incoming links.

What to do if rankings have been affected

With Penguin now in real-time, penalties can be cleared much quicker than they were previously, so don’t panic. In fact, penguin recoveries are already being reported. The process for cleansing your site will likely include checking backlinks and undertaking a new link building campaign, supported by social media, to re-establish authority in search results.


Clever Marketing - Surrey Digital AgencyWorried that you may have been affected by the penguin update? Get in touch and see how our SEO services could help!

Call us on 020 3146 4341, email [email protected] or fill in our contact form.



We all know that content is king but with search engines ranking sites by popularity, it’s important to maintain your site regularly to iron out any on and off page technical issues.We take a look at five of the most common SEO problems and how to overcome them.

We take a look at five of the most common SEO problems and how to overcome them.

(more…)











How Do I Rank Higher on Google?

26th February 2018

This New F1 Logo

14th February 2018



Never Be Afraid to Link Out

10th December 2017


Woking Works for SMEs

16th November 2017



Meet Flossy – The New UKIP Logo

29th September 2017


Happy Birthday, HG Wells

21st September 2017



































How has Link Building Changed?

10th September 2015













Go Big or Go Home?

11th June 2015














Should You Build An App?

5th March 2015