WE'RE A FULL SERVICE CREATIVE DESIGN AND DIGITAL AGENCY.
DESIGN. DEVELOP. DELIVER.

WE'RE A FULL SERVICE CREATIVE DESIGN AND DIGITAL AGENCY.
DESIGN. DEVELOP. DELIVER.

WE'RE A FULL SERVICE CREATIVE DESIGN AND DIGITAL AGENCY.
DESIGN. DEVELOP. DELIVER.

ARCHIVE POSTS


Searchmetrics released its annual study of Google’s top search ranking factors recently. This is used as a comparative benchmark for webmasters, online marketers and SEO’s to identify patterns and trends. The company’s historical database contains over 250 billion pieces of information such as keyword rankings, search terms, social links and backlinks with global, mobile and local data covering organic and paid search plus social media.

 

The full 63-page report can be downloaded here. We thought we’d take a look at the key takeouts from the report which measure the top 20 search results for 10,000 keywords:

 

Ranking factors are more personal

 

Much has changed over recent years with universal ranking factors certainly a thing of the past. These days each individual search query has its own ranking factors that are continually changing. Benchmarks like the Searchmetrics annual report, which provide key insights, can be used by search specialists to understand what impact Google’s evaluations will have on its clients’ rankings.

 

Content remains king as does user intent

 

We all know content is king with the main challenge for SEO’s ensuring each post is relevant to users. Word count continues to be important with the top ranking sites exceeding an average of 1000 words. Continuing to write content specific to your readers and on a regular basis remains of prime importance. Search queries can vary enormously however so the more specific the content is, the more likely it is to be found and shared.

 

User intent is an essential part of the mix for SEO’s. Regularly studying the SERPs for a brand’s basic keywords will help uncover what people are looking for. Likewise, customer data will keep SEO’s up to speed with industry developments while focusing on user intent, rather than just keyword research, will help enhance search queries.

 

Technical factors continue to be important  

 

In addition to having relevant content, web pages need to provide a smooth user experience and being fully optimised. Ensure your website is mobile first as these sites will always rank higher in Google search than those that are not. Tips on how to make your website mobile friendly can be found here. You should also review how quickly your site loads and how large the files are as well as performing your on-page technical basics. Nearly half the pages in the top 20 of the Searchmetrics report were encrypted using HTTPS.

 

User signals provide Google with feedback

 

Google has a wealth of data generated from its search results, browsers and analytics. As such it’s able to identify how satisfied a user is with their search results with an evaluation created in real time. Top ranking factors include click-through rates, time spent on a site and bounce rates. The average bounce rate for websites on the first page of Google is 46%, with the time on site for the top 10 sites being 3 minutes 10 seconds. Websites in positions 1 to 3 typically have an average CTR of 36%. When working on your keyword research and user intent, also consider local search as well as the topic.

 

Backlinks have less influence on rankings

 

Although a website with an established link structure should never be underestimated, links are less influential when it comes to search results. There has been a dramatic drop over the past 12 months with relevant content and user intent ranking above. It’s now possible for a site to have a higher Google ranking than a competitor even if it has fewer links. While this is topic dependent, typically it’s come about as mobile searches have rarely linked even if they’re shared or liked.

 

Backlinks do form part of Google’s algorithm but they’re certainly not the driving force they were previously. Penguin is now a factor in the algorithm too, which means less stability: websites can move up and down rankings quickly as a result of other’s efforts. While you should ensure you keep your backlink profile clean, it’s important to continue with your outreach activity too. Links pointing to your website with a high domain authority will ensure you are seen as an authority in your niche.

 

So to summarise, we can expect content and user intent to increase in importance with technical also a key driver in search results. Backlinks are on the decline and now just one of many contributing factors in a site’s visibility.

 

 

For further information, complete our contact form today or call our Digital Marketing Manager Paul Mackenzie Ross on 020 3146 4341



Initially introduced back in 2012, the purpose of Google’s Penguin algorithm is to identify unnatural backlinks in Google search results. As of September 2016, Penguin 4.0 was released and now running in real-time – this will be Google’s last update. But what does it mean and how could it affect you? We take a look at the questions everyone is asking.

 

What is Google Penguin?

Google Penguin is a web spam algorithm designed to capture websites that have created unnatural backlinks to gain an advantage in search results. While other factors are taken into consideration to ensure websites meet webmaster guidelines, the primary focus is backlinks. Penguin finds unnatural links that webmasters use to manipulate search results.

 

What makes links important

If you have a well respected site with a good domain authority linking to your website, it’s like a recommendation. Equally if you have a large number of smaller sites linking back to your website, this too can be effective. Anchor text can also play a part as it’s clickable text with a hyperlink suggesting the website in question should be trusted.

 

What’s different about Penguin 4.0

With Penguin data now in real-time, it’s possible to continually re-crawl and re-index web pages. Refreshing data in this way means bad links devalue individual rankings (rather than receiving old fashioned penalties) which can be recovered from in real time. In previous versions, Penguin updates would penalise an entire domain. Penguin 4.0 is more granular with ‘penalties’ issued for specific pages. It works by devaluing spammy links and adjusting a site’s Google ranking based on spam signals. Penguin is now part of the core Google algorithm which consists of 200 other signals that can affect rankings.

 

How to avoid being penalised

As has always been the case, webmasters should focus on creating compelling content that is updated regularly. The focus should be on the end user, making sites unique, valuable and engaging. Avoid duplication or thin content, rich anchor text and use relevant links. Any links pointing to a webpage need to have value to the end user, providing relevant information related to the product or service. Penguin penalties will mostly relate to links and anchor text whether its external links from your website or incoming links.

 

What to do if rankings have been effected

With Penguin now in real-time, penalties can be cleared much quicker than they were previously so don’t panic. In fact penguin recoveries are already being reported. The process for cleansing your site will likely include checking backlinks and undertaking a new link building campaign, supported by social media, to re-establish authority in search results.

 

Worried that you may have been effected by the penguin update? Get in touch and see how our SEO services could help!

 





































How has Link Building Changed?

10th September 2015













Go Big or Go Home?

11th June 2015














Should You Build An App?

5th March 2015































Print’s Not Dead!

26th September 2014


Marketing Automation

6th July 2014