Searchmetrics released its annual study of Google’s top search ranking factors recently. This is used as a comparative benchmark for webmasters, online marketers and SEO’s to identify patterns and trends. The company’s historical database contains over 250 billion pieces of information such as keyword rankings, search terms, social links and backlinks with global, mobile and local data covering organic and paid search plus social media.
The full 63-page report can be downloaded here. We thought we’d take a look at the key takeouts from the report which measure the top 20 search results for 10,000 keywords:
Much has changed over recent years with universal ranking factors certainly a thing of the past. These days each individual search query has its own ranking factors that are continually changing. Benchmarks like the Searchmetrics annual report, which provide key insights, can be used by search specialists to understand what impact Google’s evaluations will have on its clients’ rankings.
We all know content is king with the main challenge for SEO’s ensuring each post is relevant to users. Word count continues to be important with the top ranking sites exceeding an average of 1000 words. Continuing to write content specific to your readers and on a regular basis remains of prime importance. Search queries can vary enormously however so the more specific the content is, the more likely it is to be found and shared.
User intent is an essential part of the mix for SEO’s. Regularly studying the SERPs for a brand’s basic keywords will help uncover what people are looking for. Likewise, customer data will keep SEO’s up to speed with industry developments while focusing on user intent, rather than just keyword research, will help enhance search queries.
In addition to having relevant content, web pages need to provide a smooth user experience and being fully optimised. Ensure your website is mobile first as these sites will always rank higher in Google search than those that are not. Tips on how to make your website mobile friendly can be found here. You should also review how quickly your site loads and how large the files are as well as performing your on-page technical basics. Nearly half the pages in the top 20 of the Searchmetrics report were encrypted using HTTPS.
Google has a wealth of data generated from its search results, browsers and analytics. As such it’s able to identify how satisfied a user is with their search results with an evaluation created in real time. Top ranking factors include click-through rates, time spent on a site and bounce rates. The average bounce rate for websites on the first page of Google is 46%, with the time on site for the top 10 sites being 3 minutes 10 seconds. Websites in positions 1 to 3 typically have an average CTR of 36%. When working on your keyword research and user intent, also consider local search as well as the topic.
Although a website with an established link structure should never be underestimated, links are less influential when it comes to search results. There has been a dramatic drop over the past 12 months with relevant content and user intent ranking above. It’s now possible for a site to have a higher Google ranking than a competitor even if it has fewer links. While this is topic dependent, typically it’s come about as mobile searches have rarely linked even if they’re shared or liked.
Backlinks do form part of Google’s algorithm but they’re certainly not the driving force they were previously. Penguin is now a factor in the algorithm too, which means less stability: websites can move up and down rankings quickly as a result of other’s efforts. While you should ensure you keep your backlink profile clean, it’s important to continue with your outreach activity too. Links pointing to your website with a high domain authority will ensure you are seen as an authority in your niche.
So to summarise, we can expect content and user intent to increase in importance with technical also a key driver in search results. Backlinks are on the decline and now just one of many contributing factors in a site’s visibility.
For further information, complete our contact form today or call our Digital Marketing Manager Paul Mackenzie Ross on 020 3146 4341
In our first post tackling common SEO problems and how to overcome them, we covered:
These being just some of the technical issues faced on-page and off-page. In part 2, we take a look at another 5 SEO issues and what you can do to resolve them.
Alt attributes (commonly and mistakenly called ALT tags – ALT is an attribute of an IMG tag) help search engines like Google understand what an image is about. If the attributes associated with that image are missing and there’s no description, it can cause SEO problems. Image alt attributes should include your keywords to ensure they’re categorised in the right way. We covered broken links in our last post – broken images cause similar issues in that they can lead to a poor user experience. Both these issues can be overcome by ensuring your alt attributes accurately describe your images. That way they will be properly indexed in search results too.
You can increase the speed of your site by removing code that’s not needed. Also, move inline scripts and styles to separate files and add relevant on page text where it’s required.Other aspects you might want to check include removing white spaces, using CSS for styling and formatting, resizing images (removing those you don’t need) and keeping the size of your page under 300kb.
Other aspects you might want to check include removing white spaces, using CSS for styling and formatting, resizing images (removing those you don’t need) and keeping the size of your page under 300kb.
A title tag is what appears in your search results with an H1 tag what visitors to your website see on a page. While multiple H1 tags can appear on a page, it’s important to get the hierarchy right to ensure your website is indexed in the right way. H1s should be consistent with title tags but not the same. Ideally, you should use one H1 tag per page with H2 tags breaking up the content.
When Google introduced its Panda updates way back in 2011, the idea was to reduce the amount of “thin content” in the search results. Around this time the notion that web pages should contain a minimum of 300 words came about and that thought still persists today with even the popular Yoast SEO plugin for WordPress still touting the “recommended minimum 300 words”.
While there is no set word count to rank with a search engine, the preference is long-form pages with the text including keywords and phrases. Google is known for ranking websites with more depth and longer content. Equally, visitors to your website want to see content that is relevant to the topic they searched for. Even if you’re sharing an image-led post or infographic, it will need some context behind it. Evergreen content is often popular with lists, tips and how to guides the most well-received.
So remember – google likes high-quality content. In its own words:
“…sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
While all websites will include on-page links, having too many links is unnatural and can dilute the value of a page. It’s important therefore that links are relevant and useful. This way you can ensure your website will rank well and have a natural link profile. If you remove the low-quality links from your website, you will provide a better user experience particularly for those accessing your website via a mobile and table. High-quality links will improve your SEO ranking.
There are so many ranking signals that Google considers for SEO, which are constantly changing and evolving. If you or your company needs help navigating the minefield is that search engine optimisation, feel free to get in touch and ask for a free SEO audit. Better still, you can let us evaluate your website speed & performance, security, mobile-friendliness and SEO in a complete website audit – claim your free website audit now.
If you need further help and assistance, with your 10 SEO problems, get in touch with Woking web agency Clever Marketing on 020 3146 4341.
Our Digital Marketing Manager alone has nearly 20 years of SEO, SEO, PPC and content marketing experience so he’ll be able to help you out.
Initially introduced back in 2012, the purpose of Google’s Penguin algorithm is to identify unnatural backlinks in Google search results.As of September 2016, Penguin 4.0 was released and now running in real-time – this will be Google’s last update. But what does it mean and how could it affect you? We take a look at the questions everyone is asking.
As of September 2016, Penguin 4.0 was released and now running in real-time – this will be Google’s last update. But what does it mean and how could it affect you? We take a look at the questions everyone is asking.
Google Penguin is a webspam algorithm designed to capture websites that have created unnatural backlinks to gain an advantage in search results. While other factors are taken into consideration to ensure websites meet webmaster guidelines, the primary focus is backlinks. Penguin finds unnatural links that webmasters use to manipulate search results.
If you have a well-respected site with a good domain authority linking to your website, it’s like a recommendation. Equally, if you have a large number of smaller sites linking back to your website, this too can be effective. Anchor text can also play a part as it’s clickable text with a hyperlink suggesting the website in question should be trusted.
With Penguin data now in real-time, it’s possible to continually re-crawl and re-index web pages. Refreshing data in this way means bad links devalue individual rankings (rather than receiving old-fashioned penalties) which can be recovered from in real time. In previous versions, Penguin updates would penalise an entire domain. Penguin 4.0 is more granular with ‘penalties’ issued for specific pages. It works by devaluing spammy links and adjusting a site’s Google ranking based on spam signals. Penguin is now part of the core Google algorithm which consists of 200 other signals that can affect rankings.
As has always been the case, webmasters should focus on creating compelling content that is updated regularly. The focus should be on the end user, making sites unique, valuable and engaging.
Avoid duplication or thin content, use rich anchor text and have relevant links. Any links pointing to a web page need to have value to the end user, providing relevant information related to the product or service. Penguin penalties will mostly relate to links and anchor text whether its external links from your website or incoming links.
With Penguin now in real-time, penalties can be cleared much quicker than they were previously, so don’t panic. In fact, penguin recoveries are already being reported. The process for cleansing your site will likely include checking backlinks and undertaking a new link building campaign, supported by social media, to re-establish authority in search results.
Worried that you may have been affected by the penguin update? Get in touch and see how our SEO services could help!
We all know that content is king but with search engines ranking sites by popularity, it’s important to maintain your site regularly to iron out any on and off page technical issues.We take a look at five of the most common SEO problems and how to overcome them.
We take a look at five of the most common SEO problems and how to overcome them.
Duplicate content occurs when content appears on the Internet in more than one place. This makes it difficult for search engines to identify which version to include in their index and rank for search queries. Although there isn’t a specific penalty for duplicated content, it can affect ranking and traffic.Your SEO services agency can run an SEO audit initially so you can begin to assess the situation. If the duplicated content is something you can control, you can re-direct to the most authoritative page. You can also use the rel=canonical attribute which prioritises which duplicate page to rank.
Your SEO services agency can run an SEO audit initially so you can begin to assess the situation. If the duplicated content is something you can control, you can re-direct to the most authoritative page. You can also use the rel=canonical attribute which prioritises which duplicate page to rank.
Title tags are used to tell search engines and visitors what a page is about in an accurate and succinct way. This same title will also form the anchor text when shared on other websites and across social media.
Title tags are an important element within SEO, ranking more highly the closer the start of it is to a keyword-based query. Google can override your title if it doesn’t like it. If you’re working with an SEO services agency, they can help optimise your title tags when it comes to length, keywords and relevance.
A meta description is a short description that appears under your URL in search results. Placed in the HTML of a website, it’s a 150 character snippet of the content – a great way to show visitors the page is relevant to their search. This is something you can rectify via your SEO services agency or do yourself depending on your site platform and expertise. Meta description errors can be found by heading to: SEO > Site Auditor > Meta or SEO audit software. If you’re a WordPress user, you can use the Yoast SEO Plugin to edit the meta description. These extracts are crucial as they influence the traffic you receive from specific search results.
A broken link and subsequent 404 error results in a poor user experience and can have a negative impact on search results too. Broken links normally come about if an old page is deleted, moved or if the URL has been changed. The same can apply with outbound links whereby a website you’re linking to alters so it’s worthwhile checking for broken links regularly. You can use your SEO services agency to crawl for internal and external errors, these can then be fixed or removed manually.
Images are the best way to bring a website or post to life thus making the content more shareable. But in much the same way as broken links, broken images can lead to a poor user experience. Images need to index properly too with alt tags used to provide an accurate description for search engines. Google image search receives over a billion page views every day. Optimised images are more likely to receive regular traffic from Google and image-based sites like Pinterest. You can work with your SEO services agency to identify which images are impacting your website, replacing or deleting those affected.
These are just a few of the most common SEO problems companies currently face, we’ll be back to address more SEO issues in part 2. If you have a specific challenge you’d like us to include, feel free to contact Paul Mackenzie Ross at Clever Marketing, Surrey’s digital marketing agency, on 020 3146 4341.