The functionality of Google Search Console
The performance section shows you how your website is performing. It is broken down in a nice line graph showing to monitor your improvements over time. You can break this down into smaller segments to get specific target and page data.
Total clicks refers to the number of people who have clicked on your website from Google Search.
Total impressions doesn’t refer to the number of people who have seen your webpage like ads, it refers to how many times your webpage was shown in the search results.
This is the % average of searchers who clicked through to your webpage. This number will vary as you add new search terms and pages.
This is the overall position of your pages and where it displays on search engines for the user across your search terms. As you add higher competitive keywords, this will drop until you start building authority. You can break this down into pages which is more helpful.
Queries are the keywords Google searchers have discovered your page with. Usually, the keywords that you are number 1-10 for, will have the highest number of clicks associated with them. Low clicks, just mean the keywords are present but you will want to build on these to increase your average position.
Location is important as it tells you which countries the searchers are in when they discover your website. If you have a lot of traffic from India but only sell to the UK then this traffic will not convert and as such lower your engagement rate on your website. Which impacts your overall SEO score. Monitor this to make sure your visitors are finding you for the right search terms in the right countries.
Pages is a good category because it allows you to see which pages are getting the most traffic. You want to make sure these pages have a user journey directing them from the page they land on to converting to a lead for your business.
Devices are often overlooked, but it is very useful because it allows you to get to know your audience. If you are a b2b business and getting mostly desktop views, this makes sense because your search terms are focused on people who work for a company and in most cases use an office pc or laptop. If your users are mostly on mobile, you want to make sure your website is optimised for mobile and has quick user journeys so as to not lose their attention.
This section shows you the dates and the number of clicks to your website. This can be a good tool to measure the success of a marketing campaign. E.g if you released a newspaper article and there were more searches for your business as a result.
This is the best tool in my opinion because I absolutely love seeing a webpage be indexed. If you followed the last document, you will have written yourself a great article optimised for SEO and you want to see it rank. If you have submitted your sitemap, Google will find these…eventually. But you can also tell the algorithm that this webpage is there with this great tool.
Search the URL for your blog article. This will call you back crawler data, stats and whether it has been included in your sitemap. You have two options, to either index this page or view the crawled page.
View crawled page: this appears when the website has been crawled and thus Google knows it exists. Viewing the crawled data allows you to check they have the most up-to-date version of the page. In the Blog Article Doc, I mentioned how you use your inspect panel to see how your website looks to the search bot. Use this to see if the info matches up. If not request indexing, if yes, Google will index this page as soon as it can. If it remains unindexed, make sure there are no errors with the page and the meta descriptions match the page.
Request Indexing:This will tell the Google bot to crawl the page and index the website if it meets its criteria for indexing. This will add your webpage to the queue immediately without having to wait on Google to do its auto crawl and discover it.
Sitemaps are a roadmap of all the important pages and posts of your website. Sitemaps make it much easier for the search engine to categorise and rank your website if they know what they are looking for and which pages you would like them to rank.
Sitemaps only need to be added once to your Google Search for the algorithm to know where this structured data is located. It will recrawl this website periodically, how frequently this occurs will depend on your website, traffic, industry and content itself. However, there are some reasons why you may need to add it more than once:
You have changed the URL where your sitemap is located.
You have made a lot of changes and created many new pages for your website.
You are having technical errors and need to quickly see if it can read your sitemap.
You have changed your robots txt for allow/disallow rules.
What should your sitemap address look like?
There is no set rule to what your sitemap address should look like but it tends to follow general URL naming principles and most commonly takes the form below.
If you have different sitemaps for different things it might have a category and look like this
If you are using a development agency this will usually be created for you, ask them for the link. If you are using a diy website builder like wordpress, you can get SEO plugins like Yoast, which generate and automatically update this for you. They will provide this address through the plugin itself.
When submitted you should get a success message with how many pages it has discovered. If you get an error it can either not read your sitemap or the URL is incorrect.
The pages section tells you which pages have been crawled, which pages have errors and which pages have been indexed.
It is a common misconception that all of your website pages need to be indexed. This is not the case, if you have 100% indexing rate - there is usually something wrong! Only the pages that deliver value to your user and are required to be found in search engines should be indexed.
What are things I shouldn’t index?
If you work with children, there are many rules around GDPR. You might have data about a child or images of a child taking part in activities. You have these online through a media vault ( password protected) for family members and guardians. You do not want these pages to be indexed by Google. Thus you want your page index table to say X number of pages have been “blocked by robots txt)
Things to look out for
404 pages are pages that do not exist on the website but are included in your sitemap. This section will tell you which pages are a dead end for your user so you can fix them. Fixing them will depend on what the issue is, the URL path could be incorrect, the page could have too many redirects or you have removed the content and have not redirected the link elsewhere. Once you have resolved the issue, you can validate the fix through Google Search Console and request the page to be recrawled and indexed. You do not need to resubmit your sitemap for this.
Crawled but not indexed
You want to keep an eye on this section to make sure the pages you want to be crawled are not in it. Google can decide not to index a page for many reasons but the top reason is that it doesn’t deem it primary or important content for the user.
Other things to watch out for include
If you are adding pages regularly to your website, the number of indexed pages should be increasing.
Watch for sudden drops in the number of pages indexed. This could show an error with the indexing, a broken sitemap or a new algorithm push from Google itself.
This section allows you to tell the algorithm about outdated URLs they are indexing. This will submit a removal request to Google and they will remove the page from indexing. You can check the status of this through your pages section. This tool is great if you quickly want to remove a page from Google Search.
This section is extremely helpful to pinpoint issues that are causing your users frustrations on mobile devices. It will tell you if elements are too close together etc, allowing you to go to your website and make changes to the mobile viewpoints and design.
For overall website performance, you will want to use a tool like Google Lighthouse or GTmetrix to pinpoint the areas in your website that require improving.