For anyone who owns a website and wants to gain organic traffic from Google Search, Google Search Console is a must-have tool. It not only gives you performance information but also gives you specific recommendations for how to enhance your site. This is reflected in a variety of Analytics Reports, including Mobile Usability, Core Web Vitals, Links Reports, and others. It allows you to submit a newer page on your site for indexing (so that it appears faster in Google’s search results) as well as a sitemap so that Google continues to read newer pages on your site on a regular basis.

In this in-depth article, you will learn everything that Google Search Console offers you. 

Table of Contents

What is Google Search Console

Google Console Logo

Google Search Console is a free Google service that assists businesses with Search Engine Optimization. Google Search Console provides you with a wealth of information, including what keywords your site ranks for, where you rank for those keywords, how often visitors click your result after entering in specific queries, and what other sites have linked to your material.

How to Add Your Site to the Google Search Console

Log in to Google Search Console and select “Add Property” from the drop-down menu.

Then, in the “Domain” section, copy and paste your homepage URL. The next step is to validate your website. 

Google console add source

There are seven different techniques to verify your website. Here are the three simplest methods for getting your website verified:

  • HTML Document: Create a one-of-a-kind HTML file and upload it to your website.
  • CNAME or TXT Record: This is where you specify a unique CNAME or TXT record for your domain.
  • HTML Code Snippet: Simply add a little snippet of code (an HTML tag) to your homepage’s code in the head> section.

How to Set Your Target Country

Google is pretty good at determining which country your site is aimed at. To do so, they examine data such as:

  • Your Country Code Top-Level Domain (ccTLD) (for example, co.uk for UK sites)
  • Your website’s address is incorrect.
  • The location of your server.
  • The country from which you receive the most backlinks The language in which your content is written (English, French, etc.)
  • However, the more data you can provide Google, the better.

The following step is to define your target country within the (old) Google Search Console.

Step 1: Under “Search Traffic,” click the “International Targeting” link.

Step 2: Select “Country” from the drop-down menu.

Step 3: Select “Target Users in” from the drop-down menu.

Step 4: Choose your desired country from the drop-down menu.

And that’s all there is to it.

How to Link Google Analytics With Google Search Console

Connecting your Google Analytics and Google Search Console accounts is beneficial. For instance, you can reclaim your Keyword data using this connection.

Here’s how to do it:

  • Go to Google Analytics and log in. Then, at the bottom of the left menu, click the “Admin” button.
  • Select “Property Settings” from the drop-down menu.
  • The “Adjust Search Console” button can be found by scrolling down. Toggle it!
  • Now, “Add” button should be clicked.
  • Scroll down to your website, tick the box, and click “Save.”

You’ve completed the task! Google Search Console and Analytics are now integrated.

How to Check for Security Issues

It’s time to check for Security Issues, look to see if your site has any security vulnerabilities that could be affecting its SEO.

To do so, go to “Security Issues.”

Google will look for Security Issues, there aren’t any security issues, it’s worth a look.

How to Add a Sitemap

You generally don’t need to submit a sitemap to Google if you have a small site. A sitemap is essential for larger sites (such as E-Commerce sites with thousands of pages).

This is how you do it:

  • Step 1: First, you must construct a sitemap. You should already have one if you’re using WordPress and the Yoast plugin.
  • Step 2: Go to Yoast to create a sitemap if you don’t already have one. Then, under “General/Features,” set the XML sitemaps setting to “On“:
  • Step 3: Click on the “See the XML Sitemap” link to access your sitemap:

It’s incredibly simple with the new Google Search Console.

  • Step 1: Take note of your sitemap’s URL. Then select “Sitemaps” from the drop-down menu.
  • Step 2: Click “Submit” after pasting in your URL.

That’s all there is to it:

How to Optimize Your Technical SEO With the Google Search Console

Although “Technical SEO” is a popular term these days, SEO began as a purely technical discipline in the early days of the internet, when “webmasters” attempted to manipulate search engine algorithms by tinkering with code and implementing well, let’s call them questionable strategies like keyword stuffing.

Google, of course, ultimately caught on to these dubious “black hat SEO” techniques and began issuing regular core upgrades. Link to another website. This link will open in a new window. Due to Google’s search algorithms, which contradicted and penalized black hat SEO tactics, digital marketers were forced to change their approach and focus on delivering Organic Search Traffic to websites with effective content marketing strategies.

As you may be aware, fixing these Technical SEO issues usually results in higher ranks and more traffic. Furthermore, the Google Search Console includes a plethora of tools to assist you in identifying and resolving Technical SEO issues.

How to Use The “Index Coverage” Report To Find (And Fix) Problems With Indexing

If your website is properly set up, Google will:

  1. Locate your page; 
  2. Add it to their index as quickly as possible.

However, things go awry from time to time. If you want Google to index all of your pages, there are a few things you’ll need to fix.

This is where the Index Coverage report comes into play.

What is the Index Coverage Report?

The Index Coverage Report shows you which pages from your website are indexed by Google. It also informs you of any technical issues that are preventing pages from being indexed.

It’s part of the new Google Search Console and takes the place of the old Google Search Console’s “Index Status” report.

How to Find Errors With The Index Coverage Report

There are four tabs at the top of the Index Coverage report:

  • Error
  • Valid Error with Warnings
  • Valid 
  • Excluded

For the time being, let’s concentrate on the “Error” tab. As you can see, there are 54 mistakes on this website. The graph depicts the evolution of the figure over time. When you click on an Error Status, you’ll be sent to a list of pages that have the same issue.

Select a URL from the drop-down menu. This brings up a panel on the side with four options:

But first, let’s use a browser to go to the URL. You can then double-check that the page is indeed down. Then, at the top of the page, paste your URL into the URL inspection form.

Googlebot will then run over to your website.

If you are still getting a “404 Not Found” error on this page. To solve this issue, you have two choices:

  • Keep things the way it is. The page will eventually be deindexed by Google. This makes sense if the page is unavailable for whatever reason (for example, if you no longer sell that product).
  • The 404 page can be sent to a similar Product Page, CategoryPpage, or Blog Post.

Fixing “Soft 404” Errors

It’s now time to take care of those irritating “Soft 404” issues. Examine the URLs that contain the mistake once more. Then, on your browser, go to each URL.

Let’s test whether Google can get to the page without any problems. You’ll utilize the URL Inspection tool once again. You’ll use the “Test Live URL” button this time. Googlebot gets redirected to the page as a result of this action. It also produces the page so you can see how Googlebot perceives your page.

After that, go to the More Info tab and look for any page resources that Google couldn’t load properly.

There are occasions when it’s necessary to stop Googlebot from accessing particular resources. However, these blocked resources can sometimes result in soft 404 errors.

These five things, on the other hand, are all supposed to be banned in this scenario. After you’ve resolved any indexing issues, click the “Request Indexing” button:

This tells Google that the page should be indexed. The page should be indexed the next time Googlebot visits.

Fixing Other Errors

You can solve any error you encounter using the same procedure I used to fix “Soft 404s”:

  • Open your browser and go to the page.
  • Input the URL into the “URL Inspection” tool.
  • Examine the specific issues that Google Search Console mentions.
  • Resolve any issues that arise.

Listed below are a few examples:

  • Errors with redirects.
  • Errors in the crawl.
  • Issues with the server.

You can correct almost any issue you encounter in the Coverage Report with a little effort.

How to Fix “Warnings” In The Index Coverage Report

In the Index Coverage report, click the “Valid with warnings” tab.

There’s only one warning this time: “Indexed, but banned by robots.txt.

According to Google Search Console, the page is being blocked by robots.txt. Rather than selecting “Fetch As Google,” select “Test Robots.txt Blocking“:

This leads us to the old  Google Search Console’s robots.txt tester. This URL, it turns out, is being banned by robots.txt.

The solution to this problem is to unblock the page from Robots.txt if you want it to be indexed. However, if you do not want it to be indexed, you have two choices:

  • The “noindex,follow” tag should be added to the page. Also, remove it from robots.txt.
  • Using the URL Removal Tool, delete the page.

How To Use The URL Removal Tool In Search Console

The URL Removal Tool allows you to remove pages from Google’s index quickly and easily.

Unfortunately, this tool has not yet been updated to work with the new Google Search Console. To utilize it, you’ll need to use the old Google Search Console. In the new Google Search Console sidebar, expand the “Legacy Tools and Reports” tab, then click “Removals,” which will take you to the old Google Search Console.

Finally, Copy and Paste the URL you want to delete:

After you’ve double-checked that you’ve entered the correct URL, click “Submit Request.”

Image Source

How to Check Indexed Pages for Possible Issues

Now let’s move on to the “Valid” tab. This tells us how many pages are indexed in Google.

What should you look for here? Two things:

Unexpected drop (or increase) of indexed pages

If you notice a sudden decline in the number of pages that have been indexed then there could indicate that something isn’t right:

  • Perhaps a number of pages are obstructing Googlebot.
  • Perhaps you added a noindex tag by accident.

In either case, you should absolutely check this out unless you deindexed a bunch of pages on purpose.

On the other hand, if you detect a significant increase in the number of indexed pages, this could be another indicator that something is wrong. (For example, you might have unblocked a number of pages that should have been blocked.)

An unexpectedly high number of indexed pages

You’d expect to see around that many pages indexed when you look at the “Valid Report” in Index Coverage.

Excluding Stuff

There are numerous reasons to prevent a page from being indexed by search engines.

  • It could be a login page.
  • It’s possible that the page has duplicate content.
  • Alternatively, the page could be of poor quality.

You must ensure that Google does not exclude pages that you WANT to be indexed. You have a lot of pages that aren’t included in this instance.

If you scroll down, you’ll see a list of reasons why each page isn’t in Google’s index.

Here is a description of a few of the reasons that rank the pages down:

“Page with redirect”

The page is redirected to a different address. This is perfectly OK. They’ll eventually cease trying to index that URL unless there are backlinks (or internal links) pointing to it.

“Alternate page with proper canonical tag”

Google found an alternative version of this page somewhere else. That’s what a canonical URL is supposed to do. So that’s A-OK.

“Crawl Anomaly”

These are pages that robots.txt is blocking Google from crawling.

“Crawled – currently not indexed”

These are pages that Google has crawled but hasn’t indexed for some reason. Google does not specify why the page will not be indexed. However, this issue indicates that the page isn’t excellent enough to appear in the search results.

For resolving this problem, the best way is to work on enhancing the quality of any mentioned pages. For example, if it’s a category page, include some material that describes the category. Make the page distinctive if it has a lot of duplicate content. Increase the amount of content on the page if it is lacking.

Basically, make the page worthy of being indexed by Google.

“Submitted URL not selected as Canonical”

This error means, the page contains the same information as a number of others. However, a different URL is preferable. As a result, this page has been removed from the index.

You can apply to the noindex meta robots tag to all duplicate pages save the one you want to be indexed if you have duplicate content on several pages.

“Blocked by robots.txt”

It’s worth double-checking these errors to make sure what you’re blocking is meant to be blocked. If it’s all good, then robots.txt is doing its job and there’s nothing to worry about. 

“Duplicate page without canonical tag”

The page is one of a group of duplicate pages that lacks a canonical URL. In this scenario, it’s rather simple to figure out what’s going on.

A number of PDF documents are available. These PDFs also include content from other parts of the website. This isn’t a big deal, to be honest. However, to be safe, you should have your web developer use robots.txt to block these PDFs. Google will only index the original content in this case.

“Discovered – currently not indexed”

Google has crawled these pages but hasn’t included them in the index yet.

“Excluded by ‘noindex’ tag”

Everything is up to the mark if anything is excluded by noindex tag. Hence, noindex tag is doing its job correctly.

What is the Performance Report?

Google Search Console’s Performance Report displays your site’s overall search performance in Google. This report displays not only how many clicks you receive, but also your CTR and Average Ranking Position.

This new Performance Report replaces the former Search Console’s “Search Analytics” report (and the old Google Webmaster Tools).

In the new Performance Report, much of the information is duplicated from the previous “Search Analytics” report. However, you can now do cool things with the information you obtain (like a filter to only show AMP results).

How to Get More Organic Traffic with the Performance Report

You can increase your Organic Traffic by studying some aspects of the Performance Report:

  • Analyzing Performance Data
  • Studying how User-Focused Content Performs in Discover Report
  • You Can Check Many Important Factors Using “URL Inspection”

How to Supercharge Your CTR With The Performance Report

CTR is undeniably an important Google ranking component. To increase your CTR, the Google Search Console Performance Report is a document that summarises the performance. So, let’s get started with the Performance Report in the new Google Search Console:

Find Pages With a Low CTR

To begin, select the tabs “Average CTR” and “Average Position“:

You should concentrate on pages that are ranked #5 or lower… Also, you have a low CTR. To do so, select the “Position” box from the filter menu and check it.

According to Advanced Web Ranking, the CTR for Google position #5 should be roughly 4.35 percent:

You want to filter out everything that is higher than the 4.35 percent predicted CTR. You can then concentrate on the pages that aren’t performing well. So go back to the filter menu and check the “CTR” box.

The CTR filter should then be set to “Smaller than 4.35″.

Find the Page

After that, you’ll want to examine which of your site’s pages ranks for the term you just discovered. Simply click on the query with the low CTR to do so. Then select “Pages” from the drop-down menu:

Take a look at ALL the Keywords this Page ranks for

It’s pointless to improve our CTR for one keyword just to have it messed up for ten others. The Performance Report will show you all of the keywords for which your page is ranked. Simply select “+ New” from the top bar and then “page“.

Then type in the URL for which you wish to see inquiries. You’ll get a list of keywords for which that page ranks:

Optimize your Title and Description to get more Clicks

Power words demonstrate that someone can receive quick and easy results from your material, and they’ve been shown to draw clicks in the SERPs time and time again. Here are some examples of Power Words to include in your title and description:

  • Today
  • Right now
  • Fast
  • Works quickly
  • Step-by-step
  • Easy
  • Best
  • Quick
  • Definitive
  • Simple

So you can add a few of these Power Words to the page’s Title and Description Tag:

Monitor the Results

Finally, allow at least 10 days to pass. Then log in again. Google may take a few days to reindex your page. Then, for you to collect useful data, the new page must be live for at least a week. Comparing CTR between two date ranges is a breeze with the new Google Search Console. Simply select the date filter:

Finally, filter the results to only reveal search queries containing the term you discovered in step 1 (in this case, “top helmet brands”).

How to Find “LSI Keywords” With Google Search Console’s Performance Report

An LSI keyword is a phrase that ranks between 8 and 20 in search results and receives a significant amount of impressions. The benefits of LSI keywords are: 

  • Google already thinks your page is a good match for the keyword (otherwise, you wouldn’t be on page one). You can generally bump your page up to the first page if you give it some TLC.
  • You’re not depending on Third-Party SEO Tools’ shaky keyword volume statistics. The Google Search Console’s impression statistics will inform you EXACTLY how much traffic to expect.

How to Optimize for Opportunity Keywords In Google Search Console and Rank for Hundreds Of Longtails

One of your pages can now rank for hundreds, if not thousands, of long-tail keywords. A post about SEO tactics ranks for 4,000 different keywords, according to Ahrefs.

First and foremost, you must make your content really detailed. Because when you create engaging content, you immediately rank for hundreds of long-tail keywords. Simply go to Google Search Console and see what keywords your page is already ranking for.

Find the most frequently asked questions about your subject. Then, through your article, respond to them.

How to Find High-Impression Keywords

It is significantly preferable to consider the High Impressions, CTR, and Average Positions as a whole. In this case, clicks are irrelevant because they are a result of the CTR and impressions KPIs. If you rank well for some keywords but don’t match search intent, your webpage may receive a large number of impressions and an outstanding average position. It may give you the idea that it is operating really well. However, your CTR will undoubtedly decrease. You will lose your positions sooner or later, therefore it is not a long-term solution.

By simply searching for your keyword, you will get all the details including High Impressions:

Amazing Google Search Console Features

  • Confirming that Google can find your website. 
  • Fixing indexing problems and requesting re-indexing of new and updated content. 
  • Showing you the sites that link to your website. 
  • Viewing Google Search Traffic data about your site, that is, how often your site shows up in Search Results, the Search Queries that show your site, and more. 
  • Receiving alerts when Google encounters spam, indexing, and other problems on your site. 
  • Troubleshooting issues to do with mobile usability, AMP, and other search features.

Important Pages With Internal Links

Internal linkages are extremely effective. Unfortunately, the majority of individuals use internal linking incorrectly. The Search Console provides a fantastic function that might assist you in resolving this issue. This analysis will show you exactly which pages need internal link love. To go to this report, go to the Google Search Console sidebar and choose “Links.” You’ll also receive a report detailing the number of internal links leading to each page on your site.

You can find the exact pages that link to a specific page internally. Simply hover your mouse over one of the URLs in the “Internal Links” column to see a list of all the internal links that point to that page.

Supercharge Key Posts With Internal Links From Powerhouse Pages

It’s a page on your website that has a lot of good backlinks. More backlinks equal more links that may be passed on via internal links.

Powerhouse Pages are easily found in the Google Search Console. Simply press the “Links” button once more. There’s also a section called “Top Linked Pages.” 

For a complete list, click “More.” The report is organized by the total number of backlinks by default. However, I prefer to arrange by the number of sites that link to you: these are your Powerhouse Pages.

All you have to do now is create internal connections from those pages to the ones you wish to promote.

Advanced Tips and Strategies

You can use Google Search Console 

  • to increase your mobile CTR, 
  • optimize your Crawl Budget and,
  • fix issues with mobile usability. 

These pointers must be looked upon to further improve your page’s ranking.

Mastering Crawl Stats

The Crawl Stats report displays information about Google’s crawling activity on your website. For example, how many requests were made and when your server response was, and whether or not there were any availability issues. You can use this report to see if Google has any server issues while scanning your site.

What is Crawl Budget?

The amount of pages on your site that Google crawls each day is your Crawl Budget. This number can still be found in the old “Crawl Stats” report.

Why is Crawl Budget Important for SEO?

If your website has 200,000 pages and a Crawl Budget of 2,000 pages every day, it’s possible that Google will take 100 days to crawl your website.

So, if you make a modification to one of your pages, it may take months for Google to evaluate the update, or if you add a new page to your site, it will take an eternity for Google to index it.

First, stop wasting Crawl Budget on unnecessary pages

This is a significant issue for E-Commerce websites. Most E-Commerce sites allow customers to search for items and filter across products. This is fantastic for boosting sales. However, if you’re not cautious, you could end up with THOUSANDS of extra pages that look something like this:

yourstore.com/product-category/size=small&orderby=price&color=green…

Google will happily waste your Crawl Budget on these trash pages unless you take action. URL Parameters is the solution to this problem.  Click the “URL Parameters” link in the old Google Search Console to set these up. 

Then press the “Add Parameter” button. You may easily notify Google that any URLs with that colour option should not be crawled:

This should be done for all parameters that you don’t want Google to crawl. Also, if you’re new to SEO, consult with an SEO expert to ensure that this is done appropriately. It’s simple to cause more harm than good when it comes to parameters!

See how long it takes Google to download your page

The Crawl Report in Google Search Console shows you how long Google takes to download your pages on average:

The graph above shows a spike, which indicates that Google suddenly took a long time to download everything and this has the potential to wipe out your Crawl Budget.

Get more backlinks to your site

As if backlinks weren’t already fantastic enough, they also help you save money on your Crawl Budget. Crawl Budget increases as the number of backlinks increases.

Get new content indexed (in minutes)

The fastest approach to get new pages indexed is to use URL inspection. Simply paste the URL into the box and hit Enter.

Then click “Request Indexing,” and Google should index your page in a matter of minutes.

Use “URL Inspection” to reindex updated content

Updating your content is something important, to keep your content fresh. It is also beneficial since it boosts Organic Traffic (FAST). And you can use the “Fetch As Google” feature to get your new content indexed as soon as possible. Otherwise, you’ll have to wait for Google to automatically re-crawl the website.

Identify Problems With Rendering

URL Inspection Tool” can use “Test Live URL” to show you how Google and users see your page. All you have to do now is click the “View Tested Page” button.

Then select “Screenshot” from the drop-down menu. You’ll also be able to check how Google sees your page.

Learn More About:

Conclusion

This detailed guide provided insights into Google Search Console. It usually helps you to understand its working, key features, and benefits, and the best practices.

Curious about Google Search Console vs Google Analytics? Check out our detailed guide to learn the key distinctions and how each tool can benefit your digital marketing strategy.

In case you want to analyze data from Google Search Console into your desired Database/destination, then Hevo Data is the right choice for you! 

Share your experience of learning about Google Search Console! Let us know in the comments section below!

Harsh Varshney
Research Analyst, Hevo Data

Harsh is a data enthusiast with over 2.5 years of experience in research analysis and software development. He is passionate about translating complex technical concepts into clear and engaging content. His expertise in data integration and infrastructure shines through his 100+ published articles, helping data practitioners solve challenges related to data engineering.