Google Search Console or GSC – the rebranded tool still familiar to old souls in the marketing community as Google Webmaster Tools – is a pretty versatile piece of marketing software. While SEOs tend to be familiar with it, most marketers, conversion specialists, and web site designers stand to gain something from understanding the basics.
At its core, GSC is a tool that …
- allows marketers to see what terms people use to get to the website, and
- provides an avenue to tell Google about how to crawl the site best.
Activating Google Search Console
There are 5 ways to activate Google Search Console.
You can verify by signing into your domain name provider, or by using one of the 4 items below:
1. Google Analytics (GA)
If you also manage the Google Analytics account, and you use the asynchronous tracking code, you can use GA to verify that you own the GSC presence for the site.
2. HTML file upload
You can download a file from GSC, place that file on the root of the site, and get Google to check that the file has been placed to verify ownership.
3. HTML tag
You can add a small script on your <head> tag and let Google check that it exists, and you can verify ownership that way.
4. Google Tag Manager (GTM)
If you also manage GTM and use the container snippet, you can use that to verify ownership of GSC.
As soon as GSC verifies that you own the site, it starts to collect and report data about search terms and crawl rates.
Once you verify that it’s your site, you can also create paths without additional verification. So, if you have registered and verified that you own www.example.com, you can create an account for www.example.com/blog without needing to use one of the five methods mentioned above. The www.example.com/blog path will already be pre-verified.
Learning What People Are Searching For
Arguably the most important feature in GSC is the ability to check what search terms people use to get to your website. This, along with on-site search data and survey data, can tell you about user intent.
Within Google Search Console, you can go to Search Traffic> Search Analytics to see the clicks, impressions, click-through rate and position of the search terms.
If you’re LG, for instance, you might get data like this:
Here’s what those things are:
- Queries – the actual search terms people use to search on Google
- Impressions – the number of times a page on your site appears for the given terms
- Clicks – the number of times people click on a result to your web site for the given terms
- CTR – the click-through rate, or the percentage of clicks relative to the impressions
- Position – the average Google position of your pages for the given terms
That data is useful for a slew of different functions:
- You can check if your educational pages are driving early stage terms to your site, and you can see how effective your top of the funnel efforts are
- You can see what terms people are typing in when they either know your brand or are looking for very specific things, and are likely to be at the bottom of the funnel
- You can see which terms in your space you don’t have high enough Google positions for, and can target those terms with new pages you create
Within the “main” GSC profile, you can see 3 months’ worth of data. If you want to see more than that, you can use the new beta version:
On the new version, you can check up to 16 months’ worth of keyword data. You just need to Status> Performance, and change the date range to 16 months.
Letting Google Know What to Crawl First
Google’s spiders will crawl your site whether or not you have a sitemap. However, you can tell Google which pages you are prioritizing when you submit a sitemap via GSC. Google has a limited amount of crawl time dedicated to each site. And to make the most of that crawl time, you’d want to tell Google which pages you think the spiders should crawl first.
To start that process, go to Crawl> Sitemaps:
Once you’re there, you can go to Add/Test Sitemap, and then plug in the location of the sitemap you want to submit.
If you can build an XML sitemap that updates periodically, great, you can submit that. However, even if you don’t have that, you can still upload a basic text sitemap somewhere on the root of your site, and submit that location to GSC.
If you’re using a text sitemap, remember to save it in UTF-8 encoding, with the format below:
Once you’ve submitted the sitemap to Google, you can wait a while and then check how many of those pages Google has crawled – that should give you a pretty good idea about whether or not you are getting your most important pages seen by Google.
Checking What Pages Look Like to Google’s Spiders
Sometimes, for particular pages, you’d want to check if Google has issues viewing and rendering your pages correctly. For those kinds of problems, GSC has a feature called fetch as Google:
Once you go to Crawl> Fetch as Google, you can plug in a path on your web site and see if Google can see the page, and if what Google sees has issues.
It might help you identify whether you have pages to fix.
Checking Blocked Pages
There are probably pages on your site that you don’t want Google to waste its crawl time on. Since Google’s crawl time is limited, you’d want it to spend its time crawling your most useful pages, or at least your valid pages.
You probably don’t want to get a few page types crawled:
- Your 404 page
- Search strings with parameters
- Anything protected by a login/account information before a user can access the section
For those types of pages, you’d want to exclude all of those via a robots.txt file.
You can check whether those exclusions are working under Crawl> robots.txt Tester, and see if you need to make any adjustments.
Putting It All Together
Google Search Console is a very powerful tool if you know what you’re looking for.
Once you’ve verified your account, you need to make sure that you’re using at least the basic functions that make the tool worth the hassle verifying:
- Review the search terms people use to get to your site, and learn intent data for the different parts of the funnel
- Submit a sitemap to Google, and check how many of your most important pages are indexed
- Use diagnostics tools like the robotx.txt tester and the fetch as Google tools to check if your site has issues
GSC is pretty handy when you know exactly where to dig.