The Ultimate SEO Audit with Google Search Console – Full Series

The Google Search Console (GSC) provides powerful and beneficial tools for website maintenance and optimization. As we have covered in the last few weeks, GSC implements the ultimate SEO audit and what is necessary to correct errors and achieve higher rankings in search engines. GSC not only keeps your business or website sharp and running well, but it also helps drive more eyes to your content.

We have provided a handy summary below of our ultimate guide to an SEO audit with Google Search Console, focusing on each of its different sections and tools plus tips on how to best utilize each.


  • Search Engine OptimizationStructured Data to markup your HTML content
  • Rich Cards for mobile
  • Data Highlighter for easier tagging
  • HTML Improvements to optimize post titles and meta descriptions for SEO
  • Site Links to let search engines know about addendum links other than the home page
  • Accelerated Mobile Pages helps web developers write fast loading HTML and JavaScript
  • Queries Report for your keyword rankings
  • Pages Report for URL popularity
  • Countries, Devices Search Type, and Dates Reports to provide more details on where your users are and how they are finding you
  • Links to Your Site provide a vote of confidence from the linking site
  • Internal Links to other articles on your site show that you are writing relevant content
  • Manual Actions are like marks on your license if you are not following the Google “speed limit”
  • International Targeting is your tool to communicate the correct language and geographical preferences to users
  • Mobile Usability tests that all the pages on your site are working properly for mobile
  • Index Status lets you know which pages have been indexed by Google
  • Content Keywords tool helps you understand how well your keywords are performing
  • Blocked Resources to not allow the Googlebot to index certain pages of your site
  • Remove URLs tool helps remove URLs which were created from user searches and do not add value
  • Crawl Errors to correct failed attempts at indexing pages, or pages that returned 404 or 500 errors
  • Crawl Stats to let you know when the Googlebot was active and returns stats
  • Fetch as Google lets you know if Google can access the page
  • Robots.txt lists which pages are being blocked from the bot
  • Sitemaps to help maintain a higher search ranking
  • URL Parameters to see if the Googlebot is experiencing any problems
  • Security Issues to see if you’ve been hacked
  • Other Resources to help you become a great Webmaster!

Additionally, there are many other tools on the web that can assist you such as Screaming Frog which provides a comprehensive SEO Audit tool for beginners. Other options include Check My Links and SEO Report Card which also supply free services.

Why You Need to Utilize Google Search Console for Your Business

Successful website owners know that the difference between profit and failure is by improving their Search Engine Optimization (SEO) to attract more business and get ranked higher on Search Engine Results Pages (SERP). If you are not an SEO professional, the mountains of information you would have to consume to understand and enhance your ranking effectively can be overwhelming (which is why many businesses hire professional SEO companies, like Netsville, to manage their websites). One way to get started is to take advantage of Google Search Console (GSC).

Formally known as Google Webmaster Tools, GSC provides a whole bevy of tools and reporting metrics to improve SEO for websites. These tools provide webmasters with information on how well their website pages are performing for search results.

GSC benefits your website by helping to improve how Google views your site with tools designed to ramp up its functionality, rectify errors, and strategize content towards a more compelling search engine ranking. It checks on site indexing, broken pages, and gives Google more information about the structured data on your site and how it should be served to search engines. Furthermore, there are tools to help markup rich content so that Google can categorize and index it properly and an Accelerated Mobile Pages tool to calibrate HTML for mobile devices.

While there are many aspects of GSC than can be used to improve your website, the best starting point would be by exploring the tools in the Search Appearance section. Used to illustrate how your site may appear in a search, the Search Appearance section includes tools such as Structured Data, Rich Cards, Data Highlighter, HTML Improvements, Site Links, and Accelerated Mobile Pages.

Structured Data

The Structured Data (or rich snippets) tool markups your HTML content so it can be understood, organized, and displayed by SERPs. The report lets you visualize what your SERP would look like because it describes your webpage to the search engine in a language it can understand.

Rich Cards

Rich Cards are immensely useful for mobile because they have a Pinterest look to them allowing the results to be more visually appealing when compared to text based rich content. They have minimal text and are mostly images because users are more likely to click on an image than text. Rich Cards are especially useful if you have a media heavy website.

Data Highlighter

The Data Highlighter tool teaches “Google about the pattern of structured data on your website” to be used in searches. It tags data when you highlight it on your website without the need for a developer to code it. You can use it to markup articles, events, products, restaurants, and movies. You might notice that these types of tags are for items on your website that change periodically. Using the Data Highlighter allows you to bring them to the forefront in searches.

HTML Improvements

If you have a CMS website like WordPress, you might be using a plugin called Yoast SEO which makes optimization for SEO easier if you are consistently applying it to each post and page. The HTML improvements tool provides feedback on how well you’ve followed the SEO rules such as proper post titles and meta descriptions.

Site Links

The Site Links tool works to optimize addendum links, such as About and Contact Us, that appear below your main website page link in search results (as seen in the example below).

Since Google decides how and when the links are displayed, making your content more relevant and helpful to searches will encourage Google to add site links. This alone is one of the biggest reasons to ensure your content is relevant and meaningful to Google’s search engines because you gain more real estate on the SERP.

searchconsolesitelinks

For instance, there are 6 extra links below the Netsville main link to other pages on our site. Once you have what Google determines to be enough content on your site, it will display the site link categories automatically below your main page link.

Accelerated Mobile Pages

The Accelerated Mobile Pages (AMP) tool provides template code for web developers to write fast-loading HTML and JavaScript. The number of people using mobile devices to access the web has increased exponentially and mobile pages need to load as fast as possible or risk the user losing interest. AMP provides developers with open source specifications to build mobile pages that are more efficient and eliminate any pages that load too slowly for mobile devices.

By applying AMP, your website will become more mobile-friendly and therefore more SEO dynamic (which, in the aftermath of Google’s Mobilegeddon, is a vital component to good SEO).

Overall, the Google Search Console provides direct insight on what needs to be done to optimize your website for better search engine results and cleans up any hidden errors that might be hindering your site from attracting a larger audience. For more information, we recommend checking out Quick Sprout’s ultimate guide by Neil Patel for a comprehensive breakdown of each section of GSC including Search Traffic, Google Index, Crawl, Security, and Other Resources.

The Ultimate Way to Determine Your Website’s Search Traffic

We have explored the benefits of Google Search Console’s (GSC) Search Appearance tools but that is just a small part of what makes GSC a necessity for your business. The Search Traffic section, for instance, has its own unique set of tools that optimize your site to gain more qualified traffic and conversions.

The first of these tools is the Search Analytics Report which graphically displays how often your site appears in Google’s search results. The metrics include the number of clicks per keyword, impressions, click-through rate (CTR), and site position on the Search Engine Results Page (SERP). Selecting queries, pages, countries, devices, search type, and date filters will report the grouping category by the selected metrics.

searchanalyticsgraph

Queries Report

If you sort by the Queries filter, the report will return the keyword rankings listed from the most to least popular. Valuable feedback, especially if you are trying to determine which keyword is working well. Choosing the right set of keywords is critical to matching your site content to your targeted audience. The information reported when you choose the position checkbox is extremely important because it shows you the position of each set of keywords in the SERP. This tool makes keyword research much less mystifying by identifying weaknesses in your content compared to the user’s interest.

Pages Report

The Pages report sorts your site’s URLs by popularity. This data provides easy-to-see results to determine which pages are the best performers with the most popular indicative of what is most relevant to your audience. Improving the content on these pages and building backlinks between them on your site will improve the SEO.

The Pages category also gives you the ability to compare the results of different pages on your site. This could be used to utilize A/B testing in your marketing such as comparing Impressions with CTR to get a visualization of where content could improve. For example, if you have a high number of impressions but score a low CTR, it could mean that the user was not compelled to visit (click) your site compared to the other search results.

Countries, Devices, Search Type, & Dates Reports

The last four filters of the Search Analytics tool generate reports based on Countries, Devices, Search Type, and Dates. The Country and Devices categories are valuable if you are an international company or depend on mobile traffic. The Search Type filter reports if the users searched for web, images, or videos. This type of data is good to have if, for instance, you are a photographer and want people to find your site through images posted. This is where relevant information in your alt tags is key. Date filter sorting allows you to see which dates performed the best. A pattern could be discerned of which dates and times are best to post.

The Site Traffic Section has other useful tools, such as International Targeting and Mobile Usability, which we will feature in upcoming articles. As you can see, the Search Analytics tool is extremely useful to gain a greater understanding of your site’s performance and to let search engines know what your site is all about.

searchtraffic

How to Eliminate Search Traffic Errors to Gain Favor with Google

The tools to eliminate Search Traffic errors section of Google Search Console (GSC) is a big one. Last week, we explored the Search Analytics Report but that was only one of the many tools available in this section such as: Links to Your Site, Internal Links, Manual Actions, International Targeting, and Mobile Usability.

Links to your Site

When another website links some of their content to content on your website, Google looks at it as a vote of confidence from the linking site to the quality of the content on your site. Google uses its PageRank tool (which is no longer public) to demonstrate the importance of that linked page.

On the dark side are other websites who are spammy or have low quality content linking to your website through black-hat SEO techniques. When black-hat sites link to content on your site, Google lowers their view of your website which also lowers your site’s PageRank. To disassociate your pages from such websites, you can check who has linked to your site using the “Links to your Site” tool in Google Console. This action will allow you to disavow sites that look suspicious. It tells Google that you also disapprove of these links and Google will not associate them with your site in the future.
rustylink

Internal Links

Writing new blog posts and articles for your website keeps your content fresh and what you write is most likely related to other content you have written. Linking to past articles on your website that are relevant to the current article you are writing demonstrates to GSC that those pages are still important. The amount of internal links, which point to a piece of content, demonstrates the value of that content. This continues to build as you link posts together showing consistency of your overall subject. For example, at Netsville, we write about anything and everything to do with Internet Property Management, which is defined as managing websites, social media, analytics, digital marketing, etc. Because all of those topics come under the same umbrella, we can link many of them together.

Manual Actions

Google wants you to follow their quality guidelines. With the Manual Actions report, Google will let you know if anything on your site has been marked as spam or using black-hat techniques. In these cases, Google can demote or remove your content in search results. The best way to avoid these issues is to follow their Quality Guidelines.

Watch this video from Google’s Search Quality Team to understand more about Web Spam Content Violations.

International Targeting

If you are pursuing an international community on your website, the International Targeting tool is a necessity. The purpose of this report is to ensure that your site delivers content to the correct language and geographic preference of the user (i.e. making sure users in France are receiving the content intended for them).

Google recognizes hreflang tags in your content to serve the pages that are relevant to the user’s language. The International Targeting report checks the accuracy of the hreflang tags in your content plus further instructions on how to fix and optimize the code.

Mobile Usability

Remember Mobilegeddon from 2015? It was like Y2K all over again, only this time the warnings were for real. The Mobile Usability tool tests the usability of your website on mobile devices. With mobile web traffic growing faster than desktop traffic (having recently overtaken it globally), it is imperative that every metric in this report comes out positive.

The Mobile Usability report checks your pages for mobile friendliness such as readability and responsiveness among other indicators. If your pages do not conform to these conditions, it is likely that users may not return. In addition to utilizing this report, you can test your pages on Google Developer’s Mobile-Friendly Test page by individually entering the URL’s of your web pages.

seogears
How to Use Google Index to Optimize your Site’s Performance

After eliminating Search Traffic Errors, the next step is to optimize your site’s performance in the Google Index section of Google Search Console (GSC). In general, this section will provide you with the tools to fine-tune how you want your site to be indexed, check significance of content keywords, ensure that resources the Googlebot may need are not blocked, and remove URLs that are of little relevance in searches.

Index Status

When users search the Internet for something, the results are returned quite quickly. The reason for this is quite simple: Google has indexed thousands, if not millions, of web pages so that it already knows what’s out there. When a search is performed, the indexed pages that are most relevant to the search query are checked and returned to the user’s search results.

As a website owner, you should know which of your website’s URLs have been indexed, which have not, and if there is a problem. The Index Status tool of GSC reports how many of your website’s URLs were found and added to Google’s index, how many are blocked intentionally by your robots.txt file, and how many URLs you have removed. If the count of indexed pages ever drops unexpectedly, there could be a problem with the page(s) and you need to investigate any issues. Your graph should show an upward trend in the number of indexed pages, which indicates that Google can access the pages of your site. Of course, before a website can be indexed, you need to submit a sitemap to Google. You can make one at XML-Sitemaps.com or use a Sitemap plugin.

indexstatus

Content Keywords

Have you ever wondered how well your content keywords are performing? The Content Keywords tool of GSC will do just that. It will return a list of your most significant keywords with a drill-down link to the pages those keywords appear on. Reviewing the keywords with the Search Queries report reveals how Google is interpreting the content of your site. Significance indicates how many times the keyword is found on your site (with the most significant listed first) and on which pages they appear.

If you see results for any unexpected words, such as “sex” or others, you should check your site for hacking. You will also want to check for Crawl Errors if any of your keywords are missing from the list. For example, if you are a restaurant that is known for sweet treats and cake, one of your significant keywords is likely to be “dessert.” If that word is missing from your list, it is likely that Google was not able to crawl and index a number of pages and checking for Crawl Errors can help solve the problem. Again, submitting a Sitemap can serve in reducing such errors.

Blocked Resources

Without access to some of your page’s resources, the Googlebot cannot index the pages of your site correctly. Your site’s robots.txt file should not disallow crawling of resources such as JavaScript, CSS, and image files because pages may not render or index correctly. Check that your robots.txt file is correct so it does not affect your page’s ranking.

At the top of the Blocked Resources report will be a list of files that may be impairing the indexing of your web page. Below the graph is a list of all the pages that may not be indexed properly. To correct this, click on the URL and go through the 3 steps in the pop-up window. The first is to see how Google views your page. The second step is to verify ownership of the blocked resource. The third step is to unblock items from the robots.txt file. More details on these steps are listed on the Blocked Resources Report help page.

Remove URLs

Pages that are not considered useful should be included in the robots.txt file to prevent crawling of content that does not add value to users coming from search engines. For example, if you have a search option on your site and someone uses it to find content on your site, a search results page is created. It’s possible that Google will index these pages and think your site is larger than it really is. The search results do not provide added value to your site and it will affect your Page Ranking. Varvy.com explains this concept really well.

For more information on GSC’s Index section, check out Google’s Search Console Help pages for more details.

UPDATE: At the time this article was written, it was reported by Google that the Index Status report was broken. A user found that their Sitemap report had increased but the Index Status report did not. Since the two reports usually align and shift together, Google suggests relying on your website’s Sitemap report for the time being. An estimated time for the Index Status report to be fixed has not been shared presently.

nasty-crawlers

How to Smash Nasty Crawl Errors

Once you have optimized and adjusted the way Google indexes your site, you need to check the Crawl section of Google Search Console to correct any problems that were found by the Googlebot. This is a vital step because if the Googlebot has issues with or cannot find a lot of pages on your site, it will think that your site is down or may pose a hazard to visitors. This can severely affect your page ranking if Google doubts the relevance of your website.

Crawl Errors

Crawl Errors are a list of URLs on a website that the Googlebot attempted to index but could not. If these errors appear, there is no reason to be alarmed because many can easily be corrected. They might be 404 Errors, meaning the page doesn’t exist or the name was changed, or they could be a 500 Internal Server Error, which indicates something has gone wrong on the website’s server.

404 errors are common and not difficult to fix. If you are getting 404 errors, you may have deleted a post or changed the name of a URL, linked to a post that no longer exists on your site, or linked to an external page that is no longer there. If it is an internal link that no longer exists, find a post that is meaningful to the subject of your post internally. In the case of a missing external link, you should find another external link that is similar to your subject to replace it. For other 404 errors, sometimes you only need to fix a broken link.

A 500 error is also nothing to panic about, but it is a bit more challenging than a 404 Error. For 500 Internal Server Errors, it is possible that the server had a glitch that has since been corrected, but the Googlebot needs to be updated. The error can show up for a variety of reasons. The video below contains more information on such crawl errors:

Crawl Stats

The Crawl Stats tool chronicles the Googlebot activity on your website for the last 90 days. It takes into account all of the files your website it has downloaded, including CSS, JavaScript, PDF’s, and image files. It records in the Crawl Stats report how many pages, kilobytes, and time it took to download per day. You will see spikes when you have added a lot of new information or have information Google deems to be very useful. At Netsville, for example, we recently had a large spike for this article.

Fetch as Google

Fetch as Google is a tool which checks whether Google can access a particular page on your site. It will return the output code and HTML for the page. Fetch and Render accesses the page, renders how it will be displayed, and checks whether it can access page resources like images or scripts.

google-magnify

Robots.txt

A robots.txt file directs Googlebot and other search engine robots (web crawlers) on how to crawl and index pages on a website. The robots.txt tester tool shows you the last time Google read the robots.txt file and its code. It can be set to block all web crawlers from all content or from a specific folder on a site.

Moz.com has a very good cheat sheet on the number of ways it can be set up. You may want to block them from scanning non-public parts of your site or indexing duplicate content. You can test pages to see if they are blocked and you can instruct it to allow or disallow certain pages. It also enables you to verify that you are not blocking pages that should not be blocked.

At the bottom of the page in this section, you can additionally observe how many syntax warnings and logic errors appear on your site. Syntax warnings are characters or strings of code that are incorrectly written and fail to execute a command. Logic errors errors are bugs in a program that cause it to operate incorrectly. Both need to be fixed so the server does not fail to output the page.

Sitemaps

Having a sitemap is beneficial because if you are continuously updating your content, it will help you maintain a higher search ranking. Using a sitemap alerts Google that your content has been updated. Ideally, sitemaps should be updated every time you add new content to your site so that it will be indexed and visible on Google’s search as soon as possible. In other words, if you have a very large website, you will want to update the sitemap frequently.

URL Parameters

URL Parameters are added pieces of text that appear at the end of a permalink such as a country country code if your website sells products globally. They tell Google that you have multiple web pages which contain the same content aimed at different countries, defining that there are differences between the pages (hence the unique parameters). The URL parameters tool will tell you if Googlebot is experiencing any problems and the URL Parameters need to be configured.

It is not recommended that you do anything with this section unless you are very familiar with how parameters work. If you have reason to believe that problems exist, contacting an IT technician would be the best option to avoid breaking anything on your site.

Security Issues

security-image

If your site has been attacked, you will receive a notice in the security issues sections. In the event you receive a warning, it is possible that your site has been hacked and users will see a warning about your site in the search engine results page (SERP). This is another problem where the best solution would be to get an IT technician involved.

Other Resources

Google is great with providing extra help when you need it. On the Other Resources page you will find information outside of Google Search Console to help you optimize your site even further plus a link to Webmaster Academy to advance your knowledge of all of the tools we discussed in this series of articles. Here is the list of all the resources Google provides for help and testing:

other-resources-image