Google Search Console provides the data you need to track your website’s performance and improve rankings, available exclusively through the Search Console.
This makes it indispensable for online businesses and publishers seeking maximum success. Indeed, monitoring your search presence is easier with free tools and reports.
» MORE: Need help managing your website?
What is the Google Search Console?
Google Search Console is a free web service hosted by Google that allows publishers and search marketers to track the overall health and performance of their site compared to Google Search.
It offers an overview of metrics related to search performance and user interaction to help publishers improve their sites and increase traffic.
Google Search Console also allows Google to report detected security issues (such as hacking vulnerabilities) and penalties imposed by the search quality team.
- Monitor indexing and crawling.
- Identify and fix bugs.
- Review search performance.
- Request indexing of updated pages.
- Review internal and external links.
There is no need to use Google Search Console to improve rankings; it is not a ranking factor.
However, the usefulness of the Search Console makes it indispensable for improving search performance and increasing website traffic.
Where to start?
The first step to using Google Search Console is to verify site ownership.
Google provides several different ways to verify a site, depending on whether you verify a website, a domain, a Google site, or a site hosted on Blogger.
You can do this in two ways:
- Specify a domain. This way, you will get the complete statistics on the resource, including all protocols, subdomains, and directories.
- Specify a particular URL. Choose this option if you want to limit your statistics to a specific address with a particular protocol.
You can use both at the same time for the same site.
To add a domain, copy the TXT entry specified in the new window and add it to your domain’s DNS configuration.
If your ISP is not at the top of the list that opens (and most likely it won’t be), go to your registrar’s site, log in to your account, and open the DNS Servers Control Panel. Find the button where you can add a new entry. Select TXT from the list and paste the copied text.
Then go back to the Google Search Console where you copied the TXT entry and click “Validate”. Most likely, a warning (in red) will appear that ownership could not be confirmed. If you did everything correctly, do not worry – you need to wait a while, and the domain will be available in the Google Search Console.
The second option to add a resource is even simpler and has several ways to confirm the rights.
A list of added URLs will open, and an “Add Resource” button will appear at the bottom.
If you have added a site before and want to use another one, open the drop-down menu next to the address of the existing one.
Domains registered with Google domains are automatically verified when you add them to Google Search Console.
Most users verify their sites in one of four ways:
- Uploading an HTML file.
- The meta tag
- Google Analytics tracking code.
- Google Tag Manager.
Some website hosting platforms limit the content uploaded and require a sure way to verify site owners
But this becomes less of an issue because many hosted services have an easy-to-use verification process, which will be discussed below.
How to verify site ownership
There are two standard ways to verify ownership of an ordinary website, such as a standard WordPress site.
- Uploading an HTML file.
- Adding a metatag.
When you validate a site with either method, you’ll select the URL prefix property process.
Let’s stop there and admit that the phrase “URL prefix properties” means nothing to anyone but the Googler who coined the phrase.
Don’t let it make you feel like you’re about to enter a blindfolded maze. Checking a site with Google is easy.
HTML download method
- Go to Search Console and open the property selection drop-down list in the upper left corner of any Google Search Console page.
- In the pop-up window labeled Select Property Type, enter the URL of the site and click the “Continue” button.
3: Select the HTML upload method and upload the HTML file.
4: Upload the HTML file to the root of your site.
Root means https://example.com/. So, if the uploaded file is called Verification.html, then the uploaded file should be at https://example.com/verification.html.
- Complete the verification process by clicking Return again in Google Search Console.
Verifying a standard website with its domain on websites such as Wix and Weebly is similar to the steps above, except that you will add a meta description tag to your Wix site.
Duda has a simple approach that uses the Search Console app, which quickly checks the site and launches its users.
Troubleshooting with the GSC
Ranking in search results depends on Google’s ability to crawl and index web pages.
The Search Console URL Checker tool alerts you to any crawling and indexing issues before they become a severe problem and pages start dropping out of search results.
URL Checker Tool
The URL Checker tool shows you whether a URL is indexed and can appear in search results.
For each URL sent, the user can:
- Request indexing of a recently updated web page.
- See how Google discovered the web page (sitemaps and referenced internal pages).
- View the date the URL was last traversed.
- Check if Google uses the declared canonical URL or uses a different one.
- Check mobile usage status.
- Check enhancements such as breadcrumbs.
The coverage section shows Discovery (how Google discovered the URL), Scanning (shows whether Google successfully scanned the URL, and if not, indicates the reason why), and Enhancements (provides the status of structured data).
The coverage section can be accessed from the left menu/
Coverage Error Reports
Although these reports are flagged as errors, it doesn’t necessarily mean that something is wrong. Sometimes it just means that indexing can be improved. For example, in the following screenshot, Google shows a 403 Forbidden server response for almost 6,000 URLs. The 403 error response means that the server is telling Googlebot that it’s not allowed to crawl those URLs.
These errors occur because Googlebot cannot scan web forum members’ pages.
Each forum member has a member page that has a list of their latest posts and other statistics.
The report contains a list of the URLs that are causing the error.
When you click on one of the listed URLs, a menu opens to the right where you can check the affected URL.
To the right of the URL itself, there is also a context menu in the form of a magnifying glass icon that also provides the ability to check the URL.
Clicking on the check URL shows how the page was discovered.
It also shows the following data points:
- Last scan.
- Crawled as.
- Is the scan allowed?
- Page selection (provides a server error code in case of failure).
- Is indexing allowed?
There is also information about canonical used by Google:
- Canonical, declared by the user.
- Canonical, chosen by Google.
For the forum website in the example above, the important diagnostic information is in the Discovery section.
This section tells you which pages display links to member profiles in Googlebot.
With this information, the publisher can now write a PHP statement that will make the links to members’ pages disappear when the search engine bot starts crawling.
Another way to solve this problem is to add a new entry in robots.txt so that Google will not try to crawl these pag
By eliminating this 403 error, we free up crawl resources for the Googlebot to index the rest of the website.
The Google Search Console coverage report allows us to diagnose and fix Googlebot crawling problems.
Correcting a 404 error
The coverage report can also alert the publisher to 404 and 500 series errors and tell you that all is well.
A 404 server response is called an error only because a browser or search robot’s request for a Web page was made in error because the page does not exist.
It does not mean that your site is in error.
If another site (or internal link) links to a non-existent page, the coverage report will show a 404 response.
By clicking on one of the affected URLs and selecting the “Check URL” tool, you will see which pages (or sitemaps) link to the non-existent page.
You can then choose whether the link needs to be fixed (if it’s an internal link) or redirected to the appropriate page (in the case of an external link from another website).
Or it could be that the web page never existed and whoever is linking to that page made a mistake.
If the page no longer exists, or if it never existed at all, a 404 response can be shown.
Using the GSC’s features
The top of the Google Search Console performance report contains much information about how the site performs in searches, including search features such as select snippets.
There are four types of search that can be explored in the performance report: Web, Image, Video, and News.
Google Search Console displays the Web search type by default.
Change the type of search displayed by tapping Search Type.
A pop-up menu appears, allowing you to change the type of search you want to view.
A helpful feature is the ability to compare the performance of the two types of searches on a graph.
Four measures stand out in the performance report’s top section:
- Total clicks.
- Total impressions.
- Average CTR (click-through rate).
- Average position.
The Total clicks and Total impressions metrics are chosen by default.
By clicking on the tabs dedicated to each metric, you can choose to display these metrics on the bar graph.
Impressions are the number of times a site has appeared in search results. As long as the user does not have to click a link to see the URL, it is considered an impression.
Also, if the URL is at the bottom of the page and the user doesn’t scroll to that section of the search results, it still counts as a display.
High impressions are good because it means Google shows the site in the search results.
But the value of the impressions metric is made meaningful by the “Clicks” and “Average Position” metrics.
Clicks metrics show how often users click from search results on a website. A high number of clicks, in addition to an increased number of impressions, is a good thing.
A small number of clicks and a high number of impressions is less desirable but still acceptable. This means that the site may need tweaking to get more traffic.
The click metric is more meaningful when considered together with the metrics of average CTR and average position.
Average CTR is a percentage that reflects how often users click from search results on a website. A low CTR means something needs to be improved to increase visits from search results. The site is operating efficiently if its CTR is higher. This metric becomes more important when considered in conjunction with the Average Position metric.
CTR is calculated using the formula: (number of clicks/number of impressions)*100. For example, if you had 100 clicks and 1000 impressions, your CTR is 10%. This figure can be measured for the entire site or for individual pages.
Search Console’s performance report also allows you to compare numbers, filter them, and visualize them so that you can easily compare the information. For example, you can filter data by query and country to see what people are searching for in a particular region. Or you can check which pages are most frequently viewed from a particular device. Such combinations will give you a more global understanding of your site’s effectiveness – because you can see how different metrics relate to each other.
“Average position” shows the average position of a site in search results. Averaging positions from 1 to 10 is excellent. An average position in the 20s (20-29) means that the site appears on the second or third page of the search results. That’s not a bad thing. It just means that the site needs more work to give it an extra push into the top 10.
Average positions below 30 may (in general) mean that the site could benefit from significant improvements. Or it could be that the site ranks for a large number of low-ranking keyword phrases and a few very good keywords with exceptionally high rankings.
Either way, this could mean a closer look at the content. It could be a sign of a gap in the content on a website where content that ranks for certain keywords is not strong enough, and a dedicated page dedicated to that keyword phrase may be needed for better ranking.
All four metrics (impressions, clicks, average click rank, and average position), when considered together, provide a meaningful overview of a website’s effectiveness.
The main takeaway from the effectiveness report is that it is a starting point for quickly understanding a website’s effectiveness in search.
It’s like a mirror that reflects how well or poorly the site is performing.
If you click on one of the page category blocks, you will see the reasons why URLs are assigned a certain status. The most common problems are server errors, error 404, missing pages, and URL blocking by robots.txt.
You can filter the indexing results by clicking on the drop-down filter. By default, the chart and table show “All pages processed” and show all URLs detected by Google. You can also choose the All Sent Pages option to see only those pages that are in your sitemap, or you can select URLs listed in a particular Sitemap.
With the Indexing Report, you can see which pages were indexed by Google and analyze any problems found during the scan. The pages scanned may have one of the following statuses: “Error”, “No error, there are warnings”, “Page without error” and “Excluded”.
The status “Error” means that Google failed to index the page, and the status “Excluded” means that you either intentionally excluded the page from indexing or added duplicate content. The “No errors, there are warnings” status shows indexed pages that have some problems, and the green “Page without errors” status says that the page has been successfully indexed.
If your site has more than 500 pages, be sure to check the indexing report. If you have a small site, you can skip this tool and just type site: vash_sajt into Google, where vash_sajt is the URL of your site’s home page.
Under the page status blocks, you can see a graph of page impressions that overlaps with the columns of indexing statistics. By analyzing the chart, you can find connections between the number of impressions and the number of indexed URLs.
Performance Report Options
Scroll down to the second part of the Performance page, and you’ll see several so-called measurements of website performance data.
There are six measurements:
- Queries: shows the most popular search queries and the number of clicks and impressions associated with each keyword phrase.
- Pages: shows the most effective web pages (plus clicks and displays).
- Countries: best countries (plus clicks and impressions).
- Devices: shows the most popular devices, divided into mobile, desktop, and tablet.
- Search Appearance. Shows the different kinds of expanded results in which the site was displayed. It also tells you whether Google displayed the site with Web Light results and video results, as well as relevant click-through and click-through data. Web Light results are optimized for very slow devices.
- Dates. In the dates tab, clicks and displays are ordered by date. Clicks and displays can be sorted in descending or ascending order.
Keywords are displayed in queries as one of the performance report options (as above). The query report shows the 1000 most popular search queries that resulted in traffic.
Of particular interest are the low-performance queries.
Some of these queries display small amounts of traffic because they are rare, which is known as long tail traffic.
But others are search queries that are the result of Web pages that may need improvement, perhaps they may need more internal links, or it may be a sign that a keyword phrase deserves its own Web page.
It’s always a good idea to look at low-performing keywords because some of them may be quick wins that can lead to a significant increase in traffic when the problem is solved.
Google Search Console offers a list of all links leading to the site. This report contains information about your site’s internal and external links.
Here you can see which sites link to you most often – instead of site URLs, you’ll see a list of their root domains: https://www.google.com/ will appear as google.com.
The link report is available at the bottom of the left-hand menu. To get there, click Links > Most Referenced Sites > More.
However, it is essential to note that the link report does not report links that help rank the site. This information will help you make sure that all sites that link to you are relevant and reliable. We advise you to check this report regularly because links from unknown and spammy resources can harm your site’s ranking. Google recommends rejecting such links to avoid such problems.
And it is also worth getting rid of links from sites that are not relevant to your niche. For example, if you have a travel agency site, it makes more sense to have mentioned it in authoritative travel blogs.
The link report also shows which pages link to certain pages of your resource more often than others. For example, if you click on the “Pages Most Referenced” box, under “Pages with Many External Links,” you will see a list of sites that link to a particular page. Next, you can click on the “Inbound Links” column to see a complete list of pages pointing to a particular page on your site. At the top of the list will be the pages with the most links.
To find out what words other sites use when linking to your site, go to Links > Most Common Link Texts.
The Links report consists of two columns: External Links and Internal Links.
External links are links from outside the website that lead to the website.
Internal links are links that originate within the website and lead to other locations within the website.
The “External Links” column contains three reports:
- Most Popular Pages.
- The sites with the most links.
- Top link text.
- The Internal Links report lists the pages most frequently linked to.
Each report (pages with the most links, sites with the most links, etc.) has a link to additional results that you can click on to view and expand the report for each type.
For example, the advanced report for the most frequently referenced pages shows the top landing pages – that is, the pages of the site that are most frequently referenced.
Clicking on a URL will modify the report to show all external domains that link to that page.
The report shows the domain of the external site but not the exact page that links to the site.
A sitemap is usually an XML file, which is a list of URLs that help search engines locate web pages and other forms of content on a Web site.
Sitemap files are handy for large sites that are difficult to crawl if new content is frequently added to the site.
Scanning and indexing are not guaranteed. Factors such as page quality, overall site quality, and links can affect whether a site is crawled and pages are indexed.
Site maps simply make it easier for search engines to find those pages, that’s all.
Creating a sitemap is easy because other files are automatically generated by the CMS, plugins, or hosted website platform.
Some hosted website platforms generate a sitemap for each site hosted on their service and automatically update the sitemap when the Web site changes.
Google Search Console offers a sitemap report and gives publishers the option to download a sitemap.
To access this feature, click the link on the left menu. The “Sitemap” section will report any errors with the sitemap.
Google Search Console can be used to remove the sitemap from reports. However, it is important to remove the sitemap from the website itself. Otherwise, Google may remember it and visit again.
Once submitted and processed, the coverage report will populate the sitemap section to help troubleshoot any issues with URLs submitted via sitemaps.
The Quality section of Google Search Console includes the recently added Page Usability report, as well as the Key Internet Metrics and Mobile Usability sections.
Google has recently added the Page Usability report to the Google Search Console to help researchers create pages that are as mobile-friendly as possible. Google evaluates quality based on the following criteria: Core Web Vitals (speed and stability of page loading), usability on mobile devices, lack of security issues, use of HTTPS, and quality of ads.
The Page Usability report shows the percentage of effective URLs. In this report, you can also see the total number of impressions of such effective URLs.
The chart under “Page Usability” shows the percentage of effective pages for each individual day.
What if the usability report chart shows a small percentage of effective URLs?
We have to admit that the rules here are pretty strict. Google will mark a URL as effective only if all of the following criteria are met:
- The Key Internet Indicators report gave the URL an “Effective” status.
- The URL has no usability issues on mobile devices.
- The site has no security issues.
- The URL starts with HTTPS, not HTTP.
- The site has no problems with advertising implementation.
- If the URL does not meet at least one of the above criteria, it will be given an “Ineffective” status.
Internet Key Performance Indicators Report
The Internet Key Performance Indicators report assesses how well your site’s indexed pages are performing based on page load speed and stability. It is divided into two reports – “Mobile” and “PC” – and assigns the following statuses to pages: “Inefficient”, “Need to increase speed”, or “Efficient”.
The report is based on the following metrics:
- LCP refers to the rendering of the largest content on the page and shows how long it takes for the browser to load the largest visible item in the viewport. This is usually a video or image.
- FID or Delay After First Input is the time between the user’s first interaction with the page (clicking on a link, button, etc.) and the browser’s response to that interaction.
- The CLS, or cumulative layout shift, calculates how much elements shift during page loading and how quickly the layout stabilizes.
- Simply put, the first render should load quickly, be stable, and respond immediately to user actions.
To see how your site’s URLs are performing, you need to go to the report and click on the “Low Speed”, “Need More Speed”, Quite Reliable” tabs located at the top of the review page.
Mobile Convenience Report
Everyone has faced this problem more than once when sitting on a site from a phone: the content is wider than the screen, the text is too small to read, and the buttons are very close together. This is what Google calls mobile-friendliness (or rather, the lack there
The mobile-friendliness report helps detect pages that don’t display well on mobile devices. The chart shows the number of pages with “Error” and “Page without error” statuses.
The “Views” option provides information about how the site’s pages are being shown from mobile devices.
By clicking on a specific error, you can get more information, as well as information on how to solve a particular problem, and notify Google of the changes you’ve made.
The mobile-friendliness report highlights the following errors:
If your page has plugins that are not supported by most mobile browsers, you should redesign it using modern technology, such as HTML5.
Content is wider than the screen
If visitors are forced to use horizontal scrolling to view all of the page content, you should set the relative width and position for CSS elements.
The font is too small
If the font size is so small that users have to zoom in on the page content, you can set the correct font scaling.
The interactive elements are too close.
If buttons, navigation links, or other interactive elements are so close together that it is impossible to click one without hitting another, you need to resize them to make them mobile-friendly.
Some of the Google Search Console reports can be found in the Enhancements tab. They include:
- Logos. The functionality comes in handy for anyone who uses logo markup – it provides detailed information about the effectiveness and errors associated with your logo.
- Website link search box. This report shows problems and warnings related to the search box.
- Navigation strings. This section gathers your site’s syntax problems and tips for fixing them.
Tools from the old GSC and external tools
These tools were taken from the old GSC and do not yet have a counterpart in the new version. Even though not all of these options are fully functional now, you can still benefit from them.
Targeting by country and language
This report helps you check the use of the hreflang tag on the site and find related errors. It also gives you an indication of which countries are most important to you.
On the Language tab, you can see the number of hreflang tags with errors in the <head> section of your pages, in the HTTP header, and in the sitemap. You can also use this report to check how many errors there are in your site’s language code.
The output results may be different for users from different countries, and if necessary, you can use this report to specify your target country. But, if you have content in English and you want users from all over the world to read it, you may not want to set a specific country as the only geographic target.
The “Messages” tab in the new GSC gives you access to all alerts. This option provides all the important information and helps improve workflow. Webmasters can now quickly check to see if their sites have new issues, learn about updates, and get tips to improve their Google presence without leaving the home page.
Use this tool if you want to prevent Google from crawling certain URLs with specific parameters to avoid problems with duplicate content.
This tool is only worth using if you are an experienced SEO professional and your site meets the following requirements:
- It has more than 1000 pages.
- There are a lot of doubles indexed by the Google robot, and all repeated pages differ only in their URL parameters.
You need to set this tool as correctly as possible because you might accidentally tell Google to ignore important URLs on your site. If you misconfigure your URL settings, it can cause important pages to be excluded from search results.
In this section, you’ll find tools to help improve your site’s structure and user experience. They include reports on ad quality, abuse, and incorrect notifications, as well as testing tools and other resources listed below.
Ad quality report
The ad quality report helps you determine if your ads violate the Better Ads Standards. For example, if a page has auto-play video ads or has too many ads that, from Google’s perspective, are very distracting to visitors. You can use this tool to find out what type of ads you’re showing your visitors and how to improve them.
This report shows instances of abuse that mislead site visitors. These could be fake messages, false alerts, or system dialogues that cause users to click on them. The abuse report briefly explains each problem and shows the URL of the page where it was detected. Sometimes the console provides images that show specific problems on your site.
Incorrect Notifications Report
Incorrect notifications also mislead visitors. They appear in the browser and try to trick users into sharing personal information or promoting malware or unwanted software. In fact, the report provides information in the same form as abuse.
This section includes three tools to help you test and manage your structured data. The first is the “Structured Data Validation Tool,” which shows you how Google analyzes your data and displays it in search results. The second is the “Structured Data Markup Wizard,” which lets you add page element markup to your HTML code. The last is the “Markup Checker Tool for Emails” which allows you to test your markup.
This section of the GSC shows you other tools for analyzing and optimizing your site’s performance. These are Google My Business, Merchant Center, PageSpeed Insights, Custom Search, Google Domains, Webmaster Academy, Google Ads, and Google Analytics.
This section of the Google Search Console allows you to view and manage various settings for your site. You can change your ownership status, change your domain address, manage your site status, limit the frequency of crawl requests, and make other adjustments as needed.
Google Search Console Page Usability Report
The Page Experience Report offers data related to user interaction with the website based on site speed. Search Console displays information about Core Web Vitals and mobile usability. This is a good starting point for an overall summary of site speed.
Detailed status reports on results
Search Console offers feedback on advanced results through a performance report. This is one of the six options listed under the chart at the top of the page, under the “Appearance in Search” list.
Selecting the “Appearance in Search” tab displays click and impression data for the different types of advanced results displayed in search results.
This report tells you how important the traffic of advanced results is to the website and can help determine the cause of certain website traffic trends.
The Search Appearance report can help diagnose problems related to structured data.
For example, a drop in advanced results traffic may be a signal that Google has changed the requirements for structured data and that this structured data needs to be updated.
This is a starting point for diagnosing changes in advanced results traffic patterns.
When Search Console data is updated
In the upper right corner of the Google Search Console, you will see information about the latest update. If you have just added your site to the console, it may take about a week for the data to display.
Important: The data in the console may be delayed in loading and therefore differ from the data displayed in other tools. This could be because Google has not crawled your site since the last update, or Google Search Console has taken longer to process additional data.
Exporting data from Google Search Console
The Google Search Console also allows you to export this data to Google Spreadsheets, Excel, or CSV format. Cells without values will be marked with zeros.
Google Search Console has its own limitations: you can download only 1000 URLs and queries at a time. This limit is not very useful for large sites. It makes data analysis more complicated because you have to upload more than one file. But this problem can be solved.
The first option – is to use the application programming interface or API. Essentially this method is only suitable for users with a technical background, which makes exporting data rather complicated and time-consuming.
You can also export data from Google Search Console using the Google Sheets plugin. Once plugged in, you’ll get information about positions, queries, clicks, and more right in a spreadsheet that can display up to 5,000 rows, which is the maximum API.
If you’re not very tech-savvy or don’t have a developer on your team, you can get around GSC’s limitations by using third-party services.
In SE Ranking, we collect data from the console and provide it in a convenient format to make it easier for you to analyze or visualize the data.
With SE Ranking, you can export up to 10,000 rows in .xls format and an unlimited number of rows in SCV format. That way, you’ll download the data with a single click, and all the information will be collected in one place.
Other tools and reports in the Google Search Console
There are other tools and reports in Google Search Console that are not displayed on the Overview page but are very important too. These are “Deletions”, “Quality of Ads”, “Security Issues”, “Manual Measures”, etc.
If you need to quickly remove unwanted content, this tool will help. Just keep in mind that it temporarily blocks the pages you want from showing up only in Google search results, not on the web as a whole. This tool cannot remove URLs from search forever, but you can use it as a first step.
With this tool, you can also check the history of removal requests and see all URLs that have been marked as adult content. But keep in mind that you can’t block a page on a site you don’t own.
You can stop a particular URL from being indexed by using the no-index directive and the robots.txt file.
The manual action report allows you to find out if your site has been penalized for not following webmasters guidelines. It is important to remember that such actions usually result in downgrading pages or excluding them from Google search results. Therefore, it is worth keeping a close eye on this section of the console.
Sanctions can be taken for various reasons. For example, if website visitors are spamming you or obsessively promoting your business, Google can pessimize your site. People working in so-called “gray” niches often add links to a company’s website to forums or user profiles. And while their business becomes more visible, your site is equated with a spammer. That’s why you should always monitor any suspicious user activity
And Google will also penalize a site for publishing low-quality or copied content, as well as for adding too many keywords to your text. Make sure that all keywords are inserted naturally in the text and that your content is valuable to users.
To see a complete list of manual measures and how to fix them, check out this guide.
The Security Problems report displays information about when a search engine has detected that your site has been compromised or potentially dangerous content on your site. The report provides data about the problems, which can be divided into several categories: compromised content, malicious or unwanted software, and social engineering.
Let’s take a closer look at problems with hacked content. Simply put, this is any material that has been placed on your site without your permission. Hackers can inject malicious text into existing posts or add new pages. Sometimes they do this with CSS or HTML, so they are harder to detect. One of the most common schemes is adding code that redirects users to unwanted spam pages. These could be betting sites, adult sites, or other “gray” resources.
You should monitor this report regularly to ensure the security of your site. If any of your pages get hacked, you will see a list of security problems at the top of the report. Otherwise, there will be a “No Problem” message and a green checkmark.
By the end of 2019, Google finally switched all webmasters to the new version of its search console. However, some old tools and reports continue to work even after the old version of Google Search Console is shut down until they come up with an improved replacement. Here’s a list of them:
- Targeting by country and language.
- Crawl statistics.
- URL settings.
Everything is contained in the “Web Tools” menu of the old version.
This entire list can be found in the left side menu under “Previous Tools and Reports”.
Search Engine Console is Good for SEO
In addition to the aforementioned benefits of Search Console, publishers and SEOs can also upload link rejection reports, impose penalties (manual actions), and security events such as site hacks, all of which contribute to better searches.
This is a valuable service that every web publisher who cares about search visibility should take advantage of.
Are You Ready to Get Started?
Connect with Sage for answers to your questions, pricing, and more!