We live in a mobile age. SEOs no longer just focus on data and instead focus on the issues affecting the website. Using this section, you can look at a website from Google and find problems.
What are Report Crawl Errors?
Report Crawl Errors Reports invalid pages on the site. The report consists of two parts:
- Site Errors
- URL Errors
These errors can be due to site servers, error codes in HTTP, And other factors that have caused 404 in particular. Site errors are related to the entire website, and your website report is probably as follows:
URI errors are caused by monitoring site URLs in desktop and mobile versions. This part of the console search report for your site probably looks like the one below:
How to monitor errors (Crawl errors) To solve?
Monitoring errors include:
- DNS error (DNS error): This error means that something has happened on the server-side that has interrupted the connection.
- Server errors: Server errors often occur when your website gets too much traffic.
- 404 SoftErrors (Soft 404 errors): These types of errors point to the response code HTTP Type 200 is not. To resolve these errors corresponding pages redirects to pages 301 to.
- 404 errors: This type of error is the most common one you receive. 404 Errors occur when you attempt to delete or remove a page. While 404 errors do not necessarily affect site rankings, they do affect backlinks. If the 404 site pages have backlinks, redirect them to the 301 as soon as possible.
After months of trying to speed up the site and optimize keywords, fixing 404 errors can be a viable option for website SEO.
With just a 301 redirect, you can be sure that your previous page backlinks are recalculated for the site and you can still use their credentials.
What is Report Crawl Stats?
The part Crawl Stats In Google Search, the console often comes with red, blue, and green charts. These charts are data to evaluate how well your site is being monitored.
Report Crawl Stats It will show you how often Google monitors your site.
How to report Crawl Stats to use?
Rapid site monitoring means that Google is indexing your site as quickly as possible, and the more Google visits your site, the better your search results will be.
If there are many ups and downs in these charts, your site will probably have a problem. Report Crawl Stats It comes with three parts:
- Number of pages monitored per day (Pages crawled per day)
The site with a good monitoring rate is as follows:
The diagram for this good site is as follows:
If you see a lot of ups and downs here, you need to do some research. If you’ve recently added a lot of content to the site or removed most of the block from Google’s blockchain, this section’s charts are likely to rise sharply.
In the example above, the chart rises abruptly due to a lack of redirection HTTPS to the HTTP Is. As a result, Google has monitored both versions of the site.
- Amount of downloaded kilobytes per day (Kilobytes downloaded per day)
- The amount of time taken to download a page (Time spent downloading a page)
URL Inspection Tool (Newersion ) / Fetch as Google (Old version)
What is Tool Fetch as Google / URL Inspection tool?
Tool URL Inspection tool It’s one of our favorite tools because it lets you view your site from Google. You can also view the rendered version of the site from Google’s perspective.
This tool has many similarities with the tool Fetch as Google It has an old console. Click on the option Fetch Response HTTP You will see what Google receives. Click on the option Fetch and render, You will also see the rendered version of the site in the browser.
How To Tool Fetch as Google / URL Inspection tool To use?
for use URL Inspection tool In the new console, just paste the URL you want at the top:
You will then receive a report as follows:
from tool fetch as google You can also use it to find hidden content generated dynamically. After registering the URL with option fetch and render You will receive an answer similar to the one below:
This should not be confused with fixed content. According to John Muller, content that clicks on the buttons will not harm the site’s ranking. In other words, it’s okay to use the tabs below to hide content.
Other uses of this tool:
- Updating an old page
- Implement a new section on the site
- Introducing the new mobile version
- Update file txt
- Run the tags rel = canonical
- Move from version HTTP to the HTTPS
- txt Tester
file robots.txt If implemented incorrectly it can cause disaster on the site. SEO experts have had some experience with the consequences of not being aware of this file. Such users do not index the entire site due to incorrect implementation of this file.
Tool Robots.txt Tester Allows webmasters to know precisely which parts of the site are blocked by Google’s robots.
How to make sure Google’s robots aren’t blocked?
To ensure this applies to the tool Robots.txt Tester We go and import one of the site’s URLs. Click on Test We can see if that path is allowed to be monitored.
If you are allowed to monitor you will get the following message:
If blocked, red buzzer with the title blocked You will see.
What is Report Sitemaps?
Site maps play the role of mysterious men behind the scenes. So whatever you put in the site map can affect the site’s SEO. So always pay attention to the warnings given about the sitemap. This section is available in both versions of Google Search Console and is as follows in the new console:
The Google Map Search report on Google Console gives you a glimpse of what’s happening on the website. These are errors that occur on the site map and will affect the site.
How to fix site map errors?
Do not be afraid of yourself. The best defense against these errors is to find out the root cause of these errors. The following scenario is one of the cases we usually deal with:
A great website gives us its website analytics. They have recorded over 10,000 pages on several site maps. Looks great. After just 5 seconds, we find that only one of the Google indexes has been indexed. After a few minutes of review, we find that they have typed the URL site into the uppercase site maps.
After two weeks of solving this problem, we found that 83% of the pages were indexed and 2357% were added to site traffic.
What are Tool URL Parameters?
One of the worst-case scenarios for a site is that it involves a huge amount of duplicate content. This often happens when we launch a new product, are on holiday, or travel.
How to solve this problem quickly? Tool URL Parameters This is what you are looking for. This tool displays all the parameters used on the site. For example, you will see something like this below:
How To Tool URL Parameters To use?
Using the tool URL Parameters You can create settings for Google to control how Google treats pages with parameters.
The following example shows how to create duplicate content on the site if the comment section parameters are not set correctly.
Although Professionals know that using a Canonical tag can prevent duplicate content, it does have an impact on saving site monitoring budget. (crawl budget) It will not have.
Using the tool Parameter URL You can manage site monitoring budgets and duplicate content at the same time.
What will the new version of Google Search Console look like?
Now you have probably understood why Google Search Console is a tool for professional searchers. But what does the new version of the tool look like in the future?
Google has just introduced a new version of the tool and is constantly updating its features. This tool already has the following features:
- Optimized reporting tool AMP
- Better servicing with a focus on mobile
- New site index report to better understand site errors
- Better system notifications for new problems
This will be updated in the future as Google Search Console continues to change.