Google Web Master Tools

The Google webmaster tool is a free service offered by Google that helps to monitor and maintain site’s presence in Google Search. Google webmaster tool is also known as Google search console.

In google webmaster tools, we don’t have to sign up for Search Console to our site to be included in Google’s search results, but doing so can help us understand how Google views our site performance in search results page.
Google to communicate with webmasters uses Google Webmaster Tools as the primary mechanism.

8.1 Adding website to search console:

In the search console home page, click on “start now”. Here we will get a “select property option” between Domain & URL Prefix. Here we will select one property based on our website properties. After that we will enter our URL and click on “continue”. Here we will get a pop-up with instructions and code to verify ownership.

8. Google Web Master Tools 1

Here We choose property and enter Domain name

8. Google Web Master Tools 2

In Domain verification, we will have three steps:

1. In the first step, Sign-In into your Domain Name provider i.e (Godaddy.com, Googledomains.com, CrazyDomains.in, etc.,).

2. Copy the text provided with DNS configuration.

3. Then click on “Verify”. Here we can also find “verify later” option. We can also go for “verify later”, if we are unable to verify at that particular time.

8. Google Web Master Tools 3

In Prefix verification, We will get Recommended verification and Other verification methods.

8.2 Recommended Verification:

In this process we will get a “HTML File” with a download option. These “HTML File” must be uploaded on website dashboard. Once uploading is done click on “Verify”. After verification we will get a search console dashboard.

In google webmaster tools, verification of recommended method can be done by four main methods:

1. Download HTML file.

2. Upload in website file.

3. Confirm the successful upload.

4. Click verify.

8. Google Web Master Tools 4

Note: For verification process recommended method is mostly used.

Other verification methods:
In google webmaster tools, ownership verification can be done by using four different other methods.

8. Google Web Master Tools 5

The HTML file upload: Upload a verification HTML file to a specific location of your website. After that click verifies button and then we have access to Google Webmaster Tools data for this website.

8. Google Web Master Tools 6

Domain Name Provider: From the drop-down list, select Domain Name provider. After that Google will provide a step-by-step guide for verification along with a unique security token to use.

Google Analytics: If the Google account is same for both Google Webmaster Tools(GWT) and Google Analytics, is an admin on the GA account, and you’re using the asynchronous tracking code, then you can verify the site.

8. Google Web Master Tools 7

Google Tag Manager: Google tag manager is used to verify website.

8. Google Web Master Tools 8

After clicking on verify, we will get Domain manager

8. Google Web Master Tools 9

After entering username and password, click on sign in button. After that click on verify button. Once the verification is done, then the account will be added to search console.

8. Google Web Master Tools 10
8. Google Web Master Tools 11
8. Google Web Master Tools 12
8. Google Web Master Tools 13
8. Google Web Master Tools 14
8. Google Web Master Tools 15
8. Google Web Master Tools 16
8. Google Web Master Tools 17
8. Google Web Master Tools 18
8. Google Web Master Tools 19
8. Google Web Master Tools 20
8. Google Web Master Tools 21
8. Google Web Master Tools 22
8. Google Web Master Tools 23
8. Google Web Master Tools 24
8. Google Web Master Tools 25
8. Google Web Master Tools 26

8.3 Setting Geo-target location:

This is also known as marketing and internet marketing. It is used to determine the geolocation of website visitors and delivers different content to the users based on the location.

The process of setting up Geo-targeting is same for Google Adwords, Yahoo, search marketing and Microsoft AdCenter.

8. Google Web Master Tools 27

Google Adwords are classified as follows:

This feature is used to search Google geo-targeting database or map for the location you would like to target.

Browse

It is used to browse through all over the available country, state, city locations.

Bundles

This feature is used to choose pre-set bundles. It typically involves common country groups or continents.

Custom > Map Points:

Use Map Points in specific locations we can enter in to zip codes or longitude and latitude.

Custom Shape:

This feature is used for geo-targeting needs.

Custom > Bulk:

Use this feature to type in a large number of locations all at once.

8. Google Web Master Tools 28
8. Google Web Master Tools 29

Search queries analysis

A search query is used for the user to access the specific information by using web search engine.

The search queries are classified into four types:

Informational queries: The queries that cover a broad topic for which there may be thousands of relevant results.

Navigational queries: The queries which search for a single website or web page.

Transactional queries: The queries which reflects the user intention to perform specific task.

Connectivity queries: The queries generate reports about the connection between indexed web pages.

Filtering search queries:

The filtering search queries are classified into four types:

1. Filter existing keywords and negatives: By this process, we can focus on high preferable keywords.

2. Filter by conversions to find effective queries

We’ll utilize a minimal conversion filter to be certain that questions that had a positive effect on business. Here we could see queries that visit our website through a contract, purchase etc.

3. Filter through impressions to get low click-through-rate queries

The search queries with low CTR will have high scores, so first filter the search queries with minimum number of impressions, after that filter by CTR.

4. Filter by a specific keyword

Back in Query Stream, if the filters aren’t recorded with different filters but do not neglect the capacity to filter . These are extremely helpful for determining if we would like to decide on a word as a keyword

External Links report: External links are hyperlinks, that are navigate to other domains from existing domains.

These are the most important factor to rank a webpage.

Crawls stats and Errors:

Crawling: Crawling is the process of analyzing a website with the following links. This is done by Search Engine crawlers to find the meaning of a page and to make sure relevant information can be found for users’ queries.

Crawls stats: This provides information on Google bot’s activity on a website. It takes into consideration all content types that we download such as CSS, JavaScript, Flash, PDF files and images.

Crawls errors: Crawls errors provides detailed report of site URLs which are failed to crawl within the website.

The report has two main sections:

Site errors: These are high-level errors from the website. Site errors gives a report about issues like preventing Googlebot accessing the site from past 90 days.

Site errors are classified into three types:

Server errors: This implies if host is taking a long time to react, along with also the request timeout. The time required for loading the site and also makes users to wait to get a specific quantity of time on visiting the website from the Googlebot. If it takes too much time to react, then the Googlebot will cease

DNS errors: DNS is represented as Domain Name System.Where DNS errors are the first and most prominent error because if the Googlebot is having DNS issues, it can’t connect to your domain via a DNS lookup issue or DNS timeout issue.

failure: It means that the Googlebot can’t recover the robots.txt file, situated within our domain name. As soon as we do not need Google to crawl particular pages then robots.txt file is used.

URL errors: These are the particular errors that Google struck when attempting to creep over particular phone or background pages.

URL errors are classified into five types:

1. Soft 404: Soft 404 is a URL error on the website which returns a page by indicating that the page does not exist. In some other cases, it may show a page with no usable content, instead of showing a “not found” page. For example, It can be simply described as a Page without content.

2.404 Error: This means crawling a page that doesn’t exist on our site. When other sites or pages link to that nonexistent page then Googlebot finds that 404 pages.

3.Access denied: Access denied simply determined as Googlebot can’t crawl through that specific page.

4.Not followed: This is also called as “no follow” link directive. It means Google didn’t follow that particular URL. Mostly this errors were seen where Google has a running issues with JavaScript, Flash, or redirects.

5.Server errors & DNS errors: This server error and DNS errors comes under URL error, to avoid them we use Google direction.

8.5 Fixing 404 errors

The 404 Not Found error might seem for many reasons although no true issue is different, so occasionally a very simple refresh will frequently load the page you were searching for.

Check for mistakes in the URL.Sometimes once the URL was typed incorrect then the 404 Not Found error arises.

Until you discover something, move up 1 directory level at one time from the URL.

Search for the page in the favorite search engine. It may be completely wrong URL in that case a quick Google or Bing Search must get you wherever you need to go. Insert or upgrade bookmark after found the webpage that’s required to prevent the HTTP 404 error at the future.

By way of instance, if there’s difficulty in attaining the connection from tablet no matter telephone subsequently clear the cache from the browser to the tablet computer might help to get that url.

Change the DNS servers used by your computer, once the whole site is providing you with a 404 error, particularly if the site is available to people on different networks.

Last, contact the web site directly, if everything else fails.

Robots.txt

The website owner gives instructions to web robots about their site with the help of Robots.txt file.

The basic format for robots.txt:

User-agent:[user-agent name]

disallow:[URL string not to be crawled]