Starting process to Guide the Google Search Console

Are you looking for the best  search engine rankings?

Of course, you can spend your way to success on Google. But that comes with some serious downsides—namely, it’s expensive and the traffic evaporates as soon as you stop spending.

Instead, if you’re low on funds, focus your efforts on organic search engine traffic through SEO, or search engine optimization.



Thankfully, Google has given us a simple tool to understand how it sees your site, what issues might be affecting your traffic, and how you can improve the site for better rankings and results.

That tool is known as Google Search Console.

The tool has been around for a while, and it used to be known as Google Webmaster Tools, and Google Webmaster Central before that.

In 2015, Google rebranded it as Google Search Console—so if you’ve seen different terms, don’t worry. They’re all different names for essentially the same thing.

The great thing about Google Search Console, or GSC, is that it’s completely free. And it’s made by Google itself, so the advice comes straight from the source.

Here’s how you can use GSC to maximize your SEO results.

Adding your website

To start things off, you’ll need to set up a free account with GSC. Then you’ll need to verify that you actually own the site you’re going to analyze.

(Unfortunately, there isn’t a way to check up on your competitor’s sites with this tool. You must prove ownership to inspect a site.)

Start by clicking the “add property” button on the left-hand dropdown.

From there, just enter your site name. Remember that it’s a strict entry, meaning http: and https: are counted as different sites.

Next, you’ll need to verify that you own the site. Google provides a few different ways of doing this.

The recommended method is to add an HTML file to your server. But you can also add a meta tag, edit your DNS settings, or connect to your Google Analytics or Google Tag Manager account.

Dashboard

Once your site is verified you’ll start seeing data on your website. Sometimes it can take a few hours before you see any data, but it’ll start rolling in.

Once it does, you can use a few different tools to explore what Google sees—overview, performance, and URL inspection.

Overview gives you a rough overview of everything from what keywords you are ranking for to how much traffic you are getting.

In addition to that you’ll see if the Google bot is experiencing any crawl errors when going through your website, the number of sites linking to yours, and how many pages Google has indexed.

With Performance, you can see a more detailed breakdown of your site’s performance on Google.

And with URL inspection, you can explore any single URL. Just type it into the search bar at the top of the screen, and you’ll be presented with a quick report on how Google sees the URL, like this.

Site index

Just like everything else, Google isn’t perfect. So configuring your site can help them do a better job of ranking your website.

When configuring there are a few areas that you should be familiar with.

Coverage

There will be some pages on your website that you just don’t want Google to index. These could be private login areas, RSS feeds, or crucial data that you don’t want people accessing.

On the coverage tab you can see a basic report of pages on your site.

It’s broken into a few categories—pages with an error, valid with warnings, valid, and excluded. You should try to have zero pages with errors or warnings.

The number of valid and excluded pages depends on what you’d like Google to index, and what you want to keep private.

By creating a robots.txt file you can block not just Google, but all search engines from accessing web pages that you don’t want them to get their hands on.

However, for highly sensitive areas of your website you may want to consider password protecting all relevant directories.

Through a robots.txt generator and tester, not only will you be able to create a robots.txt file, but you will be able to see if it is done correctly before you upload it to your server.

Here’s a simple generator from SEOBook.

It’s wise to do because the last thing you want to do is make a mistake and tell them not to index your whole website.

And if you accidentally mess up and find Google indexing pages that you don’t want them to index, you can request them to remove it through this section.

Sitemaps

Next up is sitemaps. This is basically a “table of contents” for your site that can help Google find every page on your site and understand its hierarchy.

Submitting a sitemap will help Google determine what pages you have on your website so they can index them.

If you don’t submit a sitemap they may not index all of the pages on your website, which means you won’t get as much traffic.

Sitemaps have to be submitted in an XML format and they can’t contain more than 50,000 URLs or be larger than 10 megs.

If you exceed any of those limits, you need to split up your sitemap in multiple files and then submit them.

If you aren’t technical, you can go to XML Sitemaps to create a sitemap. All you have to do is enter in your URL of your homepage and click “start”.

Once your sitemaps have been uploaded, Google will tell you how many of your URLs are being indexed. Don’t worry, it is common for them to not index all of your web pages.

But your goal should still be to get as many pages indexed as possible.

Typically if pages aren’t being indexed it’s because the content on those pages isn’t unique, the title tags and meta descriptions are generic, and not enough websites are linking to your internal pages.

Comments

Popular posts from this blog

Exploring the Vibrant World of India FM Radio Stations

In 2019 A review of the payday loans algorithm