Login or register
Back to the blog

Taking a shortcut to submit your website to Google

Eugene Serbin (Head of SEO) How-To Articles January 22, 2020
The first thing to realize is that you – as an SEO or a business owner – can do nothing to make sure your brand-new website is indexed in the twinkling of an eye. Face the fact: becoming visible is not about instant gratification.
What you can do, though, is to grease the wheels of the process, showing Googlebot that there’s something it hasn’t seen yet.

Submitting your site to Google sets the stage for further development in terms of search engine optimization. And even though this does NOT make you show up high, you can now rest easy knowing you’ve eventually got it off the ground.

So, how to make it appear on the web?

What is the most headache-free way of helping Google get its hands on your website?

Take it easy as I’m going to shed some light.

The natural side of the indexing process

Actually, after rolling out a website, you can leave things as they are. Those bots of Google, Bing, and Yahoo are smart enough to find and crawl it single-handedly.
A word of SEO advice:  To make sure your website is already indexed, type “site:” and your domain URL after the colon in the Google search bar. This way, you will see all the pages that the bot has crawled so far.
Indexed Forbes.com pages on Google

However, waiting is a way beginners who don’t understand they miss the boat usually use. They just sit on their hands until crawlers deign to do their job, which, by the way, can take weeks or even a MONTH!

With something like  400 new websites entering the web world every minute, this timing is perfectly understandable. From the perspective of Google, of course.

From yours, it’s not. As a website owner, you want search engine bots to notice you first when they are mercilessly chasing new pages to crawl.

Fortunately, there’s something you can do to speed things up to make it dialed-in sooner.

4 Search engine submission ways to get Googlebot to notice you

The natural indexing process is time-consuming, which is why the manual way of submitting sites to search engines is a more effective business decision. So, roll up your sleeves and do the following.

Let your sitemap work for you

The list of URLs in Semalt’s sitemap to add the website to Google

Think of an XML sitemap as a guide for Googlebot to figure out what’s going on inside your website. It’s a list of URLs that the machine reads like you read your morning paper to catch up on the news.

If you’ve just created a website, building an XML sitemap should be the first thing on your to-do list. It is supposed to include all URLs that you want crawlers to find. After scanning the file, they will know which pages are important to you as well as users.
A word of SEO advice:  Sitemaps are a great way to get media content indexed. There are some extensions that you can use to point Googlebot to images and videos found on your site.
To build a sitemap, get ready to properly structure all high-priority pages and do the coding. For those who are not good at the latter, it’s better to rely on some helpers to do this for you (find the best of them in my post on  TOP-18 free SEO tools for your toolbox ).

Once created, an XML sitemap should be added to your site files and submitted using Search Console, which brings us to the second way of this guide.

Be no stranger to Google Search Console

The performance of Semalt’s website in Search Console
Search Console is a 100% free tool used to be better aware of how your website is shown on Google. It comes with useful insights on your SEO or PPC campaign performance, coverage, and more.

When you have a submit-ready sitemap at hand and wonder how to get your website on Google, go to the Index section in Search Console and hit Sitemaps. Here you will see all the sitemaps you’ve added before, their current status, and the number of URLs they’ve helped to find. To submit another one, simply paste its URL in the bar above.

 Adding a sitemap via Google Search Console to submit a website to Google
A word of SEO advice:  If your sitemap lists more than 50,000 URLs or is more than 50MB in size, you better split it into several ones. You can then submit each of them in Search Console.
Search Console can also be helpful to find out whether a page is indexed or not. Back to the Index section, click on Coverage to see your valid pages and the ones with errors. The tool will tell you if something is wrong with your URLs in the description part below the diagram.
The list of crawling issues shown in Google Search Console

For faster indexing, you can manually request this on your own via Search Console. Use URL Inspection to determine the visibility status of any page and solicit recrawling from Googlebot if needed.

Use robots.txt to be on the same page with Googlebot

Strange as it sounds, you don’t always need every single page to be indexed. Those with Admin data and “Order is successfully confirmed” ones have little to no SEO value, so crawling is not necessary.

A robots.txt file is a thing that will help you prevent search bots from running rampant on your site. It’s like a set of directives to come to an understanding with crawlers and make sure they do their job the way you want them to.

Robots.txt is made up of crawling rules that point bots to pages they should discover and shows the ones they should stay away from. As simple as that.

 Semalt robots.txt
This file sits at your website root and can be accessed by putting /robots.txt next to your domain URL in the address bar.

It usually has two sets of rules: Allow and Disallow. Obviously, the former takes search bots through the pages they can jump to, whereas the latter lists all those that can slip through the cracks without causing harm to your SEO.

Robots.txt saves your and Googlebot’s time by blocking low-quality or additional stuff from being indexed when you submit your site to Google. As 30% of all pages on the web are duplicate, you better focus on what really matters instead of those, and you will become the one to be reckoned with.
A word of SEO advice: Accessing robots.txt is the first thing crawlers do when they find your website. That is why it makes sense to add your sitemap to it to improve crawling and indexing.

Make your glass brimming with link juice

3 layers of link juice distribution
Building backlinks is one of the surefire ways not only to get your website on Google but also to make it rank high once it’s indexed.

When you have lots of backlinks, which are put on already indexed pages, your chances of being noticed shoot up exponentially.

Why?

Because Google tends to recrawl valuable pages. And if some of them have a trail of breadcrumbs to your website, search bots are likely to follow and eventually discover it.

Besides indexing, links help your newly created website outrank others. Find more on how this works in my all-encompassing SEO guide.

Another excellent thing about backlinks is that they prove to crawlers that your website is worth being indexed faster than the pages with no link juice. Plus, gone are the days when getting them was no mean feat as Semalt’s AutoSEO package can easily make your glass half-full.
Putting it all together: For faster indexing, be sure to build an XML sitemap, wrap it up with Search Console functionality, create a clear-cut robots.txt file, and make more links flow to your website.

5 Most common indexing issues to watch out for after submitting your site to Google

What if you got a straightforward sitemap and made it into robots.txt, but your website didn’t show up anyway? No, it’s not a matter of Reality vs. Expectations.

That can only mean one thing: you’ve done something wrong during your Google website submission.

Hide your savage anger as most indexing issues call for a simple fix. Before you can address them, though, it’s important to dig deep into the cause:
  • It’s disallowed in robots.txt. First, make sure you haven’t inadvertently hung up the “DO NOT ENTER” sign on your website when laying down crawling rules in your robots.txt. If some top-priority pages are disallowed there, change that for the good of your SEO.
  • It’s excluded by a noindex tag. Have you put noindex into the code? Go remove it as soon as possible to let crawlers know that you want those pages indexed. If you are not sure about whether there’s this tag on your website or not, use the Search Console to inspect the matter.
  • It’s non-canonical. Rel= “canonical” defines which one out of several similar pages (with the same content) should be indexed. If your webpage is labeled as non-canonical, crawlers will pass by, so you better check this tag for misuse.
  • It’s penalized or banned. This may occur if search engine bots think you’ve messed up on meeting their guidelines. Some of the most frequently reported reasons for being removed from the index include content scraping, spammy activities, etc.
  • It lacks link juice. The more links go to your website, the higher its chances of getting indexed faster. Therefore, if you don’t have any, and it’s a month or so since you’ve been waiting to appear on the web, going for link-building might help.
Putting it all together: If you face indexing issues, it's crucial to determine all of them to map out the best way out. From detailed inspection to error fixing, Semalt's services can do the trick for your website.

The bottom line

All in all, there are two routes to follow to submit websites to search engines: a do-nothing way and a manual one.

If you don’t feel like waiting until your competitors eat away at your business, take matters into your own hands.

From creating an XML sitemap and robots.txt to requesting indexing in Search Console – this won’t take you long. As for link-building, it can also be set up in a flash should you know who to turn to.

Once everything is ready, submit your website to as many search engines as possible for more coverage opportunities. In addition to Google, you can’t go wrong with Bing, Yahoo, Yandex, Baidu, etc.

I really hope this will help you go online sooner and understand the indexing process better.

If you know other easy-peasy ways to submit sites to search engines, what are they?

Which ones have you used before? And how long did it take for your website to get indexed?

GET EXPERT SEO ADVICE FOR FREE
We know how to kickstart your SEO campaign and double your organic traffic.
Get SEO Advice
164 Views 10 Comments
10 Comments
Archie
Jan 24, 2020, 10:56:07
Thank you, Eugene! Help me sort this out - do you put links to get better rankings or faster indexing?
Eugene Serbin
Jan 25, 2020, 19:22:11
The best part about backlinks is that they work both ways. If Google hasn’t indexed your website yet, and you get inbound links built for it, this will greatly speed up the process. And when it’s indexed, your search engine rankings will NOT be at their all-time low. Backlinks improve your online credibility and foster growth.
Lee Yates
Feb 1, 2020, 16:28:22
I did the same things when launching my online business. It worked out for me. Thank goodness. Just wondering if the workaround is the same for Baidu?
Eugene Serbin
Feb 3, 2020, 07:26:46
Yeah, the whole process is all about the same, so it’s almost in no way different. - First off, you should go to your Baidu Webmaster Tools account (it looks similar to Google Search Console, except everything is in Chinese). - You can find the Website Submission page in the navigation bar. - Then paste your website’s URL and submit it.
Jose Young
Feb 7, 2020, 06:29:19
Hi, it seems like google skipped my website for some reason. i created it 11 days ago and its still not there. could you help plz?
Eugene Serbin
Feb 13, 2020, 13:38:35
The thing is, indexing has no time limits. For some, it’s an overnight process; although, for most of us, it takes weeks. If you have requested indexing in Search Console and made all the steps outlined here, do not fret. I would allow for a little more time.
Amy Morris
Feb 20, 2020, 20:22:10
Totally worth my while... One thing only… how do you add a sitemap to robots.txt? I mean… are there any special instructions or anything like that?
Eugene Serbin
Feb 21, 2020, 09:36:46
This shouldn’t be tricky. If you already have a sitemap, just locate it and add a path to it to your robotx.txt file before listing Disallow and Allow URLs. It should look like this: Sitemap: http://www.your-domain.com/sitemap.xml
Mac Kidwell
Jan 29, 2020, 13:47:13
Clearly explained! A huge chunk of information in a pretty concise article!
Eugene Serbin
Feb 1, 2020, 12:42:45
Very kind of you. Thank you!
© 2013 - 2020, Semalt.com. All rights reserved
Close
Andrew Timchenko
Head of Customer Success Department
*
*
*
✓ By entering your data you agree to Semalt`s Terms of Service and Privacy Policy