Can I use robots.txt to optimize Googlebot's crawl?

Today's question comes from blind five year old in San Francisco who wants to know, can I use robots text to optimize Google bots crawl?

For example, can I disallow all but one section of a site for one week to ensure it is crawled and then revert to a normal robot text?

Oh, blind five year old.

This is another one of those no kind of videos.

I swear I had completely Brown hair until you ask this question and then suddenly Gray just popped in like that.

That's where the Gray came from.

Really?

So no, please don't use robot text in an attempt to sort of say, shunted Googlebot all the way over to one section of a website, but only for a week.

Although we try to fit robust text on a sort of daily basis or once every few hundred fetches to make sure that we have an accurate copy.

Robust text.

Weird things can happen if you're trying to flail around and change your robust text really fast.

The other thing is that's really not the best mechanism to handle it.

Robot text is not the best way to do that.

Suppose you want to make sure a section of, say, ten pages gets crawled?

Well, it's much better to take those ten pages and link to them from your root page and say, hey, our featured category this week is red widgets instead of Brown widgets or blue widgets, and then just link to all of the ten red widget pages. That's because when all of a page rank comes into the root page of your site, which is where most of your page rank typically comes in, because most people typically link to the root of your website.

If you put the links to the pages that you care about right up front and center on that root page, the page rank flows more so to those pages than to the rest of the pages on your site. That might be five or six or seven links away from the root page.

So what I would say is you could try using robust text.

I really don't think it would work.

You would be much more likely to shoot yourself in the foot by trying to jump around and swap out different robots text every week.

What's much better is instead to work on your site architecture to rearchitect things such that the sites that you want to highlight the site or the parts of your site where you would like more page rank and more crawling is linked to more directly or more closely from your root page, and that will lead GoogleBot more into that part of your site.

So please don't try to just swap in and out different robots text and sort of say, okay, now you get to crawl this part of the site this week and this part of the site next week.

You're much more likely to just confuse Google Bot and Google

Bot might say, you know what?

Maybe I just won't crawl any of these pages.

This seems very strange to me.

So that's the other way that I'd recommend it is.

Put it right.

Change your site architecture and make your site more crawlable that way.