Should I disallow crawling of all of my site's JavaScript files

Today's question comes from Zurich, in Switzerland.

John Mueller wants to know, "Googlebot keeps crawling my JavaScript and thinking the text in my scripts refers to URLs.

Can I just disallow crawling of my JavaScript files to fix that?"

Well, you can disallow all JavaScript.

But I really, really, really would not recommend that.

If there's perhaps one individual JavaScript file that's the source of the problem, you could disallow that.

But in general, it can be really helpful for Google to be able to fetch, and process, and execute that JavaScript to learn what other links are on the page.

So in general, I would not blockJavaScript, and I would not block CSS, those sorts of things.

It turns out, as we're executing JavaScript, we do look at the attributes.

So you can actually use JavaScript and put, like, a no follow attribute on individual URLs.

And so it is the sort of thing where you can do you link-level granularity there.

And you can block, for example, an individual JavaScript file.

But in general, I would not recommend blocking all of your JavaScript files.

Those can be really helpful for Google to understand your site and be able to find more pages deeper within your site.

So I wouldn't recommend blocking them.