Have you been getting this error message?

Over the last few days many WordPress website owners and managers have been receiving this error from Google:

Googlebot cannot access CSS and JS files on http://example.com/

To: Webmaster of http://example.com/,

Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.

Example email from Google Webmasters

What does it mean

The robots.txt file lives in the root of your website directory, and controls what robots (bots or spiders) are able to see on your website.

In these cases, the robots.txt file is blocking what Google’s search spider is able to see; and that’s what they’re trying to fix.

Here’s an example of what Google sees (on the left) and what the user sees:

What Google's search spiders see, vs what your readers see
What Google’s search spiders see, vs what your readers see

How to fix “Googlebot cannot access CSS and JS files”

The first stage is seeing what’s being blocked.

Go to Google Webmaster Tools and navigate to Crawl > Fetch as Google, then click “Fetch and Render”.

In a few seconds, your homepage will appear in the list below. Click this to go into the details page.

Under the screenshots you’ll find a list like this:
Seeing what blocks Google's bots

Each of those ‘blocked’ resources need to be unblocked. The most common two directories will be wp-includes and wp-content.

Now we know what to look for, let’s open our website in a SFTP tool, or use the file manager in your control panel.

In the ‘root’ or main directory you might see robots.txt. If you do, open it.

If you don’t see it, no panic! There’s no ghosts in the machine. If no file exists, WordPress will automatically display one. Let’s create one and we can over-ride the WordPress default. Create a new file and call it robots.txt.

If the file is new or blank, add these lines:

User-agent: *
Disallow: /wp-admin/

If there is existing content, add these lines at the end of the file:

User-agent: Googlebot
Allow: /
Disallow: /wp-admin/

This says, if the Googlebot shows up, it can access every directory and file in the public directory, apart from the wp-admin directory. This mirrors the default WordPress behaviour of blocking all bots from wp-admin, but gives specific permission to Google to access the files.

Save your robots.txt file and then test again through the Google Webmaster Tools suite.

Troubleshooting still-blocked resources

If your robots.txt was empty or new, you should be away laughing. However, if it was complex, there might still be problems. Here’s how to check and solve each conflict.

How to diagnose 'blocking' lines in your robots.txt file.
How to diagnose ‘blocking’ lines in your robots.txt file.

In the “Fetch as Google” details screen for your homepage (see instructions above), click the robots.txt Tester link. This will open a new tab or window with a test suite and highlight (in red) the line that’s blocking access. Remove this line and test again, until complete.

Once you’ve worked your way to a green ‘success’ line, copy and paste the updated code into your .htaccess file.

If you have any problems, fill out the form below and perhaps one of our engineers will be able to help:

  • This field is for validation purposes and should be left unchanged.