4 simple reasons to invest in custom WordPress website development
Installing cheap plugins on your site can slow things down -- invest in plugin development to keep your WordPress site working well!Read More
Over the last few days many WordPress website owners and managers have been receiving this error from Google:
Googlebot cannot access CSS and JS files on http://example.com/
To: Webmaster of http://example.com/,
The robots.txt file lives in the root of your website directory, and controls what robots (bots or spiders) are able to see on your website.
In these cases, the robots.txt file is blocking what Google’s search spider is able to see; and that’s what they’re trying to fix.
Here’s an example of what Google sees (on the left) and what the user sees:
The first stage is seeing what’s being blocked.
Go to Google Webmaster Tools and navigate to Crawl > Fetch as Google, then click “Fetch and Render”.
In a few seconds, your homepage will appear in the list below. Click this to go into the details page.
Under the screenshots you’ll find a list like this:
Each of those ‘blocked’ resources need to be unblocked. The most common two directories will be
Now we know what to look for, let’s open our website in a SFTP tool, or use the file manager in your control panel.
In the ‘root’ or main directory you might see
robots.txt. If you do, open it.
If you don’t see it, no panic! There’s no ghosts in the machine. If no file exists, WordPress will automatically display one. Let’s create one and we can over-ride the WordPress default. Create a new file and call it
If the file is new or blank, add these lines:
User-agent: * Disallow: /wp-admin/
If there is existing content, add these lines at the end of the file:
User-agent: Googlebot Allow: / Disallow: /wp-admin/
This says, if the Googlebot shows up, it can access every directory and file in the public directory, apart from the wp-admin directory. This mirrors the default WordPress behaviour of blocking all bots from wp-admin, but gives specific permission to Google to access the files.
Save your robots.txt file and then test again through the Google Webmaster Tools suite.
If your robots.txt was empty or new, you should be away laughing. However, if it was complex, there might still be problems. Here’s how to check and solve each conflict.
In the “Fetch as Google” details screen for your homepage (see instructions above), click the
robots.txt Tester link. This will open a new tab or window with a test suite and highlight (in red) the line that’s blocking access. Remove this line and test again, until complete.
Once you’ve worked your way to a green ‘success’ line, copy and paste the updated code into your .htaccess file.
If you have any problems, fill out the form below and perhaps one of our engineers will be able to help: