Googlebot cannot access CSS and JS files in WordPress

Did you get a message from Google Search Console saying “Googlebot cannot access CSS and JS files” ? The message contains links to instructions on how to fix this issue, but those instructions might look complicated to the user. If you are blogging on WordPress then as a webmaster it’s important to know how to resolve “Googlebot cannot access CSS and JS files” in WordPress.

Why does Google access CSS and JS Files?

JavaScript and CSS files help Google understand that your website works properly.
To determine the user experience and speed of a website Google reads these files. If these files cannot be accessed by Googlebot because of restrictions in the robots.txt file then your website ranking will get affected. This is when Google systems detect an issue with your homepage that affects how well their algorithms render and index your content.

By default WordPress does not block search bots from accessing any CSS or JS files. However, sometimes you may accidentally block them while adding extra security measures or by using a WordPress security plugin which restricts Googlebot from indexing CSS and JS files which may affect the SEO of your website.

Allow Google to access your WordPress CSS and JS Files

Login to your Google search console tool and find out which files Google is unable to access on your website.

Click on Crawl » Fetch as Google.  Next, click on Fetch and render button (you want to do this for both Desktop and Mobile)

Fetch as Google

Fetch as Google

 After fetching, the result will appear in a row below. Clicking on it will show you what a user sees and what the Googlebot sees when it loads your site.
details of fetch attempt

details of fetch attempt

 You can also find a list of these blocked resources under Google Index » Blocked
Blocked resources

Blocked resources

Clicking on each resource will show you the links to actual resources that cannot be accessed by Googlebot.

Most of the time your WordPress plugins or theme add these CSS styles and JS files.

Now you will need to edit your site’s robots.txt file which is what allows a website to provide instructions to web crawling bot.

You can edit it by connecting to your site using an FTP clients like FileZilla. The robots.txt file will be in your site’s root directory.

editing robots.txt file

editing robots.txt file

You will most likely see that your site has disallowed access to some WordPress directories like this:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/

Now you need to remove the lines that are blocking Google’s access to CSS or JS files on your site’s front-end. Typically these files are located in the plugins or themes folders. You may also need to remove wp-includes, many WordPress themes and plugins may call scripts located in wp-includes folder, such as jQuery.

Some users may notice that their robots.txt file is either empty or does not even exist. If Googlebot does not find a robots.txt file, then it automatically crawls and index all files.

Then why are you seeing this warning?

On rare occasions, some WordPress hosting providers may proactively block access to default WordPress folders for bots. You can override this in robots.txt by allowing access to blocked folders.

User-agent: *
Allow: /wp-includes/js/

Once you are done, save your robots.txt file. Visit the fetch as Google tool, and click on fetch and render button. Now compare your fetch results, and you will see that most blocked resources issue should disappear now.

Christine

Christine

Author at onlineshouter
Christine writes for people who seek for knowledge about SEO, blogging, online marketing, gadgets and web apps.
Christine

Leave a Reply

Your email address will not be published.