If you recently received a Google Search Console email saying “Googlebot cannot access CSS and JS files”, don’t worry as it’s easily fixed.
In the past when search engines weren’t overly bright, the robot text file found on the server was used to stop them crawling areas of your site that would cause issues. For example double content could easily be found be viewing the same page through different web addresses. Or areas of the backend admin aren’t particularly useful if found in search engine results. So the robot text file would block them entering various areas.
As Googlebot has grown, its become a lot more WordPress wise.
Webmasters recently received : Googlebot cannot access CSS and JS files
If your site has been blocking Googlebot from accessing those files, then it’s a good thing you know about it so you can deal with the issue.
There’s an easy fix for it, which involves editing your site’s robots.txt file. If you’re comfortable editing that file, then go ahead with this fix. If you are not, then we can easily do it for you.
Simply access your server and locate your robots.txt file. Then simplify your file by removing following lines of code:
That’s what’s blocking Googlebot from crawling the files it needs to render your site as other users can see it.
You should now find that this resolves the issue. You can check it worked by visiting Google Webmaster tools here https://www.google.com/webmasters/tools/googlebot-fetch