Why All SEOs Should Unblock JavaScript & CSS… And Why Google Cares

picture of chart and pen

Recently, Google issued a notice informing webmasters that they should not block their .js and .css files. The reason is that Google’s web crawler, GoogleBot cannot access those files on many websites that are blocking those particular extensions. By blocking GoogleBot’s access, it can significantly decrease your Google search engine ranking. Fortunately, there is an easy fix and Google has provided the information you need to correct it.

What Are .js and .css?

These are files that are used in web development. The file extensions .js and .css are used for specific types of files. JavaScript source code files (.js) are used to accomplish various things on a website. This includes things such as automatically changing a formatted date, causing a page link to open a new window, or cause text and images to change when they are rolled over by a mouse. Cascading style sheets (.css) are plain text file formats that help keep things on your website in proper display format. They determine factors including fonts, sizes, colors, borders, and spacing. If these files are blocked, and Google cannot see them, it is understandable how they could seriously affect your rank.

Why Are They Blocked?

Having a website with blocked .js and .css is rather common. The Internet is evolving, and more and more the importance of being mobile friendly is important. In fact, we discussed this in detail in a previous article. Google may not even be able to determine if your site is mobile friendly when your .js and .css files are blocked, which can be costly to your SEO efforts. Because of these changes, search engines are no longer only focused on text content on your page as it was in the past. More and more, the greater information a search engine has about your site, the better. The issue and fix lie in your robot.txt file. This is the file that gives information about your site to the search engines. If your robot.txt file is blocking these files, they cannot be seen by the robot.

How to Fix It

picture of html markup - meta tags

If you received a notice from Google stating that it is unable to see your files, it also provides instructions on how to fix the issue. In case you have not received the message, the fix is relatively straightforward. First, use the Fetch as Google Tool to see if, and which resources your website is blocking. Next, edit your robots.txt file and change the blocked resources from “disallow” to “allow”. For example, the blocked files will appear as disallow: [path]. Change the word disallow to allow for the resources the Fetch as Google Tool specified. More information on the robot.txt file is available here.

Once your robots.txt file is updated, you can test it with the Test robots.txt Tool before updating it on your site. If there are no issues, and you’ve updated the file on your website, just run the Fetch as Google Tool once more to validate the changes.

Google has been warning against blocking these files for some time, so the concept is not new. However, the importance of mobile-friendly sites in their ranking algorithm has increased the need to fix your files. By unblocking your .js and .css files, you can ensure that GoogleBot can see your site and determine if it is mobile-friendly.

Source : Cohlab

#technology #seo #google #news #share #development