Written By: Adam Dince
Often times, Google will notify Webmasters via its Search Console product about issues it experiences crawling (navigating and reading) a Website. Typically, Google issues these warnings before it takes action which may affect organic search rankings and performance.
Back in the day, before Google was able to execute JS/CSS, it didn’t really matter if you blocked external files in the robots.txt. Now that Google is better at crawling and understanding JS/CSS, it’s imperative to give its Googlebot crawler access to those important external files.
If you’re finding this warning message in Search Console or if you find a similar message in the email account associated with Search Console, please remediate immediately. We often take for granted the current organic search traffic our Websites get. Don’t put yours at risk.