Inicio > Blog > SEO/GEO > Google technical guidelines update: crawl rendering

Published on 10/28/2014

·

Updated on 10/28/2014

Google technical guidelines update: crawl rendering

Google announced yesterday, through its Google Webmaster Central Blog, the update of some of its technical guidelines. These guidelines affect the way Google crawls the content of websites and has an effect on their ranking. If up to now Google crawled a website in a similar way as a text browser (e.g. Lynx), now Google

Jose Emeterio Vicente

Head of R&D and SEO Consultant

Google announced yesterday, through its Google Webmaster Central Blog, the update of some of its technical guidelines. These guidelines affect the way Google crawls the content of websites and has an effect on their ranking.

If up to now Google crawled a website in a similar way as a text browser (e.g. Lynx), now Google crawls web pages in a more human-like way, taking into account also information about the design style of the page or functionalities implemented through JavaScript programming. In other words, Google’s robot has gone from simply crawling the HTML of the page to rendering the using CSS and JavaScript code.Explore like Google

To check if your website follows Google’s new technical guidelines you can perform the following checks:

  1. Review the robots.txt file configuration to ensure that search engines are not blocked from accessing the cascading style sheets (.css) or Javascript (.js) files, as Google now needs this information to render each page correctly.
  2. Optimization of content linked to web pages: cascading style sheet (.css) files o JavaScript (.js) etc. to ensure that they are optimally bundled (i.e., require fewer calls to the server which favors downloading); that the code is minimized (no blank lines or unnecessary spaces, comments or any other non-essential information); and that the Gzip compression on the server also for this type of files (as the information is transmitted compressed, it travels faster to the users).
  3. Download speed data in Google Search Console to ensure that the new Googlebot crawling method does not affect content indexing.
  4. The compatibility of the website with mobile devices and the blockages in the rendering that may be caused by the JavaScript with Google’s Page Speed tool.
  5. Check how Google renders and crawls content using the “Crawl as Googlebot” functionality – get and process option – in Google Search Console which has been adapted to test the new technical guidelines.

We must keep in mind that in no case a lack of any of these guidelines could lead to a Google penalty.

Resources blocked from tracking

And that on many occasions we will not be able to act on resources external to our website over which we have no control. These are guidelines that should be detected and implemented as far as possible to ensure optimal positioning in Google.

Jose Emeterio Vicente

Head of R&D and SEO Consultant

Head of R&D and SEO Consultant at Human Level. Computer Engineering graduate. SEO and Web Analytics expert, Google Analytics certified. Instructor for the Professional SEO-SEM Master’s program at Kschool.

SHARE THIS POST

Content owned by Human Level

The content of this website is copyrighted and is property of © 2001-2026 Human Level Communications S.L.U.

Join the conversation

Fields marked with * are required.

Related articles

Let's Talk!

Tell us more about your project and how you think we could help you. Improving your online visibility and conversion is just a click away.

Contact

+34 966 141 907
info@humanlevel.com

Our Schedule

Mondays to Thursdays from 8:00 to 18:30
Fridays from 8:00 to 15:00

    Fields marked with * are required






    In this form, we collect personal data in order to contact you and process your request in accordance with our services. The data controller is HUMAN LEVEL COMMUNICATIONS S.L.U. For more information on how we treat your data and to exercise your rights, you can consult our Privacy Policy.

    Terms acceptance and newsletter subscription