How to leverage your content management system to improve SEO

Fernando Maciá

Written by Fernando Maciá

Taking into account the aspects of web optimization for search engine positioning when implementing a content management system can prevent our investment in content from being devalued by a poor presence in search engines.

Content managers

The generation, publication and archiving of a huge number of pages in large portals and Web sites poses multiple challenges that Web content management systems (CMS) have been trying to solve in recent years:

  • Facilitate the generation and editing of Web content by personnel without specific training in programming.
  • Ensure a homogeneous appearance of all contents and their presentation according to a corporate design and a predefined editorial line.
  • Maintain consistency in the structure of the Web sites, allowing the incorporation of new content in the appropriate sections after prior approval and control by the indicated persons.
  • Maintain a consistent navigation that allows users to reach each of the contents that are published at all times.
  • Avoiding duplicate content (different URLs displaying the same content), orphan content (files that are left on the server unnecessarily because they are pages that no longer point to any links or image or multimedia files that were displayed on deleted pages) or broken links, which point to pages that do not exist on the server.

Content Management Systems or CMS are software tools that allow decentralizing the maintenance of the content of a portal, so that non-technical staff from different departments of a company can add, edit and manage their own content on a corporate Web.

Use content management systemsCMS and search engine optimization: an impossible symbiosis?

However, despite their obvious advantages, the traditional focus of this type of tool has been on making content management as easy as possible by simplifying the production, approval and publication processes, rather than on generating web pages that are properly optimized to be competitive in search engines.

Among the problems that, from the point of view of search engine optimization, recurrently appear in portals supported by content management systems, the following stand out.

Dynamic URLs

Search engines sometimes limit the number of dynamic variables present in the URLs they index. Pages generated by many content management systems frequently include a large number of dynamic variables in their URL address.

Unique titles

The title of a page is one of the most important factors when it comes to positioning content well in search engines. However, many content management systems do not allow users to assign a unique relevant title to each page.

Lack of support for metatags

Many CMS do not have specific fields for the user to specify the contents of the Keywords and Description meta tags. Although they are not as important as the title to achieve a good position in search engines, these tags still play a relevant role when the user prefers to click on our site in a search results page.

Absence of keywords in URLs

Dynamic URLs generated by many content management systems are often unfriendly to both the user and search engines, and do not include search terms that contribute to better positioning.

Impossibility of further optimization

The content production process imposed by the use of a CMS system makes a posteriori optimization of the generated content extremely difficult and, in the best case, adds an extra workload that could be avoided if SEO aspects had been taken into account in the implementation of the manager.

Make the content management system the best SEO tool

It is therefore paradoxical that it is precisely those companies that invest the most resources in the maintenance and generation of new content for their portals that, at the same time, benefit the least from the volume that such traffic could mobilize due to poor implementation of their content management systems, from the point of view of search engine positioning. A failure that, in many cases, is not due to shortcomings of the tool itself, but to a lack of knowledge on the part of the technicians who implement it of the importance of ensuring that the content generated by it can be competitive in search engines.

But, in the same way that a poorly implemented content management system can reduce the return on investment in content generation of a portal, one that has taken into account the basic aspects of web optimization for search engine optimization can be the most effective ally to generate content that will climb the ranks of the most competitive searches. Let’s see how.

Content managers start from pre-designed templates, which users cannot alter, to generate new pages. If we validate the code of these templates at source, we ensure that the pages generated from them will also contain valid code. The use of valid code ensures that the page will display correctly in different browsers and that search engines will be able to crawl it without problems.

Create a site map

Almost all content management systems allow you to create and maintain a site map. Search engines limit the number of links that can be followed to 100 per page and they must be normal HTML text links. If we adjust our content manager to generate and maintain a hierarchical map of the website with these premises, we make it easier for search engines to crawl each and every page of our website.

Generate pages with friendly URLs

Some content managers generate static files whose URLs do not contain dynamic variables, while others use URLs with multiple dynamic parameters, such as: http://www.midominio.org/portal/site/Equipo /menuitem.6dab26af2ec93c76a68e76b1805101ca/?vgnextoid=88c8c55c1edb7010VgnVCM100000081510acRCRD. Among the latter, some allow you to create more user-friendly alias URLs (containing keywords or eliminating dynamic parameters) which are then replaced internally by the system with the dynamic URL you require. All things being equal, choose a CMS that generates keyword-rich URLs or URLs with a reduced number of dynamic parameters. URLs like the one in the example are very unusable and unfriendly to search engines.

Limit the level of subdirectories

Search engines give more importance to a page the closer it is to the home page of the portal. This is why we must limit the number of subdirectories shown in the URL: many content management systems allow content to be hierarchically organized independently of the physical location of the files on the server, presenting URLs that are much simpler than the actual structure of the corresponding directory. For example, the URL https://www.midominio.com/Viajes/Destinos/Ciudades/MasInfo/Datos General/Congresses/Congresses_business.htm makes this content six levels deep below the home page.

Connecting the CMS link validation control

Most managers control the publication of broken links pointing to content controlled by the manager itself, but few validate that a link pointing to an external website is not a broken link. If the control exists, be sure to connect it to prevent a user from entering a link to a non-existent web page.

Leave control of the robots.txt file to the webmaster

Some content managers allow editing the content of the robots.txt file by the author of a page. In general, it is better that only the webmaster controls the content of this file to prevent that, through ignorance, a user can block the robots from crawling an important part of the Web.

Avoid duplicate URLs

Search engines are extremely selective when it comes to punishing duplicate content on a website, so we must ensure that each page only exists under a single URL. In any case, if we want users to be able to reach the same content from homologous URLs, it is better to program permanent 301 redirects, which are not punished by search engines.

Avoid session variables in the URL

If our portal performs electronic transactions or some other process that requires maintaining customer status, it is preferable to use a session cookie rather than a session variable that appears in the URL. The reason is that if search engines detect this, they will refrain from crawling the page to avoid indexing the same page with a different session variable as different pages. For example: https://www.midominio.com/index.php?module=Health&action=DefaultIndex&PHPSESSID=98ab41f9acd8c74df4b6e063985bda8d. In this URL we can see that there is a session identification parameter (PHPSESSID) that, if detected by search engines, will prevent the page from being crawled, since successive visits by the search engine would file the same page as different pages, in which only the value of the session identifier would change. In any case, the use of session IDs should be restricted to only those areas of the portal where it is absolutely necessary.

Reduce junk code

Simplify the HTML code used in the templates and opt to use Cascading Style Sheets (CSS) instead of tables to lay out the content. The use of Cascading Style Sheets makes it easier to update the design of a Web site, considerably reduces the weight of the files (the layout travels only once from the server to the user, after which it is already available in the temporary memory or cache of the browser for the following pages visited) and gives greater weight to the significant content of the page with respect to the total weight of the page’s code.

Select text option for site navigation

Avoiding whenever possible the use of Javascript or Flash menus, since their links cannot be followed by search engines. In many cases, we can achieve with the use of CSS effects similar to those of Javascript or Flash menus. If the content manager allows you to create abreadcrumb trail, let’s activate it. It improves the usability of the site, helps to place the user in the whole structure of the web and is an excellent shortcut for search engines to crawl all the content.

Do not forget that there are headings

The use of styles makes us forget the existence of HTML hierarchy tags (H1, H2, H3, etc.) whose final visual appearance can also be modified with styles, but which help search engines to better understand the logical structure of the page and indicate which aspects are more important. It is therefore important to encourage content editors to use headings instead of simply defining larger or smaller text with font size and, if possible, to limit to a single top-level heading (H1) per page.

Require the creation of a unique title, and the addition of relevant metatags

Program the CMS in such a way that the completion of title and meta tags is a requirement for the publication of the content and, if possible, activate a control to check the uniqueness of the title.

Require alt tag to be filled in when adding an image to the content

This tag allows search engines to better index images, helps in the relevance of key terms on the page and improves accessibility to content for the visually impaired.

Implement controls to prevent the publication of duplicate content

Many content managers include ways to prevent duplicate content from being added. Along with thin content control, this is one of the most interesting ways to avoid Google Panda penalties.

Encourage the use of descriptive text in links

Instead of “Click here”, use “More information about our 24H customer support”.

The best of both worlds

It is evident, then, that if when implementing a content management system to generate and maintain the content of a large portal, aspects related to web optimization and search engine positioning of said portal are taken into account, we can encourage or, in some way, impose a certain discipline through the CMS system itself that involves content editors to create easily crawlable and indexable pages, which can adequately compete in search engines.

Large companies have the search engines’ favorite raw material: abundant, original, dynamic and rapidly updated content. Let’s take advantage of the full capacity of content managers to extract the maximum return on the investment they make in their online presence.

  •  | 
  • Published on
Fernando Maciá
Fernando Maciá
Founder and CEO of Human Level. Expert SEO consultant with more than 20 years of experience. He has been a professor at numerous universities and business schools, and director of the Master in Professional SEO and SEM and the Advanced SEO Course at KSchool. Author of a dozen books on SEO and digital marketing.

What do you think? Leave a comment

Just in case, your email will not be shown ;)

Related Posts

en