Written by Jose Vicente
Table of contents
Structured data allows us to markup our content in an error-free manner, to help search engine bots understand the type of entities each page is composed of. If our website is an e-commerce, we will have to correctly markup our product pages, if we have a media outlet we will have to conveniently markup our news articles, and so on with a very long list of other possible options.
The correct implementation of structured data should result in the following benefits:
- Favour our content’s adaptation to the appropriate rich snippet results.
- Mitigate to some degree Google’s interpretation of certain pieces of content as thin content.
- Incorporate information to Google’s knowledge graph.
Although Google itself recognised that the simple fact of implementing structured data doesn’t directly benefit the page’s SEO, it does indirectly favour visibility, because crawlers will be able to understand our content more precisely. However, once Google understands our content, we will depend on its quality to get better rankings, regardless of whether we have added structured data or not.
Now that structured data’s possibilities are clear, halfway between technical SEO and content relevance, let’s see how we should work on it.
Where do I start?
This is a common question arising during the technical phases of a web development or SEO project. Developers have read on what is structured data and Schema.org, but there are over 600 different types of data, which we don’t really have time to look through, and there is no documentation justifying their use.
Therefore, the most reasonable thing to do is to concentrate on one first goal: to optimise SERPs to appear for the majority of relevant rich snippet results with our content. Google provides us with a search gallery, where we can check SERP current features, and, if we filter by website type, it will recommend us the most indispensable features we need to work on.
For example, if we have an e-commerce website, we will have to start by optimising Products, FAQs and Reviews. With this, we should see our CTR in the SERPs improve, as our goal shouldn’t focus so much on getting a good ranking, but getting a click for each relevant search query our website appears for instead. And, in many occasions, a slightly lower position but with a more attractive appearance can encourage more traffic.
Analyse your competitors
We also shouldn’t lose sight of our competitors when deciding which structured data we should add, or when evaluating how detailed or thorough we want to be. Tools like Sistrix or SEMrush can point us in the right direction of which SERP features our competition is already taking advantage of.
But if we don’t have access to such tools, we can use simple solutions, such as using the search engine itself. We’ll just have to search for a bunch of things, using the site: command in combination with one competitor domain, to analyse what type of solutions they implemented for each type of content. Once that’s out of the way, we will only have to go to Google’s structured data testing tool to see what structured data they use in each case.
If we want to be more thorough and see what structured data our competitors have implemented, we can use a crawler, namely Screaming Frog. This tool is capable of extracting what type of structured data each crawled page implements, as well as structured data that’s not directly related to rich snippet results.
Going the extra mile
Nevertheless, if that’s a possibility at all, we can explore schema.org and add structured data to a larger amount of content, beyond the basic types recommended in the search gallery. Although they won’t give our rich snippets more visibility, and we aren’t sure to what degree they’re going to improve our rankings, it’s a practice, which won’t ever harm us in any way, on the contrary, if we’re lucky, it will help us stand out with regard to our competition.
Using the Structured data testing tool we’ll be able to make sure that the used schema is correct, but we also must ensure that our marked content uses Google’s general guidelines for structured data. We should be especially mindful that our content with structured data is correct and visible for users.
Implementations and tests
Now that the type of structured data we want on our website is clear, we must check the level of precision we can reach in each case. If we check Google’s search gallery’s documentation, we’ll see that, in general, there are three sections to take into account:
- Structured data and rich snippet guidelines: in each section we will see a series of criteria that structured data markup or the page where it’s going to be included should follow. It covers most cases in which it is valid, and cases in which it isn’t. For example, if we want to implement the FAQPage type, we must markup a list of questions and answers. But, if the page only has one question answered by different users, we should choose the QAPage markup type.
- Mandatory property implementation: to be able to qualify to appear with rich snippets in the search results, we must make sure that we implement the properties stated in the documentation as mandatory for each type. Thus, we’ll have to ensure that this information is provided in each of our pages to be able to add structured data markup to it.
- Recommended property implementation: besides the mandatory properties, some features need additional data. We must complete these properties whenever possible.
In summary, during the implementation we must ensure that we comply with the general guidelines, corresponding to each rich snippet, and that we add the mandatory properties to qualify to appear with rich snippets in the results. We should also include them with other recommended properties whenever possible.
The structured data testing tool allows us to test example pages to detect possible errors in their code. If we can’t yet publish them, we can copy and paste their code into the tool. Nevertheless, we must keep in mind that the tests carried out on each piece of content and the properties it includes are very basic, so they’ll need a more thorough analysis.
What if I don’t follow the guidelines?
The answer to this question will depend on the type of guidelines we’ll be failing to comply with. For example, guidelines for carrousel features establish that it can appear when structured data of type recipe, course, article, restaurant and film are set. If we add structured data markup ItemList to any other type, the carrousel won’t be displayed, but we won’t be infringing any requirement that could result in penalty.
There are other requirements, however, which can result in a manual penalty due to issues in the structured data, unless we comply with the set rules. An obvious example of this would be when structured data doesn’t match the page’s content, or when we’re using structured data markup on content that is not visible for the user. In these cases, Google will see it as an attempt to deceive its algorithms, and that is seen as something negative.
After our structured data is published and we’ve made sure it’s all correctly implemented, we still have some SEO work to do. We must keep monitoring the documentation to keep up to date with the changes published periodically, some of which can be:
- New SERP features: over the last few years, Google has been constantly making changes to its search results. Amongst these changes, some can be very interesting for our website, which in some cases, will require some adaptation of our structured data implementation.
- Structured data changes: even if no changes are made to SERPs, Google can introduce changes to its guidelines or mandatory properties, making our implementation obsolete. When this happens, we’ll need to update our structured data accordingly.
- New recommended properties in structured data: in some structured data types we see how Google goes on increasing the number of recommended properties. This usually goes hand in hand with the creation of new rich snippet features.
- New schema.org structured data: the number of schemas changes constantly. Usually, we can see on https://pending.schema.org/ which ones are currently being worked on. We recommend to check this page periodically, to see whether some relevant type for our website is currently in the works, and monitor it until its definitive version becomes part of schema.org.
Review of errors
Google Search Console gives us the opportunity to detect errors in our structured data implementation. Inside its “Enhancements” section, there are the different rich snippets detected by the tool, and inside each detected type we can see their “Coverage”. URLs are classified in three different ways, depending on their severity:
- Error: structured data with critical errors preventing our rich snippets from appearing. It’s important to solve them as soon as possible.
- Valid with warnings: the structured data of these pages are missing non-required fields. Rich snippets can be displayed for these pages, but they won’t have certain features because of these missing fields.
- Valid: rich snippets don’t have any errors, all mandatory and optional fields have been filled, and thus they are valid.
Tools like Sistrix and SEMrush give us the possibility of evaluating for which terms we are getting rich snippets, but Google Search Console provides us with additional resources by showing which pages have problems, and the reason why our results may not appear with rich snippets. Missing mandatory fields in structured data can prevent us from getting rich snippets for a great number of search queries, which results in lower visibility. For that reason, it is indispensable to review each error and its details periodically, and implement the suggested improvement. As we go around solving them, we can request Google to validate the error, and get feedback on the result.
To conclude, when working with structured data we must start by focussing our efforts on SERP-related features, which can give our website the most visibility. We must strive to achieve the most thorough implementation, without failing to comply with the guidelines set by search engines. Finally, we must monitor possible errors that could invalidate our work regularly.