Written by Fernando Maciá
Table of contents
- 1 1. More Schema and new featured snippets
- 2 2. From NLP to NLU
- 3 3. Entities in all result formats
- 4 4. Increased weight of user behaviour
- 5 5. More variety of result formats
- 6 6. Targeting new verticals
- 7 7. Lower organic CTR
- 8 8. More local+mobile adaptation
- 9 9. Advertising in voice results
- 10 10. Google as the only ecosystem
At Human Level we have got our crystal ball out. And even though we know we might end up being wrong with many of our predictions regarding what will happen in SEO in the following months, here’s our take. These are the Top 10 trends that could set the path for search engine optimization in 2020.
1. More Schema and new featured snippets
Google is moving forward in giant strides towards recovering from completely semantic information. In the meantime, it’s still asking us to add structured data Schema.org markup to our HTML code.
The definition of new schemas has increased exponentially since its launch in 2011. Below you can see a comparison between the schemas provided at the moment of its launch (http://web.archive.org/web/20110728002346/https://schema.org/docs/full.html) and the ones available in the present day (https://schema.org/docs/full.html). Today, we can choose from more than three times as many schemas than there were in 2011.
As a direct consequence of schema.org‘s extension and its mass implementation in increasingly more websites, Google is capable of showing the direct answer to more queries through the many different formats of featured snippets.
If the spotlight for this year’s second half went to FAQPage and HowTo featured snippets, for the next year we foresee:
- New featured snippet formats.
- An increased control over the coherence between the landing page type and the featured snippet. For example, Google has established in its usage guidelines for FAQPage that it is not to be used with advertising purposes. However, several travel industry websites have already discovered that this featured snippet type is a good way to hoard more visibility in the SERPs. We predict Google will end up vetoing the inclusion of this structured data markup as the main entity to an obviously transactional landing page. But while they don’t, keep ’em coming!
In other related news, we’re still fighting over whether featured snippets generate more or less CTR than a classic #1 ranking result. Just as with many other controversial topics in SEO, the answer is: it depends.
Indeed, it depends on whether the featured result is capable of satisfying entirely the user’s search intent or it isn’t. In the first case, the CTR decreases dramatically, but we should stress the effect on the obtained results in terms of branding. In the second one, the CTR really achieves figures above 70%.
We’ll have to continue testing things to find out how to earn this extra ball in the SERPs, while taking care of our CTR at the same time, and making the most of the branding effect.
You can see our presentation (in Spanish) about featured snippets and its effect on voice search. There’s also a video here.
2. From NLP to NLU
The implementation of Bidirectional Encoder Representations from Transformers (BERT) is just another step for Google, on its way from the Natural Language Processing (NLP) to Natural Language Understanding (NLU), that is, a concept closer to human-like language understanding.
The key to reaching such a precise understanding of natural text resides in understanding the meaning of each word within the context of all the words making up a sentence, instead of processing them in a linear way, following the same order in which they appear. We can see how Google identifies entities in a text, and the relationship between them, with its Natural Language API demo:
As we can see in the picture above, Google knows that Fernando Maciá is a person entity, and that it’s very relevant to the text’s meaning (0.81). It also knows that Human Level Communications is a company. It also understands ‘books’ to be an intellectual work, and that ’10’ is a number.
In a syntax analysis Google is capable of seeing each word’s purpose and its relationship with the rest within the context of a sentence.
As Google understands natural language better, it will:
- Identify more precisely the user’s real need, subjacent in the search intent, with each keyword. And, at the same time, it will be able to link a specific need to a whole lot of different search formulations (neural matching)
- Evaluate better what content responds better to a user’s needs, regardless of classic relevance factors, both on and off-page, like the presence of a specific keyword within the page title, or within link anchor texts.
- Depend less on indirect quality factors or content usefulness, like authority and popularity, calculated based on link quantity and quality.
- Depend less on the content’s semantic markup, i.e. structured data, to extract valuable information. This will situate Google in a position of advantage for the extraction of more relevant information and for transforming it into direct responses to users, using the rest of the Web as its own information repository. And this will happen whether we want it to happen (using schema.org) or not.
All this seems to be laying the groundwork for Google to become a truly indispensable assistant, capable of understanding natural spoken language and answering in the same manner, with the most relevant data for each necessity.
3. Entities in all result formats
Google has already been showing us the entities he links to a specific concept in Google Images results:
The entities displayed help us to refine our image search, in a similar way to how Google Suggest suggestions help us to complete a simple text query. Nevertheless, it’s likely that Google will begin to show other relationships between entities as it expands their variety and recognised relationships.
Our prediction at this point is that there will be a larger percentage of SERPs where the knowledge graph will appear, and Google will display other entities to which he links the provided concept, increasing what we already see in US search results:
The recommendation for working on this concept would focus on making it easier for the search engine to identify these relationships, as well as authority sources which would help Google verify them. This means, not just links, but references in general, even if they don’t contain links.
Presenting all these relationships directly in the search result pages not only facilitate refining the search for the user, but, more importantly, they statistically verify these links between the various entities. I.e. it will be us, users, who will confirm to the search engine which relationships between entities are more relevant for most people.
4. Increased weight of user behaviour
This year we’ve had this huge controversy with regard to whether user behaviour could be growing increasingly important as an SEO relevance factor. Juan González from Sistrix, had already published in 2016 his juicy take on the matter, with examples of changes in search results, which apparently can be explained with user behaviour influence, and Johannes Beus was wondering about the same thing in another post (“Is user experience a ranking factor?“), especially after Gary Illyes disproved similar hypothesis posed by Rand Fishkin.
The truth is, regardless of how Google is incorporating user behaviour to its algorithm, every SEO has been able to verify how rankings change for different results, while classic on and off-page factors remained unaffected.
At Human Level, we also ventured into posing our own hypothesis in this presentation, the video recording of which you can see below.
Regardless of the medium Google employs to measure user satisfaction for a specific result, we have no doubt that, just as with the development of semantic aspects, they will continue to gain importance at the expense of secondary quality indicators, namely classic on-page relevance factors (keyword density, keyword presence in prominent locations, etc.) and off-page relevance factors (link quality, quantity and anchor texts).
If it can measure the user’s response directly and internally, why would Google keep trusting secondary external factors?
5. More variety of result formats
Direct consequence of points 1, 2, 3 and 4. Google not only accesses, stores and orders more information, but it also understands it and knows how to extract added value from it. It hasn’t been that long since the search engine started to show results like this one below:
Or, for instance, result sliders for queries where two entity categories are linked, namely this one of “cheapest neighborhoods in New York”.
Google shows an increasing variety of featured results, and for a larger percentage of results (in 2018, featured snippets were shown for an 8% of keywords, according to this SEMrush study).
It’s become clear that the search engine is capable of interpreting correctly and more precisely a search intents with a higher complexity, and their underlying needs. And its response is each time more accurate, making entirely unnecessary to resort to certain specialised vertical search engines.
Which takes us to point 6…
6. Targeting new verticals
Google doesn’t want to act as a meta search engine. If we give to it all our information, why shouldn’t it show it directly in its search results?
Here are some examples of new formats that have been rolling out gradually, to complete the classic block of ten blue links:
6.1. Stock exchange
For a stock exchange price index query, Google shows as the top result a widget with direct information. Bad news for economy, finance, investment and stock exchange media websites. Something, which is also readily available to be used with the Google Assistant and its voice search.
Google already shows flight options directly in its search results, for a route query, providing a variety of companies with prices, specifying whether they are direct flights or with stops, as well as information on dates, departure/arrival airports, etc. All this data is part of the Flight schema, together with a specific flight’s status.
Searching for hotels also returns direct results from Google, pinned on the map, with user reviews, prices and the possibility to check their room availability on any date. In short, a hotel search engine without leaving Google’s SERPs, which also started showing prices per day in 2019.
6.4. Ingredients and recipes
Searching for ingredients and recipes has been the development field for many niche sites. For this type of queries, Google has began to display direct results where it combines ingredient sliders, recipe results with a knowledge graph, which puts together their most attractive aspects.
Without doubt, in 2020 we will witness a further expansion of new search result formats, which will put at risk the viability of websites, which up until this point have been profitable. Some likely candidates include all those related to local tourist information, real estate agencies, bibliographic information, instruction manuals, technical data and specs tables, advanced geographic information… And probably many others, which we won’t even see coming.
7. Lower organic CTR
Throughout 2019, we’ve been able to verify how Sparktoro’s 2018 CTR study was confirmed by data. Indeed, if you can check your Top 10 CTR, you have probably been able to verify a drop in clicks that seems impossible to justify for its position.
For example, comparing data of the last three months of 2019, with the same period in 2018, we can see how the average position grows by two points, and yet, the CTR decreases by 45%.
As much as we disagree with Google’s actions, it seems this is and will continue to be the scenario from now on. Together with organic optimisation strategies, businesses which haven’t yet replanned their branding strategy will be forced to do so. It looks like the only way to protect ourselves from fluctuating CTR will be to build a powerful branding.
8. More local+mobile adaptation
We are no longer living in times of mobile-first, but pretty much mobile-only now. The smartphone has become the go-to device for Internet browsing. While tablet sales are completely stuck, mobile phones have risen as the main device, which has resulted in more search queries being interpreted as local by Google.
Thus, a local SEO strategy becomes more important and it’s no surprise that Google is in a position to win the war against large local directories, 2.0 hotel and restaurant recommending sites, as well as local business’ websites themselves, and by extension, their Facebook pages, as the customer acquisition platform.
By boosting the submission of information to Google MyBusiness, as well as voluntary content contribution (pictures, reviews, etc.) or involuntary content contribution (browsing data, in-person visits, permanence, etc. transferred directly from our mobile phones) the search engine targets the most frequent local queries by: industry (restaurants, hotels, shops, trades), search intent (near me, how to get to…, best … stores) and needs (what to see, where to go, when to go…).
In 2020 we think the Google MyBusiness interface will continue to develop, so local businesses will practically build their site in Google search results, and Google will have all the information it requires to satisfy any need from within its assistant (it has been confirmed that the Speakable markup can be applied not only to news, but to any other type).
9. Advertising in voice results
As we see how voice search and results are gradually evolving, something, which won’t go unnoticed is –at least for the time being– unclear how organic information will coexist with adverts, as both have been the foundation for Google’s success over the last 20 years.
Can we expect for Google to delight us with an advertising pitch before returning the featured snippet result? Some sort of featured positioning for paid results…? Should we try and rank for queries starting with OK, Google?
The latter move is what posts like this one from Search Engine Journal and this one from Effective Spend seem to recommend. Although if we must pay attention to data, Juan González from Sistrix warns us that –at least in Europe– voice search could take some time to come.
However, we cannot ignore that we said the same thing about mobile… until it happened 😉
10. Google as the only ecosystem
The common thread tying these 10 trends we’ve just walked you through is Google’s intention, each time less concealed, of becoming the only ecosystem most users use.
With tools like the search engine, Chrome Internet browser, Google Maps, Google MyBusiness, GMail, Docs, etc., Google aspires to become the closed Web each company would want for itself, which will have as many users and information as possible.
Microsoft attempted something similar back in the day, and competition authorities put an end to the software giant’s monopoly aspirations. Google has already received severe fines in the United States, as well as in Europe, and now that its two founders Sergey Brin and Larry Page have stepped down, we no longer have anyone to demand that their famous slogan is respected: Don’t be evil.
It asked us for our sites to be faster, to adapt them to mobile phones, to implement structured data markup, to provide our pages in a sitemap… and we did it all. It already has all the information. So, now what?