Search engine optimization: trends for 2005

Fernando Maciá

Written by Fernando Maciá

Anticipating what will happen in the future – even if that future is immediate – is a risky business. If what we are talking about is the Internet, then rather than an exercise in foresight, we may be trying to exercise the arts of pure divination. Nevertheless, companies must work within the framework of the possible scenarios that we can foresee for the future, so that we can establish a strategy that allows us to be prepared for changes before they occur. Since it was Bill Gates who once said that 640 Kb of memory (RAM) should be enough for everyone, I humbly accept the more than likely possibility of being wrong and I propose to outline in the following lines what we at Human Level Communications anticipate for this year, within the increasingly competitive and rapidly changing search engine marketing sector.

All against one and one against all

Concentration of search engines

2005 marks the culmination of a process of search engine concentration that has left three unquestionable protagonists. Google as -still- undisputed leader, Yahoo! and MSN Search. These three search engines account for 85% to 95% of all search engine/directory traffic to a website and their databases feed the results of many other search engines and meta-search engines that have either fallen into Yahoo!’s orbit (read AltaVista, All the Web, Overture, Inktomi) or drink from Google’s stream (Netscape, AOL, Terra, Lycos…). Meanwhile, the giant Microsoft is fine-tuning its own search engine technology, still in beta, in MSN Search.

The areas where the greatest confrontation between these three giants is foreseen and which could have the greatest impact on search engine marketing are the following:

  • Quality in organic results: only time and the market will decide which search engine is able to produce the best organic (not paid) results. Google has the most extensive experience in this field, while Yahoo!, using the know-how of one of the leaders in this field – Inktomi – which it acquired, has developed its own website indexing algorithm. The MSN Search crawler has been scouring the web for more than a year now to generate Microsoft’s own search engine database.
  • Local search: using a search engine to find, for example, the closest pizzeria to our own location introduces a key geographic segmentation factor for the success of the search engine strategy of many online businesses that, despite the universality of the medium, focus on a geographically limited market. Google and Yahoo! have already taken the first steps in this direction and MSN Search will soon follow.
  • Free e-mail: Yahoo! Mail and MSN Hotmail shared the leadership in free e-mail accounts. Google came up with Gmail, a free 1Gb mail account within a contextual advertising business model, through the automated analysis of the content of messages, which hurt the sensitivities of the most fierce defenders of the privacy of communications. Yahoo! Mail has reacted quickly by raising the quota of its free mail to 250 Mb and offering a Mail Plus mode, which for less than $20 per year doubles the space offered by Google. Once again Microsoft is behind in this fight, although it has raised Hotmail’s space quota to 250 Mb, so it will be interesting to see what MSN’s response to Google’s challenge is.
  • Desktop search: like many other notable advances in computing, the idea of a web search integrated with the search on the user’s own local hard disk is not new: once again, Apple has been ahead of the curve with its Sherlockwhich has been present in its MacOS operating system for several versions. But Google Desktop has been the first similar application of this principle in the Windows universe, and once again wins the battle in its own field against the Redmond giant in the face of the possibility that Microsoft might try to integrate its own MSN Search tool in the next version of Windows, as it did before with the Explorer browser (which, in the end, left Netscape Navigator mortally wounded).

New players

Personalized search -Eurekster- or interlinked results by category -Kartoo- are an example of some of the risky bets that new players in the web search arena are developing to challenge the leadership of the three giants. There is still much to experiment on the Internet and the previous experience of a company born in a garage, Google, which challenged and defeated – at least for the time being – the giant Yahoo! encourages many entrepreneurs to take on the role of David against Goliath again.

Optimize, optimize, there is something leftThrough web optimization we will achieve a better positioning than by resorting to

Google’s success demonstrated that users would use search engines that were able to generate results that placed the most relevant pages for the defined search criteria in the top positions. Faced with this laudable mission, webmasters, first and then SEO experts, have tried to place their websites in the highest positions, in the maximum number of categories, in order to obtain the greatest possible flow of traffic. The time of “tricks” – keyword saturation, hidden or invisible text, metatag saturation, link farms, mirror websites, continuous search engine submission requests – belongs to the past. Search engine algorithms have discovered each new “trick” at about the same rate as the “experts” discovered and exploited the weaknesses of the crawlers. This process of mutual confrontation has led to a reciprocal improvement: search algorithms are now more difficult to “cheat”, so that they are progressively able to provide more relevant results ordered by a hierarchy that benefits the search engine user. On the other hand, webmasters have discovered that it is more important to correctly identify the market niche they are targeting and to generate rich, quality and constantly updated content, within website structures with a refined, usable and accessible code, than to abuse absurd language to “please” only Googlebot.

In the field of web optimization, the immediate framework for action focuses on the following areas:

  • Common sense: what is the point of appearing first in a search if when the visitor lands on our site he finds repetitive, absurd language and confusing link structures? In other words, do we have a website to have many visitors or to have many customers? To appear first in the search is to have the prospect at the door of the store. Until you choose something and pay (and become a customer) there is still a long way to go.
  • Professionalization: the time of optimization “tricks” is over. Today, the optimization work is the domain of real experts capable of generating coherent navigation structures, a meaningful language for the search engine and natural for the visitor, inbound links from sufficiently important and significant websites and a consultancy that can identify the market niche of each website and the keywords that will generate the highest level of conversion of website visitors into customers.
  • Content: generating original and quality content is one of the most efficient investments, in terms of return on investment, that can be undertaken. Writing articles, participating as a moderator in forums or chats, developing e-books or newsletters translates, in the world of networks that is the Internet, into a rapid influx of qualified traffic not only from customers, but also from suppliers, collaborators, potential partners…
  • Inbound links: links from other important and related websites pointing to our website are one of the most effective ways to get qualified traffic to our website. There are several reasons for this.
    • Appearing as a recommended link in important portals positions us as experts or leaders in our sector before our potential customers (and our competitors).
    • The links pointing to our website contribute to increase our PageRank, the measure of popularity with which Google expresses the importance of a website (and one of the parameters it takes into account when placing our website in its results).
    • Inbound links from other websites produce more qualified traffic directly (people who click on those links) and indirectly (people who click on Google for appearing in better positions, thanks to the popularity of our website).
  • Usability: for a website to obtain a good performance (whether sales, contacts, quality visits) it is necessary that it is not only optimized for search engines, i.e. indexable (search engine friendly) but also optimized for the user, i.e. usable (user friendly). Therefore, it is important to take into account the usability criteria when designing the navigation, layout and contents of the website.
  • Feedback: when optimizing, we cannot ignore which search terms and which search engines and other referrers produce the highest conversion rate to customers. Based on the analysis of this data, we will continue to reoptimize our website, not to obtain the highest number of first positions in search engines, but to achieve the highest profitability (which is what it is all about).

From information to knowledge

Conversion

This is what we will hear most about during 2005. The real goal for this year will be to convert an increasing percentage of visitors into customers. Getting top positions in search engines, although one of the most profitable investments, costs money. But it is even more expensive to face an AdWords campaign or any other form of Pay-per-Click. And the higher the level of traffic, the higher the cost of the bandwidth consumed and, sometimes, even the cost of the statistical recording of the traffic. This is why the quality of traffic will be prioritized over quantity. This will require a concentration of efforts on two aspects: positioning on the one hand, measuring results on the other.

Positioning

It will be worth investing more time in identifying and selecting the key concepts for our market niche. As well as the most effective search engines and value-added portals to drive quality traffic to our website. The main efforts will be concentrated on:

  • Optimization will no longer be the preserve of the few. It is foreseeable that the generalization of web optimization procedures will tend to shift the competition between websites in the same sector from the field of optimization “tricks” (metatags, key term saturation, etc.) to that of the richness, usability and genuine interest for users of the contents of the websites, as well as references to them in the form of inbound links.
  • Quality code. More and more portals are using XHTML 1.0 code and CSS 2.0 cascading style sheets for the syntax of their web pages. The flexibility and customizability of the design and the faster download speed of this type of code brings the added benefit of increased accessibility and indexability.
  • Content localization. Beyond the simple translation of texts, websites targeting a global audience tend to “localize” their content according to the linguistic and idiosyncratic peculiarities of each country. In the United States, people are beginning to consider the possibilities opened up by the juicy Hispanic market, which has now become the country’s largest minority. The European multicultural reality demands the adaptation of websites to an even greater variety of languages and interests. And we cannot ignore the growing interest that Chinese and Arab audiences are beginning to arouse in the West.
  • Usability and accessibility. Two principles that, in addition to improving and making it easier for our users to become customers, imply a greater social responsibility for minorities and, generally, an increase in the indexability of our websites by search engines.

Web traffic analysis

Ranking reports and “raw” traffic statistics are therefore meaningless if they are taken as the sole measure of a website’s success. We will have to concentrate on selecting the most significant measures of our website’s performance (the so-called Key Performance Indicators) and establish a pattern of continuous analysis that will allow us to identify trends and anticipate changes. To be successful, this type of analysis must meet these conditions:

  • Each department establishes which measures are its performance indicatorspercentage that the internal product search engine has given a satisfactory result, percentage of queries resolved in the support section, proportion of participation in the traffic of the various sections, percentage of products added to the shopping cart, percentage of purchases successfully completed, valuation of articles by visitors, recommendations of the site to other users, contacts made…
  • On the basis of these indicators, which must be identified in each case, it is advisable to plan a schedule for periodic monitoring of the various data. Instead of receiving hundreds of data on web traffic, the head of each department can concentrate on a few measures that he or she can monitor on an ongoing basis.
  • This monitoring does allow each manager to propose improvement actions and to identify, in subsequent measurements, whether these actions have been successful or not, since they are concentrated in only one area and have specific measures that indicate this performance, so that corrective measures can be introduced.
  • From these measures, the cycle starts all over again.

Conclusion

The concentration of search engines into three main players – Google, Yahoo! and MSN Search – goes some way to simplifying the task of search engine optimization. However, the generalization of this type of strategy is beginning to erase the competitive advantage that for a time has been given to the first websites that optimized their pages to be more “liked” by the search engines. From now on, the positioning battle shifts from the “tricks” of false experts to more essential and complex aspects: metatag optimization, yes, but also richness of content, code correction, content generation for other websites, participation in forums and chats, active search for partners and inbound links, online training, usability, accessibility… And in this scenario, the improvement of traffic analysis methods, increasingly sophisticated, from which we can extract valuable knowledge for our marketing decisions, not only in terms of our online presence but also in the offline world. And, finally, all of this may be proven within a year to be perfectly false…. But that is still a year away and, as correction is wise, we will start from this basis and keep an eye on the evolution of this exciting sector to avoid being caught unawares by the technological “tsunami”.

  •  | 
  • Published on
Fernando Maciá
Fernando Maciá
Founder and CEO of Human Level. Expert SEO consultant with more than 20 years of experience. He has been a professor at numerous universities and business schools, and director of the Master in Professional SEO and SEM and the Advanced SEO Course at KSchool. Author of a dozen books on SEO and digital marketing.

What do you think? Leave a comment

Just in case, your email will not be shown ;)

Related Posts

en