Maximum visibility on the Internet for 2009

Fernando Maciá

Written by Fernando Maciá

Search Engine Optimization - SEO in 2009If Web managers could write their own personal wish list to the Three Wise Men of Cyberspace, achieving maximum visibility of their Web site on the Internet would surely be at the top of their list of requests. This parameter of visibility as a factor in generating quality traffic, together with convertibility as a factor in transforming that traffic into a positive impact on the income statement, are the two critical aspects that determine the performance of a Web site and, ultimately, its Return on Investment.

Search engine optimizationThe search engine marketing strategy, traditionally understood as the adequacy of the code and content of a Web site, and the cultivation of external links as determining factors of its relevance and, therefore, of its positioning in search engine results, has been in recent years the fundamental competitive advantage of many pioneering Web sites in the application of search engine marketing strategies.

However, in the last two years there have been three circumstances that in the short term will make search engine optimization, understood in the traditional way, a necessary condition to compete, but no longer sufficient:

  • Generalization of SEO activity: more and more companies are including natural positioning strategies both in the development phase of their websites and in their daily management. The principles and recommendations to be applied at the code and programming level are now in the public domain and no longer the exclusive patrimony of a few. Like any other competitive advantage, it ceases to be so as soon as its application becomes generalized.
  • Improvement of search engine algorithms: the margin of influence on the calculation of search engine relevance based on easily manipulated aspects such as programming, code or link buying is narrowing every day. Search engines are reaching such a level of filtering of optimization strategies that the work required in this field has to be “finer” every day to avoid making mistakes such as over-optimization, buying text links, etc., which search engines have begun to penalize more frequently and more quickly. And the tendency to incorporate users’ own criteria in the results will only further reduce the room for maneuver in programming aspects.
  • Other traffic sources compete with search engines in lead generation: with social networks playing an increasingly important role. In certain market niches, social aggregators, opinion leaders, social bookmarking will tend in the future to generate, if not as much traffic, then possibly as many leads as traffic coming from general search engines.

This is why the search engine optimization activity must progressively open up to other fields in order to guarantee its objective, which is none other than to find quality traffic for a Web site, wherever it can be found. And it is from this more global perspective that we can see a series of trends that are becoming increasingly important in the mix of activities that online marketing companies must develop to achieve maximum visibility on the Internet:

  • Social Media Optimization (SMO): we can understand social media optimization as the company’s online public relations. The Web 2.0 model involves the empowerment of the individual who, taking advantage of the diffusion that the Web allows, can generate states of opinion and influence other people’s purchasing decisions. Properly managing the company’s presence in social networks, taking care of its online corporate reputation and stimulating positive opinion towards its brand and product and service offerings should become one of the priority objectives in the medium term.
  • Domain reputation: concepts such as TrustRank or the multiple models that are appearing whereby search engines will take into account the situation of a given domain in the Web ecosystem, its links, its history as well as the post-search behavior of users and their opinion of the traditional relevance calculation, its history as well as the post-search behavior of users and their opinion regarding the traditional relevance calculation narrow the room for maneuver by which certain search engine optimization professionals have focused on exploiting the weaknesses of the algorithms to their advantage and open a new scenario in which the identification of the genuine interests of the audience and the efficient response by companies will weigh more when it comes to rewarding with more visibility those Websites that really deserve it.
  • Proactivity and content intelligence: precisely along these lines, the companies that demonstrate the greatest proactivity in content generation will be the ones that will win the day. Proactivity fed from the identification of demand trends and anticipation of such changes, to generate content that responds to future search trends before the competition. Companies that rank at the top for these future search trends will achieve time and time again the best search engine visibility versus those that simply react to changes and will always “lag behind”. It’s not about what you know now, but the speed at which you are able to keep learning.
  • Traffic analysis intelligenceThe main focus is on the identification of the key performance indicators Where do my visitors land?where do they navigate to?why do they abandon the purchase process?why don’t they make more calls?how can I increase conversion rates? leads what does my Web generate? More than traffic measurement, interpretation of Web analytics data.
  • Stricter limits on traditional SEO strategies: search engines have learned to identify positioning techniques that focused on exploiting their weaknesses. Recent penalties on Web sites that had engaged in the buying and selling of text links or that showed clear signs of over-optimization of on-page relevance indicate that in the future, anything that is not – and, moreover, appears to be – totally natural for relevance purposes will set off increasingly sensitive alarms in search engines.

Ultimately, what these trends confirm is that the most easily manipulated aspects: those related to programming to influence the relevance on the pageor strategies such as the exchange or purchase of links to influence relevance. off the pagewill have less and less weight in the calculation of relevance by search engines, so its effect on the positioning of a Web in the results and its visibility will be less.

On the other hand, aspects that are more difficult to manipulate, such as domain history, original, quality and constantly updated content, a good online corporate reputation, a low bounce rate on visits and the ability to identify demands, anticipate changes and generate content for these demands before the competition does, will play a more important role in the future.

We are moving from a scenario where the gap between what can affect search engines – programming-related aspects – and what users like – marketing-related aspects – is narrowing, and this trend is only going to increase in the short term. So if your goal is to gain a sustainable competitive advantage, make sure that your website meets all indexability requirements at the code level: this will remain a prerequisite. But after meeting these requirements, focus on knowing your potential customers well: who they are, how they search, on which social networks they exchange opinions, what they will be interested in six months from now? The Internet is a race where there is no finish line: it’s all about staying ahead as long as possible.

  •  | 
  • Published on
Fernando Maciá
Fernando Maciá
Founder and CEO of Human Level. Expert SEO consultant with more than 20 years of experience. He has been a professor at numerous universities and business schools, and director of the Master in Professional SEO and SEM and the Advanced SEO Course at KSchool. Author of a dozen books on SEO and digital marketing.

What do you think? Leave a comment

Just in case, your email will not be shown ;)

Related Posts

en