At last! A Rock-solid Strategy to Help you Reach High Search Engine Rankings

rock.jpgEvery SEO strategy has its good and bad

Search engines have long kept their ranking formulas secret in order to give users the most relevant results and to preclude malicious manipulation. Despite all the secrecy, though, it remains extremely difficult for search engine companies to prevent either direct or indirect manipulation.

We can broadly classify search engine optimization (SEO) strategies into two camps: passive and active. Passive optimization occurs automatically by search engine robots, which scour the web finding and categorizing relevant sites. Most websites listed on the search engine result page (SERP) fall into this category. If it were up to search engine companies like Google and Yahoo, this would be the only type of optimization that existed.

Active optimization takes place when website owners purposefully engage in activities designed to improve their sites’ search engine visibility. This is the kind of search engine optimization we normally talk about when we think of SEO.

Let’s go one step further and classify these deliberate forms of optimization based upon the tactics employed. In general these are: basic principles, exploiting flaws, algorithm chasing, and competitive intelligence.

Basic principles are just that. We know that search engines, just as like human beings, intuitively recognize a website has a lot of information about, say, “classic cars” when that phrase appears many times in the text, has links to other sites about classic cars, and is linked to by similar sites. Using such basic principles, we can optimize our site by creating content-rich web pages that include the targeted keyword phrases. Moreover, we can attract high quality and relevant links, and we can make our website navigation easy for both the search engine crawler and our visitors. This strategy is primarily used by so-called “white hat” SEOs. The problem is that search engines do not always rank every such relevant site highly. It seems unfair, but it’s true. Here is an example.

allintitle.gif

Exploiting flaws is technique that involves altering a web page in such a way that it confuses search engine ranking algorithms and drives them to give a high ranking to an undeserving page. This strategy is primarily used by “black hat” SEOs. Of course, such tricks are often short-lived—search engine companies are on a constant lookout to plug these holes.

Algorithm chasing is an optimization technique based upon serious study on search engine patents, public research papers, and observable data from search engine result pages. The goal is to identify patterns that might give a near approximation of the ranking formulas employed by major search engines, and thus provide a clear path toward search engine optimization. This strategy is primarily used by what we call “gray hat” SEOs. The drawback to this approach is its inherent difficulty, compounded by the fact that search engines companies regularly change their algorithms.

To illustrate how difficult this actually is, take a moment to consider all the rankings factors listed by Seomoz.org (http://www.seomoz.org/article/search-ranking-factors). This is not a scientific deconstruction of search engine algorithms, but rather a poll of expert opinions—it’s what SEO experts believe is going on in those elusive data centers. Most estimations are no doubt based upon empirical evidence, but some are nothing more than educated guesses. Between 2005 and 2007, these perceptions have changed many times. Of course, so have the algorithms employed by Google, Yahoo, Microsoft and others.

Competitive intelligence is based on the idea that optimization can be achieved by carefully examining high-ranking websites, their content and their links. With just a basic understanding of how search engines operate, we can deduce how those sites achieved their high rankings and then apply that knowledge to the sites we want to optimize. This strategy is also used by gray hat SEOs. The technique can be very effective, but it serves to note that websites with high rankings are listed prominently for varying reasons. Some rank high for just a few days or weeks, and others for several years. Choosing the wrong sites for competitive intelligence can result in unreliable results.

The need for a more dependable SEO strategy

We know that a search engine’s job, first and foremost, is to rank highest the most relevant websites for each and every search we make. These sites are considered web authorities. For example, you couldn’t imagine a web search for “Apple Computer” not listing Apple Inc.’s homepage. Search engines need to find these web authorities and list them first.

The strategy that I’d like to propose combines competitive intelligence with web authorities. That is, we must devise techniques designed to identify such web authorities for the target terms we want to optimize, and apply competitive intelligence to understand why those sites get to be top-ranked. Only by optimizing our sites based on this knowledge can we truly succeed.

Let’s call this strategy SEO Intelligence. At first glance, our technique might seem gray or black hat. In reality, however, it is not at all. We must always keep in mind that, in order to optimize our site and become a web authority, we still need to provide truly useful content. Only then will visitors and the topical community link to our site and grant the authority we need.

A word of caution

One of the most frustrating experiences about learning search engine optimization is the large number of often conflicting ideas and suggestions from SEO experts in blogs and forums. Evidence of this is the “Most Controversial Factors” section of the 2007 ranking factors report. The problem is that some ideas and suggestions are “gut feelings”—conclusions based on incomplete experiments, a misunderstanding of basic principles, or the ever-changing search engine ranking algorithms. As a rule of thumb, it is generally good practice to follow advice that is backed with observable evidence or that you can experiment with on your own to confirm or debunk.

In future posts I will discuss the tactics and techniques necessary to identify web authorities.

20 replies
  1. Hamlet Batista
    Hamlet Batista says:

    Paul,

    I am glad you like it. My web designer helped me improve the blog a little bit.

    He is currently working on a new and original theme for my blog. It's going to take a few days to be finished, but I think it is going to be worth it.

    Reply
  2. Paul Montwill
    Paul Montwill says:

    "SEO Intelligence" – SEO became a science and there are so many different factors that need to be taken into consideration that being an SEO expert means – research, analysis, knowledge exchange. It is getting more difficult every day because there are more and more web pages on the Internet but the SERP is still the same – 10 places. I am wondering what the situation will be in 10 years. 3D SERP? More advanced vertical search? I am happy with the trend – you can't just jump into SEO these days and have great results in one month. You need to learn a lot, work hard, stay motivated and keep improving your analysis skills every single day. Getting crowdy, ha? That is why we need more sophisticated methods and ways to improve our websites. Thanks for helping us with this journey, Hamlet.

    Reply
    • Paul Montwill
      Paul Montwill says:

      I agree but it wasn't a few years ago when every kid who knew something about it could jump on the first page. And search engines were not so smart as they are now.

      Reply
  3. nicknick
    nicknick says:

    I wouldn't say that they are any smarter. Just more complex. I still see a lot of poor quality sites that rank in the top 10. If they were smarter, they'd have improved their quality. They are certainly harder to manipulate then they used to be.

    Reply
  4. Paul Montwill
    Paul Montwill says:

    For me smarter and more complex is the same.

    I still see a lot of poor quality sites that rank in the top 10.

    It does not mean that search engines are not getting smarter. They are not perfect.

    If they were smarter, they’d have improved their quality.

    How do you know they haven't?

    They are certainly harder to manipulate then they used to be.

    So they are smarter? :-)

    Reply
  5. Kenric
    Kenric says:

    SEO will just keep evolving and evolving as time passes on. Search engines are really young if you think about how long ago they were created. There will always be a constant battle between SEOs and SE programmers.

    Reply
  6. Rex
    Rex says:

    "One of the most frustrating experiences about learning search engine optimization is the large number of often conflicting ideas and suggestions from SEO experts in blogs and forums… The problem is that some ideas and suggestions are “gut feelings”….it is generally good practice to follow advice that is backed with observable evidence …."

    I like this. Reading the SEO forums can be confusing and misleading. What we really need is a set of proven principles. I'm looking forward to your next post on this topic.

    One such example of steps that lead to SEO results is the wonderful story of Julian Paling, a hardworking London SEO, who describes "How I got to Number 1 in Google." I think you'll enjoy it.
    http://dfinitive.com/blog/seo/how-i-got-aerobed-t

    Reply
  7. Bobby
    Bobby says:

    Making that important distinction between passive and active search engine optimization — and then using the advanced principles you list and describe — is key to seeing a high search rank in an awfully competitive digital world. Thank you for sharing!

    Reply

Trackbacks & Pingbacks

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply