Posts

PageRank: Caught in the paid-link crossfire

Last week the blogosphere was abuzz when Google decided to ‘update’ the PageRank numbers they display on the toolbar. It seems Google has made real on its threat to demote sites engaged in buying and selling links for search rankings. The problem is that they caught some innocent ones in the crossfire. A couple of days later, they corrected their mistake, and those sites are now back to where they were supposed to be.

The incident reveals that there is a lot of misunderstanding about PageRank, both inside and outside the SEO community. For example, Forbes reporter Andy Greenberg writes:

On Thursday, Web site administrators for major sites including the Washingtonpost.com, Techcrunch, and Engadget (as well as Forbes.com) found that their “pagerank”–a number that typically reflects the ranking of a site in Google

He also quotes Barry Schwartz saying:

But Schwartz says he knows better. “Typically what Google shows in the toolbar is not what they use behind the scenes,” he says. “For about two and a half years now this number has had very little to do with search results.”

There are two mistakes in these assertions:

  • The toolbar PageRank does not reflect the ranking of a site in Google. It reflects Google’s perceived ‘importance’ of the site.

  • The toolbar PageRank is an approximation of the real PageRank Google uses behind the scenes. Google doesn’t update the toolbar PageRank as often as they update the real thing, but saying that it has little to do with search results is a little farfetched.

Several sites lost PageRank, but they did not experience a drop in search referrals. Link buyers and sellers use toolbar PageRank as a measure of the value of a site’s links. By reducing this perceived value, Google is clearly sending a message about paid links. The drop is clearly intended to discourage such deals.

Some ask why Google doesn’t simply remove the toolbar PageRank altogether so that buyers and sellers won’t have a currency to trade with. At first glance it seems like a good idea, but here is the catch—the toolbar PageRank is just a means of enticing users to activate the surveillance component that Google uses to study online behavior. Google probably has several reasons for doing so, but at minimum it helps measure the quality of search results and improve its algorithms. If Google were to remove the toolbar PageRank users would have no incentive to let Google ‘spy’ on their online activities. Read more

A Google Allegory

When John Chow’s rankings dropped a few months ago, a lot of SEOs believed, and continue to believe, that Google banned him for selling links and wanted to set an example. It seems that many ignored his review for a link back campaign, which was clearly designed to game Google. It was also the main driver in his former top ranking for “make money online.”

Now it seems that something similar happened to graphic designer David Airey, and many started to advise him to remove the paid links from his blog. On the other hand, the site is still listed, only it’s on search engine result page (SERP) 6. John Chow is listed too, coincidently on the same SERP. Some say it might be a duplicate content issue. I have to agree with Jim Boykin, however, that there is no evidence that Google is dropping sites that are selling links, and I can say from experience that duplicate content filters tend to keep at least one version of the content—they don’t remove all of them!

One of the main problems I’ve seen in the SEO industry is that we formulate theories based on incomplete information. The fact that key information necessary to make our job easier remains closely guarded by the search engines (for obvious reasons) does not help either.

There is still some critical information missing here…

Will Google ever penalize sites for selling links? Read more

A detailed look at what can (and can’t) be automated in SEO

robot_flower.jpgIn my post about SEO automation, SEO expert Halfdeck expressed concern about the possibility of customers preferring a sophisticated SEO tool over a highly-trained SEO professional. I am sure many of my peers had the same thought running through their heads when they saw my post on Sphinn. The post even made it to the front page!

The reality is that we should embrace software progress in our field. We should look at software improvements and innovations as tools that can help us scale our own business and be more successful. Is it bad to use SEO software that does all the tedious work and helps us serve 20 clients per month instead of 10?

Man vs. Machine

If we look at the last 100 years, man has achieved an incredible amount of progress. Every invention brings a new level of comfort. You can do many things far easier, faster and more efficiently than before. There is an irreversible momentum in this direction.

The fear of computers replacing the work we do is not unfounded, of course. Machines are replacing human labor everyday. The more repetitive the work, the easier it is to get a machine to do it for you. On the other hand, the more creative and more social the task is, the harder it is for a computer to replicate.

There is software for almost any professional activity that we do, yet we (the professionals) are still in high demand. Nobody is going to trust his or her business to a computer blindly, after all. If something goes wrong, who would they complain to? ;-)

So the question becomes: do we resist change, or adapt and thrive? Read more

SEO can be automated!

rs_scsmall.jpg … partially

Loren Baker has asked a thought-provoking question: “Can SEO be automated?” Coincidentally he asks the question just a day after we released a product at TechCrunch40 with just such a goal.

It seems that the folks at Commerce360 are working to build a product similar to our RankSense. There is a fundamental difference in approach, however. We are not trying to replace the human element; we are trying to make humans work far easier and simpler. Truthfully, I don’t even think their goal of fully-automated SEO is possible. In many ways search engine optimization is plain old marketing—and marketing is driven by creativity. No machine can quite claim to be creative just yet. Read more

Why you should target the most competitive keywords

competition.jpgEverybody writing about SEO will tell you that it is not a good idea to optimize your site for the most popular keywords in your niche. What are your chances of success if you tried to rank for “internet marketing, where there are about half a million websites ranking for that term and most likely many savvy competitors?” I want to tell you why I chose to ignore such advice years ago, and how I was able to reach heights I couldn't have dream of by doing so. Of course it is also clear why the guys at the top are so eager to give such advice—nobody likes to face more competition. ;-)

I remember reading such advice five years ago when Sumantra Roy's KEI was a key ratio to identify keyword opportunities. I similarly recall an earlier period when I was still working on salary and planning to branch out on my own. I used to ask my friends and colleagues, mostly engineers, whether they thought that starting a business was a good idea. Their answer was always that they didn't think so. “Why leave the security and comfort of a paycheck every two weeks?” “Why take unnecessary risks?” After a while I realized that I was asking advice from the wrong people. How could they provide advice for something they didn't have any experience with? I decided to trust my instincts instead, and put my confidence in taking calculated risks. Read more

Long tail vs fat head optimization strategies – Part 3

snakeandsign2.jpgThis is the final post in the long tail vs fat head optimization strategies series. The focus of this post is to expand on the optimization strategy for highly competitive keywords. We are going to leverage our insights from the link analysis explained in Part 2 to build better and smarter links.

The purpose of link analysis is to identify the link sources that are providing the ranking boost to your competitor. I explained several principles that are very important when evaluating links. Ideally you will try to get the same link sources that are linking your chosen web authority to link to you (at least the most important or more authoritative ones). You will also want to make sure the links come with similar anchor text or textual context (text surrounding the anchor) and complement all this effort with some traditional and out-of-the-box link-building tactics. Unfortunately, this is easier said than done.

Read more

Long tail vs fat head optimization strategies – Part 2

bigheadsnake1.pngIn my previous post, I explored how to assess the competitive level of your keywords and I shared my strategy for optimizing non-competitive keywords. As promised, here is my strategy for optimizing highly competitive ones.

As this is a rather dense topic I will split it in two. This post will explain how to use link analysis to understand your competitor’s rank, and the following post will explain how to leverage that information in your own link-building efforts.

Not all links are created equal

At the moment, we need lots of links to our sites. My strategy is to study the link structure of my chosen web authority carefully, as well as their incoming link text in order to build a similar relevance profile for my site. If I can get similar links and anchor texts, chances are that I will be ranking right next to my competitor.

Unfortunately just getting links to your site is not enough; you need to look for the right links. No link is measured exactly the same. As I explained before, the more pages that match for a targeted query, the more the search engine needs to know about those pages to rank them properly. It is very important to understand this concept. It is the single most important reason why on-page optimization is not enough to compete for very popular keywords.

Just like on-page metrics, there are several metrics search engines use to evaluate links. Before you set out to perform link analysis and build links, there are some basic principles you need to learn. Read more

Long tail vs fat head optimization strategies – Part 1

snakes.jpgOptimizing for highly competitive keywords requires a completely different strategy than optimizing for non-competitive ones. First, let’s clarify a few points. When I talk about long tail or fat head keywords, I am talking in relation to the search demand for those particular keywords. I am not talking about the offer (the number of sites competing for those keywords). Although demand and competition are generally in direct proportion, there are cases where this is not the case, such as unexploited niches.

In this post, let’s just explore the simple case where you are targeting non-competitive keywords. They have decent demand but not a lot of competition. You may be asking yourself how you can tell the competitive and non-competitive keywords apart. Read more

Adolescent Search Engines: They are growing up so fast!

teens.jpgSearch engines are just like teenagers. Don’t believe me? Consider this analogy.

Let's say you have a teenage kid with a handful of friends. He knows them very well and even remembers their phone numbers by heart. He’s bright and it doesn’t take him long to become very popular at school. Now, he has dozens of friends. While extremely intelligent, he doesn’t have the memory to recall all his new friends’ contact info. Now he uses a paper address book to keep track of them, looking them up by their initials.

Later, he discovers social media sites on the Internet and gets addicted. He gains hundreds of friends all around the world. He genuinely wants to stay in touch with them but his paper address book is no good and he upgrades to a web-based electronic one. Now he can find any friend by simply typing in the first or last name. After joining several social networks and starting his own blog, he has several thousand friends. Suddenly he is faced with another unforeseen challenge: many friends have the same name! He needs to use a differentiating piece of information, their country or city for example, to tell them apart. But in several cases even this fails; he has three friends in Korea named John Kim, and two of them in Seoul! He has to tell them apart by age.

Now imagine that this kid is a search engine and his friends are our web pages. Instead of a few thousand listings, search engines have to sort through billions to find what is being searched for. This is when things get really interesting. :-) Read more

The Truth About Sitelinks: Site structure is splendid, it seems

There has been a heated debate on Sphinn about a controversial post by Rand Fishkin of Seomoz. There is a lot to learn from that discussion, but instead of focusing on the debate, I want to talk about something that keeps coming up: Google's Sitelinks.

yahoo1.png

Google doesn't provide a lot of information, but this is what they say about the matter:

  1. Sitelinks are presented if they are found to be somehow useful.

  2. A site’s structure allows Google to find good Sitelinks.

  3. The process of selection, creation and presentation of Sitelinks is fully automated.

Let's forget the technical details for the moment and focus on what Google's purpose is here: they want to save users some clicks by pointing them to the right page directly in the search results. Sitelinks appear only for the first result, and only for sites with meaningful traffic. (Google uses the toolbar data of visitor frequency to make this determination.)

I decided to dig deeper and study the sources, try some examples of my own and make my own conclusions. I'd definitely like to have Sitelinks when people search for my blog, and I'm sure many of my readers here would like the same. Here’s what I learned… Read more