John Pushed Google's Limit and got Dropped! Should you push it too?

John Pushed Google's Limit and got Dropped! Should you push it too?

June 29th, 2007 @ // 10 Comments

I just learned that John Chows rankings dropped even for his own name!  He suspects that it has something to do with his review for a link back promotion. I am sure that is the problem.

As of yesterday, I was not ranking for my own name. My problem was that when I moved from wordpress.com to my own server, I had http://hamletb.wordpress.com and http://hamletbatista.com, both with the same content. Fortunately, it was very easy to fix. I made hamletb.wordpress.com redirect to hamletbatista.com, and requested a removal of hamletb.wordpress.com from Google's index via Webmaster Central.

Some useful SEO tweaks I did  to my blog:

1. I installed all-in-one-SEO plugin so that I can have unique titles and descriptions. Meta keywords is not really useful for SEO.

2. All category pages, etc. have a meta robots noindex tag to avoid duplicate content.

3. I registered an XML sitemap of my blog to Google. 

Would these steps solve John's case? I don't think so.

While they offer to get link and traffic from a high PageRank website, such as his, and as compelling as it seems, I chose not to do any review for a linkback. I have linked to his site several times, but never a review for a link back. Why?

I understand search engines. I've been there and I have done that. I've learned my lesson. I know very well they don't like any type of manipulative behavior, no matter how creative it is. John was doing great as long a Google was not aware of what he was doing.

He says that he will change the rules; but, I think it doesn't matter, – if he changes them or not. Google is already aware of his efforts as he is doing it in the open.

If you are one of his reviewers or are following the same strategy, I wouldn't be surprised if something similar happened to you.

John says it doesn't matter. He only got a few visitors each day. Well, those were free visitors from a term that costs a lot of money in Google Adwords. Is it worth it to lose such ranking?

I have to say that I agree with him when he says that it is not a good idea to put all your eggs in one basket. You need to have a comprehensive marketing mix for your blog.  Ignoring Google is definitely not on mine.


Category : Blog

10 Comments → “John Pushed Google's Limit and got Dropped! Should you push it too?”


  1. Hamlet Batista

    6 years ago

    Jez,

    What I am saying is that by following John Chow's links Google can find other bloggers doing the same thing. Sounds like a good time to stop.

    Preventing Googlebot from accessing duplicate content pages is a good way to reduce the number of pages in the supplemental index. I did not change my robots.txt. I used the meta robots noindex tag.

    Reply

  2. Jez

    6 years ago

    Sorry misread your article regards the robots method.

    Do you believe reducing suplemental pages has a positive effect though, it seems an open issue from what I have read… I suppose the answer is to try it like you are…

    Reply

    • Jez

      6 years ago

      I suppose at the very least it makes analysis of your site easier if there are less supplemental pages….

      Reply

      • Hamlet Batista

        6 years ago

        Jez,

        My take on supplemental pages is very different from the majority of SEOs. I plan to write a detailed post about it soon.

        Reply

  3. Scot Smith

    6 years ago

    What do you think about only tagging each post with 1 category to prevent duplicate content?

    Does it make a difference?

    Reply

    • Hamlet Batista

      6 years ago

      Scot,

      Thanks for your comment. I think that would be very unfortunate for your visitors. For me, visitors come first.

      If Google finds duplicate content in your blog, the worst that could happen is Google listing your category page in the results instead of the actual post. To prevent this I simply block access to the category pages via robots.txt or meta robots noindex tag.

      Reply

  4. Jez

    6 years ago

    Regards John Chow, I noticed he has lost all the long tail phrases I used to be able to find him with. Looking for information on Adsense often used to list him.

    A good example is

    “the internet’s biggest google whores”

    That search brings up a lot of stuff related to JC, that was probably his most popular post…. but no direct link to JC.

    I think this is different to the last penalty he had, which seemed more keyword specific…. this time it seems as thought his site has been penalized across the board…

    The losses of 150 hits per day were for specific keywords, his total SE traffic was a lot higher… over 60k visitors per month I believe… I would imagine mostly from google….

    Reply
  5. [...] John pushed Google’s limit and got dropped! Should you push it too? [...]

    Reply

  6. MLM

    5 years ago

    I think goggle really is just trying to stop people from buying there way to the natural search engine listing.

    If all someone had to do was go out and pay for 1000′s reviews they would dominate the search engine.

    Just my thought great post.

    Reply

  7. Jez

    6 years ago

    Hi Hamlet,

    I was thinking the same thing today regards his reviewers… are you implying that john chow is becoming a bad neighborhood?

    Review for a linkback was a bit of a scam anyway IMO.

    Another thing I wanted to ask you was your opinion on the supplemental pages. You say you have modified your robots file, do you know whether this has an impact?

    Jez

    Reply

Leave a Reply


7 − six =

Latest Posts

Testimonials

"Hamlet is a passionate and focused entrepreneur with excellent ideas, a wealth of knowledge and a very sharp technical ability. It is a pleasure to work with Hamlet."

Richard Chmura — Owner, GoStats Web Reporting Corp

Subscribe Now