LongTailMiner v0.1 alpha — find invisible Long Tail keywords

I’m really enjoying this blogging thing! Every comment I am getting from my readers is a new idea that I feel rushed to put into practice.

My reader, Andrea, mentioned she parses log files to mine for keywords as well. That is an excellent idea.

I decided to put that idea into code and here is a new tool to mine for long tail keywords.

To make really good use of it, I would setup a PPC campaign in Google with a “head keyword” in broad match, bidding at the minimum possible. Make sure your ads maintain good click-through rates (over 0.5%) to avoid getting disabled. Run it for a week or two (preferably more) and you will have a good number of search referrals and “long tail keywords” that people are actually looking for. You can later create good content pages that include those keywords. In most cases, long tail keywords are really easy to rank with on-page optimization only.

I will probably write a Youmoz entry with more detailed instructions on how to take advantage of this. In this way I can get more people to try it and get really valuable feedback.

Read more

LinkingHood v0.1 alpha — Improve your PageRank distribution

As I promised to one of my readers, here is the first version of the code to mine log files for linking relationship information.

I named it LinkingHood as the intention is to take link juice from the rich to give to the poor linking sites.

I wrote it in Python for clarity ( I love Python :-) ) . I was working on an advanced approach involving matrices and linear algebra. After reading some of the feedback regarding the article, it gave birth to a new idea. To make it easier to explain, I decided to use a simpler approach . This code would definitely need to be rewritten to use matrices and linear algebraic operations. (More about that in a later post). For scalability to sites with 10,000 or more pages, this is primarily an illustration and does everything in memory. It’s also extremely inefficient in its current form.

I simply used a dictionary of sets. The keys are the internal pages and the sets are the list of links pointing to those pages. I tested it with my tripscan.com log file and included the results of a test-run.

Read more

Log based link analysis for improved PageRank

While top website analytics packages offer pretty much anything you might needto find actionable data to improve your site, there are situations where we need to dig deeper to identify vital information.

One of such situations came to light in a post by randfish of Seomoz.org.He writes about the problem with most enterprise-size websites, they have many pages with no or very few incoming links and fewer pages that get a lot of incoming links.He later discusses some approaches to alleviate the problem, suggesting primary linking to link-poor pages from link-rich ones manually, or restructuring the website.I commented that this is a practical situation where one would want to use automation.

Log files are a goldmine of information about your website: links, clicks, search terms, errors, etcIn this case, they can be of great use to identify the pages that are getting a lot of links and the ones that are getting very few.We can later use this information to link from the rich to the poor by manual or automated means.

Here is a brief explanation on how this can be done.

Here is an actual log entry to my site tripscan.com in the extended log format: 64.246.161.30 – – [29/May/2007:13:12:26 -0400] “GET /favicon.ico HTTP/1.1″ 206 1406 “http://www.whois.sc/tripscan.com” “SurveyBot/2.3 (Whois Source)” “-”

First we need to parse the entries with a regex to extract the internal pages — between GET and HTTP — and the page that is linking after the server status code and the page size.In this case, after 206 and 1406.

We then create two maps: one for the internal pages — page and page id, and another for the external incoming links page and page id as well.After that we can create a matrix where we identify the linking relationships between the pages. For example: matrix[23][15] = 1, means there is a link from external page id 15 to internal page id 23.This matrix is commonly known in information retrieval as the adjacency matrix or hyper link matrix.We want an implementation that can be preferably operated from disk in order to be able to scale to millions of link relationships.

Later we can walk the matrix and create reports identifying the link-rich pages, the pages with many link relationships, and the link-poor pages with few link relationships. We can define the threshold at some point (i.e. pages with more or less than 10 incoming links.)

Why it’s good to mix your incoming link anchor text?

I’ve been reading John Chow’s blog for a while and it is very interesting how he is getting a lot of reviews with the anchor text “make money online” in exchange for a link from his blog. He is ranking #2 in Google for the phrase “make money online.”

I know a lot of SEOs read John’s blog and are not alerting him of some potential problems with this approach. I like the guy and I think he deserves to know.

It is not a good idea to have most of your incoming links with the same anchor text. Especially if most links are pointing to the home page, and the rest of the pages don’t get any links, or very few of them do. Search engines, notably Google, flag this as an attempt to manipulate their results.

Nobody knows for sure how it works but Google has proven in the past that they can detect this and act accordingly.

My advise is to request variations of the target phrase for the anchor text with each batch. For example: make money online free, making money online, make money at home online, work from home, etc… Use a keyword suggestion tool to get the variations and make sure you include synonyms too.

I would also require reviewers to include a link to their favorite post in the review. This way the rest of the pages will get links too and look more natural.

This is documented in other sites. Please check:

http://www.marketingpilgrim.com/2007/01/google-defuses-googlebombs-does-this-change-link-building-practices.html

http://www.linkbuildingblog.com/2007/04/how_not_to_buil.html

http://diagnostics.googlerankings.com/anchor-text-link.html Case #2

http://www.webmasterworld.com/forum30/29269.htm

http://www.seobook.com/archives/000894.shtml

Your competitor is your best friend

As I mentioned earlier, for me success is about what, how, and work.  This is my simple formula.

Anywhere my customers or potential customers express their problems and frustrations is a place for me to dig out opportunities.  Forums, blogs, mailing lists, news groups, etc…   Your what should be driven by your customers’ needs.

Most critical for success is how we do it.  What sets us apart?  What is our UVP?  This is where following your best competitors closely, pays off.

Nobody is perfect.  There is always a better way to do things or at least to appeal to another audience.

My approach is not to simply copy what my competitors are doing.  This is the easiest path, but it is very difficult to stand out by just being another XYZ.

I prefer to look at my competitor’s solutions as their prescribed answer to customers’ specific problems.  The key here is that what needs solving is the customer’s problem, and there is rarely a single solution.  My solution is how I would solve it better leveraging my strengths.

The harder to get the link, the more valuable it is

Links that are too easy or relatively easy to get do not help much in getting traffic or authority for search engine rankings.

If your link is placed on a page where there are several hundred links competing for attention, it is less likely that potential visitors will click than if the page only has a few dozen links.

The value of your link source is in direct relation to how selective that source is when placing links on the page and how much traffic the source gets.  The value also declines with the number of links on the page.

Google is understood to use algorithms to measure the importance and quality of each page.  The PageRank was invented by Google founders and is used for measuring absolute importance of a page.  The TrustRank algorithm describes a technique for identifying trustworthy pages — quality pages.  We can not tell for sure to what extent Google is using this algorithm if at all, or at least their publicly known version.  What we can say, is that based on observation, we can definitely say that they do not treat all links equal and they do not pass authority to your page from all of your link sources.

Success in a $100 budget?

Patrick Saxon from Seoish.com has asked top names in the SEO industry a very useful question.  What most have missed is that Patrick has actually answered the question himself by writing the article.

First, he created a very useful piece of content, and second, he has received a large number of authority links from his peers.

He recently won a conference pass to SMX in Seattle from Arron Wall and he frequently comments and writes posts in the Youmoz section of Seomoz.org.  I can only see him moving up.  Congratulations Patrick on this cleverly created linkbait!

What would I do with $100-$500 if I had to start over again?  I hope I am allowed to keep my knowledge and experience and at least have the means to support myself for several months.

Give and you shall receive.

I would choose a topic I know a lot about, am passionate about, and invest the money in a domain name and creating useful content.  If I create the content myself I would pay a professional to make it look better.  I would host the content on a hosted blog such as wordpress.com or blogger.com.

After 20 or so posts I would use them as source for an ebook to be sold from the website.

To build buzz I would leverage social media sites and I would start helping and offering suggestions to others in popular forums and blogs.  Readership will build up.

Patrick has pretty much done most of this.  My only suggestion to him is to find or create a useful product for his audience.  If he decides to stick to Adsense, I would definitely move those ads above the fold!  Check the Adsense guidelines for better placement.

Assessing competitive levels

Critical to success is competing where we know we can excel.  This might sound obvious, but many entrepreneurs fail to identify exploitable opportunities.  Don’t get me wrong; I love competing.  There is no problem with dreaming big.  Even if we want to go after Dell or Microsoft, we have to find a really smart plan to achieve that.

Realistically it is wise to start very small and have a clear and smart plan to grow bigger.

I do this with SEO.  I always target niche keywords first — keywords that no other SEO or few others are targeting.  When I conquer those keywords, I move on to the more competitive ones.  This has the added benefit that my relevance profile looks natural to the search engines.

Here is a tip I use to find such keywords.

Google and other search engines let you search for words in the title, url, body, and the text in the links pointing to the web pages.  You can use this information to assess whether there are savvy SEOs targeting that keyword niche.

It’s been well known for a while to SEOs that the link text in the links pointing to a page carry enormous weight.  You can practically rank first page for keywords that are not in the body text if you use the link text effectively.  Many websites that rank high do not contain those keywords in their incoming anchor text.

How competitive a keyword is, is usually measured by the number of sites listed for the keyword search.  For example, a search for “seo” in Google returns 125 million results. Very competitive!

Searching for “allinanchor:seo” returns under 3 million results.  A lot of results but far fewer than the normal search.  A search for “allintitle:seo” returns under 5 million results.

To assess how competitive a search phrase is, I prefer to compare different searches: intitle, inanchor, intext, and inurl.  This cues me as to what extent websites are being actively optimized.  This is my real competition!

Why start SEO and Affiliate Marketing with PPC?

1. Accurate keyword research.  There are numerous keyword research tools that help you identify keywords that people are searching for, their volume of searches, level of competition, etc… Unfortunately, every single tool has a critical problem: the source of the information.

Wordtracker relies on information from meta search engine Dogpile, and similar sources. Yahoo mixes plurals, singulars, and phrases typed in different order; the information reported is from the previous month. Google tries to estimate traffic and fails to provide good predictions most of the time. There are other popular tools that have similar problems.

Running a test PPC campaign for a week or two will provide actual and dependable statistics about the amount and quality of the traffic to be expected for each keyword.

2. High click-through titles and descriptions. Page titles and meta descriptions are what people will normally see in the search results. We need to provide an incentive for the searcher to click-through.

Unfortunately it is very tricky to test changing titles and meta descriptions for SEO. We need to be able to rank first!

PPC management tools are designed so that we can easily split test multiple ads and the system will tell us which ads perform better. When we find the winning PPC ads we can use them to create our titles and meta descriptions.

3. High converting landing pages. Having a high conversion rate and high converting landing pages is not only important for our bottom line, it’s very important to retain top affiliates as well.

Another advantage of running test PPC campaigns is that we can tweak our landing pages until they give us the desired results.

Top affiliates measure the merchants effectiveness by their earn per click (EPC) — how much they make from every click they send. You can offer large commissions, incentives, etc… What really matters is how well their traffic will convert.

Even if you don’t plan to run a PPC campaign, it makes perfect sense to run at least one as a test to help you improve the results you will get with other channels.

Why Google needs a supplemental index?

Search engines researchers use two main concepts to identify the success of search engine algorithms:  precision and recall.

Precision measures how accurate the algorithm is in finding the best matches for our searches.  Recall measures how comprehensive the algorithm is in finding as many relevant results as possible.

In their effort to fight spam Google has filtered out a lot of pages that would otherwise rank.  They have created a separate index for such pages which they call the supplemental index.

Basically this is an effort to improve the comprehensiveness of the search engine — the recall.  If there are no pages in the main index for a particular search, they will at least have the supplemental results.