This post was actually written over a month ago, but the Free Urchin release by Google prompted me to post it now.
I believe that Search Engines, in general, need to seriously look at what drives users to search and how to maximize their revenues from search in order to create a virtuous cycle (the more money you make from search driven by user satisfaction, the more it tends to self perpetuate and so on).
In order to seriously compete with Google, emerging search engines (A9, MSN, etc) need to understand what the driving force behind their revenues (or planned revenues are) are, and then apply it to their business in order to reach deeper into the consumer’s mind (and wallets, or so it seems). I believe that PPC and Natural Search (with revenue sharing links) potentially can merge. Issues like Click Fraud still plagues advertisers, rev share removes this risk. The fact that CPC’s keep rising should also tell you that business are underpaying for their traffic on the PPC engines, that’s what allows affiliates like to us to make a lot of money doing the arbitrage work.
Here is an alternative strategy to search:
The 3 cornerstones of a search is generally considered to be (defined by John Battelle in his book, Search):
1. The Crawler
2. The Index
3. The User Interface,
There is, however a fourth, in my opinion: The Customer Interaction. Poor quality results and websites tend to drive more similar searches per user, and less revenue per user. By monitoring the interaction between a user and a website, you can increase both the quality of the listings and the revenue of the search engine. Not enough time and investigation is spent by the major search engines on understanding post click conversion metrics on natural results, and this is the untapped gold mine of the search engine industry.
(Added: 15 November 2005
Googleâ€™s recent move into Web Analytics is definitely due to their foresight into this area. )
To understand Customer Interaction, you need to split the traffic intent of the user between Commercial & Non-Commercial. I’m going to focus on Commercial Natural Search Results for the purposes of this post, but the principles of usability would still apply to results that are of a non-commercial nature.
If the quality of websites in an index was monitored with respect to customer interaction, relative to the keyword that was searched on, you would probably find that search engine spam results would be quickly dropped to the bottom of the listings, organically. Search engine spam is almost akin to email spam – high volume, low conversion – high irritation.
Natural traffic should be independently ranked, based on the user experience for a given keyword, however, independent rankings, should not necessarily mean non-monetization of natural traffic. In the past, Inktomi was used to generate independent “Trusted Feed” organic listings, however, the downfall was that advertisers could not control the keywords that they paid on. The market moved to PPC, but in my opinion, PPC is just a pit stop to Cost Per Action – which is the metric that most smart PPC marketers use. Snap is trying to achieve much of what I describe here, although I don’t believe their knowledge of the affiliate marketing space is up to scratch to execute on this strategy without taking years by building direct relationships with each website. The idea should be to build a platform and leverage the platform to build the business.
At the rate that Google is growing, competing search engines like Yahoo needs a solid and ground-breaking strategy in order to reverse this growth trend, and I believe strongly that it lies in â€œThe Customer Interactionâ€.
PageRank is dead. Links are becoming less important, due to the rise of RSS feeds. Websites are no longer pointing to other sites, they are syndicating the content on their own sites. Google is full of Search Engine Spam, mostly as a result of duplicate content feeds and syndication – can link popularity ever be trusted again? The use of inbound links as a method of determining authority is and will be eroded over time.
The other flaw with link is that by using CSS, which only become popular after PageRank, essentially you can put a link up once on your website and it propagates throughout the website, seamlessly. Search engine spammers are notorious for placing links from PR9 & PR10 sites, by buying them and increasing their PageRank.
Who can we trust?
How about you let the customer decide what they think is most relevant…
A user searches on the keyword “Star Wars DVD” in Google. Top 3 results are for Amazon, Buy.com & eBay. Currently, you do not know which of the 3 the user actually purchased from. This is a huge slice of data, missing from your databases – can you imagine the potential to optimize the results based on the User Interaction. Firstly, you would find the most relevant results moving to the top of the listings, and secondly – you would be able to begin finally monetize the natural search market – something even Google have not yet achieved. Remember, that I’m not saying that you should change the results from the index – on the contrary – serve the most relevant results, even if the site does not have a monetization program (i.e. no commerce transaction), but if you’re going to send traffic to Amazon, at least use an affiliate link to gather both data and commissions.
There are a number of ways to achieve what is written above, without impacting the quality of the results.
1. Integrate Commercial Natural Search results with Affiliate Programs
This is actually easier done than said. For example, a user searches for “Blink Malcom Gladwell Book” on Google. The #1 link is (as expected) Amazon. Now, instead of sending the traffic to Amazon for free, and trying to figure out if this is what the user was looking for, send it via an affiliate link, integrated into their API (our Catalog Server does this, for instance – pretty easy). The user is then cookied and based on the data that Amazon sends back, you can check to see if there is Customer Interaction.
You can also improve your listings is you realize that users are not buying the book from eBay, but instead from Borders – this way, the search engine will become like a personalized shopping and searching partner – which is what the ultimate goal of search should be. As we know, Google & eBay work closely together, but for no remuneration on the natural side â€“ imagine if this changed?
The idea is not to build two separate indexes, but instead to allow affiliate redirects to rev share on the basis on a natural result click via a particular merchant.
2. Earning Per Click
By monitoring Earnings Per Click on a website per keyword click level, search engines may find that some of their natural results traffic (sent via affiliate programs), have higher customer interaction rates, therefore higher yields than your PPC search program, an in this way, you can either force your PPC advertisers to pay higher amounts, or display Paid Listings, with natural listings.
3. Partner Database
Search Engines can and should start building a search partner database. Some partners, who may not be running an affiliate program, can simply add snippets of Java Script code to their site (ala Urchin) and allow the search engine to track the user through the site. This would greatly enhance your understanding of the customer interaction and increase detection of bots/spiders. You can also prioritize the high volume websites out, and perhaps eventually move them onto some type remuneration basis. This will also add a face to previously anonymous website, which typically are spam site or phishing sites, etc.
I have just touched on a few of the endless possibilities engulfing our industry – maybe some are way out, maybe not – but if Snap succeeds, I’d be interested in seeing how close they come to this post.
The overall context of this post is that the search engine needs to have more insight as to the usability of a website in itâ€™s rankings, but surely there is an opportunity to monetize natural listings whilst still delivering rock solid search resultsâ€¦?