SES notes: Search Engine Algorithm Research

Filed in Conferences/Educ., Link Building, SEM, SEO, Yahoo by Matt McGee on August 16, 2006 1 Comment

SES 2006 logoTuesday, 3:30 pm – Search Engine Algorithm Research

Another well-attended session in one of the big rooms. Rand Fishkin of opened up, and his presentation focused on how the algorithms handle link analysis. Bill Slawski of SEO by the Sea followed that up with a look at some recent search engine patent applications, and suggested they may offer clues to where algorithms are headed. He also talked about the ways engines are trying to understand a searcher’s intent better, using as an example the ability for an engine to decide if a search for “blues” was intended for the style of music, the color, or the hockey team in St. Louis.

But the best single presentation of the week, in my opinion, came from Jon Glick of, who spent a handful of years as Yahoo’s Senior Manager for Search. With that background, Jon has inside info. about algorithms that very few others have. Here are my (sometimes paraphrased) notes from Jon’s presentation:

  • Search engines keep a history of your site and track how often pages change. Repeated “meaningful changes” can increase the frequency your site is crawled. (Note: This is the concept I discussed in Training the Crawlers back in May.)
  • SEs track the rate of change of a page’s links. A sudden jump in inbound links will lead to scrutiny. But the algorithms make exceptions for this combination: a site/page that gets a lot of buzz (news & blog posts) and a lot of searches. (If that doesn’t sound like a recipe for avoiding the so-called “sandbox”, nothing does.) These sites will get an editorial review to make sure they show up in the SERPs.
  • Quality Score is a new factor being used by Google AdWords, and will also be part of Yahoo’s new PPC platform.
  • SEs evaluate your outbound links, and give a “spamminess score” based on your link profile.
  • Personalization — engines are treading carefully and hesitant to rely too much on this.
  • Shorter URLs are generally crawled more often, and rank higher, because the algorithm considers them to be more authoritative.
  • Sites with an RSS feed are generally crawled more often because the engine assumes the site offers fresh content.
  • Yahoo! uses more than 80 factors in its search algorithm.
  • The three major factors for ranking across the engines are still
    1. Content – keyword-rich copy
    2. Connectivity – in the form of links
    3. Outside opinion – in the form of anchor text

Now, on the surface, none of these bullet items on their own are too earth-shattering. But put them all together, and consider the source, and you have some real solid information on SEO/SEM best practices.

tags: , , , , , , ,

Comments (1)

Trackback URL | Comments RSS Feed

  1. Faizan says:

    I have more 2000 inbound links to my website. How can i check spamminess score of these links. Because i comments on blogs and forums on regular basis. So may these links can be consider as spamm or not.

Leave a Reply

Your email address will not be published. Required fields are marked *