Top 12 Takeaways from SMX Advanced 2007
… Not including the “Give It Up” session, natch. Here’s my list of the primary takeaways from the organic SEO track at SMX Advanced 2007 in Seattle. This will be heavy on takeaways from Google/Matt Cutts due to the fact that the opening session was 45 minutes of Matt.
1.) Matt Cutts was really on a mission to drive home the message that Google is not averse to using human intervention in its SERPs. I think I heard him say it in each session in which he spoke. Google is removing the phrase “100% algorithmic” from some of the webmaster guidelines pages. This is a 180-degree turn from the old days when they went out of their way to emphasize that everything they do is automated. The folks who never believed that were on the money.
1a.) On a very related note, Google’s human intervention includes trying to ascertain what type of webmaster or site owner you are. They have tools that make it possible to see other sites that a person owns. Matt C. said “if you have a gambling site and a sweater site, this doesn’t mean the first will hurt the second … (but) if you have 200 spammy sites, shouldn’t that be a signal when we look at your 201st site?” In a separate session, he said small mom and pops will get the benefit of the doubt compared to a power webmaster.
2.) If you’re building links, be careful not to go too fast. Aaron Wall and Greg Boser talked about this: Look at the dominant site in the industry and figure out their link growth rate — how many inbound links over how long? Try to match that without going dramatically over.
3.) Matt Cutts: Pages that are in the supplemental index are not parsed the same as other pages. They have to be compressed. Google may not index and store every word and phrase of pages in the supplemental index. (Me: The Web is getting too big for search engines to store it all.)
4.) The engines have practically given up trying to combat the level of spam you get in the Viagra and similar industries. Peter Linsley from Ask.com said you could have 2,000 people working on it every day and not come close to cleaning it up. More optimistically, Matt Cutts said Google has two initiatives coming soon that they think will impact this type of spam.
5.) Yahoo (Amit Kumar): Mentioned two effects related to having duplicate content: a) Yahoo is less likely to extract links from duplicate pages, and b) less likely to crawl new pages from known duplicate sites. (Me: Does that affect just spammy, scraper-style sites, or would it also affect a site that just happens to have a lot of dupe pages for whatever reason?)
6.) Matt Cutts: Click-thru data is used with personalized search, but he wouldn’t confirm or deny if it affects general search results. He did add, though, that it would be easy to game and “very noisy” if they were to use it.
7.) Spam is measured by intent and extent. I hadn’t thought of it in such succinct terms, but that’ll make a nice explanation to use with clients who don’t understand why some tactics are okay in some situations. Several different speakers said this, from Todd Malicoat to the search engine reps and others.
8.) To turn off personalized search at Google, add this parameter — &pws=0 — at the end of the query string/URL. (Thanks Lisa)
9.) New site to check out: Microsoft’s new keyword tool at adlab.msn.com, suggested by Todd Friesen.
10.) Matt Cutts was about the only person in attendance who believes Wikipedia deserves the high rankings it gets on almost any Google search you do. Seriously, almost every session had at least one Google/Wikipedia joke.
11.) Matt Cutts: Part of the Googlebomb algorithm doesn’t run in a “live” setting; it runs every couple months, or when Google presses the button as needed to deflect a Googlebomb from working.
12.) “Jason” masks aren’t a real good session prop.
Bonus takeaway: The way to a search marketer’s heart is through the stomach. Really, has anyone blogged about SMX Advanced without mentioning how great the hot lunches were? No. Everyone has mentioned it, and rightly so.
If you were at SMX, what are your takeaways?
Hi Matt,
This is a really nice summary!
Two things in this stand out to be as being especially thought-provoking.
1) Don’t build links too fast. What about getting Dugg? If I’m getting 1-2 links a day from manual efforts, and I then get 100 links from getting dugg, all in 24 hours, what does Google think? The whole point of SM is that rapid burst of exposure and the long-term link benefits, but if Google thinks bursts of links are suspicious, SM becomes a dubious good.
2) Clickthroughs influencing rankings – this is just the worst idea. What about all of the crummy website I click onto, either looking for something and not finding it there because the site is crummy, or because someone writes to me and says, “look at this crummy website”. I do NOT want Google to think my click is an endorsement of the high quality of that site. I think Google is making a really weird oversight on this issue.
Thanks for all the posts you’ve done from the frontline on this, Matt. Much appreciated!
Miriam
Hi Miriam, on the DIGG thing, the algorithms take into account when you (or your site) become a “hot topic”. And that would include a sudden burst of links like you’re referring to. What they were talking about in the panel is the stuff you can control — set yourself a target for linkbuilding based on what the space you’re in will support. If you get featured on Oprah suddenly, no problem. The algo can account for that.
I agree with you 100% on #2.
You’re welcome. And congrats on having two stars next to your name now. You deserve more. 🙂
My takeaway? The seo sector is a lot more boozy that I would have thought….
I must confess, I was all excited when that second star popped up by my name!
Thanks for explaining that about the Digg effect. Still trying to get all that SM stuff straight.
Miriam