XML Sitemaps: The Most Overrated SEO Tactic Ever

Filed in MY BEST POSTS, SEO by Matt McGee on June 19, 2008 43 Comments

Whenever I take part in a Site Review session at some conference, one of my fellow panelists will inevitably tell a webmaster something like, “You don’t have an XML sitemap. Create one and submit it to the search engines as soon as you can.” I’ve yet to have the opportunity to play devil’s advocate on that during a session, but I’m going to do it here on SBS. Because I believe that XML sitemaps are the most overrated SEO tactic ever.

XML sitemaps - the Most Overrated SEO Tactic Ever

XML Sitemaps: This Week’s Hot Topic

Coincidentally, XML sitemaps are the hot topic this week on the SEOweb:

Those are all good and interesting discussions, and it’s worth the time to read them. But I don’t agree with the “sitemaps are a great idea” crowd. So, why am I calling sitemaps the most overrated SEO tactic ever?

XML Sitemaps Don’t Solve Problems

I’ve done SEO on sites as small as 4-5 pages. I’ve done SEO on sites with 15,000,000+ pages. I’ve never once recommended the site owner create an XML sitemap and submit it to the search engines. Sitemaps don’t really solve any problems where indexing and crawlability are concerned. Let’s use a typical 100-page site as an example:

Crawlability Problems

If you have a 100-page site, and the spiders can only access 25 of your pages, fix your crawlability problems. Using a sitemap to solve crawlability issues is never a good idea. It’s like putting a band-aid on your chest after open-heart surgery. You need a lot more help than that. 🙂

Indexing Problems

If you have a 100-page site, and the search engine has decided that only 25 of your pages are strong enough to be indexed, forcing the other 75 pages on them via a sitemap isn’t going to help your cause in any way. It’s not going to improve your overall site strength or make your site profile look any better. The solution to getting those 75 pages indexed isn’t to spoon-feed them to the search engine; the solution is to make those pages better by improving the content, acquiring better/more links to them, and so forth.

No Problems?

If you have a 100-page site, and the spiders are able to crawl all 100 pages, and all 100 pages are indexed, and life is good … maybe you’re thinking a sitemap is a good complement, or something to do “just to be safe.” Why? If life is that good, you don’t need an XML sitemap. Let the spiders keep doing what they’re doing; let them crawl through your pages, let them figure out which pages deserve more frequent crawling, and which don’t. Don’t get in their way, and don’t steer them off track with a sitemap.

My attitude towards sitemaps is pretty much in line with what Barry wrote in his SER article above. He said, “I have always been a believer that well on-page optimized sites do not require or even benefit much from Google Sitemaps.” That’s exactly it.

But that’s not all. I’ve seen a case where not only did the sitemap not solve any problems, but it was also the cause of the problem.

XML Sitemaps Can Cause Problems

I’ve been helping a friend recently with a highly-trafficked and popular blog. This is a blog with great authority in its niche. It’s been around for several years, has tens of thousands of inbound links, and has all kinds of trust.

But it also had a problem: Referral traffic, specifically from Google, had dropped off the charts. Google had stopped indexing new posts. The most recent article in the index was a couple months old. We couldn’t figure out why. But the facts were clear: Google was suddenly sending a fraction of its normal referral traffic and not indexing new posts. To be frank, I’m not smart enough to know what was wrong. I still don’t know why all this happened.

My suggestion? Take down the XML sitemap. I figured we had nothing to lose. We deleted the sitemap from Webmaster Central. We pulled it off the site. We removed all links to it. It was gone. Nuked.

Results? Eight days later, here’s what I emailed to my friend:

  • New posts were being indexed quickly by Google, within an hour of posting.
  • Old posts that had fallen out of the index were back in.
  • Total pages indexed in Google were at the highest level since we started working together.
  • Referral traffic from Google was up 136% from the week before.

Granted, my experience is purely anecdotal. What happened in this case may not happen in the next case. Maybe it was pure coincidence! (I doubt it.) We’re all biased by our own experiences, and my experience is that sitemaps are completely overrated when it comes to SEO. They don’t solve problems, and can cause problems. Your mileage may vary. 🙂

Additional Resources

I mentioned that sitemaps are a hot topic this week. In addition to the links above, here are two that also question the need/value of XML sitemaps:

Your turn: Do you use XML sitemaps? Why or why not? Have they ever solved a problem for you? Have they ever caused a problem?

Comments (43)

Trackback URL | Comments RSS Feed

Sites That Link to this Post

  1. Best of The Web - Blog Posts and Articles Of Year 2008 | February 6, 2009
  2. Best of The Web - Blog Posts 2008 | February 9, 2009
  3. Create a Google XML Sitemap for your website | January 28, 2011
  1. TheMadHat says:

    I’ve always felt that having one didn’t do much if anything, but having them actually hurt you doesn’t seem to make any sense to me. It’s certainly possible and I’ve heard from others the same view, but these examples always leave me wondering if it was something else that caused the drop/jump in indexing & ranking.

    Still on the fence…

  2. Mary Bowling says:

    Hey Matt- great post. I often have to tell folks NOT to do an xml site map or to take down and existing one on a site that is having problems with indexing.

    Natural crawlability is crucial. So, I simply upload the verification code and then let Google crawl the site. I can then use Webmaster Tools to see what Google finds in the crawl and work to correct it.

    Many web masters think they should only put what they deem to be important pages on an xml site map, instead of all of the pages that should be indexed. Sometimes, they put absolutelyeverything on an xml site map, like pdfs that are duplicates of web pages. Others throw one up and then ignore it. Then, when pages are added or deleted, we can have a real mess.

    I think an xml site map is so easy that everyone wants to do it and charge their clients for doing it, instead of rolling up their sleeves and doing the hard work, instead.

  3. PPCblogger says:

    Hi Matt,

    Thanks for the mention. I am completely with you on the value of submitting sitemaps.

    In some cases, by feeding Google all your crap, low value content that is not strong enough to get crawled and indexed on its own merit, is showing a bit too much.

  4. I do use sitemap on my blog and have it automated through a plugin. All my pages have been indexed till now – but still it’s early days.
    Everywhere I see people recommend having a sitemap. This is a good alternative way to look at sitemaps.

  5. Matt McGee says:

    Thanks for the generous comments, folks.

    MadHat – I sat and studied that site for a couple weeks, and indexing had fallen off for a couple months. And then we yank the XML sitemap and all is well. No way you’re gonna convince me it was pure coincidence. 🙂

    Mary – I like that phrase, “natural crawlability.” Well said.

  6. miguel lucas says:

    I can’t believe it! A couple of months ago I had a simmilar experience. A pretty old website (3 years old) unexpectedly stopped being indexed by Google (for 3 weeks). No big changes on it: just content updates. I decided to remove the XML sitemap and crawling started 2 days before. Coincidence?. I don’t think so either…

  7. Brian Turner says:

    A quick pointer on the problems – Sitemaps was always billed as a way to help Google find if content had been updated. If was updated, it was a signal for Google to crawl it – if not, a signal to Google to not crawl.

    Therefore saving Google money on bandwidth.

    In fact, I think the USP for Sitemaps was always that it saved Google bandwidth, little else.

    2c.

  8. cochlea says:

    Our sitemap currently has a “warning” basically stating that Googlebot is having issues with 301 redirects one some of our pages. I do wonder if this warning is occurring because of the sitemap, since I didn’t think 301’s were an issue for Google.

  9. Matt Ridout says:

    I’m afraid I don’t fully agree with all this XML sitemap smackdown.

    I understand that XML sitemaps should not be used to fix crawling problems but if you have no problems then an XML sitemap is just good practice.

    I have never seen lesser indexing or trouble with new posts – if anything, its helped the crawl rate.

    There’s my 2 cents 🙂

  10. Pixielated says:

    @Matt, ill join you on this one.

    We always recommend XML sitemaps as good practice however it comes with the continual responsibility to update it otherwise this could cause crawl issues.

    making sure that do have a good naturally crawled site is important and an XML sitemap should never make up for this!

  11. Paul Drago says:

    Typically, I would agree with you– however on certain larger sites with international audiences there may be a good use for XML sitemaps.

    A particular client of mine (very large international organization) has a country drop down menu. Creating a country specific sitemap for country folder /us/ /can/ and submitting them to Google Webmaster Central and specifying the content for a specific country has been a tremendous help in making sure the right pages appear or the right audience. (ie– UK users don’t see US or Australian pages)

    I know there are 100 different ways to deal with this same problem– but the client had other issues in stake (brand etc) which kept us from utilizing any other method. XML sitemaps worked.

    Of course, your primary audience is small business owners and this won’t affect them at present

  12. john andrews says:

    I wish you would have saved the sitemap, Matt. It’s a VERY SPECIFIC directive, and can indeed cause trouble if it is not accurate. But if accurate, it should do no harm. We could have examined it.

    As noted, the sitemap is designed to save Google time and effort. If yours says “don’t bother – most of my stuff is the same…still” it may also be an indicator of value, for trendy content. Google will use what it has available. The key for us, is to make sure Google gets what we want it to get, so it makes the deductions we want it to make.

  13. Matt McGee says:

    It never occurred to me to save the sitemap, sorry John. But yeah, I would’ve liked to have someone look at it. Maybe next time? Actually, I hope there is no ‘next time’.

    Thx for all the comments pro and con, folks.

  14. Karl Ribas says:

    Great post Matt. I’ve always suggested to my clients that they / we create an XML sitemap and of course distribute it to all the necessary search engine hot spots. However, you’ve certainly raised several interesting points with this post. You’ve inspired me to treat each client on a case-by-case basis as far as who to suggest a sitemap for… especially when working with well established websites.

  15. Wil Reynolds says:

    Genius post Matt! I have been struggling with a nice way to say THIS IS USELESS to so many clients that ask about them. You have articulated this so very well. Thanks!

  16. Diane Aull says:

    Fix any crawlability problems and make pages more index-worthy… I love it! Of course, those things are often hard work, much harder than slapping up an XML sitemap.

    I agree with some of the commenters there are situations where a sitemap is the best (or at least a good) alternative. It’s certainly a useful tool to have available.

    But it’s no panacea and should never be presented to a client as one, IMO.

  17. Kieran says:

    Interesting post…it is certainly one side to the sitemap discussion. The actual =/- of sitemaps can be debated but for me the benefits far out weight the negatives. The stance I would suggest taking when it comes to sitemaps would be why not? IF it took a lot of resource to build one I would maybe think twice but with all the free tools out there….

  18. Rajat Garg says:

    I think the bigger problem with sitemap is added ability for competitor to scrape the website.

    Also, if indexing is not directly correlated to sitemap submission, the largest incentive to do it questionable, except making search engine’s life easier.

  19. Jerry says:

    When you look at the bigger picture, the introduction of xml sitemap was for indexing and regular crawlability. What happens when such pages are not properly structured? So first things first..Build a site that can be easily followed and the issue of a sitemap might never arise. But for some sites an xml sitemap can be helpul..

  20. Rahil says:

    Well, what happened to your friend’s blog is vry intresting.

    but the topic is debatable. can the topic of fixing the “crawlability” problem without using sitemap discussed here! that would take the discussion towards solving the problem!

  21. WP PL says:

    “Let the spiders keep doing what they’re doing; let them crawl through your pages”

    I agree with that. I never used sitemaps, instead I prefer to optimize internal linking. If necessary I like to use an “onsite sitemap” on a sepreate page or in the footer, which also serves as an overview for users (even though most users don’t pay attention to footer links).

  22. Macy says:

    There has been so much talk about using a sitemap to help google help you, so I went with it. There are usually too many variables involved to determine the direct effect of any particular change, but I noticed an increase in indexed pages at the time.

  23. Chicago Blog says:

    Interesting discussion. I was worried that by not having an XML sitemap on my blogs, i was losing out on something special.

    But i see now it is not a big issue. This article has alleviated some concerns. Plus i wouldn’t know what to do with crawl problems anyway. Anothing blogging technicality to look into, “crawling and indexing problems”. Maybe you can write a how-to post on this, Matt.

    I’d be happy to provide you my GWT crawl results for you to do a case study on. Just let me know.

  24. I totally agreed with what Matt said is absolutely right.

    I have experience which I also believe all should have the same that even the sitemap contains 100 pages, google would not index all the pages if those pages are not properly optimized.

    As very well said by Matt: “The solution to getting those 75 pages indexed isn’t to spoon-feed them to the search engine…” and certainly it would NOT do the job by using sitemaps.

    Thanks to Matt for his great and bold advice.

  25. By the way, Matt, I wonder if you would consider writing more articles on those controversial subjects, such as duplicate contents (especially I have read that google would consider http://www.youdomain.com and http://yourdomain.com and are duplicate contents (or sites?)) In that case, all professional seo should be a standard procedure to choose between www. and no www. and do some sort of redirect or what best in the very very beginning when hosting a site.
    I seldom see it in any recommended seo practice.

    Thanks in advance.

  26. Donace says:

    As stated earlier an interesting discussion; though in my opinion if the SE’s dont find your site and the pages by themselves it just means they are not worth being found.

    The best way to get found is to write great content and let people know. Marketing is key

  27. R.A. says:

    I originally thought no seo professional with logical thinking would disagree with Matt,but in fact a lot of others still said: Why not? It is a recommeneded seo practice by SE and it would take too much of the time.
    I also found that a few of these kinds of seo practices are just useless, such as the Revisit tag, Google never follows this tag – it certainly has its own alogrithm to determine how frequent it would browse a site for update!

  28. R.A. says:

    Sorry, my last Reply has a slight typo error in the 2nd sentence, which should read: …and it would NOT take too much of the time.

  29. Alex says:

    I’m calling “context” on this one. In a lot of cases, yes, they’re overrated. Nothing beats “natural crawlability” (love the phrase, btw!). BUT- If you build dynamic websites via a CMS, and your organization is adding / updating pages all the time, having a list of “This one got updated, this one got updated, this one got created” that a spider can look at and go straight to the new stuff means that it won’t waste YOUR bandwidth by crawling your entire 10,000+ page site when only 20 pages have been updated since the last crawl. It also means that those 20 pages WILL get analyzed the next time Google visits your site, in case the spider is set to only crawl, say, 1,000 site pages per visit.
    I’ve noticed an honest net positive effect of xml sitemaps, in that new pages get indexed much faster using them (I have mine set to be automatically generated upon request for all my clients, so any call to the sitmap gets a completely fresh snapshot of the website).

    I’m not discounting your rankings-drop experience, however. While I never experienced that, evidence does point to it being the result of an xml sitemap issue on your part. I do wonder if google crawls the sitemap instead of links on the site pages, instead of “in addition to”… Something that will hopefully be answered in the google group.

  30. yaniv says:

    Thank you for this article.
    I immediately removed my xml sitemap .

  31. carl says:

    Well said, right on target, I’m not using sitemaps with google (crosseyed) again!

  32. From my experience, there is plenty of good reason to run an XML sitemap. It’s not a make or break for sure, but it can be advantageous.

    Of course if it is breaking something it’s worth removing, fixing the problem and then potentially putting it back, but to tell people NOT to use it for the sake of some issue that arrose andofr which you never had the real reasoning/answer I think is irresponsible.

    With the XML sitemap generator for WordPress I am able to get my content into the Google index in as little as 4 minutes, so that’s not to be sniffed at.

    Natural crawlability is more than just a sitemap sure, but at the same time it just like anything else in SEO. No one part in itself is make or break, but SEO is about the sum of all parts together.

  33. Bill says:

    We are doing an experiment. We have 20 sites of various types and size using the XML sitemap protocol in conjunction with Google Webmaster Tools and we have 20 sites not doing so.
    Both groups undergo the same exact SEO tactics and operational scrutiny.
    We hope to determine if the sitemaps are a waste of our time and bandwidth or viable and valuable.
    We are only 11 days into the experiment. I hope to have some quantifiable results in 30-60 days.

  34. john says:

    To be honest, I am of the opinion that this whole post was written up with linkbait in mind. I have worked on many sites in my time, most of them large corporate sites, not once have I ever seen the addition of a sitemap cause a loss in traffic. Quite the opposite.

    The first thing I do is always the creation and submission to Google of an xml sitemap which is automatically updated when new content is added. Why?

    Because it gives an instant traffic and indexing boost. Every single time. Whether the site has been 7k pages or 150k pages, the addition of the xml sitemap has always been beneficial.

  35. Jerry Okorie says:

    Our sitemap currently has a “warning” basically stating that Googlebot is having issues with 301 redirects one some of our pages. I do wonder if this warning is occurring because of the sitemap, since I didn’t think 301’s were an issue for Google

  36. Thomas says:

    Hey, I know you wrote this article some time ago but I believe it is still relevant even today. I was glad to find it because it says what I have been saying for a long time so I refer it to my clients when they ask about doing an xml sitemap and look at me funny when I tell them an xml sitemap is not going to help you rank better in Google nor fix all the issues I see with your website having good “seoness”.

    Too, I see “seo pros” all the time selling “sitemap submission” (and site submission) services. Good way to tell a bonafide professional from one that maybe read some txtbook on “seo” and hung a shingle.

  37. I’m a beginner in this area and after reading a lot on this topic I still don’t get it, as there seems to be a great devide in opinions. What is the current situation with XML sitemaps and SEO?

    • Matt McGee says:

      Frisor – I’m generally of the opinion that the vast majority of sites don’t need an XML sitemap, but I believe I’m in the minority on this. The only time I’ve ever recommended use of an XML sitemap was a situation involving a website with about 15 million pages that wasn’t being crawled very deeply.

  38. I actually made an XML sitemap after posting that question and it seems Google has been quicker at re-crawling and re-indexing the content on my site. The site is new and has a few static pages which are updated often. I’ve specified daily in the XML – perhaps that is the decisive factor and not the existance of the XML sitemap itself.

  39. Ooops, the XML tags got eaten. I meant that I specified “changefreq” as daily.

Leave a Reply

Your email address will not be published. Required fields are marked *