Earlier this week, I invited readers to submit questions that might become good topics for future articles … and you responded! A few of you submitted very specific questions about local SEO and local search, which would normally be fine except for the fact that I’m not doing much local SEO these days!
So, for this first question, I’ve enlisted the help of Professor Maps — AKA Mike Blumenthal, who I assume/hope you know as a prolific & expert local search blogger and small business owner himself. Mike graciously replied to this question from reader Kris D.
Kris asks: To what degree does Google normalize NAP data? Given the following addresses as a general sampling:
123 US 19 #L
123 U.S. 19 #L
123 US 19 Ste L
123 US Highway 19 Ste. L
Will Google recognize that these are all the same locations for a unique business?
I recently noticed that Google+ Local picked up a review from a directory listing that had the #L in the address even though I have been using the Ste L style suite spelling. Additionally, some directories such as Localeze and some of the other big players (CityGrid, Google+ Local) use different data sets with slighly different spellings that are not changeable and some don’t even recognize the suite info at all. The reason for the question is that I do not want to charge for a problem that does not need to be fixed.
I shared this question with Mike via e-mail, and here’s his reply:
Google does an excellent job of normalizing NAP data.
A simple mind experiment demonstrates this fact. Most small businesses are very inconsistent with how they represent their NAP. As you point out, the primary data suppliers also handle specific fields differently. Between those two facts, the vast majority of businesses have some inconsistencies. When Google discovers trusted NAP information that it can’t reconcile with an existing listing, they create a new listing in the index.
If Google were not able to normalize this data correctly, then virtually every listing in Google would show duplicates. They don’t. In fact, as a percentage of the total number of businesses in Google’s local index (somewhere north of 100 million), very few on a percentage basis have duplicate listings. So, for the most part, the algo handles those minor discrepancies well.
That being said, you don’t want to tempt fate. Whenever possible, seed consistent NAP and pick your NAP based on how Google represents the data in their index (i.e., the address information matches how Google shows the address on their map). This will minimize problems.
The only time I would worry about the situation would be if the business is consistently experiencing duplicate listings flowing into the index. Then you need to commit the time to solve the problem.
Kris, I hope that answers your question. And thanks for sending it in. If any other readers have questions I might be able to turn into future articles, see this post. Just keep in mind that very specific local search questions aren’t in my wheelhouse these days. Best to visit Mike’s Blog and send him a note if need be.
(Stock image via Shutterstock.com. Used under license.)