Does Local need to be held to a higher standard? Danny Sullivan and Chris Silver Smith Respond

I have been thinking about Yahoo lately (big surprise that eh?) and have been mulling over in my head what is what in the world of Local.

Thinking that I needed some more voices to compliment what is rattling around in my brain I wanted to hear what others had to say. I sent the following off to a number of individuals, each having different expertise that they bring to the world of search that I respect, looking for their thoughts.

I asked them if they basically agree or disagree with the premise and if they would answer the following question:


The internet is coming face to face with the communities we live in. Local is at the nexus of this juncture. With the iPhone we now listen to our music, answer our phones, read our email, look at our maps and browse the yellow pages. In the near future we will likely be using our iPhroid (or whatever the device will be called) to replace our wallet, the ATM machine and who knows what else. In the past we have been satisfied with search providing relevant results but we are now in a time when we expect the map and business listings to be not just relevant but correct as well.


As we move forward to what I call the age of the iPhroid with who knows what transactional and social capabilities, does Local need to be held to a higher standard to “truly” succeed and play a trusted role in our lives?

What is your opinion?

Danny Sullivan and Chris Silver Smith responded first so they will be first to go:

Danny: Well, fair to say Mike, I don’t think the standards are very high in local. There seems to be a large degree of trust over community contributions and edits, simply because I don’t think the companies want to expend the people power to clean things up. And I think they also feel most people still look more closely at web wide results, which they pay more attention to. But as local gets used a lot more, I think those standards will have to rise, especially if the players want to gain or keep market share.

Chris: I think it’s a great question. Data quality is one of the biggest issues in local search and IYP, and it seems to not be getting as much play as it really should.

We’re all so dazzled by the whiz-bang interfaces brought to us by Google Maps, iPhones, and other systems that we’re not asking the big questions about whether the data behind it all is reliable. A huge percentage of the time, it simply isn’t.

There have been many times when I’ve sent family and friends to a business, only to find it had closed. I’ve also used online maps many times only to find the pinpoints incorrect — the very worst instance was when I made the maps in printed instructions for my brother’s wedding rehearsal dinner — sending dozens of cars full of hungry friends and family to an incorrect location (streets often have both north and south or east and west numbering systems, and interactive maps sometimes pinpoint them wrong when online addresses don’t include the cardinal qualifier).

It’s all the more ironic if you know that I spent the earlier part of my career as a professional cartographer — I’m at an extreme end of expert users of maps and shouldn’t be messed up by charts and directions as frequently as I have been by online maps. Even knowing the high percentage error rates involved in the services doesn’t help me much — other than if I sense a reason to question a map’s accuracy I may call ahead to get verbal confirmation from a business or other info source.

Quite a number of years ago, John C. Dvorak did a little informal survey of yellow pages results from the major IYPs, and on the basis of it he beat up on Superpages quite a bit for incomplete or erroneous info. At the time, I thought it was pretty unfair because I thought he should’ve taken our data suppliers more to task or should’ve done a broader sampling than one or two searches, but his point was pretty salient and our company beefed up data quality improvement efforts. But, here we are five-plus years out and local search and IYPs would still likely fail his informal test. (Dvorak later stated that he was giving up using 411 and using Superpages instead, so I’d guess he eventually forgave us for sometimes having bad data.)

Inaccuracy in local search info is a really big, complex beast, and there’s no quick cure for it. But, it would likely help if the industry had a lot more transparency as to what they’re doing about it — this is an area where we should have them show their cards in the consumer interest. What if each provider was to set up an info page outlining how they deal with: removal of listings for closed businesses; capturing and updating business info that has changed; criteria for choosing which data source trumps another data source if the two have disagreeing info; computation of map pinpoints; and quality improvement of address locations on interactive maps.

From my perspective, it’s time for each of the major players to stop passing the buck on quality, and work on it more intensively than the cute graphic interfaces.

What if we started rating the various local directory providers by how complete/accurate their data is? It would probably start exposing the fact that the local search emperor has no clothes.

Please consider leaving a comment as your input will help me (& everyone else) better understand and learn about local.
Does Local need to be held to a higher standard? Danny Sullivan and Chris Silver Smith Respond by

18 thoughts on “Does Local need to be held to a higher standard? Danny Sullivan and Chris Silver Smith Respond”

  1. The challenges associated with local data quality are very similar to the issues we face in the shopping center and retail development industry. Data isn’t accurate even though some suppliers send people to physically verify location and attribute information (gross leasing area, tenant lists, etc). Of course this method has its flaws because of the fluid nature of retail businesses. Physical verification typically only happens a few times per years and it is even difficult to perform phone verification more frequently than that.

    While our business is not focused on the local space, I am happy to share our methodology for gathering shopping center and chain retailer data and to answer Chris’ questions since I believe it may be useful to stimulate conversation and to also start to offer transparency.

    Before I get started, I’ll share a couple of notes on our business so that readers can put my comments in context with the local space. 1) Our focus is not on small businesses. Our customers are more interested in chain retailers with at least 3 or 4 locations and shopping centers. 2) We are a very small company so our resources are very limited.

    1. Removal of listings for closed businesses

    We regularly crawl commercial real estate development sites and retailer sites to determine if a shopping center or retailer/restaurant has closed. This actually isn’t the best method since once a business has closed the web site is often not updated. In fact, many businesses which close leave their web site operating until the domain name needs to be renewed or until the contract with the web site hosting company expires. Even if a retailer/restaurant has more than one location, the web site often doesn’t report closings in a timely fashion.

    We are more likely to find current information about a store closing because the news is published in an industry journal or other news source. Still, we have found news sources are more likely to report multiple store closings for a chain due to a general business downturn as opposed to one store closing because the location was poor.

    We might consider other web sources for store closinsg but currently we do not rely on blogs, social networking sites, etc. I think this model might work if we combine it with physical verification.

    2. Capturing and updating business info that has changed

    The methodology for capturing and updating business information is similar to what is described in the above point. The good news is that store and shopping center openings are much more like to be kept up-to-date on web sites. Our technology allows us to crawl web sites very frequently so our information can be very current although we try not to inundate web sites with server requests so we try to manage this with a pretty light touch. Information on store and shopping center openings are often in the news as this is a key part of marketing efforts. In fact, these openings are often reported many months before the actual opening of a location so we can really monitor store openings.

    We also verify the total number of locations we have for a retailer to make sure we aren’t missing acquisitions or divestitures. We use SEC filings and other respected third-party data sources to verify our overall counts.

    In the future, we would like to develop a methodology to use phone verification as an additional source for capturing infomration since we are still finding about 20% of chain retailers do not have a web site. Plus, this methodology can be useful even for businesses with a web site, especially in the case of store closings. It also seems like it would be possible to use the telephone companies as a resource to see which phone numbers are still active.

    3. Criteria for choosing which data source trumps another data source

    Determining the best data source is very challenging and we find this to be especially true with shopping center locations. We gather shopping center information from three primary sources including commercial real estate sites, actual shopping center sites (malls and regional centers tend to have their own web sites but most other shopping centers don’t), and retail web sites.

    The first two sources are certainly the most accurate but often times we find shopping center information on retailer sites which can be less accurate in terms of the name of the center and address since the retailer includes their address rather than the general address of the shopping center. If shopping center information is only coming from retailer sites we like to verify the center from another retailer web site or other web source.

    4. Computation of map pinpoints

    The process of creating map pin point or geocoding is an interesting one. We have used services which provide a latitude and longitude in addition to identifying how precisely a location is geocoded. The good news is we have a better idea of the quality of individual geocodes but the questions remains as to how to improve the precision. More than likely our next steps will be to try other geocoding services so we have multiple sources and hopefully more precision.

    5. Quality of improvement of address locations on interactive maps

    We rely on Google’s api for mapping so this questions isn’t particularly relevant for us since our control is limited by Google’s technology. I think the only point that matters is that we can be more precise with locations by using our geocoding service rather than solely relying on Google.

  2. Wow, not only thoughtful replies from Chris and Danny, but a fascinating reply from Lynn. I really enjoyed reading that. It seems clear that a multi-discipline approach is crucial to shooting for accuracy.

    Great topic, Mike.

  3. Hi Chris/Linda

    Linda, great respsone! You both focus on data quality. Certainly that is crucial and as Danny & Chris point out requires greater investments of time & money to get anywhere.

    But is the issue just the quality of the data? For me Chris
    s suggestion of transparency needs to expanded to include issues of fraud (as separate from basic errors) and transparency of how to get something fixed. I would expand the reporting & tranparency requirements beyond data quality.

    The site should define what is spam and how to deal with it. Right now, you need to join a secret cabal, send off a missive in the night and hope that someone takes a look or make a public stink. That seems like a byzantine method of dealing with it particularly as spamming can move into illegal areas and those need to be given particular attention.

    The transparency should apply to the most granular of data suppliers, the business owners themselves. There should be clear and specific paths for them to correct erroneous information. Now for example in Google when the Plusbox and the OneBox data diverge, it is incredibly frustrating to get ANYONE to deal with it.

    There are other specific issues but in a broad sense the Local provicers need to be held accountable not just for the quality of the data but the legality and implications of the data. By that I mean to say that like credit card companies take responsiblity for fraud beyond a certain point, the Local provicers need to take responsibility and make fixing things a priority.


  4. I remember Google asking about Local people interested in “taking care” of Local – as in $10 a listing with photographs and a quick adWords awareness pitch .

    Whatever happened to that? I was most interested as it would have taken care of the spam problem and had local people on the lookout for spam…

    David Saunders

  5. Hi David

    That program is still alive but at what low level it is hard to tell. Google got their original signups in the markets they wanted and then stopped accepting new folks.

    I occasionally read about new reps being added but it seems to be only to replace someone who dropped out but not to expand the program.

    Only Google knows if it is working to improve quality and what impact they have had.


  6. Mike, thanks for getting these interviews, nice to hear from both of them, especially Chris the former cartographer.

    One point I think is important too is that going forward GPS navigation systems are going to be more “live” and depend on this information as well, making it even more critical to have accuracy and precision not only in maps but in places and businesses. Think of a gas station that you see on your GPS, it’s 2 miles away and you’re out of gas. You walk 2 miles and find out, hey, this gas station (a) is no longer in service or (b) is closed right now, although the information you got didn’t mention either points.

  7. Hi Michael

    I have bill Slawski, Ahmed Farooq and Miriam still to come…

    Yes it will get evermore critical to have accurate information and transparency. Certainly from the consumer point of view they should not only be able to update the kind of record you are speaking about but have an easy and responsive way to report fraud. From the business owners point of view there should be an easy way to get the search engines (Google in particular) to remove erroneous information. At the moment its but a hope and a prayer.


  8. Mike:

    Really interesting comments.

    Having been both a long time retail real estate broker and a retailer I was fascinated by Lynn’s response.

    Maintaining that data on a real time basis is difficult. Having done it personally and built my own data bases of retailers, spaces, contacts, etc. it is a difficult time consuming job. There is no ONE ultra reliable source for maintaining fresh real time updated data of the type that he described.

    To this day, I see live websites of b&m businesses that have closed. I suppose the websites won’t go dark until its time to renew the site. Similarly businesses might take calls with answering services even after they have closed the doors.

    From the spam or affiliate side its a different issue.

    If a competing business thinks they can gain a competitive advantage by creating something akin to mapspam….with many locattions….who is to stop them then the “collective conscience on the web” or alternatively competitors. A third alternative are the disgusted customers recieving bad information. In those cases they may either consider it “too much of a hassle” to report the bad information, or ignore it and go to the next piece of information or business, hoping it will be accurate. Not a lot of incentive or time to go around reporting bad information.

    From an affiliate perspective…..should an affiliate get in the middle of a transaction its a shame for the consumer.

    I know from past experience, if I’m driving to a city, and I load up with 5 or 6 hotel phone numbers I can call ahead for reservations for a night or two and normally negotiate favorable rates, (assuming all hotels aren’t booked up.). Aggressive hotels, (or virtually any retailer would rather make a sale at half price than no sale at all.

    Putting affiliates in the middle of transactions cuts the opportunities for consumers to take advantage of this opportunity.

    Finally, having taken the time to speak with an old time cartographer/mapper who has seen his hard copy map business take huge hits versus the businesses that have created maps for mapquest/google/yahoo he has explained that the detail and work involved in creating maps is dramatically less by the map creators for the web than was done in the past for hard copies (at least by his company). He knows the cartographers for the web mapping companies. They used to work with him. The standards and work efforts have been compressed and are less rigorous.

    Ha….once when we were chasing some mapspam, Mike, I used the old hard copies of a more accurate map to verify that addresses on a web map were inaccurate.

    I suspect if the errors aren’t egregious they will go unnoticed and unreported by most.


  9. Hi Dave-

    Distinction that you make between bad data and fraudulent data seems to me critical to move forward.

    Everyone I interviewed agreed that accuracy stinks. We can assume that we will need to live with bad quality to some extent, but we don’t need to assume that we need to live with the opaqueness, lack of standards and poor responses that we have seen from Local Search Engines when we do report errors and fraud.


  10. Mike:

    Its hard to determine the percentage of accuracy. I don’t believe we have a fix on that.

    We see the egregious errors, and you do a masterful job of reporting them. On the other hand you’ve done some research in Olean and Buffalo (I believe) that suggests a majority of accurate data.

    I don’t know what the percentage of accurate to inaccurate data is. The specifics of inaccuracy come out in the annecdotal complaints in google groups for business owners.

    I do believe there should be different standards for purposeful inaccuracy and malicious misrepresentation.

    Even with a less severe standard for “mistakes” our big friend Google could benefit with your suggestion for a “correct this information” button for algorythm errors that pop up in things like plusboxes 😀


  11. Dave

    You know besides the lack of transparency and how difficult it is to correct something there is even something more bothersome to me.

    That is the lack accepting responsibility for the problems. With fraud with Visa or MC the consumer has a $50 maximum. That seems a reasonable and fair way to split the risk. That creates a system of trust and a willingness to switch.

    In Local not only do I see none of that, it is even hard to get them to acknowledge that there are mistakes.


  12. I have found that optimising for local search can really pay off. A lot of people are now searching to look for local services and retailers, if more companies take this approach to use local keywords and have adequately developed sites I think local search can be quite beneficial to both business and searchers alike.

Leave a Reply

Your email address will not be published. Required fields are marked *

Comments links could be nofollow free.