March 8, 2007
Another report of large medical center problems with Google Maps:
== 1 of 1 ==
Date: Thurs, Mar 8 2007 9:22Â pm
From: “Michael” Â
I have the same exact problem. I am the web director for NYU Medical Center.
I have the correct address and phone number listed in over 20,000 pages on the footer.
Google maps does not have an elegant mechanism for validating changes. With large institutions, mail stops can be very difficult so the post card method does not work. Also, our call center is analog, so they can not validate there. How about a validation tag on our web site?
that would seem to be the most logical.
This has become a serious problem for us, as we have patients
literally showing up in the wrong locations when they are scheduled for surgery.
I have sent several emails to the maps group, to no avail.
Bart, if you find a solution i would love to hear it. I would love to know where the mapbot is getting its data from. An XML document on the root of my server listing the correct addresses and numbers would make the most sense to me.
Google has upgraded the Local Business Center with a range of new features. You can now:
*Add photos to your Google Maps listing (within the guidelines)
*Add custom attributes to your business listings
*Correct and adjust your Google map marker location, so if it is slightly off, you can move it to the right spot
*You can now see statistics on how many people viewed and clicked on your local business listings
The ability to correct your map marker has been a frequent request at the Google Maps for Business Owners group and will add one step to the process of improving data accuracy.
The custom attributes feature holds out the promise of solving one of the vexing problems facing businesses that serve larger areas than the locale in which they are located and possibly solving the categorization issues as well.
The other very interesting feature of the custom attributes is the attributes differ by industry group. The default values for a Physician are different than the default values of a restaurant. It appears that Google is in the race to build the “semantic web”.
Barry Schwartz at SearchEngineLand has a great summary of the other new features in the Local Business Center..
March 6, 2007
Last week I reported on a Hitwise report that Traffic to Google Maps increased by 26% from Jan. to Feb. due to an increase from upstream Google traffic and I surmised that it was likely due to the OneBox change.
Today, LeeAnn Prescott from Hitwise confirmed that the traffic increase was likely due to the change and that the week lag in data that I had noted was an artifact of the data collection and Google rollout procedures. LeeAnn noted: If the Local One Box change happened in the middle of the week, there may be a lag. It also may have been rolled out in different markets separately.
I had noted previously that the Local OneBox change had provided between 10 and 12 new entry points into Maps from the prime territory above the fold on the main search results.
To give a sense of how large this increase is one needs to realize that just this increase is roughly equal to all of Microsoft’s Local traffic and greater than the combined traffic from Insiderpages, Judy’s Book, Ask & Yelp.
Given Google’s dominant share in search, even small changes on their main search results page creates incredible traffic for any of their secondary products. It demonstrates clearly how difficult it will be to unseat the leader in the local battle on-line.
I love reading the the Google Maps for Business Group postings. This one appeared under the heading, Phone Number Troubles. I’ll say.
Recently we have run into a small problem with a women that lives in
Sherman Oaks California (Which happens to live near one of our
locations) Every time someone Google searches to find our clinic
location (example: “our business” Sherman oaks) They get a map and
two phone numbers. One is our number which will directly connect the
searcher to our clinic. The other will get a very nice old woman’s house
that happens to live by our business. This is very frustrating for her
as you can imagine. We explained that our business does not control
what information is displayed. It’s very hard for her to comprehend
Perhaps she needs to start suplementing her income with Windows support.
March 3, 2007
LeeAnn Prescott at Hitwise reports: Traffic to Google Maps increased by 26% from January to February 2007. It appears that this increase was due to an increase in upstream traffic from Google, which occurred on February 7, according to this daily clickstream chart shown here. Did anyone notice a change in how Google drives traffic to Google Maps around this time?
This jump in traffic to Google Maps shown on the chart occurred one week after it was reported that Google upgraded the Local Onebox results on the main search results page.
March 1, 2007
Today’s Wall Street Journal had an article titled: Local Search Sites Draw Users’ Input in which they:
â€¢ extolled the value of Yelp’s reader reviews
â€¢ reported on the trend of user generated content in local sites like Local.Yahoo.com, Yelp, InsiderPages.com
â€¢ Offered up the Kelsey esitmate of $6.2 billion in local search advertising by 2010
â€¢ Note the changing plans (difficulties) of Judy’s Book and Citysearch and their need to shore up traffic.
â€¢ AhmedF points out on Greg Sterling’s site that the article has the YellowPages.com generating almost 7000 reviews a day (200,000 in February) since allowing reviews.
What is of interest to me in the article comes from the Wall Street Journal’s track record of trend spotting just before a trend reaches critical mass across the U.S. Whenever I am looking to impress my kids or nephews I will read about an artist, toy or movie in the WSJ and usually discover that my kids have yet to fully appreciate the trend (which usually does take on widespread appeal) and hope that they remember my cool call in 12 months. They never do.
The WSJ has hit on 3 in this article: Local search taking off, user generated content and the difficulties facing even some of the bigger players in the market, all of which will be gaining mainstream mindshare over the next year.
February 23, 2007
There has been a recent upsurge in complaints about the accuracy of data that Google uses in Maps. There were recent (false) reports of hijacking, of very old & outdated listings not being removed and of complete bungling of a medical facility’s listings. The increase in complaints is due in large part to the increased exposure of the data in the Local OneBox and the resulting increase of awareness on the part of business owners.
Bill Slawski and I have written about the issue of data accuracy as has Greg Sterling. It was (is) my contention that the data will improve in accuracy over time due to the self interest of the many parties involved. As I noted several months ago, the last step in that process would be getting small businesses directly involved in correcting their own record. That is starting to happen with the increased visibility of the Local OneBox.
There are other accuracy issues that are not addressed by my original post. For example: the problems with Google’s heavy reliance on an aglorithmic approach to information, the quality of the data that Google uses to create, verify and ultimately delete records, and the lack of easy end user corrections of obviously erroneous data.
That all being said, I wanted to test a data set against on the ground information to see if it was “accurate enough”. To do so I chose the data generated by the query: “Restaurants Olean, NY“. Why? Three reasons: 1)I know most of them by sight, 2)I had a local Chamber of Commerce list of current restaurants and 3)it presented a small enough set that I could manage the information.
Here is what I found:
*Google identified 71 restaurants with the query, the Chamber list identified 50.
*6 of Google’s 71 were in fact closed. Some as many as 3 (maybe 4) years
*4 of Google’s 71 were either duplicates or not really restaurants
*11 of Google’s were pubs and bars and in Olean. In this area, they don’t really serve food unless you consider Bud one of the basic food types.
*Google missed including 3 coffee shops that the Chamber had as restaurants and to its credit found 3 restaurants that the Chamber did not include.
*Google generally ranked the restaurants reasonably by their local popularity on the Maps listing (with the exception of my favorite that they put at number 10…guess its time to stuff the reviews:)).
*The ranking and choices for the Local OneBox were very good. The number 1 and number 2 choices are two of the area’s most popular and busiest restaurants. The choice for number 3, Pizza Hut is arguable but a reasonable choice.
*In the top 10 Map listings there was only one closed restaurant
February 21, 2007
Update 02/11/10: These Google LBC categories have now been placed in a searchable database too located on the Google LBC Categories page of my website.
Update 12/20/09: I have a new list of current categories at Google Local Business Center Categories – The Complete List
Update 10/13/2009: If you have found this old post then you are a motivated searcher. I am now developing a searchable database of Google’s current categories. If you think that easy access to Google’s category information would be helpful to you, contact me at firstname.lastname@example.org and let me know that you are interested in testing the beta.
The categories that Google uses in Maps have always been confusing. They have their own very limited list and then they integrate categories from their other providers in a non-transparent way that causes confusion.
In a first step to making this more tranparent, Reuben Yau has created the first full summary of Google’s own Local Business Center categories.
February 16, 2007
This is my current read on the state of the algorithm used within the Maps product. Matt has posted a valuable summary of factors that seem to influence Google Maps standings.
I want to supplement that list and organize it in a slightly different way to clarify my understanding of the situation.
||Is the business listing considered in the pool of candidates for the search in question? This data is gathered both off and on-line
|1. Address Located within City of Search
||See Bill Slawskiâ€™s post on local sensitivity.
|2. Confirmed listing by virtue of entry in Local Business Center or trusted Google partner
||Only one seems to be required and there seems to be no ranking difference between them. Local Business Center, BBC, Talking Phone book, SuperPages are examples. However, using the Local Business Center is the preferred way as it avoids any ambiguity particularly about the authoritative website.
|3. Categories of business relates to search phrase
||Again from Google Local Business Center or one of its partners (which means that sometimes the categories donâ€™t come from Local Business Center ). How Google cross references these is of interest.
|4. Business Name relates to search phrase
||This works like a title tag in organic search
|5. Confirmation of address by authoritative website and referring websites
||This is why your website needs your address and the sites you are listed on do as well .
|6. Link phrase relevance
||I have not yet tested but it stands to reason that this is the case
||When compared to other businesses in the search pool what is relative standing of this business listing across various web resources
|1. Score of Authoritative Website
||This appears to be pagerank related. In a brief analysis that I performed this was born out. See Bill Slawskiâ€™s patent review
|2. Number of Reviews
||Quantity seems to trump quality. This speaks directly to Matt’s point about being in local directories and getting reviews in those directories.
|3. Number of Web References
|4. Quality of Web References
||Since not all of these are available on all businesses, Google will use whatever is available.
The results with the Local OneBox are different than those in Google Maps roughly 50% of the time. Half of the time they are exactly like the Google Maps results and half the time they are not. This has shifted within a single search over the past 3-4 months.This implies that Google is testing additional factors in those results. In the set of results that donâ€™t match the Google Maps results, I have noticed that Google will not list â€œunverifiedâ€ results in the OneBox and it appears that in the past there is additional weighting review scores.
Given the flux in the OneBox, it appears that focusing your efforts on the Maps product is the only reasonable thing to do as of today. I assume that over time, as more data is gathered and updated more frequently, the Maps results will improve and that there will be consistency between the two. Once the OneBox algorithm has gelled it makes sense to revisit it. As I and others have pointed out, it makes little sense for them to be different but Google may decide otherwise.
There are a number of unanswered questions about the Google Maps rankings as well. This is true particularly on searches that don’t have locally prominent results. So this list should be viewed as a first stab and an effort with the help and cooperation of others to achieve understanding.
February 14, 2007
Yesterday I wrote of Google now allowing user correction of unverified business listings in Google Maps. It was reported by Barry Schwartz and repeated elsewhere that I, in part, played a part in this outcome. Well it has been a sort of Charlie Brown moment…. You know, when Lucy holds out the football every fall and Charlie, in his trusting way, goes to kick it…
After further looking and some user reports, it turns out that this feature is much less widespread than I previously thought.
I assumed that “unverified” meant any record that had not been claimed by the owner or one of Google’s partners (like Superpages or Talking Phone Book); a record that had no “details” yet associated with it. Google’s actual definition is clearly much more narrow than that.
In fact, while I can replicate the results (so it wasn’t a temporary test on Google’s part) on a single search, I have only been able to find very few “unverified” records in any other industries or locations amongst the many that I have tried.
The issue of accuracy in local data is an important one (see Greg Sterling’s recent post: Data Quality: The Local Achilles Heel) and there is value in allowing the community to correct any errors. In fact as a tactic, the easier it is to correct an erroneous local record, the less Google will be criticized for (even obvious) errors in the data and the way they are assembled.
However, in this case, the essence of Google’s credibility is at stake. The undisputed king of the algorithm has an algorithm that is, on the very rare occasion, assigning a competitor’s web site to a business’s Map record. This error, while uncommon, shatters the perception that Google is infallible in the search arena. An error like this, creates the impression of unreliability and non-trust amongst users. Google is many users’ single most trusted URL on the web and there is implicit faith in its accuracy.
The problem, while not widespread, is serious and allowing users to correct the problem could be a step forward. Allowing correction of the few “unverified” records (please report if you find more) that I have stumbled upon out of the twenty five million U.S. business records extant is not an adequate solution.
Perhaps, the example that I have found, is but a first step in allowing community input. Perhaps an algorithm that fixes the problem is just around the corner. Perhaps I won’t have to feel like Charlie Brown after all.