Was Google’s Promise to Correct GeoSpatial Data in 30 days Too Optimistic?

My test* of Google’s ability to correct and update geospatial data is in. They passed, but just barely. While a passing grade may suffice in French class, I am not sure that it is sufficient in this case.

Google’s long term business plans are predicated on the ability to deliver local ads to mobile users. For that to happen accurately, Google needs to know not only about the existence of a business but where it is located on the globe. To gain control over this critical underlying information, Google, in early October, replaced TeleAtlas’s road and street data with their own. As part of that upgrade Google promised that reports of Map errors would be fixed in 30 days.

On October 30th, I reported the fact to Google that the address 201 N Union St. was located roughly 3000′ to the north of its actual location. This resulted in the 30 some odd offices located in the building (with the exception of mine which had been changed in the LBC) all showing as being located at the other end of town. On November 3, Google acknowledged the accuracy of my claim with a timely email. Corrections appeared to moving along as planned.

The good news? Yesterday, at 1:50 pm, Google reported that the reported issue has been fixed. The bad news? While the street address resolves correctly in Maps, none of the businesses yet do. This is likely an indexing alignment issue and will resolve itself in another week or two. Does it matter to the user getting bad driving directions? Not a bit, they will be angered regardless of the technical explanation.

report-an-error

My grade on their efforts? A C+ or maybe a -B. Much better than TeleAtlas ever did by a long shot and with reasonable user feedback. That being said here is why I am downgrading them to a C.

Firstly, it has been 45 days, not the promised 30 to a fix. My father (a retail animal) always instructed us to under promise and over deliver. That should be a dictum that Google adopts as its own. If it is going to take 45 days then say 60 and folks will be surprised at how quickly it was done. Saying 30 and taking 45 on the other hand, just engenders scorn.

Secondly, finish the job before reporting out. Just because one index is updated, the whole problem needs to be solved before it is really solved.

Thirdly, and this is why the grade might be on the lower side, getting Maps right is the future of Google. Behave like your competitive lives depend on it, because they do! Say what you mean, mean what you say and execute.

Google bought into a huge maintenance and upkeep problem when they decided to replace TeleAtlas as their provider of underlying geodata. It was obviously perceived as a critical technology to bring in house and justly so. That being said, if they are going to do it, do it right. The market is a tough task master and I do not think anything less than an A+ will suffice to keep Google in a market leading position going forward. Any lesser result will mean failure.

*My sample size is one, arguably too small to make a judgement. I have more requests into Google which I will follow as well. Perhaps larger metro areas have been prioritized and fixed in the time offered.

My response: Even if one is this late, that is too many. The user base does not understand sampling and the time to completion stated should be the maximum time in any case.

Google LBC: Data Rich Dashboard Fails at Math (amongst other things)

Google’s Data Rich Dashboard has always provided tantalizing glimpses of how your business listing is performing at Google. Details enough to get a sense that it is either working or not but not enough to really plan an effective marketing campaign. The biggest complaint has been the lack of detail about searches in the “other” category with the lack of the actual search phrase being used being a close second.

Now though it appears that there is additional reason to complain; it can’t count days! Here are several screen shots:

Selecting last 30 days link show only 25 days in the chart:

Picture 26

—-
Selecting last 30 days link show only 24 days in the chart:

Picture 24

—-
Selecting the last 7 days link shows only 5 days in the chart:

Picture 23

This sort of math, while a likely artifact of the low frequency of data updates to the local data, does not instill confidence in the ability of Google to provide reliable or timely information. If they plan on updating the data less than every day then the reporting out function & interface should reflect that fact rather than provide such odd results.

Google Places Pages Upgraded – Owner Verified Checkmark and More Sentiment Details Now Show

Update 5:00 pm: The Google Lat Long Blog has officially announced the rollout of their improved sentiment reporting. They also noted that they have “improved [their] ability to find reviews of places, searching more quality sources of information from across the web”. I have heard reports recently of dramatic jumps in some Place’s review counts.

The ever astute & voluble Earlpearl, has pointed out that Google has added a new interface twist to the Places page where they now indicate that a record is Owner Verified and shows a checkmark if it has been claimed into the Local Business Center.

Owner-verified

Improved Interface:
If the record is unclaimed there are now two choices; one to do a community edit and the other allowing the business owner to immediately claim the record into the Local Business Center. Previously the interface involved the circuitous route of selecting the edit link, then being taken to the Maps view info bubble and being required to again make the choice. While a minor interface upgrade, it will make it clear to both the editing public and the business owner what they need to do to correct a record while on the increasingly visible Places Pages.

what-are-people-saying

Additional Sentiment Analysis:
Note on the above example the additional sentiment categorization now showing when there is enough information for the algorythm to break out the details. In addition to the sentiment summary that has been visible at the top of the page, the sentiment is categorized, summarized and the user is given a choice to expand.

I am not sure when this feature was added but this is the first time I have noticed it. I have only examined several restaurant listings but the detail appears to be broken out on a content available basis. That being said, in the examples that I looked at, this detail showed only on listings with 13 or more reviews. I assume that it is possible for the sentiment detail to show up on restaurants with fewer reviews if the sentiment is consistent across some indeterminate number of reviews. Update: here is a restaurant with 11 reviews showing sentiment details: Old Library Olean.

Continue reading Google Places Pages Upgraded – Owner Verified Checkmark and More Sentiment Details Now Show

Google Maps News of The Weird – RustyBrick’s Favorite Places Poster Miscoded?

Update: Google called Barry.  Even though it is in their recommended QR scanners list, they told him “not to use the BeeTagg scanner and try QuickMark for iPhone or Barcode Scanner for Android.  Both worked correctly”. My question: What good is a bar coding system that is only accurate 2 out of 3 times?

Barry Schwartz reports at SeRoundtable that the QR code on his Favorite Places Poster recently received from Google leads to the wrong Places page. Rather than take the viewer to Brick Marketing, it takes them to the Citrus Grille. Here is a video of his abortive scan:

Did Google just mess up Barry’s? Are the errors more widespread? Was it hijacked as Barry asked?

Has anyone else experienced similar results?

NZ Florist Facing 7 Years for Hijacking Local Listings of Competitors in GMaps

Apparently Kendra Drinkwater, a Napier, New Zealand florist, has been charged with “using the Google search engine to dishonestly, and without claim of right, cause loss to seven Hawke’s Bay florists” and could face penalties of up to 7 years in jail.

She is accused of logging into Google Maps under multiple sign ins and using the community edit feature to edit the critical contact information of her local competitors.

From the Dominion Post article:

The owner of Flowers by Tanya in Hastings, Richie Davies, said it was frightening how easy it was to alter details. It was a matter of simply clicking “edit” on the company’s details on Google Maps.

Mr Davies said he had called Drinkwater once he and other florists had found out it was occurring. They thought Drinkwater may have been the culprit after someone logged on using her first name.

“I asked her to apologise and to stop altering the details. She claimed she’d had her details changed too. That’s when I went to the police.

According to the article, Google’s spokesperson Annie Baxter said it was the first report of “editing with ill intent” in New Zealand and warned business owners to register as the verified owners of their sites to stop others hacking their details.

What do I think of this whole matter? Continue reading NZ Florist Facing 7 Years for Hijacking Local Listings of Competitors in GMaps

Google Maps and Reviews – A reader’s perspective

Earlpearl, a frequent contributor here and elsewhere, recently wrote up this detailed opinion about Google’s use of reviews as a comment on the Plastic Surgery Co. Settles with NYS over False Reviews piece that I wrote this past July. I thought it too full of interesting tidbits to leave buried in the back library.

Even though I have a number of bones to pick with Google’s current review policy I will leave my opinions to another post. The standard caveats about Earlpearl not representing the views of the management apply. 🙂

———–

It is patently clear that reviews are a mixed bag with regard to businesses and the web. The wide distribution and availability of reviews is positive for a business when honest, and destructive when dishonest.

More to the point, honest reviews are a gift to consumers. What better advice is there than word of mouth either extolling or criticising a business?

Regardless, the proliferation of reviews and its usage as a mechanism for evaluating and ranking the importance of businesses within Google Maps opens up a can of worms.

Ultimately, a clever business or local seo is going to “create reviews” to rank higher in Maps.

I was intriqued when reviewing maps rankings for Dentists in two small adjacent towns.

At the top of the maps listings for both adjacent towns was a dentist with 49 reviews. There was some overlap amongst listed dentists, but of the 15 listed dentists following the top ranked dentist…the next most reviews was 12.

Huge difference between 49 and 12. Bigger difference between 49 and the average number of reviews per dentist (about 6). Its statistically not reasonable.

The dentist with the most reviews uses a medical email/communications system for customers that includes an opportunity for reviews DemandForce. The vast majority of the 49 came from that source. The dentist pays for the communications system.

A totally independant medical review source is ratemds. The dentist with the 49 reviews, most coming from DemandForce had reviews from ratemds.com.

I’m not saying the reviews were faked at all as in the example Mike wrote about. I’m simply pointing out how the volume of reviews has an enormous impact on rankings within Maps….and it is incredibly subject to manipulation.

I operate businesses of certain types. In one industry there are virtually no independently generated reviews. Virtually none. In fact before reviews got popular in web use, I scoured the internet for review commentary on the industry and in particular our business.

Two things: Most review commentary was critical. Happily our business didn’t receive any of that negativity for years. There was relatively little positive public commentary anywheres on the internet.

Now I look at some businesses in the industry in a certain market….and the business ranked first in Google Maps has HUNDREDS of positive reviews. HUNDREDS. I was speaking with one of their competitors. The competitor has just under 100 reviews. He laughed in acknowledgement with me……our customer “types” don’t tend to write reviews.

Most of these businesses generate an “internal” critique review for customers. It is given to customers after completing the service. The “internal” review was essentially used to see if the business was meeting customer expectations.

None of these reviews historically saw public light.

Artificially generating reviews to rank higher in Google Maps does nothing for consumers, does nothing for generating a “better maps listing”, and simply creates a lot of busy work to “spam”/manipulate Maps.google.com rankings.

Generating reviews as referenced above in the blog piece has been deemed criminal and justifiably so. Faked reviews are manipulative. If they can criminally be used to manipulate consumers they can be used to manipulate search engine algos.

I simply think Google should diminish the importance of reviews as an algo element. At the least it would be simple mathematics to evaluate a relatively large number of reviews…(such as 49) relative to the next highest number (12) or an average of 6….and determine that there is something inappropriate in that volume. Then recalculate rankings with a somewhat diminished value attached to reviews.

That would keep the Maps.Google engineers busy for a while and out of trouble 😀

Earlpearl

Google Favorite Places Poster Start Arriving

One of my clients, Barbara Oliver & Co. Jewelry in Buffalo for whom I do some internet marketing, received a Favorite Places poster and a congratulations letter today. Her initial reaction was muted and somewhat suspicious but once I explained that she was one of only 100,000 in the US and one of probably only 20 or so in Buffalo, she was quite pleased.

The promotion is an interesting one, essentially recognizing early and successful adopters of the LBC in higher traffic, retail industries. I was surprised at the skepticism of the client upon first receiving the poster. I guess that SMB’s have been trained in cynicism by years of dealing with the Yellow Page folks and coping with the current onslaught of on-line offers. In her case, a 7-Pack listing has brought measurable and substantial business.

Here is the letter that she received (click to view larger).

Favorite-Places-Letter

Google Local Business Center Categories – The Complete List

Update 02/11/10: These Google LBC categories have now been placed in a searchable database too located on the Google LBC Categories page of my website.

Picking the right category with the Local Business Center is one of the keys to success in Google Maps. Categories are critical to being considered relevant on any given search and should be chosen carefully to meet your short and long tail priorities. Since a business may only enter 5 they become very important.

Given the way that they are presented, it is difficult to plan ahead which categories might be the most appropriated for you or your client’s listing.

Here is a complete list of every category and synonym that Google currently provides within the Local Business Center. At some point in the future, I will provide the information in a searchable database to make it more useful for planning purposes:

Google Local Business Center Complete Category and Synonym List

On the list you will find 2239 categories which includes plurals ie lawyer and lawyers as well as certain geographic place categories like “estuary” and other geographic phrases. The fact that the categories include both singular and plural should reinforce the sense that you need to pick the ones you need for your listing carefully.

When the Local Business Center was first released Google only offered a choice 450 categories and no custom categories. At the time if you used any of the limited categories offered in the LBC, Google would use those to supplant any that had been provided by the Superpages. So the upshot during that timeframe, was that you had to remove ALL categories from the LBC to get into a given Superpages category in Google maps results.

In March of 2008, Google subsequently went to a more free form, expanded categorization structure, initially not requiring the use of any of their categories. This summer, Google started requiring that a user select one preset category and then allowing for 4 additional preset or custom categories.

I would love to hear how you use this list.

Google Maps – New abuses via Wikipedia landing pages?

I have received several reports from the locksmith industry (who else?) and elsewhere of a new “black hat” ranking tactic that seems to push listings to the top of the 7 Pack by using a Wikipedia webpage as the Map’s landing page.

For example, the links on the search for Locksmith Toronto ON head off to Wikipedia location pages (Ontario, Toronto & Scarborough, Ontario) that one presumes have a very strong default Location Prominence ranking.

locksmith-toronto

Google Promoting Maps with 100,000 Favorite Places Posters

favplacesticker

Google, which started their Favorite Places campaign in July by placing large statues in front of high profile businesses in major cities, is expanding the campaign by sending a Google Maps “We’re a Favorite place” poster to 100,000 of the “most sought out and researched businesses on on Google and Google Maps”. According to Google this number represents less than 1% of the 28 million U.S. businesses and “that our standards for selecting businesses are as selective or more selective than other companies which have run similar initiatives”.

Interestingly, they are including CR codes on the poster so that a Droid or iPhone user can scan the poster and find reviews and (from Google’s perspective) coupons for that business.

If you are one of the favored businesses send me a photo of the sticker! They are apparently within the next week or two. Currently the program in US only (sorry Canada) and will show up in roughly 5,000 US towns and cities. If you weren’t one of thelucky businesses and are curious as to why, here are the criteria that Google is using.

Developing Knowledge about Local Search