Understanding Google My Business & Local Search
Google Maps: Listing Guidelines – Good enough?
Lisa Barone of Bruce Clay picked up my reporting of the new Google Maps Business Listing Guidelines . She commented:
Blumenthals doesn’t seem completely satisfied with the guidelines set out by Google because they leave too much wiggle room….. He’s right in that there really could be a lot more explanation to what Google will and will not allow, but it’s a start. And really, when has Google ever given you the degree of information you secretly hoped for? Maybe one day that’ll change, but today it’s still Google’s world.
I’m just glad that some semblance of rule has been put into place over there. Watching spammy local search results pop up makes my skin crawl. I need local search to be as spam-free and relevant as it can be, because that’s the search I go to when I don’t have time to play around and I need answers now.
My question to Lisa is why should we accept mediocrity from Google on any level? Google has had ample time to make more than just a start in Maps quality. Here is the answer that I posted on her blog:
Hi Lisa
It isn’t that I am not completely satisfied with Google’s guidelines, it is that I am dissatisfied with Google’s approach to Mapspam in specific and Local quality standards in general.
These guidelines are a start but the important issue is how they are implemented and whether they are expanded and further clarified.
Google Maps is very unlike Google organic in that the standard for showing results should be truthfulness not just relevance. Your personal story on your blog about needing to know where you were and how to get someplace, show just how critical truth is when you are lost and you need trustworthy directions. If the listing had been hijacked due to your vet’s lack of awareness of the process then you would have been up the veritable creek without a paddle.
Given that the Local listings need to be truthful, the test of these new guidelines is whether Google implements them proactively or reactively. The other test will be whether Google’s explores all use cases and makes it clear whether Local is about local or about being just a marketing tool for the unscrupulous.
If Google only responds after a spam instance has been reported in Local, it will not work. This reactive response has been Google’s approach in organic and for the most part it works there because an algo can provide relevant results and truthfulness is less of an issue.
In Local that just isn’t the case. I don’t think that an algo can check for truthfulness. Google needs to proactively ferret out all of the spam that they have allowed into the system and create code and human processes that prevent more from occurring. Changes to records need much more thorough vetting both algorithmically and ultimately by a human to be sure that they are accurate. Google, because of their culture, approaches most problems as computing problems and I am worried that they will persist in that approach in Local.
Google is the one company that appears to be in the driver’s seat in pushing Local data out to the greatest number of people. If Local ultimately succeeds, Google will play a large part of that.
We can only hope that they implement high enough technical & listing review standards that Local really needs to be successful. Local has the chance to be a truly useful resource but if that opportunity is lost due to inadequate standards than it will become nothing more than the snake oil salesman of the new millennium.
© Copyright 2024 - MIKE BLUMENTHAL, ALL RIGHT RESERVED.
Comments
5 Comments
That was a good explanation of some of the real problems you have discerned with Google’s strategy, Mike. Very informative. I commented over there, too, and am awaiting moderation.
One of my fellow Cre8 mods has gotten 2 phone calls from Yahoo this week offering him a premium listing in Yahoo Local. Interesting to see this far more pro-active approach from Yahoo as opposed to Google.
Miriam
Yes Miriam, Yahoo has a much more hands on approach to local..The comment that struck from the Eric Enge interview:
[They] “have human and manual moderation that goes on for changes, so … submissions all go through a moderation process where we look for patterns and we actually do validation of data to make sure it is accurate”.
On my last submission, it took 2 days for them to look at and when they were done, they emailed and said it was approved. So not only was it vetted, there was good feedback to me and it felt like customer service…killed three birds with one stone as it were.
Mike
Mike,
Thanks for taking the time to comment on the Bruce Clay blog, and for bringing the conversation back here.
I agree with you. On all of it. I simply look at these guidelines as the first (small) step in getting Local where it needs to be. Before there were no standards of any kind, now at least there are.
Don’t take my post as a sign that I’m satisfied with Google’s results. I think they need to be more proactive in maintaining the integrity of the results and not simply hand editing certain queries when something gets called out, but to be honest, I don’t know that it will ever happen. Google’s not looking at Local with the intensity that it needs to be. They’re looking at organic, they’re fighting the paid link battle, they’re paying attention to how many people are buying ads. They’re not focused on Local Search. Which is a mistake, because they’re the only ones with the data to really improve it.
Local results should be held to a higher standard than organic. They’re more important. Like you said, had that query for [simi valley vet] been tainted, I would have been lost with no way to figure out where to go. I got lucky that the results were clean. Google needs to make sure that “luck” doesn’t play a part in whether its users get good results.
Hi Lisa
Thanks for stopping by and clarifying.
The issue for me is that as an industry we should not tolerate either mediocrity or illegality and Google should be called out on both fronts loud and clear.
Google has, on occasion, responded positively to such inputs. One example was their creating an explicit public forum to note spamming.
Again way too little and better late than never but it gives me hope that they do occasionally hear the tune even if they are for the most part tone deaf.
Mike
I am not sure how they can easily verify and validate a business listing for some small business in some forgotten way out place on the other side of the planet.
One recourse would be to find the business website, and verify against that – but the data there might also be faked/exaggerated/not quite the truth.
They could use other business listings as a check, but can those also be completely trusted? In any case, most are seeded with a small amount of fake data to catch out content copiers.
It’s a difficult problem to crack I think.
Comments for this post are closed.