Google Local: Hummingbird Diarrhea Persists 6 Months On

deadIn local, Hummingbird has produced a number of terrible search results. With its obsessive preference for brand as a search result, even when a searcher is obviously looking for a range of businesses, the algo often still returns but a single spammy result. I guess I am lucky that there are so many colorful synonyms for feces.

Today, Conrad Saam of Atticus Marketing, noted one such low quality website that Google is highlighting with these type of crappy results.

The search for Dui Attorney Los Angeles led to this search result and the website that Conrad wrote about. As these results have been reported Google has been manually deleting the results.

Problem is the results still seem to be widespread on high value terms. And they show up even more frequently when the searcher location is set to a given local market.

The marketing on the website that Conrad mentioned may be horrible and even deceptive. Its still must be generating a fair number of calls for the lawyer given Google’s willingness to show this type of result front and center on their front page for high value search terms.

Screen Shot 2014-02-12 at 1.35.44 PM

Please consider leaving a comment as your input will help me (& everyone else) better understand and learn about local.
Google Local: Hummingbird Diarrhea Persists 6 Months On by

18 thoughts on “Google Local: Hummingbird Diarrhea Persists 6 Months On”

  1. I understand the listing is deceptive and is a violation, but why are we upset that Google returns the result for the given query. Yes, Google is interpreting this as a branded search because the listing name matches word-for-word.

    Good on them for figuring out the algorithm and making a listing to capitalize on it. That’s literally the definitely of SEO.

    Are you suggesting that someone else shouldn’t be allowed to name a business Plumbers Boston MA so they can (legitimately) create a listing with that name and capitalize on the search results?

  2. @Mark
    Please read my previous posts about Hummingbird. I am suggesting that Hummingbird
    1)Has surfaced old moldy spam that long ago the local team had buried
    2)In giving preference to a single result as opposed to a pack, effectively limits the opportunities for all businesses.
    3)Google frequently will return this result on semantically similar word. For example this same result shows frequently for DUI Lawyer Los Angeles
    4)The deceptive nature of the website is a clear and present danger to consumers. They out and out promote falsehoods and for all the talk of Google wanting quality this is a good example of how they failed.

    A business can do whatever they want in terms of ranking. They can do whatever they want in terms of naming.

    However, I expect of Google a modicum of quality in local.

  3. We are getting pretty deep into some research with lawyers across the country and everything is basically the opposite of what we have seen Google preach. Especially organic rankings. Spam is holding up almost all rankings across markets we are looking at. So, a lawyer can follow guidelines and hope that Google ‘s algo catches up or get sucked into doing what most are, breaking the guidelines to do things that push rank fast.

    It’s beyond frustrating. Especially when punishments aren’t equal across the board and at fault is Google. Here are some examples:

    1. Having a business name in local or organic be part of a ranking factor? But then telling businesses not to spam it. How in the world does a business name relate to ranking? Of course people are going to spam it. Exact match domains or Exact match company names. This should be completely dead as a factor…but its not.

    2. Not allowing links to be built but “earned” and then still rewarding built links with super rich anchor text relating to terms you want to rank for. Nobody links with true money phrases unless they are a SEO linking to a friend or a SEO linking to themselves. But yet it still makes a HUGE difference in rankings for many sites while others get penalized for it (manually or filterly).

    Unless things like this are seriously reexamined in Google’s current approach, then crappy spam will always win. Penalizing some people who do these things but still rewarding others just doens’t make for good search results and my hope is people being to find apps, and other services and websites that bring better results than Google because they clearly have chosen to talk the talk but not walk the walk.

  4. In Austin one of my favorites is a search for “Austin Plumber” the business name is not real, the domain does not open and the choice of several credible businesses for a searcher to choose from are gone. This is hurting legitimate businesses and hurting legitimate SEO companies that are playing by the rules. As mentioned before, this has the potential to shift the focus of real brand building work to simply spamming a listing just to get calls.

  5. Mike –
    That keywords in a business name is such a critical ranking factor is frustrating to no end. Especially when you are trying to work with clients to better their branding/marketing. UGH!

  6. @Mike @Dan – I find it really frustrating as well. I see a lot of businesses (and SEO’s for businesses) buying EMD’s or PMD’s and throwing up some variation of their existing site (enough to not be dup content), and ranking with little to no quality link equity coming in. I shake my head and try to explain to clients that “in time” the good work climbs above the ‘suspect’ work.

  7. In Italy there’s a sea location called “Milano Marittima”. “Marittima” in italian means “on the sea” but in this case isn’t the part of “Milano” city on the sea… like you can see here:

    Sometimes searching “hotel milano marittima” appears a non-venice SERP, sometimes appear a “place info box” with an hotel that do keyword stuffing on “hotel milano marittima”, like you can see here:

    ps: it’s about an year we feedback weekly it to Google and to John Mueller directly too

  8. Once Google effectively figures it out and the culprits lose rank I wonder how many of them will then attempt to rebrand or rename again in hopes of staying on top of ranking or if those types will then turn to blackhat strategies to regain their position.

  9. This wouldn’t be as much of an issue if any of the Goog’s reporting mechanisms worked effectively. I’ve used at least three reporting mechanisms over a three year period for reporting local spam, including Maps Report a problem, Map Maker Edit, and Google+Local Edit details, and they’re broken (literally) or they fall into a black hole or they remain in purgatory. The process of removing spam has become more difficult, tedious, and complex with the passage of time. It doesn’t help that with each iteration of Maps, the different product Groups are effectively hampering the fight against spam with divergent and sometimes opposing product goals. Currently, on Map Maker, there’s two different groups reviewing edits, and each group has different tools, different masters, and different guidelines, which has led to a lot of problems.

    The Local Spam Team seems preoccupied with magical incantations, aka spam algorithms, that don’t work, and is unwilling to ensure that the process of identifying and removing spam occurs harmoniously across all the products. There’s been little to no investment in training to ensure that every product group in Geo is on the same page when it comes to spam. Instead of being the referee by resolving disputes between different teams and groups, and taking the lead on their “product”, spam, they’ve been content to basically do nothing, and allow the default Google mode of gobbling up and hoarding as much data as Google can, and hope that the algorithms will sort it out, someday.

    As it stands now, in certain categories, especially service oriented businesses, if you’re not greyhat, or even blackhat, you’re at a competitive disadvantage. It only takes one person cheating, and remaining undetected and unpunished, before everyone is cheating, and since Google has a laissez faire attitude about enforcing their own guidelines and TOS, it’s everyone for themselves.

  10. @Dan
    Google has seemed to reduce their efforts in local spam reduction.

    Whether that is a function of them undergoing so many changes and not being able to create cohesion internally and is temporary or is a more permanent Laissez-faire approach and reliance on algos I am not sure.

    I do agree that is worse now that it has been in a number of years.

  11. @Mike Blumenthal: of course… too many time and to many feedbacks. We are helpless!
    And this is only a few example: I can show us some adult site example, untouched after some spam feedback (keyword stuffing, link network, bad link profile, duplicated content, …)

  12. Hummingbird Is not google’s finest hour! I’ve seen some terrible results with it.. Its like were back in 1999 lol.. When queries just do NOT match the content at all. I guess it varies but over all it needs work.. even the youtube portal is buggy!

    Thanks for letting me apply me input 🙂

  13. @Dan,
    Hey Dan, I work with hundred of SAB’s, and I can only assume that your frustration with ‘Map-spam’ is all the fake addresses being used to bring businesses up in cities they really don’t have a physical location in. I gave up reporting bad locations and now just convince the cheating businesses to do the right thing and take their own listing down by removing the value of it. Simply write them a Google review with 1 star and say;

    “This company does not exist at the location they are claiming here that they do. If they chose to be deceptive and not tell the truth in their Google listing, how could you trust them in your home?”

    It’s not like they can complain to Google to get them to remove the above review without outing themselves. So, since the fake listing now hurts more than helps, they ‘voluntarily’ take the listing down. Quicker, easier, and a lot less frustrating than trying to do it through mapmaker’s reporting tool. When the sheriff’s not doing his job–there’s nothing like a little “vigilantism’ to ‘clean up’ the streets. ; )

  14. @David:

    How do you do that when the market is completely flooded with spam?

    Case in point: Denver, CO. 1005 spam locksmith listings. There’s less than 20 legit locksmiths in the Denver, Metro region. Even if you flood the spammers listings with negative reviews, legit locksmiths risk retaliation (even to the point of copy/pasting the same review that was posted previously on the spammers listings) by the spammers, who can afford to pay for not only spam, but spam reviews.

    Google needs to step up the plate and fix the Report a problem process. It’s really easy to detect spam, and it’s easy for Google to remove. In any given business category, it takes me about 1 minute or less to evaluate a business listing as to whether it’s spam or not (that includes reverse address search, keyword spam stuffing in the name, adherence to quality guidelines), and on average, it takes me about 30 min. to teach someone how to detect spam in their area. Is Google incapable of doing the same? Their algos don’t work, and they’re dependent on crowdsourcing to detect the spam. So why aren’t they removing it? Why aren’t they slowing the rate of spammers adding their listings to Google Maps? Why don’t their reporting mechanisms work? What is the spam team doing, other than twiddling their thumbs?

    Here’s a “small” sampling of the spam on Maps:,%20CO/s/locksmith?hl=en&gl=us

    Anyone can clearly see that it’s all spam once you get past the first 20 or so.

    It’s hard to take Google’s efforts very seriously and expect local SMBs to compete with the spammers blackhat SEO efforts. It’s David vs. Goliath & Goliath.

Leave a Reply

Your email address will not be published. Required fields are marked *

Comments links could be nofollow free.