The study details released today, while an interesting necessity on the step to cleaning up Maps, leaves many questions unanswered.
Our study shows that fewer than 0.5% of local searches lead to fake listings. We’ve also improved how we verify new businesses, which has reduced the number of fake listings by 70% from its all-time peak back in June 2015
Previous indications from Google were that they had 16.8 million business listings in the local index. If that number is roughly true then they currently are logging roughly 87,000 fake listings in the index today. That means that as of June 2015 they had 280,000 fake listings in the index.
Bad actors posing as locksmiths, plumbers, electricians, and other contractors were the most common source of abuse—roughly 2 out of 5 fake listings.
Another 1 in 10 fake listings belonged to real businesses that bad actors had improperly claimed ownership over, such as hotels and restaurants.
OK that accounts for 5 out of the 10 that they studied. What were the other half made up of?
And exactly what constitutes a fake listing? Does it mean anything that violates the Guidelines? Or anything that creates a listing that is wrongly at an address? Those are different things.
It must exclude fake names at real businesses in the criteria otherwise the number would have to be higher.
They note that they have improved the verification process and are testing even more rigorous processes in the form of advanced verification of locksmiths and plumber.
Combined, here’s how these defenses stack up:
- We detect and disable 85% of fake listings before they even appear on Google Maps.
- We’ve reduced the number of abusive listings by 70% from its peak back in June 2015.
- We’ve also reduced the number of impressions to abusive listings by 70%.
It’s interesting that Google is publicly releasing data about the quality issues in Google Maps. The data provides some reason for hope but goes nowhere near far enough in helping the industry or public understand the scope of the problem.
As local marketers with a critical in high value industries where spam is more likely to been see, we see the many abuses first hand. The pov may jaundice our perspective as to the overall quality of the index but if these 84,000 fake listings are limited to 5 or 10 key markets then the averages don’t really mean much. And there is in fact still a quality problem in local.
And if the definition of fake listings doesn’t include everything that degrades them not just fake addresses but fake names as well then Google has undercounted the problem.
Then again, because of our unique point of view maybe we over count.
In a sense it doesn’t matter. Google has a perception issue (see Danny Sullivan’s A deep look at Google’s biggest-ever search quality crisis) and local has long been at the forefront of that problem.
It’s great that Google is studying the issue, its even better that they are being somewhat transparent and sharing their results.
But that still isn’t enough. The quality, at least in those heavily impacted industries, needs to improve. The definition of fake needs to be expanded to bogus naming if it hasn’t been.
And there needs to be increased transparency of google’s efforts in the arena.
Only then can Google overcome the current crisis of confidence that they are experiencing. Google has become, by hook or by crook, the utility that provides the bulk of driving directions and business discovery.
The public needs to know that they can trust Google 100%. Given that in come markets the problem is likely much more widespread, it seems that in this context even 99.5% isn’t quite enough.