We live in a big country. There are differences in behaviors between men and women, rural and urban and regionally. This is as true on line as it is offline.
Those who leave reviews are not a uniform lot nor are their preferred review sites. In my recent research as to which sites US internet users prefer to leave reviews, it was a 1,2,3 finish for Google, Facebook and Yelp. But there were interesting difference by gender, urbanicity and to an extent income as to which sites reviewers preferred. There are likely other differences as well but the sample size was not large enough to make conclusions.
There was little gender differences among those that left reviews at Google, TripAdvisor, Angie’s List or YP.com with each site having roughly equal number of males and females that preferred each site. Perhaps it is self evident but women comprised a significant majority amongst those that left reviews at Facebook. Yelp had a similar tilt towards men.
There was little difference in preference amongst those living in suburbia, rural or urban environs on Google, TripAdvisor, Angie’s List. YP.com and Citysearch. But there was a distinct urban bent towards Yelp amongst its users and a definitive tilt towards suburban and rural users amongst those preferring to review on Facebook.
Cumulative reviews grew 44% year over year to approximately 61 million and and approximately 40% of new reviews were contributed through mobile devices.
Average monthly unique visitors grew 27% year over year to approximately 138 million* and average monthly mobile unique visitors grew 51% year over year to approximately 68 million**
Active local business accounts grew 55% year over year to approximately 79.9 thousand
It is interesting to note the traffic growth and the fact that Yelp didn’t seem to mention whether it was desktop/mobile search or their mobile app growth. Given Yelp’s incredible performance in the Google desktop SERPS, one has to assume that they are getting an increasing number from Google as opposed to their app.
TL;dr: Amongst consumers that leave reviews more than once per year, which sites do consumers prefer for leaving reviews? The answer might surprise you. Google is number one overall but Facebook made a strong showing and outpaced Yelp for the number two spot as a preferred site to leave reviews.
Reviews have two sides:
Where do people read them?
Where do people like to leave them?
I suspected that the answers to these two questions might not be the same.
Facebook reviews received more of my attention with the Big Earl’s controversy in early June. It elevated Facebook on my radar and I started gathering anecdotal evidence that Facebook was making inroads into the local review space despite the fact that they are not highlighting reviews in any significant way.
I also saw the phenomenon on Barbara Oliver’s FB page despite her making no specific effort to get reviews there, they were piling up at a steady rate. I was even seeing Facebook ratings and reviews in industries like insurance that are notoriously hard to get reviews in.
To that end I created a large scale consumer survey at Google of US Adult internet users to first figure out who left reviews for local businesses regularly and then amongst those users, what sites they preferred for leaving reviews.
Using Google survey, I created a filter question to identify users (self reported) that left reviews at least once per year and eliminated from further study, those that rarely if ever left reviews.
We asked 2671 respondents the following with a choice of 5 possible answers: After purchasing from a local business, I will take the time to leave an online review for that business (% response in parenthesis):
-Almost never – less than 1 review per year (19.6%)
-Occasionally – 1 to 5 reviews per year (15.7%)
-Somewhat frequently – 6 to 11 reviews per year (4.2%)
-Very frequently – 12 reviews or more a year (2.4%)
The vast majority of respondents noted they never or almost never leave reviews (77.8%). Is it any wonder that getting reviews is hard?
The 703 of those respondents (22.2% of the total) that answered occasionally, somewhat or very frequently were then asked a follow up question where they were asked to indicate their preferred site:
When you leave a review online for a local business which site are you most likely to use?
The margin of error in the survey is such that Google’s “victory” is statistically significant. And one could argue that the difference between Facebook and Yelp is such that we can’t really tell which is actually in second place.
But this survey is confirmed by a second survey I conducted where users were allowed to pick ALL sites they are likely to use (1002 responses).
Factual stopped accepting individual manual contributions through their website (see here when you choose “Update/Add Business Data”). Instead, now they urge business owners to contact some of the “Trusted Data Contributors” to get the listings updated (obviously, in exchange of a small fee, which is not mentioned anywhere on Factual though). It is also interesting they don’t make it very obvious that they still accept contributions through their API.
Not that the manual approach ever worked properly – it could take anything from 1 day to 3 months for an addition/update to get approved.
Bottom line is that if you want to be sure that Factual (which feeds Apple) has your data, you will now have to pay UNLESS you use their API. Here is a complete list of their trusted data contributors: Continue reading →
MozCast has now updated their query set to better reflect what searchers are seeing. Even though their methodology was different than that of Whitespark, the new results showing a decline in 7-packs due to the Google Local algo update are much the same: a 23.4% drop.
It is interesting to note that one of their observations which correlates to what I am seeing, is a number of “these queries now have authoritative one-boxes instead of packs”. That is consistent with an Google’s statement to be using more web signals and in this case demonstrating a predilection for brands and one boxes ala Hummingbird. This brand preference might also lead to additional 3 packs often seen on brand queries.
Here is the communication from Cyrus regarding the MozCast update:
So, the fix to MozCast seems to have worked, and it’s as we expected – there was a drop, but less than originally reported. On July 23, before the decline started, we measured local packs on 12.06% of localized results. Today, we’re seeing 9.24%.
Interestingly, this is a 23.4% drop, almost exactly what Darren saw in his data (just read that this morning). Could be a coincidence, but since we used different methods, different data sets, and had no idea what each other were doing, I’d say that 24% number is pretty close to the truth.
Here are some queries that seem to have legitimately lost local packs (at least in the regions I’m checking them:
money gram (misspelled – interestingly, “moneygram” returns a pack)
subway store locator
jeeps (“jeep dealership” does get a pack)
bed and breakfast
In a few cases, these queries now have authoritative one-boxes instead of packs. In a few other cases, I’m still seeing packs on manual inspection, and I can’t account for the mismatch. Our code shows no pack for “used car” in Hartford, CT, for example, but manually setting location in Google does. So, this could be volatile.
Clearly the Local algo update (note to Matt McGee: can’t we do better than naming it after a pigeon?) has had an impact and a large one.
The more important questions though revolve around the real world impact on local businesses. Is there a decline in call? Is there a decline in driving directions? Are their fewer web visits? Over the next few weeks as we learn more about these real world impacts we can hopefully better understand how to advise clients.
Last week I reported that MozCast was showing a decline of over 60% in display of the Local Pack on Google after the recent local algo update. Moz was gracious enough to share their data and it was determined that their search queries had been obsoleted by the update. So while their data was internally consistent they were likely overstating the drop.
I reached out to Darren Shaw at Whitespark and he agreed to analyze their historical ranking data on any decrease in display of the Local Pack as 1)they have a larger data set and 2)they set location differently (not using the near parameter).
Whitespark’s results? A 24% decline in display of the pack during the two day peak drop (using the same date range as Moz). Not as large of a drop as indicated by Moz but a significant drop none the less.
Like Moz, their data show a small recovery subsequent to the initial multi day drop .
Terms that appear to have been dropped:
From Darren’s post: Terms that appear to no longer be triggering local packs (based on our rank tracker data and some manual testing):
commercial * (painting, construction, remodeling, etc) – anything with commercial preceding it seems to have stopped returning a local pack.
Comments and notes.
What is reality? We won’t ever know exactly how many Local Packs Google has stopped showing nor do we have any way to easily validate any of the methods used.
We have determined that the Moz methodology, while internally consistent, is likely over reporting the drop. Whitespark is setting location differently and is thus able to overcome the limit of the Moz report. Google though, has a great many tools at their disposal and we have no way of knowing how either data set measures up against searcher realities.
Moz’s data are meant as a real time directional view of the data and in that sense served their purpose. Whitespark’s data, on the other hand, is a retrospective review of actual ranking reports.
Whitespark used a search parameter than did not change as much and has a larger data set than Moz. While Whitespark’s sample is larger, it too could deviate from reality as the phrases used are keywords chosen by businesses as “money terms” worth tracking and don’t necessarily reflect the full reality of search.
Moz’s data served it’s intended function of validating observed changes in real time.
That all being said Whitespark’s number is probably closer in size to reality than the Moz data as both anecdote and methodology seem more consistent in their results.
There was a big drop. Phrases that previously returned a pack do so no more.
We will not ever know the exact size of the drop but it was likely not as large as originally reported. It is still big.
A number of businesses will be affected.
The changes are probably still occurring although at a much slower pace.
Yelp is obviously very, very good with their SEO. They apparently have the ability to sculpt their internal link values to highlight what appear to be the most popular local businesses in the Google local results.
Apparently their ability to do that in their strongest markets is even greater than elsewhere.
These results, first highlighted by Matt Storms on G+ (h/t to Max Minzer) well before the current local algo update and they are still seen in the SERPS. They reflect on Yelp’s ability to manipulate the search results and reflect poorly on Google’s acceptance of those practices. Yelp, though, needs to be careful of soiling the bed in which they sleep. Although I suppose they could fall back on their all too successful (but BS) cry wolf strategy if Google were to clamp down.
Look at these searches (I am sure you can find more):
Last week, in the wake of the Google’s Local Algo update, MozCast was showing precipitous decline in their tool that measures visibility of the pack. With the access to the actual queries (thanks to Moz for that transparency), Linda (and to a lesser extent I) noticed anamolies, with totally unpredictable results based on previous searching techniques.
The other reality is that the search results appear to have been changing on a regular basis over the past 72 hours and appear to have not yet “settled” in. See today’s chart captured below.
What Moz was tracking did decline precipitously. It appears however that the way that Moz was tracking, using the “near” parameter, has been severely affected by this update. Bottom line seems to be that while there was a drop in 7-Pack displays in the SERPS, the MozCast is probably overstating what the “average user” (which as Cyrus points out below is a mythical baseline) is seeing. I am embedding below the best discussion of this from G+ started by Enrico Altavilla and highlighting what appear to be the best comments about what is known so far:
+Moz thanks to +Enrico Altavilla and +Mike Blumenthal report the crawling issue was probably identified.
When crawling Google without using a real Local IP address but only the modified URL with the near tag included, for Chicago for example, google SERPs are delivering Organics results for Chicago but the maps pack for the IP location used for crawling – in the next example I have run the search query with an IP from Philadelphia using your custom URL for Chicago.
I am very happy to have people validate #MozCast data – this is a real-time system designed to detect changes on the fly, and that can be tricky. If a change is big enough, the system may not be flexible enough to adapt.
In this case, the situation is complicated. Here’s what I know so far:
(1) This is not a system glitch, in the usual sense. MozCast is collecting data normally, and the numbers accurately measure what the system is seeing (more on that in a moment).
(2) The drop coincides almost exactly with the “Pigeon” roll-out, so we know something is happening. People have verified pack drops, although other have verified packs on queries that previously had no packs. All of this information is anecdotal, so it’s hard to sum it up.
(3) I have been able to manually verify some of the pack drops. However, I have also seen queries where I’m still seeing packs, even though MozCast indicates a drop. In other words, the system doesn’t seem to be either completely wrong or completely right.
(4) I have manually verified that our geo-location methods do not seem to be working they way they did previously. In other words, the system isn’t “seeing” what we expect it to see. This change seems to have happened with the Pigeon update. So, I suspect that Google has made some changes to how they handle and support geo-location (which their public comments suggest as well).
As of today (and, unfortunately, this change happened close to a busy weekend), my best guess is that (a) something did happen, but (b) the change is being exaggerated by MozCast. The question is – how much is it being exaggerated? I don’t have that answer yet.
Anecdotal reports were highlighting changes but not as severe as MozCast was reporting.
(Click the comment flag to view all comments)
Note that the MozChart is continuing to show changes:
Update: Moz has provided me with a list of local searches that were returning packs that no longer are. I am sharing this here as a Google Doc. If you draw any conclusions from the data please reshare it.
There has been some discussion at Plus and SearchEngineland about the impact of the recent Local Search algo update on directories and Local Pack results. While the article at SEL was anecdotal this recent data from Moz is less so.
Out of the 10K keywords MozCast tracks, 5K are localized (to 5 metro areas). On the morning of 7/24, 560/5000 (11.2%) were showing pack results. This morning (7/25), only 212/5000 (4.2%) were showing back results. We saw a 60%+ drop day-over-day.
Local carousels were also down, and one-boxes seems to be up.
When viewing in the context of the Local MozCast the apparent drop in 7-Pack results appears significant. I suppose it is conceivable that they are showing more on searches that Moz isn’t tracking but the Moz sample is large and varied and this is the best overall view so far.
Update: Great tip from Joy Hawkins to get a sense of the changes: In fact, if you search Google.ca for anything you’ll see the results that USED to show in the US yesterday (old algorithm I’m guessing)
Last night just before going to bed reports (h/t to Brian Mayo) started drifting in about missing 7 packs in the real estate results. Map results that had been showing on almost all real estate related searches had disappeared from the results as have DUI lawyers. Around that same time Searchineland reported that Google was reporting a major update to the local ranking algo:
Google has released a new algorithm to provide a more useful, relevant and accurate local search results that are tied more closely to traditional web search ranking signals. The changes will be visible within the Google Maps search results and Google Web search results.
The core changes are behind the scenes, but it does impact local search results rankings and some local businesses may notice an increase or decrease in web site referrals, leads and business from the change.
Google told us that the new local search algorithm ties deeper into their web search capabilities, including the hundreds of ranking signals they use in web search along with search features such as Knowledge Graph, spelling correction, synonyms and more.
In addition, Google said that this new algorithm improves their distance and location ranking parameters.
The Local Search Weather Report is showing higher volatility today and MozCast Feature Graph for local seems to have captured the loss of the 7 pack on a number of searches where it was previously present (although they are showing no decline in the carousel):
In searches I follow there has been both ups, downs and the disappearance of the pack where it was previously prominent. In one case a detached listing which had been doing well both organically and locally but wasn’t in the pack returned to the pack.
There also appears to be less duplication of results in both the 7-Pack and the organic where the order of the organic and local results mirrored each other. And in this search at least, the radius of the search has been reduced significantly. The three organic top results were all located in the suburb to the east of the city.
Google noted in the SEL article that the changes were rolling out in the US. Curious if Canada or Europe is seeing a similar turmoil. Your observations would be welcome.