The scuttlebutt in the world of Local SEO has long been that Google Insights was inaccurate and unhelpful as a guide to consumer behaviors. But is the newish Insights a better guide? How accurate is it and how can we test it?
Prior to the rollout of the Google MyBusiness dashboard in June of 2014, the original version of Insights was most certainly a piece of crap. Data would disappear or change, the product would not report for weeks on end, it would display spurious and unbelievable spikes. In fact at one point years ago, when I inquired of Google about the old Insights as to how some given data point was measured, I was told that the person who had coded it had left and they had no idea.
The Insights that rolled out in 2014 seemed to me, at least anecdotally, more robust and reliable if not perfect. I had gained at least enough confidence to use the information as a directional guide and felt comfortable that it was accurate enough for client consumption and for decision making.
But was it accurate? Many in the industry continued to diss the product but no one has really bothered to test it. Most of the data that is shown in the dashboard is captive to Google’s environment and largely unknowable. Only they really know what goes on with users interactions with search, the Knowledge Panel, the Local Finder and maps. If they don’t share it, we can’t really know.
How could we test Insights?Last week I asked myself the question: Was there a data point within the dashboard that could be measured in some other way?
The answer is yes, there is one (and one only as far as I can tell) Insight metric that can be externally measured and that is Customer Actions: Visit your website. With a campaign tracking code you could at least compare Google’s value for that metric in Insights with their value in Google Analytics1.
What did I find? That the accuracy of Google Insights seems to be quite good. The number from the GMB dashboard for Visits is very close to the Analytics number for that campaign.
The GMD Dashboard showed a figure of 1.14K while Analytics for that same period showed 1,147 sessions occurring from that campaign source.
So I have several questions for you.
Firstly are you seeing the same thing in terms of web visits being the same? Now that we have 18 months of Insights data you should be able to go back and look over a long period.2
Secondly can you think of another way to externally measure the accuracy of Insights?
Thirdly is your faith in GMB Insights increasing or do you still distrust the product?
Does a strong showing in one area (web visits) mean that the other areas like click to calls and driving directions can be trusted?
1 – Certainly there are issues using this as the benchmark. If they are cooking the books then they would have to do so across two products. While not impossible, it is much harder. It is conceivable that they are using the same exact data as Analytics.
2 – I do not have a long data set to share unfortunately.