Understanding Google My Business & Local Search
Infographic: Citations – Time To Live
The local ecosystem is a complex web of interrelations with Google having positioned themselves at its center. Given this complexity, just how long does it take for data to move through the various parts before it makes it into Google’s index. And from the main index into their local index and the cluster of data they have about your business? Just why does fixing error or changing a listing detail at InfoUSA take so long to impact your Google listing?
David Mihm and I have been working on detailing the time it takes for any given citation creation to impact the Google cluster for your business.
Chart Explanation
Our goal is to provide a broad stroke as to the range of times it might take for citation data to show up in a desktop Google search. The ranges are estimates only based on our experience and do not reflect comprehensive empirical data. As such, you might find discrepancies with our assessment of any given citation tactic. That being said, we think that the information is broadly accurate and provides insights into the delays at various points in the local ecosystem.
Depending on where the data enters the system it can take more or less time to finally make it into Google’s cluster of data in their local index and depending on where it hits in any given cycle along the way it can make it there more or less quickly.
For example, in the case of Infogroup they might take 2 months cycle to vet a new listing and another month before the data is fed to one of their customers for display in a local directory. Thus the range of times, depending on when the data hits their cycle could be as long as 180 days before (blue) the time for it to first appear live on the web. Depending on the importance of the page and its visibility where that data is shown it might take anywhere from a day to sixty days for Google (orange) to include the data in their main search engine. From there Google then needs to re-build their local index and include the new citation data into the Google+ Local cluster (Green) which occurs every 4 to 6 weeks.
The circle thus represents an educated guess as to the average time to inclusion in the Google+ Local cluster for data that started at any given point.
Discussion
Historically, as I have noted previously, a listing that went through a list broker, onto a primary list supplier like InfoUSA and then off to Google had a number of time delays before it would hit paydirt in the business cluster in the Google local index. This data could, if it hit every cycle just wrong, take as long as 9 months from beginning to end.
Recent developments in the use of APIs by CityGrid (2010) for their ad network, Yext (2011) and UBL Direct (2012) now remove one level of the delay and push data in near real time from aggregator to the display on an IYP site.
There still however is the issue with the time it takes for Google to bring 3rd party data into their main index and then for them to do a data build to push it into their local index. Google has dramatically sped up the time for moving new data into their main index over the past few years but given that the local index is rebuilt only every 4-8 weeks there is still on average a 5 or 6 week delay for data to show in their local index AFTER it hits the IYP page.
Google too has made improvements in their internal data pipeline although these improvements only affect their products. An edit in MapMaker or a G+ Page for local (one that has merged with the G+ Local page) can now flow to display within several hours once it has been approved as opposed to an edit in the Places for Business that can take as long as a month to make its way to display on the listing.
So while there have been significant improvements in parts of the local ecosystem, the dream of real time updates of local data like events, sales and specials across the whole web is still something that is in the future. But there will be a day when a business came make 3 or 3 edits and have fresh data display everywhere and the process of changing hours or announcing sales will become trivial rather than the trial they are today.
Please note: This infrographic is approved for non-commercial re-use with full attribution. It is not to be re-used for commercial purposes without explicit permission of copyright holders.
Β© Copyright 2024 - MIKE BLUMENTHAL, ALL RIGHT RESERVED.
Comments
28 Comments
Um….judging by what David and you have included here, Mike….I’ll now begin to check my own ss’s to see when we made changes and how soon our changes showed up against your own numbers here…
Oh, wish that the getlisted.ca site was up and running too…so that I could ask you two to work on some specific Canuck numbers too…
π
This is awesome research Mike and David. Thanks for putting this together.
It’s very interesting to see how long it takes for UBL data to get included in the cluster. It makes me question Google’s trust in UBL’s data.
I had no idea that CityGrid data was so quick to update. Surprised to see it beating Yext.
It’s also great to know that a merged G+Local page provides near instant updates into the cluster. Good incentive to merge now if you’re a single location business.
Wow, props for making this graph Mike & David. It will be nice to have a clearer idea of when to expect indexing.
I am curious how you came up with the data for the chart. I know some companies openly admit their data refresh rates. Some even have scheduled data refresh dates. Did you use the search engines for data or track your own experiences or maybe even some other super-secret method?
Cheers!
AQ
@Adam
This chart includes a fair bit of Kentucky windage. We used a combination of our knowledge of the industry, stated feed times, known relationships, personal experience, a LOT of back and forth between David and myself and a dart board to resolve any number still left in doubt.
It is not meant as a bible but as a broad way for SEOs and SMBs to properly set their expectations.
@Darren
Good points. UBL is a broker and doesn’t feed data directly to Google. They collect data and on their cycle and the cycle of the folks they submit data to (InfoGrid, Acxiom, D&B). This can add from 2 to 4 weeks to the process. Depending on the cycles within those organization (say infogroup) that might take 2 months to fully vet a listing you can see that in a worst case it takes a long time. It has nothing to do with whether Google trusts their data. They are slaves to many of the same antiquated systems as everyone else.
As to CityGrid, this data refers to the ads and the attendant citation data that it generates. Because they are ads, the data is pushed to sites that subscibe to their API almost instantly. They also are one of the few companies with a direct feed to Google. Thus they shorten two very long segments of the cycle.
The reason that they beat Yext is interesting. It isn’t really Yext’s fault. CityGrid was one of the slowest companies to accept Yext data in my test. (This may have changed) Thus though dragging down the average. Yext also does not have a direct feed to Google.
Yes, the speed of the new internal data pipeline at Google is impressive particularly compared to the Places for Business Dashboard. Videos, photos, changes to hours all pop in almost instantly. Now if they could just reduce the cycle for their rebuild of their complete local index the whole industry would see the benefit.
@Mike
Thanks for clarifying the trust situation with UBL.
Super interesting that CityGrid has a direct feed to Google. Why does Google value their data so much? Do you have a complete list of direct feeds to Google?
Agreed about the local index rebuild time. I’ve heard they’re working on it. So, we’ll probably see that area improve in a couple years π
Really insightful and important info Mike and David. Thanks so much for doing all the research. Will definitely help spread the word on this.
It looks like if you want to be loal SEO’er it essential to have trusted mapmaker account that can make edits immediately.
Great chart you and David put together. This will for sure go to my guys for training.
@Linda
Thanks
@Mathew
There are some things that you can do in MM that you just can’t do any place else. It’s like having a direct feed to Google but for anybody’s bussiness not just your own.
@Darren
Very few companies that have websites have direct feeds to Google Local. They need to be trusted and have huge (even bigger than huge) amounts of data that Google needs. The only two that I know of are CityGrid (they were an early partner with Google in local) and I think Thumbtack because they have such a great collections of service type businesses. There are probably more but I don’t know who they are.
Thanks very much for sharing
Infographics always put things into faster perspective for me. …
Thank you for sharing the hard work, it’ll be nice to point to an easily understandable graphical depiction of update delays rather than relying on verbal skills or other esoteric explanation.
Kentucky windage..ha! Got a good lauch outta that one.
@Chris
That was the intent. Thanks for noticing the Kentucky Windage reference. It’s definition number 3 btw in this Urban Dictionary entry.
Thanks for sharing the viewpoint. Interesting choice of destinations; infers a simplified local SEO strategy. A few of these probably don’t even have a direct impact on Google but should be bunched into “Yellow Page sites”.
Couple other ideas:
Equally important is the challenging of removing duplicate records, which can take many of these providers a considerably long time.
Measuring other maps/GPlus data insertion speeds: business photos, Panaramio photos, and Offers (both direct and from 3rd parties). On the last point, we’ve seen it vary quite a bit; form hours to nearly a month before the G+ data pipe associates business deal data.
Great information Mike and David. Thank you as always! This is getting printed out and presented to the partners at my law firm first thing tomorrow. We had recently had discussions over the citation cleanup and aggregation process. My previous local SEO experience has always told me to project 3-6 months for this type of work to bear fruit. Good to see that my estimates were not that far off.
First, very useful information and a really great infographic.
I just wanted to make a few notes based on my observation:
1) Google seems to be partnering with a number of vertical websites for business data, so it might be useful to have one (or two) more lines in the infographic specific to these. Unfortunately, the times for these feeds seem to vary greatly, with some feeding data almost as fast as CityGrid (for instance, Urbanspoon and OpenTable in the restaurant industry seem to be such), while others doing it much more rarely – just 2-3 times per year (Angieslist is the most prominent example).
2) Google has different data partners in different countries, so it could be useful to note that the infographic is related only to the US ecosystem. On a side note here, Google seems to partner with Qype for getting data internationally, i.e. not just in the UK, but also in every country which Qype is serving.
3) I believe Merchant Circle is, since recently, also Google’s partner. Unfortunately, according to my observation they also have the highest number of duplicate listings from all the major directories, which I see as one of the main reasons for the great outburst of duplicates that happened a few months ago.
I’d be happy to hear your opinion on these.
@Nyagoslav
Good point about this being US centric. I have a tendency to be provincial in that regard. The logic though can be applied to any country… Find the source that most closely matches the one above and use that as a reference. Also most other countries have much simpler data flows. The one take away on that point might be that Google is always looking for additional and cheaper data sources so they might achieve this complexity some day.
Google will look to large scape partners for information that they don’t have. The question for me though are they scraping or do they have a feed. One of the reasons that Google encouraged implementation of Rich Snippets was for the very purpose of more frequent and accurate scraping of these trusted sites.
This is great insight into each of the service offerings. I feel like I’ve been lied to for years (won’t name names, but I think it’s obvious).
Thanks for the providing the data and helping us to understand which options for adding/correcting citations may be most effective.
@cory
Speed and reach are two different issues. If someone oversold their speed it can’t be assumed on an apriori basis that they oversold their reach. Some services are good for fast coverage and some are good for broad coverage.
Mike, regarding scraping vs. direct feeding, there is a relatively easy way for this to be discovered (I believe). If a new listing is created based on data from just one matching source, this is most probably a direct feed. In any other case, we can’t be sure which one of the two cases it might be. I’ve definitely seen Places listings created solely on the basis of data coming from one source and this source was Merchant Circle (on a couple of completely separate occasions).
Another thing that I just remembered, which I haven’t personally observed, would be the probable slightly shorter cycle for LocalEze data to be included in the cluster, as LocalEze “might be” a direct partner of Google (according to David Mihm)
Nyagoslav
I do not think it is so easy to tell a scrape from a feed. I know of historical cases with the superpages where google was not allowed to get a feed but were allowed to scrape the site for business listings. For google it is an issue of scale. There are not many feed relationships because there is either not enough data or not enough unique data.
As always… yours and David’s hard work reaps more useful data for your followers – thanks guys – Andy π
Really interesting topic and nice research. I’ve had experience with citygrid. They are fast in getting data into google’s network, so it is a helpful opportunity. They can distribute widely.
Also had some problems crop up with them of various types….so I’d use them, just be darned careful, at least in my experience.
@Earl
Also had some problems crop up with them of various typesβ¦.so Iβd use them, just be darned careful, at least in my experience.
As I recall weren’t the problems caused by a 3rd party not being conscientious about your NAP?
I assuming when you say “average time to inclusion to Google + Local Cluster” you mean that this is business data Google picks up from these sources and includes in their local index.
Am I right by assuming this?
@todd
That’s correct. And as a corollary it would be included in the cluster for that business.
Thanks Mike and David…as always GREAT stuff. Love the ‘Kentucky windage….dart board’. The dart board should not be disguarded as an inaccurate method…I recall a study possibly published in WSJ years ago that demonstrated throwing darts at stock choices was a more profitable venture than the use of a broker!! LOL
This info helps to manage expectations.
You both are GREATLY appreciated!
Thanks again as always. Just did some citation mining for a local client of ours, and this is always a great reference to use!
Comments for this post are closed.