Siri has been making lots of stupid mistakes lately in local search. Mistakes that it’s younger sibling, Apple Maps, is not making. These are simple mistakes that were not there when Siri was the only game in town. Is she just looking for some more attention? Is she thinking of heading in some new direction or is Apple Maps just sending her the wrong way?
Here are two example searches that return totally crazy results that Apple Maps gets essentially correct.
Note that Siri can not even find one jeweler in or near Williamsville and returns results that are 26 miles away while Apple Maps returns relevant results:
(click to view larger)
I have found these types of results to be fairly widespread while I was traveling last week. Here is another example (Hotels Allegany NY) :
I did an interview in GPSBites where I was asked to muse on my background, the current state of mapping, the fate of PNDs & mapping companies and the near future of the intersection of mapping and commerce. Here is a snippet of a much longer interview:
2. You recently published your own survey which was designed to gather feedback from users of the new iPhone map application for iOS6. You stated that you did not believe the recent Apple maps issue was going to affect Apple sales, and in our view Apple must share a similar view as they went so far as publishing an apology on their home page which even recommended customers use rival solutions in the interim.
However, they certainly are facing some challenges. If you were heading up Apple’s cartography division, what recommendations would you make to the company on how they could improve the experience and application moving forwards?
I am not sure that I agree with the premise of the question. It assumes that Apple does not understand cartography and mapping. And that I have significant insights to provide to them.
Assuming that Apple is stupid or just uneducated is a dangerous assumption. Taking potshots at funky maps is an easy target. Remember it was not that long ago that Google was losing whole towns, repeatedly.
Mapping is hard. Apple knows full well how hard mapping is and they knew full well that they were going to have difficulties coming into this. When they announced Apple Maps in June 2012 I can not imagine that they would blow that marketing opportunity by announcing that their new Maps product had a “few” problems. NO they went ahead and presented it as the most innovative mapping product ever. Whether it is remains to be seen but in some ways they are where Google with mapping was in 2008, in some ways ahead of that and in some ways behind it. But to assume that they need my advice is to ignore a lot.
Apple has been a late starter in several industries that they ultimately succeeded in leading or developing a very strong market position. They came to the already existing MP3 player market with one device. Over the years as they developed the necessary skills they came to dominate that market. When they entered the phone business NO ONE thought that they could succeed. But their smart phone still sets the standard and has significant market share. They continue to grow their PC market share to a healthy position after being at death’s door in that market. So they know how to succeed as an underdog, how to build out the capacity AND knowledge, and plan for the appropriate growth when they enter an existing and competitive market.
Mapping is in some ways different but in many ways the same. It takes time to build up the institutional knowledge and the people necessary to compete head to head with the likes of Nokia/Navteq and Google. This knowledge can not be built over night and you can’t ramp up all the necessary efforts or staffing in one fell swoop.
Apple could have taken an easier way out of the mapping dilemma and their conflict with Google if they had partnered with TomTom or Mapquest, both of whom already had turn by turn apps working well on the iOS platform. They didn’t. Apple chose to go it alone. The real question that we need to ask (of Apple) is how much of the stack are they intending to own and of the parts that they don’t own, how are they going to get them up to the world class standards that they surely know that they need.
In choosing TeleAtlas, they chose a company with incredible underlying technology but limited resources. Apple has a history of making significant investments in their partners to gain a competitive advantage. By giving TeleAtlas access to the massive amounts of geo data generated by the iOS6 crowd Apple may just have provided TeleAtlas the information that both TeleAtlas and Apple need to compete.
We live in interesting times and Apple’s foray into Mapping promises to make it even more so.
Update: Google not only changed the output of the review content but they changed the interface at the time of review creation to have users select from the descriptive phrases as well. See photo below.
Last week at Getlisted Local U Advanced in NYC, Googler Joel Headley noted that “descriptive terms (poor, good, very good excellent) are going to be integrated into Zagat review interface more going forward”.
Reader Kerry Fager just pointed out to me that they are now doing just that on the overall annotation on each review on the G+ Local page.
Will the descriptive terms make it to the front page? Certainly the descriptors are more meaningful and if we take Joel’s comment at face value, then we might see this elsewhere.
Why the change? One assumes that “it improves the search experience”. It makes the otherwise obtuse Zagat numbering system into something understandable by mere mortals. ?
Note: As noted in the comments, there appears to be a concurrent problem with displaying owner comments on the reviews. Most, perhaps all owner comments, are missing in action. Search teams are being dispatched.
Update (10/12/2012 9:00AM): Reports of missing owner responses came in via Twitter within minutes of the release of the product on 10/10. These reports were funneled to Google who fixed this bug by mid afternoon yesterday (10/11/12).
When more granular detail is available (ie Quality, appeal & service or Food, service, decor) Google is now breaking those out individually as well: (more…)
Much has been reported in the media about the iPhone Map App that Apple rolled out with the release of iOS6. The early reporting was anecdotal, apocalyptic or often just plain wrong but very little of it addressed the question of what consumers thought of Apple’s new Map app.
What do they think? It would seem for most of them the iPhone Map App is a non issue.
To answer this question I created a survey in Google Consumer Surveys tool that ran from Friday October 5th through Sunday October 7th. The recent survey indicates that over half of current users of the new app were not affected at all by the app and over 91% fell in the “not going to jump off the Empire State Building” cohort. 74% were satisfied or perhaps just didn’t notice. Only 3.2% indicated that the Map App would definitely prevent them from buying another iPhone in the future.
(click to view larger)
When you dig into the numbers a bit you can see some other interesting tidbits: Females were less disaffected by the product than males, folks older under 55 were more likely to indicate that they were happy and those over 55 were more likely to indicate that they were never going to buy another iPhone (ouch). Urban users were happier than rural one. (more…)
Earlier this summer, Google removed a large number of residentially located service area businesses (SAB) from the index for not hiding their address. While Google was trying to clean up the index, a number of these SABs were removed in error. It turned out that Google was unable to restore many of those erroneously removed to the index. Some business listings have been restored but others have been waiting now for a number of months.
Google updated their guidance on this issue last night:
Here’s the state of these listings now (October 8):
Sevice-area businesses who are experiencing the “We currently do not support this location” message should –
1.) Check to make sure you comply with the quality guidelines, particularly hiding your address, if appropriate.
2.) Once you’re sure you comply, contact the support team (select the last option).
3.) If possible, the team will reinstate listings that are OK.
4.) Sometimes, the support team cannot reinstate a listing, even if it’s OK. These listings cannot be brought back because of an issue that we’re still working on fixing. The support team will send an email back saying the listing is down due to a technical glitch. When we have an update, we will follow up with all of the people who got the message about the technical glitch.
What’s the status of listings in #4?
For listings in #4, there isn’t much course of action other than waiting. Please know that our team’s doing everything we can to get them reinstated when possible.
Good news — we’ve been able to bring back some of the listings that incorrectly had the “We currently do not support this location” error. Many previously deleted service area businesses that had their addresses correctly hidden a few weeks ago are back.
If your listing’s not back yet, please know that we are still working on it. In the meantime, please review the quality guidelines and this article on service area businesses. Make sure your listing complies.
For those of you still experiencing this problem, there is only one option. File your request for reinclusion via the Google for Business Help files and wait. Note that if Google is unable to recover your listing quickly then you have no choice but to wait for their engineering solution. Businesses that followed Google’s original advice to recreate their listing have not had any success.
This recent email from Google support sent to me by Kane Jamison of Hood Web Management clearly indicates that Google is working on these listings on a first come first serve basis:
When Google rolled out G+ Local with Zagat reviews they changed the ordering of review content from time based to most helpful. As part of that ordering they added a new category of reviewer known as a Top Reviewers. These were folks that had reviewed a large number of locations. Google also added the ability for a business (and I presume its many managers) to leave reviews of other businesses.
Like all things Google the Most Helpful ordering of reviews is algo based and includes elements like the quality of the reviewer (in terms of followers on G+ and number of reviews), the language of the review, the recency of the review and who knows what else. One of the attributes of reviewer quality is the Top Reviewer assignation. According to this post to become a Top Reviewer one needs lots of reviews, a significant number of followers and a reviews that have been found helpful by others. It is not clear whether being reviewed by a Top Reviewer increases rank but there is every reason to assume that a review from a Top Reviewer is carefully watched by Google for other signals and content.
What never occurred to me until this morning was that one way to become a Top Reviewer was to do so as a Google+ Page for your business rather than an individual. A business page can have as many as 50 managers so reviews would aggregate more quickly and ease the burden of any individual reviewer. Obviously this business recognized the opportunity and has leveraged it.
0:16 Listings take a week to go live, a few weeks for link from Google Places dashboard to work
It might take longer than a week depending on their internal build cycles.
0:40 Verified social pages now showing message if edit not accepted
This message appears:
0:59 Fewer categories displaying because uncommon categories no longer appearing
Choosing from the list of auto generated categories increases the likelihood that a category or two will show. Maybe speculation in Linda Buquet’s forum about categories changing dramatically is in fact the case? Clearly the missing categories is NOT a bug but an intentional decision on the part of Google.
1:18 International phone number formatting issue with verified social pages
1:28 Formatting not appearing on owner descriptions
HTML tags are no longer showing but some rich text formatting is not showing although some is. Google has had problems showing rich text on local listings in the past and they finally seem to be fixing this issue. See above image.
1:45 Google+ Local best practice: edit verified social pages via Google+
What happens if a page is edited via the Dashboard? Not sure but I am sure it isn’t pretty.
We are nearing capacity for the Local U Advanced in New York next Monday but there are still a few seats left if you want to join us. Would love to see you there. Be sure if you are coming to introduce yourself so that I can put a face to your name.
This afternoon Rocky Agrawal tweeted out about this plaque he had noticed hanging in a restaurant. He (and I ) were completely fooled by the plaque and were convinced that it was really from Google. I even thought that perhaps it was an experiment on Google’s part to migrate away from Zagat signage.
I am not sure who I think less of in this situation, the restaurant that was trying to appear more than they really are by leveraging Google’s name and their review product or the company that soaked them $300 for the “privilege”. A restaurant or hotel can order a sign that touts their good standing with just about any review company including Yelp, TripAdvisor, Zagat, Frommers and many more.
When businesses that are looking for a quick fix deal with companies that are willing to accommodate them, the customer inevitably loses. And in this case so does Google, Yelp, all the other companies whose name can be put on the plaque and every one else in the local space.
The local ecosystem is a complex web of interrelations with Google having positioned themselves at its center. Given this complexity, just how long does it take for data to move through the various parts before it makes it into Google’s index. And from the main index into their local index and the cluster of data they have about your business? Just why does fixing error or changing a listing detail at InfoUSA take so long to impact your Google listing?
David Mihm and I have been working on detailing the time it takes for any given citation creation to impact the Google cluster for your business.
Our goal is to provide a broad stroke as to the range of times it might take for citation data to show up in a desktop Google search. The ranges are estimates only based on our experience and do not reflect comprehensive empirical data. As such, you might find discrepancies with our assessment of any given citation tactic. That being said, we think that the information is broadly accurate and provides insights into the delays at various points in the local ecosystem.
Depending on where the data enters the system it can take more or less time to finally make it into Google’s cluster of data in their local index and depending on where it hits in any given cycle along the way it can make it there more or less quickly.
For example, in the case of Infogroup they might take 2 months cycle to vet a new listing and another month before the data is fed to one of their customers for display in a local directory. Thus the range of times, depending on when the data hits their cycle could be as long as 180 days before (blue) the time for it to first appear live on the web. Depending on the importance of the page and its visibility where that data is shown it might take anywhere from a day to sixty days for Google (orange) to include the data in their main search engine. From there Google then needs to re-build their local index and include the new citation data into the Google+ Local cluster (Green) which occurs every 4 to 6 weeks.
The circle thus represents an educated guess as to the average time to inclusion in the Google+ Local cluster for data that started at any given point.
Historically, as I have noted previously, a listing that went through a list broker, onto a primary list supplier like InfoUSA and then off to Google had a number of time delays before it would hit paydirt in the business cluster in the Google local index. This data could, if it hit every cycle just wrong, take as long as 9 months from beginning to end.