Study: Google's "Local OneBox" Skews Search In Company's Own Favor

0 comments

With the knowledge box and the power of Google's "regulars," it's hard enough getting even close to the top. A new study from former FTC advisor Tim Wu, Michael Luca, and "the Yelp Data Science Team" claims that Google is giving preference to its own services with its knowledge and OneBox services and thereby degrading the quality of search results. Perhaps that should be taken with a small grain of salt, though - Yelp is currently one of the complainants in an EU anti-trust case. And that's probably not the only reason they have for drumming on the anti-Google war bongos.

A piece in Wired describes the study in more detail

Wired wrote:
researchers created two separate versions of local search results. The first version showed results that included Google’s so-called Local OneBox, which shows a select list of businesses at the top of the search page generated from Google+. Google’s argument for OneBox is that it shows searchers exactly what they’re looking for at the top of the page, without them needing to scroll through 10 blue links. That means, among other things, that OneBox results show up above Yelp results, but appear objective to the end user. Google argues, that the immediacy of OneBox improves search. But the researchers tested this theory by creating a second version of the same search results—only this time without OneBox. This version surfaced results that were generated using Google’s organic search algorithm. The team then randomly displayed one of the two sets of results to 2,690 subjects.

What the discovered was that users in their test were 45% more likely to click through organic results. Therefore, they concluded, the OneBox is showing "less useful search results" and creates a negative user experience. It's a direct attack on the omniscience and power of Google's universal search. The piece further emphasizes that proving stuff like this is very important to conducting an anti-trust case. One of those was conducted in the US back in 2013, but eventually dropped. Though the WSJ discovered that the backroom documents of the case were stacked heavily against Google anyway. Even if they escaped official action, the FTC wrote that Google was doing “real harm to consumers and to innovation in the online search and advertising markets."

A piece in The Verge (linked above) wrote:
Both Wu and his co-author Luca have been paid for their time by Yelp, a major adversary to Google in the sphere of presenting local information, but Wu is adamant that it's the data that has compelled his participation in the study. "I wouldn’t be doing this if I didn’t think this new evidence was a game-changer,"he tells Recode, adding that "this is the closest I’ve seen Google come to the Microsoft case." The case he's referring to is Microsoft's infamous bundling of the Internet Explorer browser with Windows, which artificially promoted IE's use despite the existence of far superior alternatives. Tim Wu now sees Google doing the same thing with search: "it’s presenting a version of the product that’s degraded and intentionally worse for consumers."

Finally, there's a piece on Search Engine Land with some interesting comments worth digging through.

DavieLand wrote:
An odd position for Yelp to take. Yelp uses it's massively powerful SEO to show it's results above the websites of small businesses, then shows the consumer it's own version of reality. If they don't like it, business are told they can always buy advertising from Yelp. A classic case of the pot calling the kettle black.

Jason Bauman wrote:

A couple things I found interesting from their data:

1) the "unbiased" reviews they offered gave priority to number of ratings and overall star ratings. While this is arguably good for the user, it would mean that their "FOTUL" plugin is ranking according to structured markup, which Google's Algorithm doesn't do.

2) The doctor example has basically all top results dominated by "ZocDoc" which looks like a great platform, but it's also paytoplay and looks like a scheduling service doctors have to opt into using. It would be interesting to see if the doctors google listed had zocdoc profiles, and if their star ratings were a lot less

3) The writing in the report itself is clearly sided, reading more like a piece from Yelp directly and not a "here's our findings" third party. Makes it harder to pick out useful data.

4) It's interesting that they call out the number of reviews, since one of Yelp's big complaints was that Google scraped their reviews. So with those stripped, the Google reviews dropped.

5) The report implies that all results are from G+, but several results in their "google+ control" are all the auto-generated "Google my business" map locations, not things created by a dedicated team.

I'm not saying there isn't a problem with the local box. I think it's a discussion we should have, but I think pieces like this report aren't the way to do it. It's easy to call out Google as being evil, but Yelp isn't "David" to their "Goliath" either. I'll have to do a deeper read of the report later, but I saw a couple pretty big flaws in their methodology and language.

/quote

Anyway...I think that there are some people who will take a lot of joy from seeing Google kicked in the pants over something like this. But is Yelp! really the boot that should be doing the kicking? If you take a look through their data you'll find, probably to no surprise, that many of the "organic" results that were shown to have higher click-throughs were from services like Travel Advisor and Yelp. So basically what they seem to be saying is "Google is putting its own results before ours! We want our results put forward instead!" So business as usual, eh?