“Neutrality”—if it’s good enough for the core of the Internet, isn’t it good enough for the edge? The biggest Internet providers say it is, and they would love to have the government slap a few neutrality rules on Google, just to see how the advertising giant likes the taste of the regulatory bridle.
In 2010, while the FCC was debating net neutrality rules, ISPs like Time Warner Cable settled on a “they’re gatekeepers, too!” strategy.
“Google has led the charge to adopt regulation to ensure Internet openness, yet it has the ability and incentive to engage in a range of decidedly non-neutral conduct due to its control over so many aspects of the Internet experience,” said one representative filing. “Google’s core search application relies on a pay-for-priority scheme that is squarely at odds with its proposed neutrality requirements for broadband Internet access service providers.”
Comcast agreed, telling the FCC, “If the Commission were to conclude that an interventionist regulatory regime is needed to preserve the ‘neutrality’ of the Internet, it could not defensibly apply that regime to broadband providers but not to Google.”
And AT&T blasted Saint Google for its sinful practices: “They ‘determine the information… that customers access online’ through algorithms that highlight some information, favor certain websites, and even omit some sites altogether.”
The answer: search neutrality. Somehow.
It’s hard to tell if this was ever a serious proposal, since it was most often deployed by ISPs as a sort ofreductio ad absurdum against network neutrality proposals. (“See, if you go down this path, you’ll have to regulate everything!“)
But outside the den of self-interest that is an FCC docket, academics were also pondering the question. In 2009, for instance, well-respected University of Minnesota scholar Andrew Odlyzko suggested that net neutrality (which he favored) might then “open the way for other players, such as Google, that emerge from that open and competitive arena as big winners, to become choke points. So it would be wise to prepare to monitor what happens, and be ready to intervene by imposing neutrality rules on them when necessary.”
But what does it even mean when we talk about applying “neutrality” to search—which is all about subjective rankings of relevance?
“Telling a search engine to be more relevant is like telling a boxer to punch harder”
James Grimmelmann, an associate professor at the New York Law School, ran through eight main principles that underlie various “search neutrality” arguments. He found every one of them “incoherent.”
Grimmelmann’s resulting paper, “Some Skepticism About Search Neutrality” (PDF), has just appeared as a book chapter in The Next Digital Decade, and it’s an intriguing look at the foundations of search neutrality. At bottom, the paper understands search as an inherently subjective enterprise that makes a mockery of attempts to regulate it into some sort of neutral form. Indeed, trying to do so is almost a categorical mistake.
Here are the eight possible bases for search neutrality regulation:
- Equality: Search engines shouldn’t differentiate at all among websites
- Objectivity: There are correct search results and incorrect ones, so search engines should return only the correct ones.
- Bias: Search engines should not distort the information landscape
- Traffic: Websites that depend on a flow of visitors shouldn’t be cut off by search engines.
- Relevance: Search engines should maximize users’ satisfaction with search results.
- Self-interest: Search engines shouldn’t trade on their own account.
- Transparency: Search engines should disclose the algorithms they use to rank webpages.
- Manipulation: Search engines should rank sites only according to general rules, rather than promoting and demoting sites on an individual basis.
Most of these are dealt with by the simple (and obvious) objection that “systematically favoring certain types of content over others isn’t a defect for a search engine—it’s the point. If I search for “Machu Picchu pictures,” I want to see llamas in a ruined city on a cloud-forest mountaintop, not horny housewives who whiten your teeth while you wait for them to refinance your mortgage. Search inevitably requires some form of editorial control.”
As for transparency, which usually involves revealing the algorithmic underpinnings of a search engine, Grimmelmann argues that it’s simply a recipe for competitors to copy and for website operators to game.
All eight principles are weighed in the balance and found wanting. (The piece is quite a good read; do check it out in full if the issue is of interest.)
All of this is fine as far as it goes, and the arguments make a great deal of good sense in general, but they open Grimmelmann up to some obvious charges of supporting a behemoth like Google—and if you can’t imagine any way in which the company could “be evil,” you’re severely lacking in imagination.
To deal with real problems that can be caused by search engines, Grimmelmann remains open to traditional methods of oversight. “It doesn’t follow that search engines deserve a free pass under antitrust, intellectual property, privacy, or other well-established bodies of law,” he notes. “Nor is search-specific legal oversight out of the question.” Should Google simply start extorting websites into a “pay for placement” scheme that is not disclosed to end users, then the government should step in.
But when it comes to the question of applying “neutrality” principles to an inherently subjective enterprise, Grimmelmann has nothing but skepticism.
Leave a reply