Tool Without A Handle: Taking Space Seriously

As I’ve been writing about networked information technologies as “tools,” it’s worth reiterating that metaphors of space are not entirely without value, including in areas of the law that derive from laws relating to real property.  Having noted in multiple prior posts the weaknesses of spatial metaphors, here I discuss some of their common applications in ways that are productive. 

Also, I make a refined distinction:  between thinking about the Internet in its entirety as a “place you go” and thinking of elements of the Internet as places.  That is, while “cyberspace” as a term for the entirety of the Internet is more valuable for fiction and poetry than for policy, there is some utility in thinking about platforms and services as discrete spaces.  In particular, thinking of them as spaces with access rights and access management tools:  digital gates, fences, doors, locks, and passcodes.

There are two particular applications of the physical space metaphor to online platforms and services that are interesting:

1) Content moderation questions - including vetting of advertisers and user-submitted content;

2) Data Rights questions - if content is posted is it free to copy?  To commercialize?

Content Moderation Questions

I was privy to a discussion at Google in the spring of 2004 regarding the acceptance of ads from online pharmaceuticals.  This discussion derived from public concerns that Google may have been accepting ads from online pharmacies which were not properly licensed to sell into the U.S. market, or even not trustworthy for anyone to use.  As a result, Google adopted a 3rd party verification process.[1]

This is a form of access control:  a test that must be passed to enter a space (through placing ads via the Google publisher network).  It is, in effect, akin to a “Captcha” - though because the qualifications for entry are more complex than “I’m not a robot,” the process is necessarily more complex.  But in both cases a test must be completed before passage is allowed.

Indeed, the process of ensuring that non-qualified pharmaceutical vendors did not end up accessible to Google users turned out to be “an ongoing, escalating cat-and-mouse game,” according to a Google litigation counsel.[2]  As a result, Google would ultimately settle a substantial regulatory case about this practice,[3] notwithstanding its efforts and its policies.[4]  This aspect of access control is relevant to myriad other areas of information technology policy, in particular cybersecurity.  The battle against unauthorized access to data, unauthorized or misleading use of platforms by technical “bots,” and other misuses of services is often a “whack-a-mole” exercise.[5]

In this context, there is a natural harmony between the metaphor of tools (gates, tests, locks) and metaphors of spaces; after all gates are applied to fences and to areas of land, tests are applied as checkpoints, and locks and doors are there to protect interior spaces.  When spatial metaphors are used for discrete components, rather than the entirety of the Internet, some useful mental images and reference points are possible.

A similar utility arises from the use of spatial metaphors for Internet access services, often referred to as the “on ramps” to the Internet or with various other road, highway or passage-related metaphors.  And, germane to “net neutrality” issues, tool metaphors for features associated with such “on ramps” also abound:  proponents of more net neutrality regulation are concerned with “gatekeepers,” “slow lanes” and “toll booths,” whereas proponents of less regulation warn of “traffic congestion” or discuss the utility of “meters.”

The pattern emerges, then, that in areas relating to content moderation and access control, a spatial metaphor for a platform or service can serve as a foundation for various tool metaphors.  Ad networks are “spaces” to be guarded by sentinels who ensure proper credentials (or, if you will, enormous bouncers checking IDs).  Databases are “spaces” to be protected by (fire) walls, lock/key, and perimeter defenses.  Internet access services are “roads” that should (or should not) have “toll lanes” applied. 

In an older, weirder, description, websites are “desert islands” occupied by a person with a megaphone.[6]

Data Rights Questions

A similar dynamic of thinking of platforms and services as “spaces” equipped with various tools plays out in Internet policy questions regarding usage rights to published data.  A recent California District Court case dealt with issues of whether, and to what extent, the implementation of tools was sufficient to limit uses of otherwise publicly accessible data.

A company called “hiQ” ran a data analytics business based on data scraped from public LinkedIn profiles.  LinkedIn eventually objected to hiQ’s practices, and sought injunctive relief citing the Computer Fraud and Abuse Act (“CFAA”).  In response, hiQ claimed it was entitled to such access under state unfair competition law.  

The hiQ v. LinkedIn opinion[7] suggests that the CFAA, which protects against unauthorized access to certain computers, can’t be interposed to restrict such access, where policies and practices don’t clearly post sufficient “no trespassing” signs on the site.

This is an important precedent for other types of commercial uses, as well as researchers who may collect and analyze publicly available data.  There are a variety of services - in the customer relations management (“CRM”) space, for example - that will "scrape" data from the Internet and then re-monetize it as a directory, as data analytics for sale, or as additional CRM features.

I think the hiQ opinion is close to being wrong, but not quite.  The court's analysis questions if CFAA can apply where the barrier to entry is legal (terms of service) rather than technical (password).  I think that's right in the sense that CFAA was not drafted with that use case in mind, and the court cites Orin Kerr's article on trespass[8] extensively in support of that reasoning (a point I come back to below).  But the analysis also rests on the facts:  LinkedIn was inconsistent at signaling allowed uses, and it allowed similar uses by other parties. 

If a party originally and consistently published terms of service, used technical countermeasures to deter data scraping uses, and then (for purposes and effects other than restraining competition) wrote cease & desist letters to violators who nevertheless persisted in "access" to data scraping, it seems odd to say the scraping isn't "unauthorized."   Or if an otherwise public site disables right-click copy and paste and a user somehow finds a way around that to copy and re-publish, that sure feels "unauthorized."

The hiQ opinion even goes on to say that "a user does not access a computer “without authorization” by using bots, even in the face of technical countermeasures, when the data it accesses is otherwise open to the public."  That seems to miss the point:  the data isn't simply "public," it's "public to anyone not a bot." The technical countermeasure is akin to a "no dogs allowed" sign in the window of a public store and so owners should have rights to refuse service if a customer rides in on the back of a dog.

As noted above, the law of trespass is key to applying the CFAA, and that law is of course rooted in concepts of physical space.  Even the concept of “access” itself contemplates a physical space into which one is authorized (or not) to enter.  In his article, Professor Kerr goes on to argue that the best way to distinguish permitted entry is through a principle of authentication.  That is, through a process that requires verifying that the user is indeed the person who has access rights to the information accessed.  So, identity is authenticated, and then authorization rights are extended (to whatever degree) to that identified person. 

In other words, a tool is utilized so that rights to the service (conceptualized as a “space”) are managed.  This follows the same construction as noted above:  a service or platform “space” is enhanced or managed by tools.  In both content moderation and data access and use cases, thinking of the service or platform as a “space you go” helps create coherent and predictable mental models for analyzing Internet policy issues, and affords the additional reference point of property rights and trespass legal doctrines to illuminate modern statutory law.

[1]See, e.g., House Permanent Subcommittee on Investigations, “Buyer Beware: The Danger of Purchasing Pharmaceuticals over the Internet,”  In order for an online pharmacy to advertise with Google, it must establish, to the satisfaction of a trusted third-party verification service, that both the pharmacy and its pharmacist are properly licensed; that the Internet website associated with the ad is owned by the licensed pharmacy; that it will not dispense prescription drugs without receiving and verifying a lawful and valid prescription from a personal practitioner; and that it will perform age verification for all prescriptions, among other requirements.  Id., Testimony of Sheryl Sandberg, Google. 

[2]Claire Cain Miller, “Google Reaches $500 Million Settlement With Government,” New York Times, August 24, 2011

[3]Google Forfeits $500 Million Generated by Online Ads & Prescription Drug Sales by Canadian Online Pharmacies,” DOJ Press Release, August 24, 2011 (

[4]For current Google policies on AdWords use for healthcare and medicines, see

[5]See, e.g., “Playing Whack-a-Mole: Results of the 2017 SANS Threat Landscape Survey,”

[6]Why the Internet is Like Joey Bishop,” (“What is a world made up of millions of desert islands, each with but a single inhabitant, a self-centered maniac who holds a megaphone and shouts "I like green tapioca pudding!" all day long?”).

[7]hiQ Labs, Inc., v. LinkedIn Corporation, Case No. 17-CV-03301-EMC. United States District Court, N.D. California, San Francisco Division (September 18, 2017). (opinion by Judge Edward Chen).

[8]Orin Kerr, “Norms of Computer Trespass,” 116 Colum. L. Rev. 1143 (May 2016)


Add new comment