The Center for Internet and Society at Stanford Law School is a leader in the study of the law and policy around the Internet and other emerging technologies.
It’s difficult to recall an internal memo gone viral that has sparked as much commentary as James Damore’s statement on gender and engineering at Google. This post is not about that memo, although the volume of commentary on it did prompt the thoughts that follow. Nor is this post about workplace diversity, at least not directly. Instead, like many other “Tool Without a Handle” posts, it is about metaphor.
In particular, I wanted to test whether, in preferring the metaphor of “a tool you use” as distinct from “a place you go,” I’d unduly limited my thinking to an “androcentric” view of networked information technologies. In other words, is “tool” a masculine metaphor, implying a gendered orientation towards my preferred approach to thinking about technology?
I conclude the answer is “no,” in part because metaphor differs from gender, and in part because metaphor is a feature of language, while gender is a feature of persons. Moreover, I identify a general objection to dichotomizing and to gender metaphors.
In today's highly digitized world, copyright infringement actions, among others, are often brought against alleged infringers using information culled from Internet service provider addresses. While fair use defenses may exist against such suits, particularly when one is doing a music mash up, a preliminary question is whether the initial source evidence is accurate.
Most people I talk to think that Facebook, Twitter, and other social media companies should take down ugly-but-legal user speech. Platforms are generally applauded for taking down racist posts from the White Nationalist demonstrators in Charlottesville, for example. I see plenty of disagreement about exactly what user-generated content should come down -- breastfeeding images? Passages from Lolita? Passages from Mein Kampf? But few really oppose the basic predicate of these removals: that private companies can and should be arbiters of permissible speech on their platforms.*
If your cell phone is on, your location is known, tracked and recorded, whether you are in your home or in public. As you move around, your location history is created and stored by the carrier, numerous applications on the device, and potentially even the manufacturer of the device or operating system provider. Your consent to capture this information, whether rough location or very granular, may be tacit, inherent in the application’s usage, or freely given when you activate, install or operate the device.
Alarm bells are sounding around the Internet about proposed changes to one of the US’s core Intermediary Liability laws, Communications Decency Act Section 230 (CDA 230). CDA 230 broadly immunizes Internet platforms against legal claims based on speech posted by their users. It has been credited as a key protection for both online expression and Internet innovation in the US. CDA 230 immunities have limits, though. Platforms are not protected from intellectual property claims (mostly handled under the DMCA) or federal criminal claims.
As part of it's 50th anniversary celebrations, the Australian university where I did graduate work recently interviewed me on a range of cybersecurity topics. At the time of our chat, Australian Prime Minister Turnbull had just proclaimed that "the laws of Australia prevail in Australia, I can assure you of that.
In its Equustek ruling in June, the Canadian Supreme Court held that Google must delete search results for users everywhere in the world, based on Canadian law. Google has now filed suit in the US, asking the court to confirm that the order can’t be enforced here. Here’s my take on that claim.