The Center for Internet and Society at Stanford Law School is a leader in the study of the law and policy around the Internet and other emerging technologies.
Block chain technology is taking the world by storm. From banking to health care, many tout block chain and the bit coin it enables as a cure-all. Others think bit coin is heading towards the edge. In between are those who see practical applications of block chain but caution on addiction to bit coin. On February 26th at the University of Copenhagen, I made a presentation entitled "Block chain technology -- good, bad, or somewhere in between?" This entry gives you a sneak preview of that talk.
Attached to this post are Powerpoint slides introducing intermediary liability basics. This particular deck comes from a great CIDE program in Mexico City. It is descended from others I’ve used over the years teaching at Stanford and Berkeley, presenting at conferences, and training junior lawyers at Google. Ancestral decks that evolved into this one go back to at least 2012. (Which might explain why I struggle with fonts whenever I update them.)
The Fourth Circuit has issued its decision in BMG v. Cox. In case you haven’t been following the ins and outs of the suit, BMG sued Cox in 2014 alleging that the broadband provider was secondarily liable for its subscribers’ infringing file-sharing activity. In 2015, the trial court held that Cox was ineligible as a matter of law for the safe harbor in section 512(a) of the DMCA because it had failed to reasonably implement a policy for terminating the accounts of repeat infringers, as required by section 512(i). In 2016, a jury returned a $25M verdict for BMG, finding Cox liable for willful contributory infringement but not for vicarious infringement. Following the trial, Cox appealed both the safe harbor eligibility determination and the court’s jury instructions concerning the elements of contributory infringement. In a mixed result for Cox, the Fourth Circuit last week affirmed the court’s holding that Cox was ineligible for safe harbor, but remanded the case for retrial because the judge’s instructions to the jury understated the intent requirement for contributory infringement in a way that could have affected the jury’s verdict.
This piece is exerpted from the Law, Borders, and Speech Conference Proceedings Volume, where it appears as an appendix. The terminology it explains is relevant for Intermediary Liability and content regulation issues generally - not only issues that arise in the jurisdiction or conflict-of-law context. The full conference Proceedings Volume contains other relevant resources, and is Creative Commons licensed.
This panel considered issues of national jurisdiction in relation to Internet platforms’ voluntary content removal policies. These policies, typically set forth in Community Guidelines (CGs) or similar documents, prohibit content based on the platforms’ own rules or values—regardless of whether the content violates any law.
Popularity doesn't equal truth. And yet Facebook's recent proposal to rank the trustworthiness of news sources based on popularity is loosely equating truth with popularity. In so doing, Facebook may be putting form over function.
The topic of how well the tool of black letter law works in the Internet law setting is of course huge, and associated with obvious definitional challenges. To point to but one; how ought we define “black letter law” in our present legal culture where legal rules necessarily must take account of the technical reality in which they operate? Indeed, given Wikipedia’s definition of “black letter laws” as laws that are “the well-established technical legal rules that are no longer subject to reasonable dispute,” one may legitimately question whether we can speak of any real black letter law within our field of enquiry. Fortunately, however, the panel was asked to approach only the more concrete topic identified in the description above.
Without a doubt, human rights law provides an important framework for the discussion of cross-border speech regulation. The International Covenant on Civil and Political Rights (ICCPR) in Article 19 clearly states the right to express opinions and ideas “regardless of frontiers” and the Internet is a particularly relevant tool and platform for the exercise of this right, both in its individual and social dimensions. There was a common underlying basic agreement among the different panelists as to the need to include a human rights perspective in content removal discussions, whether judicial, regulatory or legislative.