Blog

Safeguarding User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive: Recommendations from European Academics

The implementation of Art. 17 of the Copyright in the Digital Single Market (C-DSM) Directive is ongoing. In particular, the multi-stakeholder dialogue under Art. 17(10) of the C-DSM Directive is happening as I write. To the end of promoting public interest in the implementation process, a group of European academics (including João Quintais, Stef van Gompel, P. Bernt Hugenholtz, Martin Husovec, Bernd Justin Jütte, Martin Senftleben and myself) has drafted a document with recommendations on user freedoms and safeguards included in Article 17 of the DSM Directive. Read more about Safeguarding User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive: Recommendations from European Academics

US and UK CLOUD Act Wiretapping in Third Countries: It Is a Real Problem

My blog post on the big interception flaw in the CLOUD Act and US-UK Agreement generated some interesting responses, mostly offline, arguing that it is legal for the US or UK to use providers in their countries to wiretap users in third countries without the consent or knowledge of the third country. Read more about US and UK CLOUD Act Wiretapping in Third Countries: It Is a Real Problem

Court Decision Clears Way for State Net Neutrality Laws

On Tuesday, the D.C. Circuit Court of Appeals issued a ruling on the challenge to the FCC’s 2017 net neutrality repeal. The ruling barely upheld the repeal, but sent it back to the FCC for failure to deal with public safety and for deficiencies related to Lifeline subsidies and access to utility poles by broadband-only providers. Read more about Court Decision Clears Way for State Net Neutrality Laws

Deepfakes Article in the Washington State Bar Association Magazine

I'm pleased to have written the cover story for the latest issue of NWLawyer, the magazine of the Washington State Bar Association. The article, available here, discusses the impact that so-called "deepfake" videos may have in the context of the courtroom. Are existing authentication standards for admission of evidence sufficient, or should the rules be changed? What ethical challenges will deepfakes pose for attorneys? How will deepfakes affect juries? Read more about Deepfakes Article in the Washington State Bar Association Magazine

Filtering Facebook: Introducing Dolphins in the Net, a New Stanford CIS White Paper - OR - Why Internet Users and EU Policymakers Should Worry about the Advocate General’s Opinion in Glawischnig-Piesczek

Filtering Facebook: Introducing Dolphins in the Net, a New Stanford CIS White Paper
OR
Why Internet Users and EU Policymakers Should Worry about the Advocate General’s Opinion in Glawischnig-Piesczek

White Paper: Dolphins in the Net: Internet Content Filters and the Advocate General’s Glawischnig-Piesczek v. Facebook Ireland Opinion Read more about Filtering Facebook: Introducing Dolphins in the Net, a New Stanford CIS White Paper - OR - Why Internet Users and EU Policymakers Should Worry about the Advocate General’s Opinion in Glawischnig-Piesczek

Tool Without A Handle: A Duty of Candor

The law and legal professional ethics require of counsel a duty of candor in the practice of law. This includes a duty to not knowingly make false statements of fact, and to not offer evidence the lawyer knows to be false. These principles are considered essential to maintaining both substantive fairness for participants in the process, and trust in the integrity of the process for those outside of it.

Users of information tools in public contexts are not, of course, subject to the same duties. And publication of false information is generally protected by the First Amendment, unless it falls into one of the defined exceptions. I’m doubtful a law against publication of false information would be sustained.

It is, however, perfectly acceptable for most information technology platforms to adopt such a policy and seek to enforce it as best they can. That is, platforms could create and enforce rules against publication of information known to be false. A recent publication from the NYU Stern Center for Business and Human Rights contends platforms should do so. This post concurs: subject to some limitations, private platforms can and should take a position that use of their services to intentionally or carelessly spread false information violates terms of service. Read more about Tool Without A Handle: A Duty of Candor

Pages

Subscribe to Stanford CIS Blog