Tool Without A Handle: “Book Review: Tools and Weapons”

Tool Without A Handle:  “Book Review:  Tools and Weapons”

Since the dawn of time, any tool can be used for good or ill.  Even a broom can be used to sweep the floor or hit someone over the head.  The more powerful the tool, the greater the benefit or damage it can cause.  While sweeping digital transformation holds great promise, the world has turned information technology into both a powerful tool and a formidable weapon”  --- from Tools and Weapons[1]

In 2006, US Internet companies’ expansion of online business in China brought to the fore questions regarding the scope of their civic responsibility.  The concept of “corporate social responsibility”[2] – a belief business had responsibilities to the public as well as to shareholders, customers and employees – was not new.  But the tough human rights issues were largely the concern of extractive industries, defense contractors, and firms employing factory labor outside of the U.S.  And an international framework for how technology businesses should address human rights impacts to privacy and free expression would not arrive until 2008.[3]

Technology companies did have several years’ experience with largely domestic issues that involved the scope of their responsibility for impacts of their tools, such as the tradeoff between encryption and law enforcement access, and the constitutionality of various approaches to online pornography.  But when Google, Yahoo!, Cisco and Microsoft faced questions over their responses to Chinese political censorship,[4] questions of responsibility for the implications of technology took a great leap forward. 

A New York Times columnist at the time observed that Microsoft had been “cowardly.”[5] I worked at Microsoft at the time, on those very issues, and the accusation was unfair.  The better word would be “cautious.”  This was new policy territory for technology companies, and to a great extent new territory for legislators, academics, investors and human rights advocates. 

Some critics had reasonable recommendations,[6] such as that Internet firms should attempt to influence change in government policy and insist on adherence to local law (practices Microsoft now undertakes across the globe, as detailed in Tools and Weapons).  Other critics' recommendations were not reasonable, and assumed an outsized ability of firms to influence (or defy) Chinese law and policy.  Recommendations needed to be bold, yes, but rooted in reality.

A widely-held belief within Microsoft at the time was issues involving the impact of products and services needed to be approached with humility.  Elements of this analysis of corporate responsibility included balancing the benefits the tools offered with the costs of offering them in markets like China, incorporating perspective from a variety of stakeholders, and the need for governments to lead within their sphere (rather than leave political issues to be sorted by private firms).  Microsoft, more so than its peers, also held an acute awareness of the problems arrogance and self-satisfaction could create for a technology giant. 

As Microsoft testified at the time, it was “actively exploring how best to protect the interests of our users under these circumstances.”[7] Now more than a decade later, comes Tools and Weapons from Microsoft’s President Brad Smith and senior communications director Carol Ann Browne.  The book illustrates the fruit of more than a decade’s worth of such exploration, not only with respect to Chinese censorship but a host of issues that similarly create difficult trade-offs and pose ethical challenges.

It’s not uncommon of course for political candidates to put out a book (also often with professional writing assistance) to tell their biography, share their interior thoughts, outline their policy positions, and illustrate their capacity to lead.  Tools and Weapons is essentially that, but for a corporation.  And not just any corporation:  it’s a product of Microsoft, with a market value of more than $1 Trillion, and a deeper and longer perspective on technology issues than any of its peers.

Indeed, this is not the first book-length treatment of technology and social issues Brad Smith has written; in my library is a detailed law review article from 2000 entitled “The Third Industrial Revolution: Law and Policy for the Internet,”[8] which is a worthy predecessor to Tools and Weapons.  The “Third Industrial Revolution” reviews a cascade of policy issues for the technology sector, ranging from telecommunications to intellectual property to individual rights, treating each briefly but with insight. 

So too, Tools and Weapons covers the waterfront:  surveillance, privacy, cybersecurity, broadband connectivity, as well as topics that were very much not top of mind in 2000:  geopolitical diplomacy, and the ethics of artificial intelligence to name two.  And it retains both the occasional insight, careful weighing of positions and objections, and Smith’s measured perspective, which is positive but free of techno-utopianism.  Call it ‘upbeat realism.’

For purposes of this blog, focused on the tool metaphor such as we are, I highlight three areas in which Smith and Browne describe tools and the issues with which Microsoft wrestles concerning their use: 

  • - Facial recognition tools
  • - Social media tools
  • - "Open data" tools

Microsoft’s 2006 testimony on the Internet in China made essentially three points:  1) cogent criticism of particular uses of tools should acknowledge the benefits of such tools as well; 2) multi-stakeholder and consensus building approaches to establish operating principles are preferable to draconian legislation, and 3) there is no substitute for the role of governments in setting the rule of law and answering these tough questions on behalf of the people they represent.[9]  The discussion of facial recognition in Tools and Weapons bears these same hallmarks. 

That discussion identifies how facial recognition is being used to help physicians bank more securely, and to diagnose DiGeorge syndrome, a disease primarily afflicting people of color.[10] And, it describes how the prospect of Microsoft facial recognition tools being deployed by the Trump administration’s immigration authorities woke up not only executive consideration of the matter but employee activism. 

This affords the authors a moment to reiterate Microsoft’s interest in the proper role of government.  As Smith notes, “no one elected us.  It seemed not just odd but undemocratic to want tech companies to police the government.”[11]  As a lawyer, and as I’ve noted in this blog, I have a deep sympathy for this proposition that government should play its proper role, and I have argued for resisting the temptation to pull policy decisions away from democratic processes for forming consensus on acceptable areas of regulation.[12]

As Tools and Weapons elaborates, though, the work of defining boundaries on the use of facial recognition tools did largely fall to the private sector, including Microsoft, who engaged with employees, experts, civil liberties groups and governments around the globe to gather perspective and identify areas of consensus.  Tools and Weapons sets out some thoughtful responses, including protections for consumer privacy, boundaries for government surveillance, and conspicuous notice when facial recognition is deployed.  These are codified in principles Microsoft published both as legislative proposals and for its own operating guidance.[13]  There is, to date, not likely legislation forthcoming from governments to codify them.  

Similarly, the discussion of social media tools and their impact on democracy ably describes both the benefits and challenges of social media tools, noting that other technologies such as cars and telephones had similarly created externalities from otherwise valuable uses.  It discusses several contemporary challenges, including fragmenting community and use of social media tools to spread disinformation.[14]

I was a bit disappointed the book did not address the use of social media tools during the “Arab Spring,”[15] which brought to the fore the ability of such tools to organize for democratic change, or the US government’s promotion of “21st century statecraft,”[16] emphasizing the use of social media tools to foster U.S. foreign policy goals, including promotion of democracy and accountability. 

I say only "a bit," as it’s understandable these trends and beliefs were omitted given their current state of disrepair, and that Tools and Weapons is clearly focused on contemporary and recent issues.  Nonetheless, drawing the contrast between the optimism then, and the anger now in response to disinformation and terrorist propaganda, would have been even more revealing of the dual-use nature of social media tools.

Of course, Hillary Clinton, proponent of “21st century statecraft” as Secretary of State in 2010, would have a profound reckoning with social media and disinformation just six years later.  It is these events that Tools and Weapons touches on to, again, illustrate the value in a multi-stakeholder process to generate rough consensus on appropriate responses to concerns, with both public and private sectors needing to play complementary roles.[17]

The book does illustrate that past practice offers lessons, but instead of offering anecdotes from the beginning of the Internet age, it offers one from the beginning of U.S. democracy:  the foreign influence campaign of Edmond Charles Genet.  As the book elaborates, Genet arrived in the United States in April 1793 to, among other things, spread disinformation and tip U.S. policy towards supporting France in its war with the United Kingdom. 

Tools and Weapons observes Genet’s activities were restrained in part only after a show of unity between Hamilton and Jefferson – who were otherwise bitter political rivals – a type of unified approach among democratic stakeholders needed today.  As to whether this type of response would be effective in contemporary society, the ‘upbeat realism’ of Tools and Weapons replies cautiously, with a “probably yes.”

Finally, the chapter on “Open Data” ends on a similar note of conditional optimism, with similar prescriptions for integrating beneficial and harmful uses of data:  public-private partnerships, generating consensus on principles, and unabashed belief that despite harms, we should move forward with harnessing the valuable uses.  It introduces, somewhat indirectly, two other important observations. 

The first is the importance of scale.  Effective artificial intelligence (including facial recognition), social media and other applications rely on massive data sets, and some degree of coordination among a large number of participants, across organizational, legal, social and cultural vectors.[18]  Yes, as Tools and Weapons correctly notes, smaller players in the open data space can take comfort in the fact that data is non-rivalrous and created everywhere, by every human actor.[19]  But the case studies illustrating the power of data aggregation and analysis tools, including cancer research and political campaigns, still happen at massive scale.  Many of the important data sets are created by governments, and the book proposes similar organizational scale-building through data “easements” and similar arrangements.[20]

This, perhaps unintentionally, presents a second observation that emphasizes why this book is important.  There are thousands of technology firms who wrestle with the same issues of facial recognition, disinformation, open data, as well as the more fundamental issues discussed in the book:  privacy, cybersecurity, and economic opportunity.  Many, if not most, lack the resources and influence to broker international deals, file lawsuits and risk fines (or, perhaps worse, an antagonistic tweet from the U.S. President) in order to stand on principle,[21] meet with hundreds of stakeholders, and obtain audiences with heads of state and other world leaders, including the Pope.22]  Whether or not you agree with the analysis Microsoft presents here, or the path it is traveling, they're owed credit for tackling the issues, leading a path through the thicket, and being open with the public about what they've learned and, in its best moments, what they're still wrestling with.

13 years ago, Brad Smith addressed the American Society of International Law, opening with the rhetorical question: “why the heck did you invite someone from Seattle to talk about the next century of international law?[23]  In partial answer, Smith goes on to note that Microsoft, at any given time, is involved in thousands of lawsuits, in over 100 countries.  A company of this outsize influence, whose software is (still) part of the IT system of virtually every business, non-profit, educational and government institution, certainly should be giving this much consideration to ethics, and in particular to the implications of the tools it builds, sells and maintains.

That the Holy Father granted an audience to Microsoft to discuss ethics is a testament to the ground Microsoft has covered, with the help of many others, since it began its exploration of ethics and the impact of technology on society in earnest.  It’s a worthwhile example.  Even though, in places, Tools and Weapons exhibits a bit too much of the corporate communications polish, Microsoft’s leadership is willing to speak out, take a stand, take us ‘inside the cockpit’ and risk criticism, and to acknowledge the lessons it learned from hindsight.      

What Tools and Weapons tells us, furthermore, is that the road ahead is not paved with techno-optimism.  The issues it confronts are often difficult, and some of its solutions feel still incomplete.  For example, the book's discussions on content regulation and cybersecurity overlooked the role of identity management – the fact that “the Internet was built without a way to know who or what you are connecting to.”[24]  And some of the prescriptions feel a bit anodyne, as if coming together to work on a problem is itself a solution rather than a step on the journey.  

But overall the approach Microsoft adopted in 2006 continues to be the right one.  The duality of tools, as this space has noted frequently, is a reality to be managed, not a problem that can be made to disappear.  Approaching this by promoting beneficial uses, synthesizing stakeholder input on limiting principles, and defining roles for government continues to bear fruit, even if perfect solutions are evasive.  At least, like the authors of Tools and Weapons, it’s an approach I endorse with upbeat realism.  

[1]Brad Smith and Carol Ann Browne, Tools and Weapons (Penguin Press, New York, 2019).

[3]This would be the “Protect, Respect and Remedy” framework, also known as the “Ruggie Framework” after John Ruggie, the academic who orchestrated its creation at the request of the U.N. Secretary General’s office  See https://www.undocs.org/A/HRC/8/5 ; see generally Mariarosaria Taddeo & Luciano Floridi, New Civic Responsibilities for Online Service Providers (Springer, February 2017) online at: https://bit.ly/2CxXV82

[4]See, e.g., NY Times, Feb 15, 2006, “Online Firms Facing Questions About Censoring Internet Searches in China,”  https://nyti.ms/2q2LTRE. Notably, Rep. Tom Lantos criticized the Internet “giants” as “moral pygmies,” and urged them to defy repressive Chinese law and regulation.  https://www.c-span.org/video/?c4749525/user-clip-lantos

[5] Nicholas Kristof, “China’s Cyberdissidents and the Yahoos at Yahoo,” Feb 19, 2006, https://www.nytimes.com/2006/02/19/opinion/chinas-cyberdissidents-and-the-yahoos-at-yahoo.html;

[6]See, e.g., Rebecca Mackinnon (for Human Rights Watch), “Race to the Bottom: Corporate Complicity in Chinese Internet Censorship, https://www.hrw.org/reports/2006/china0806/7.htm#_Toc142395835

[7]Jack Krumholtz, Congressional Testimony (Oral Version): “The Internet in China: A Tool for Freedom or Suppression?” online at: https://news.microsoft.com/2006/02/15/congressional-testimony-oral-version-the-internet-in-china-a-tool-for-freedom-or-suppression/

[8]L. Smith, Bradford, “The Third Industrial Revolution: Law and Policy for the Internet (Volume 282)”, in: Collected Courses of the Hague Academy of International Law, (Martinus Nijhoff 2000) online at: http://dx.doi.org/10.1163/1875-8096_pplrdc_A9789041114891_03

[9]See Krumholtz Testimony, supra n.7.

[10]Tools and Weapons, p.213;

[11]Id., p.219.

[13]Rich Sauer, “Six Principles to Guide Microsoft’s Facial Recognition Work,” Microsoft on the Issues, (December 17, 2018), online at: https://blogs.microsoft.com/on-the-issues/2018/12/17/six-principles-to-guide-microsofts-facial-recognition-work/

[14]Tools and Weapons, p.94-95.

[15]For some examples of the literature on how and to what extent social media influenced democratic movements in North Africa in 2010-2012, see Pew Research Center, “Role of Social Media in Arab Uprisings,” online at: https://www.journalism.org/2012/11/28/role-social-media-arab-uprisings/, U.S. Institute of Peace, “New Media and Conflict After the Arab Spring,” online at: https://www.usip.org/sites/default/files/resources/PW80.pdf;  Stanford Public Policy Program, “The Impact of Social Media on Social Unrest in the Arab Spring (2012), online at: https://publicpolicy.stanford.edu/publications/impact-social-media-social-unrest-arab-spring   This last Stanford report is notable as it, aptly, ends on a cautionary note, warning of the increased use of social media by authoritarian governments to repress opposition movements and stymie democratization.

[16]U.S. Department of State, “21st Century Statecraft,” online at: https://2009-2017.state.gov/statecraft/overview/index.htm (“The emergence of new kinds of information systems catalyzes change in national politics and international relations. During the Arab Spring, digital networks distributed revolutionary mass media produced by thousands of individuals in the streets of Tunis, Cairo, and Tripoli. These networks also enabled rapid movement building with extraordinary impact. But these technologies are not ideological, democratic or progressive by nature. They enable the desires of users and amplify existing social and political forces. The dynamism of networked societies delivers both positive and negative outcomes to which we must respond.”); see also Craig Mundie, “Internet Freedom,” Microsoft on the Issues (Jan 21, 2010), online at: https://blogs.microsoft.com/on-the-issues/2010/01/21/internet-freedom/ (welcoming Secretary Clinton’s remarks on ‘21st Century Statecraft’ and observing that Microsoft “applaud[s] the heightened attention she’s brought to the important issues of free expression and privacy. These issues are at the heart of what we do to help people and organizations use technology to reach their full potential. In particular, we agree with Secretary Clinton that both governments and the private sector have important roles to play…”).

[17]Tools and Weapons, pp.95-99.

[18]Id., p.276;

[19]Id., p.274-75;

[20]Id., p.283;

[21]Brad Smith and Christopher Eisgruber, “Why We Took Our Fight for DACA Recipients All the Way to the Supreme Court,” TIME, November 10, 2019, online at: https://time.com/5723512/daca-supreme-court-hearing/?

[22]Carol Glatz, Catholic News Service, “Pope meets head of Microsoft to discuss ethics in technology, AI,” online at: https://cruxnow.com/vatican/2019/02/14/pope-meets-head-of-microsoft-to-discuss-ethics-in-technology-ai/

[23]“Brad Smith: American Society of International Law Second Century Dinner,” November 3, 2006,  online at: https://news.microsoft.com/speeches/brad-smith-american-society-of-international-law-second-century-dinner/ (January 31, 2007) (““it’s quite possible that we have more cases in more courts in more countries than any institution on the planet”).

 

Add new comment