
Chuck Cosson is Director, Legal Affairs, Privacy & Security, at T-Mobile US, based in Bellevue, WA. At T-Mobile, Chuck oversees privacy compliance programs and provides legal guidance on mobile Internet, location services, incident response, and other privacy, security, and business issues. Chuck spent 7 years at Microsoft leading that company’s public policy work on human rights, free expression, and child online safety. He has also worked in Washington, D.C. on telecommunications policy and regulation. His engagement with Stanford focuses on the role of metaphor as a guide for contemporary technology law and policy - a conception of the Internet not as a “place you go” but as a “tool you use.”
“Tool Without A Handle: Spirituality, Virtue, and Technology Ethics - Part 2”
By Chuck Cosson on January 17, 2021 at 3:31 pm
This post visits some additional concepts of virtue found in Christian teaching supplementing concepts from other traditions such as Aristotle (natural law tradition), Buddhism, and Confucianism, namely:
Consumer preferences are not always the same as consumer interests;
Winning is not the most important thing;
Solitude matters as much as engagement;
If the only values applied to Internet services are to “give people what they want,” “win followers and ads at all costs,” and “maximize reach and engagement” we will be vastly underequipped to deal with the problems those services – and the people who use them - would create, both presently and yet to come. And we will fail to respond to our present moment, one characterized by trauma, wounding, and loss that should indeed motivate us to pursue new thinking and new approaches. Read more about “Tool Without A Handle: Spirituality, Virtue, and Technology Ethics - Part 2”
“Tool Without A Handle: Spirituality, Virtue, and Technology Ethics”
By Chuck Cosson on October 24, 2020 at 3:42 pm
A review of Shannon Vallor’s excellent book Technology and the Virtues, which details perspectives on virtue from Aristotle, Confucius, and Buddhist perspectives, suggests the inquiry would benefit from engagement with Christian Neo-Platonic and derivative perspectives. I agree, though here I extend the engagement to a more general set of Christian perspectives on virtue.
To do this, a Christianity emphasizing humility is preferable to one emphasizing difference and retribution. The goal is to be a candle, not a torch. This Christianity is well aware humans are often guided more by mental shortcuts than by objective analysis and rational choice. The “ego is the enemy” as one author put it. Which is to say, importantly, that the person is not the enemy; the person is not the problem.
Within each person, of any status, race, sexual or gender identity, age, or religious practice, is the divine and the good. I think it’s a mistake to place blame on what technology is “doing to us.” In the “software” of our DNA is a superior human capacity, one that can hear divine goodness. Rather than ignore it and treat humans as inexorably enslaved to our prejudices, a principle of virtue should aim at not only changes in technology design but also at defining a social consensus of personal accountability to emotional growth. Read more about “Tool Without A Handle: Spirituality, Virtue, and Technology Ethics”
Tool Without A Handle: Tools, Trends, Technology
By Chuck Cosson on May 25, 2020 at 3:39 pm
For further insights on managing misinformation, we should look to the ways in which humans form identity through imitation, purge enmity through scapegoating, and often lack the inability to internally generate a clear sense of preferences or make choices that align with them.
One of the mechanisms worth analyzing is the human tendency to assign trajectories to immediate observations and, similarly, to be attracted to "trend stories" wagering predictions. This tendency contributes to misinformation problems as it assigns undue weight to both the ability of the predictor and the probability the prediction will come to pass.
I prefer to think, though, that rightness demands we protect the right of humans to so choose, even if it means they reject truth for fantasy. And even if free choice is inhabited with a bit of illusion, one created by subconscious beliefs that control our thinking, and thus our actions, without our immediate awareness.
Generating shared perspectives is an important component of this response. Misinformation flourishes in environments where shared perspectives are weak. Art can help illustrate, in ways that argument and evidence cannot, shared qualities of experience and perspective. Read more about Tool Without A Handle: Tools, Trends, Technology
Tool Without A Handle: “Book Review: Tools and Weapons”
By Chuck Cosson on November 17, 2019 at 2:44 pm
Tool Without A Handle: “Book Review: Tools and Weapons”
“Since the dawn of time, any tool can be used for good or ill. Even a broom can be used to sweep the floor or hit someone over the head. The more powerful the tool, the greater the benefit or damage it can cause. While sweeping digital transformation holds great promise, the world has turned information technology into both a powerful tool and a formidable weapon” --- from Tools and Weapons Read more about Tool Without A Handle: “Book Review: Tools and Weapons”
Tool Without A Handle: A Duty of Candor
By Chuck Cosson on September 3, 2019 at 11:00 pm
The law and legal professional ethics require of counsel a duty of candor in the practice of law. This includes a duty to not knowingly make false statements of fact, and to not offer evidence the lawyer knows to be false. These principles are considered essential to maintaining both substantive fairness for participants in the process, and trust in the integrity of the process for those outside of it.
Users of information tools in public contexts are not, of course, subject to the same duties. And publication of false information is generally protected by the First Amendment, unless it falls into one of the defined exceptions. I’m doubtful a law against publication of false information would be sustained.
It is, however, perfectly acceptable for most information technology platforms to adopt such a policy and seek to enforce it as best they can. That is, platforms could create and enforce rules against publication of information known to be false. A recent publication from the NYU Stern Center for Business and Human Rights contends platforms should do so. This post concurs: subject to some limitations, private platforms can and should take a position that use of their services to intentionally or carelessly spread false information violates terms of service. Read more about Tool Without A Handle: A Duty of Candor