“Tool Without a Handle”: Tools and the Search for Meaning
In a New York Times review of Edward Tenner’s book The Efficiency Paradox, Gal Beckerman observes that a key point is not simply to watch how much time we spend using technology, but to remember that “the tools we’ve invented to improve our lives are just that, tools, to be picked up and put down. We wield them.”[1]
Which pretty succinctly states the main point of this entire series of blog posts: that human agency matters. Or, perhaps more directly, that while we all know human agency matters, we all too frequently overlook that point. This post (on Labor Day, 2018) thus asks what is the real value of human agency? By identifying value in it, I hope to set the stage for future posts on the urgency of fostering greater awareness of it.
Tenner’s book points to one argument for the value of human agency: that algorithms cannot replace all human qualities, in particular intuition, experience, and skill. Algorithms can exceed human capacity in many areas, including speed and predictability of outcomes. But algorithms are notoriously imperfect. For example, in areas of content moderation, most programs still require human intuition and sensitivity to context (and thus firms rely on large trained staff of moderators).
Fictional robots and androids, such as Star Trek’s Commander Data,[2] illustrate this well. As an android, Data is capable of calculations and performances that far exceed human capabilities, yet he falls short of human capabilities in areas such as processing emotions or understanding humor. This contrast between the power of machines and their limitations is a staple of science fiction, as well as classic myths and folk tales.
One of my favorite childhood stories (and associated folk songs) was that of John Henry, who defeated the steam drilling machine. The story of John Henry illustrates not only the triumph of man vs. machine, but also a victory for labor vs. capital (the money and research needed to create the machine). For stories that tell of the ‘rise of the machines’ with foreboding are not about man v. machines at all, but about conflict within humanity, and whether it possesses the collective self-love needed for its own self-preservation, or whether shame, greed, and ignorance shall produce division and ruin.[3]
And so, making space for human agency requires we supplement the goal of efficiency (at which machines may often excel) with other goals. I say “supplement” because as Tenner notes, we do not need to make this an either/or. Indeed, both intuition and algorithms are both needed. The paradox of efficiency is not that efficiency is bad, but that pursuing greater efficiency in every instance and operation is wasteful. Greater efficiency often comes from innovation, and innovation is necessarily an inefficient process.[4]
Hence major technology companies have “innovation labs,” or “research centers” or various other teams working on ambitious technologies such as airborne Internet connectivity or self-driving cars. These centers are by design insulated from the normal profit and loss pressures of product development, marketing and sales, and are often partnered with academic research institutions. These structures illustrate the value – to consumers, investors, and companies – of cultivating intuition, skill and experience even at the expense of efficiency.
Intuition, skill and experience are somewhat imprecise terms, so I will attempt a short description, starting with intuition (which I believe is the force from which skill and experience arise). Let us call intuition that force that compels us to a positive certainty when nothing else can. When there is no logical proof, nor even data to allow probabilistic conclusions, one can still arrive at correct conclusions and make sound decisions. The resource used by humans to do so is intuition.
Intuition can be illustrated in sports: a batter in baseball, or a kicker in rugby, will at some point make contact with the ball and know, with 100% certainty, whether the ball will reach its intended direction or distance. More prosaically, any one of us who has tossed a paper wad at a trash can has, in a fashion, used intuition to predict the future. The certainty that the object will land inside the can (or not) arrives before it has arrived there. Sensory impressions are not the source of this certainty, and yet its certainty is no less real to the person experiencing it.
In biblical literature, intuition was the voice of God that led many figures (as disparate as an unclean woman and a Roman centurion) to show great faith, notwithstanding the fact that such actions would be considered rash if measured against social codes or contemporary views on the laws of nature.
Intuition has, similarly, kept great pioneers on their path notwithstanding forces arrayed against them, including skepticism, resistance, and contemporary views about right and wise behavior. Steve Jobs’ vision of an Apple computer, for example, was dismissed by one Stanford-educated scientist as “you’ve got to be joking.”[5]
Some philosophers have gone so far to create theories of pre-knowledge by which evidence is understood.[6] One need not embrace this view, though, to see intuition is a necessary precursor to experience.
Experience is activity linked to us in some way. It is the very stuff of life itself. We are, in a sense, fish swimming in experience; it is so much a part of our existence we struggle to envision its nature. Heidegger’s concept of “thrownness”[7] is an attempt to illustrate, in part, the arbitrary quality of experience. It happens (or did happen) to us seemingly from nowhere, without intent or design (though one can certainly later ascribe intent to it).
Experience, in turn, promotes skill. Experience is not a guarantee of skill: memories can be forgotten or suppressed, for example. But it does strengthen neural pathways and lead to associations and predictions. Skill comes, most directly, from working through problems with a given tool, and not from theoretical or intuitive knowledge of how the tool should work.[8]
Finally, skill is what, among few other things, gives us meaning. Human flourishing depends on a sense of meaning. As Viktor Frankl’s book illustrated, this sense of meaning is essential for life even amidst (in fact, especially amidst) suffering, physical hardship, abuse and disease.[9]
Here in Seattle, the famous “Artis the Spoonman,” has experienced poverty and illness, but his unique skills with tools have given him identity and a sense of meaning[10](as well as inspired a song by Soundgarden).
This is the answer, then, to the question of the value of human agency in thinking about information and communications technology: incorporating human agency affords an opportunity for those using the tools to find meaning and identity in such uses.
Just as astronauts wanted to be known as pilots, not simply highly brave passengers,[11] all of us, at our best, want to feel our engagement with information technology makes a difference.[12] And in contrast, feelings of helplessness lead to anxiety and despair. Optimal policy thinking, then, involves strategies that allow people sufficient control over technologies that they may, in turn, follow their intuition and leverage their experience to increase skills and thereby find meaning and thus emotional equilibrium.
In the next installment, I’ll discuss why this focus on greater user control does not, in my view, dominate contemporary discussions and why I see signs of a social crisis associated with the absence of meaning.
[1]Gal Beckerman, “What Silicon Valley Could Use More Of: Inefficiency,” https://www.nytimes.com/2018/06/04/books/review/edward-tenner-the-efficiency-paradox.html
[2]The odds that anyone reading this blog is not aware of Commander Data or this show are likely remote. Just in case, though, I'll note Commander Data is a fictional character on the television program “Star Trek: The Next Generation”; an android programmed to aspire, heuristically, to become more human.
[3]Mary Shelley’s Frankenstein, published in 1818, is perhaps one of the earliest examples of this theme; though its subtitle, “A Modern Prometheus,” shows that the concerns go still further back. Myths such as that of Prometheus (or of Icarus) are classical examples of the challenge of human self-mastery when presented with new and powerful technologies. These two myths also illustrate the question at the core of such stories: are humans so ashamed of our inability to perform as powerfully and efficiently as machines that we will, in effect, commit suicide by ceding control to what we build?
[4]Edward Tenner, The Efficiency Paradox: What Big Data Can’t Do, (Deckle Edge: April 2018), at p.41-42. Online at: https://amzn.to/2wDSyRp
[5]https://www.mercurynews.com/2013/09/27/steve-jobs-old-garage-about-to-become-a-piece-of-history/
[6]See, e.g., Plato, Meno 80d and Phaedo, 66b-d.
[7]“Thrownness” is a translation of the German word Geworfenheit.
[8]Tenner points out that for experience to bear fruit, the effort required needs to be difficult enough to foster retention and muscle memory, similar to physical exercise. Tenner, supra n.4, at p.118.
[9]Viktor Frankl, Man's Search for Meaning (Pocket Book, 1984), p.94-95 (Frankl reports finding relief from despair through an emotional connection with an image of him delivering an enlightening professional lecture on psychology and the concentration camps, i.e., by seeing his suffering through an objective lens, and connecting it with purpose and his own skill as an academic psychologist).
[10]In his own words, “My economic situation [being in debt] is queer as a sea star. I’ve been giving the Spoonman away since 1974. I have 4 grandchildren and 2 great grandchildren. I have the most illustrious Spoonman life one could imagine.” https://busk.co/blog/stories-from-the-pitch/a-long-interview-with-artis-the-spoonman/
[11]See generally Tom Wolfe, The Right Stuff (Farrar, Straus and Giroux, 1979).
[12]See “Tool Without a Handle: Mutual Transparency in Social Media” (affording social media users greater control over their news feed could help address concerns with the impact of social media user activity on mental health). https://cyberlaw.stanford.edu/blog/2017/06/tool-without-handle-mutual-transparency-social-media