Tool Without A Handle: A Mere Gallimaufry

Tool Without A Handle: “A Mere Gallimaufry”

This blog has spent a good deal of real estate discussing networked information technologies as tools, but has not yet dealt thoroughly with the qualifier in its title: tools “without handles.”  The expression is certainly not my original coinage, though I have yet to find an attribution either.  The metaphor has been used by other writers in a variety of ways:  to describe a character lacking self-restraint,[1] a “sailor without a grievance,”[2] and a “mere gallimaufry of inconsistencies.”[3]

The addition of “without a handle” is intended to indicate that my primary metaphor of a tool in the control of a user - and thus my general preferred approach to Internet policy and regulation, favoring individual control and accountability for uses of tools – needs to be leavened a bit.  The leavening is to take into account that personal autonomy presumes a degree of personal control, i.e., a handle for the tool which a person can use to direct its actions to align with intentions.  And there is perhaps no more familiar quality of networked information technologies than that they rather often do not work as intended.

Nearly everyone has had experiences with networked information technologies that range from mild frustration to complete loss of data and functionality.  The list is painful and familiar:  passwords that have expired and can’t be reset because the original password has long been forgotten.  Documents that save to unknown file locations.  User interfaces that are impossible to intuit, or which lack the ability to do common functions simply (yes, I’m speaking to you, iTunes).  Dropped connections, certificate errors, frozen keyboards, the “blue screen of death,”[4] spinning pinwheels – I’ll just stop there. 

What this illustrates, and is of importance for policymakers, is we fall short if we expect the user experience will always involve user intentions being carried out seamlessly and directly by the tools.  There is merit in various approaches to improved user control, including features such as transparency, notice, additional user controls, data portability and improved user interfaces.  And there will be both market-driven and potentially legally driven improvements in these features.  So, I believe technology will continue to improve the extent to which user experiences align with user intentions. 

But it is part of the essence of technology that there will always be some daylight between the user’s intention and the path or consequences that follow from the use of an information tool to carry it out.  Moreover, in this blog I argue that to most fully address this, more than technical innovation or regulatory controls are required.  More information or features for users won’t close this gap unless there is also a greater understanding from users that they are not simply users, but also makers.

As any technology maker could tell you, the tools used to make consumer networked information tools themselves often lack handles.  That software programs need to be designed, re-designed, built and rebuilt, tested and debugged, and occasionally repaired, is the nature of technology itself.[5]  For technology and its use is an activity that aims at control, definition, and moving that which is to that which it is not.  Speaking anthropomorphically, creation encounters resistance; that which is prefers to be as it is.

And this fact is widely accepted:  we expect the creative experience to be challenging, and we understand the technology is necessarily complex to build.  No one mastered playing Chopin without several years of practice, with dozens of flubbed notes along the way.  Few of us mastered riding a bicycle – a far simpler form of technology – without some form of training wheels (and perhaps a mishap or two, hopefully attended to by a loving parent). 

Heidegger’s “Question Concerning Technology”[6] refers to a poesis - a bringing-forth, the movement of something from unknown to revealed.[7]  The technology making process – coding software, building radio networks, connecting servers, even designing a business model to support the costs of delivering the technology - is one where makers understand, even celebrate, that there must be some road of trial and error between their intended visions and the actual activity of the technology.

That’s not to say makers don’t have setbacks or frustrations.  Robert Pirsig called these “gumption traps.” A “gumption trap” is a problem in working with technology that, rather than teaching what does not work, deters the mechanic from the enthusiasm needed for success. To be trapped is to lose the proper emotional and intellectual relationship to the work.  Gumption traps can result from external sources or internal anxiety, and the antidote to them is “gumption,” or what may be more modernly called “resilience.”[8]        

The difference between a technology maker and a technology user is that, for the maker, the tool itself is the end objective.[9]  Setbacks naturally need to be overcome in order to fulfill the objective.  For a technology user, the technology is a means to another end (I’m using a laptop, Word software, and the Internet to write this blog, for example).  What I am “making” is the blog, not the software.  I don’t particularly care how the software is doing anything, so long as my words get typed.

So, a setback in the technology isn’t as easily seen as part of the objective. It’s just an interruption and a detour from the intended path.  Gumption traps are thus more easily spawned.  Moreover, like most users, my specialized knowledge isn’t in tool building.  If something does go wrong, I’m likely at a loss as to how to fix it. 

Over time, I’ve learned a few common tricks (rebooting, ctrl-alt-delete) and how to find additional help (support pages, or if that fails, YouTube).  But what I possess is rudimentary knowledge.  Mastery of how to make and use networked information technologies is not necessarily within everyone’s grasp or on everyone’s agenda, nor am I saying it needs to be.  It’s OK for users to just want the stuff to do what it is they want it to do.  Even a diligent person may want to commit her capacity and energies to something other than mastering coding.

What I am saying, though, is users can and should take a different conception of what it is they are doing when they use networked information technologies. While it may feel like I, as a user, am a puppet at the mercy of the blue screen of death, or the victim of the lost file, or the neglected observer of the spinning wheel, in a very real sense I am also the maker of those experiences.  Or at least, of my emotional reaction to them.  One way to get a “handle” on networked information tools is to realize the meaning of events is something in which human users have a role, perhaps the only role.[10]

One may not want, need, or be able to master troubleshooting a dropped network connection, but everyone is able to master their reaction to it.  Yes, practice is, as anyone who has mastered anything knows, a matter of self-mastery.  But there’s no reason networked information technologies should offer 100% perfect wish fulfillment with zero effort than there is any reason (or reasonable expectation) anything else would do so either.

A user can become a maker of experiences in a variety of ways.  For all the gallimaufry that is technology there is an equal hodgepodge of possible responses, none of which require a user to become a master programmer.  One logical response to the absence of reliable handles for technology is simply to avoid a given technology in certain cases.  Everyone who has hunted down a landline phone for an important job interview has anticipated the inability of technology to do as intended with sufficient regularity. 

Another, more practical way, is to understand the imperfections of the tool and plan accordingly.  Everyone who has (wisely) backed up files to a thumb drive or cloud storage has understood the nature of the tools with which they work.  Everyone who uses a firewall does so prudently, and everyone using a password manager (rather than reuse the same password repeatedly) does so extra prudently.  Oftentimes, frustration with technologies failures is a bit of projecting onto the tool our own failures.

Sometimes the best response to a technology failure is doing nothing.  Tara Brach relates the anecdote of how pilot Chuck Yeager inadvertently hit on a solution to technology failures when, on a test flight, he was knocked unconscious.  Other, less lucky, pilots in similar spins had frantically, and ultimately unsuccessfully, worked the controls attempting to right a tumbling airplane.  Because Yeager had done nothing, the plane dropped back to denser atmosphere, at which time Yeager awoke and the controls again became responsive.[11]  What may seem like gumption is often actually grace.

Pascal wrote that all human misfortune comes from one thing:  not knowing how to remain quietly in one room.[12]  So too, we can become makers of our experiences with information tools when we remain quietly, being with the tools, and acknowledging their inadequacies.  That’s the handle afforded to us.  If my Internet connection drops, perhaps I can make progress in writing by finding a book.  Good thing my Kindle battery is fully charged…

[1]Madison Smart Bell, Doctor Sleep, (Grove Press, 2003), p. 228, online at: http://bit.ly/2DqDMlW

[2]William Clark Russell, Abandoned, (Methuen, 1904), p.142, online at:  http://bit.ly/2FEACNz

[3]John Collier, The Works of Tim Bobbin, Esq., (John Heywood, Simpkin, 1862), p.217, online athttp://bit.ly/2W8kN6L  (“Gallimaufry” is a word of French origin meaning a varied collection, as in a stew).

[5]And of course, the initial invention of technology is a process of trial and error.  Thomas Edison’s lightbulb is a popular example of this – every failure was, for Edison, simply part of the learning process and the bringing forth of a successful and useful lightbulb involved thousands of experiments, each of which brought forth information as to what would not work.  See https://www.fi.edu/history-resources/edisons-lightbulb

[6]Martin Heidegger, “The Question Concerning Technology,” lecture (1954), translated by William Lovitt (Harper, 1977), online at: https://monoskop.org/images/4/44/Heidegger_Martin_The_Question_Concernin...

[7]See Mark Blitz, “Understanding Heidegger on Technology,” The New Atlantis (Winter 2014) https://www.thenewatlantis.com/publications/understanding-heidegger-on-technology

[8]See, e.g., Sheryl Sandberg and Adam Grant, “Option B:  Facing Adversity, Building Resilience, and Finding Joy,” (Knopf, 2017);  https://optionb.org/build-resilience

[9]That’s not to say technology is only makers and users fall outside of that.  As Heidegger put it, “The manufacture and utilization of equipment, tools, and machines, the manufactured and used things themselves, and the needs and ends that they serve, all belong to what technology is,” Martin Heidegger, in The Question Concerning Technology and Other Essays; see Don Ihde, “Heidegger’s Philosophy of Technology,” in: Technics and Praxis, Boston Studies in the Philosophy of Science, vol 24. Springer, Dordrecht (1979). 

[10]See “Tool Without A Handle:  Tools for Meaning, Part 2,” online at: https://cyberlaw.stanford.edu/blog/2018/11/tool-without-handle-tools-meaning-part-2.  This is certainly true of technology.  As one expert put it, “technology is already the outcome of a technological way of looking and relating ourselves to the world.”  See Lucas Introna, "Phenomenological Approaches to Ethics and Information Technology", The Stanford Encyclopedia of Philosophy (Fall 2017 Edition), Edward N. Zalta (ed.); online at: https://plato.stanford.edu/archives/fall2017/entries/ethics-it-phenomenology

[12]Blaise Pascal, “Pensees,” (Gertrude Rawlings, translator) (Peter Pauper Press, 1900), p.65, online at: https://archive.org/stream/pascalspenseesor00pasc#page/64/mode/2up

 

 

Add new comment