The Glass Cage is about automation’s human consequences. It is not intended to be your typical book about robots taking our jobs for better or for worse.
Carr gives an intriguing account of the history of automation and robotics nonetheless – from Luddites to Google’s self-driving cars. What we have known intuitively is backed up by research: We cannot all fund robotics startups, and the number of new jobs created through automation has always been low. In spite of success stories of people ‘making money online’ it is the providers of infrastructure (the ones Jaron Lanier calls Siren Servers) who actually make money. Technology changes faster than humans do, taking a ride on Moore’s law – but Carr is not a believer in technology that will automagically serve all humankind:
It strains credulity to imagine today’s technology moguls, with their libertarian leanings and impatience with government, agreeing to the kind of vast wealth-redistribution scheme that would be necessary to fund the self-actualizing leisure-time pursuits of the jobless multitudes.
He wonders why Google has mastered to build a self-driving car – a task once considered too difficult to be automated by any computer ever – but yet didn’t develop software that stops people from texting while driving. Perhaps because stopping distractions would run counter their business agenda? More disturbing than the effect on employment is the way automation may impact our skills, illustrated by the history of avionics. We have come a long way since …
… the deep entanglement between human and mechanism was an elemental source of flying’s thrill,
… and pilots felt physical feedback from the machine. The books starts with a personal anecdote about Carr’s missing the sense of control and involvement when driving an automatic. The Glass Cage is a poetic metaphor for the pilot’s cockpit. Carr returns to a topic he had dwelt upon in The Shallows: the role of maps and clocks as an essential layer put between us and space or the flow of time. In glass-cage-like workplaces former machine operators or soldiers turn into technicians reading and manipulating representations of the world. Automation and tools done right would still give us the feeling to be in control. Electronic airplane controls should rather resemble the older mechanical controls. Clunky yokes that provide sensory information let the pilot feel physical resistance – and are superior to sci-fi-style joysticks. Carr distinguishes between tools that work like mechanical extensions to our body – using the scythe as a prime example – and software-based technology that is experienced as a kind of implacable, alien force that lies beyond our control and influence. Quoting from a 1910 book on aeronautics, designing a plane to be operated is
… a trade-off between stability and maneuverability. The greater a plane’s stability, the harder it becomes for the pilot to exert control over it.
Pioneers as the Wright Brothers voted for a plane unstable as a bicycle, giving the pilot utmost freedom. Carr tries to do technology optimists justice – he is never sarcastic or derisive. He traces the hopes put into ‘software’ back to philosopher Alfred North Whitehead:
“Civilization advances by extending the number of important operations which we can perform without thinking about them.” Whitehead wasn’t writing about machinery. He was writing about the use of mathematical symbols to represent ideas or logical processes— an early example of how intellectual work can be encapsulated in code. But he intended his observation to be taken generally.
‘Automation’ can thus be understood in a very broad sense. I have written about Newton’s geometrical proofs that even Richard Feynman found very hard to reproduce. Now we have been spoilt by the elegant code-like symbols of calculus. Do really miss out if we not haven’t acquired such ancient skills? Carr believes so as we are human beings made to interact with the world directly, not via a cascade of devices and abstractions. A physics professor who has embarked on “a self-imposed program to learn navigation through environmental clues” finally concluded that the way he viewed the world had palpably changed. Architects felt that they needed to stay away from electronic help or bring in the computer late so that the creative process is not (mis-)guided too early. A photographer tells his story of returning to the darkroom as he felt that the painful manual process forces him to make more conscious and deliberate choices – with a deep, physical sense of presence. The main point here is that these are not sentimental crusaders but people who simply wanted to do their jobs well.
… the real sentimental fallacy is the assumption that the new thing is always better suited to our purposes and intentions than the old thing.
Skills that come easy to an expert are learned the hard way: Pilots’ skills correlate with the time they have spent flying without the aid of automation. Neuroscience provides evidence of dedicated assemblies of neurons developed by such deliberate practice. Automation would remove complexity from jobs and thus opportunities to hone our skills. A recurring theme of the book is how automation erodes what makes us human in the best way – even if we might object: Carr quotes surprising findings by Csikszentmihalyi (of The Flow fame). When people were polled about their current mood at various time they …
… were happier, felt more fulfilled by what they were doing, while they were at work than during their leisure hours. In their free time, they tended to feel bored and anxious. And yet they didn’t like to be at work.
Psychologists call this unfortunate desire for what you ‘actually’ don’t want miswanting. One explanation is that people might pretend to prefer leisure over work as this is the socially acceptable behavior. An ethnographer confirmed Csikszentmihalyi’s theory by giving an account of an ancient tribe:
The Shushwaps did not have to wander to survive. They built villages and developed “elaborate technologies for very effectively using the resources in the environment.” They viewed their lives as good and rich. But the tribe’s elders saw that in such comfortable circumstances lay danger. “The world became too predictable and the challenge began to go out of life. Without challenge, life had no meaning.” And so, every thirty years or so, the Shushwaps, led by their elders, would uproot themselves.
If I had to pick the main virtue venerated in this book – it would be accountability. The soldier dropping a bomb via clicking a mouse feels less responsible.
The congeniality of hand tools encourages us to take responsibility for their use.
The outlook on future wars is gloomy: Automated weapons may save lives, but may at the same time increase the likelihood of wars – just because of that. Machines effectively make moral decisions in everyday life already: Robotic lawn mowers already do it when not sparing small animals a human operator might have spotted.
Who determines what the “optimal” or “rational” choice is in a morally ambiguous situation? Who gets to program the robot’s conscience? Is it the robot’s manufacturer? The robot’s owner? The software coders? Politicians? Government regulators? Philosophers? An insurance underwriter?
I believe that ‘futurists’ might not be convinced though. What Nicholas Carr considers specifically human and worth being protected might strike tech enthusiasts as a shortcoming to be fixed by extending and transforming our bodies and minds. Critics might say Carr resorts to poetry in the last chapter in order to circumvent these questions elegantly. The physicist turned stone-age pathfinder said that …
… “primal empiricism,” struck him as being “akin to what people describe as spiritual awakenings.”
Which is something you can either relate to immediately and intuitively, or dissect it analytically. It strikes a chord with me, but trying to explain it any further leads to Wittgenstein-y struggling with reality:
Only through work that brings us into the world do we approach a true understanding of existence, of “the fact.” It’s not an understanding that can be put into words.
Google’s self-driving cars challenge the distinction between explicit knowledge – that can be cast into code (or words) – and tacit intuitive knowledge of processes. It seems that that this artificial boundary is pushed more and more into the realm of the so-called genuinely human. Carr uses a sonnet by Robert Frost called ‘Mowing’ to demonstrate that
a poet’s scrutiny of the world can be more subtle and discerning than a scientist’s.
As a scythe enthusiast I am biased but he really couldn’t have chosen a better example:
It was no dream of the gift of idle hours, Or easy gold at the hand of fay or elf: Anything more than the truth would have seemed too weak To the earnest love that laid the swale in rows
Again, I think these lines will perhaps not speak to modern life hackers. Domestic automation would turn our homes more into workplaces – programmed, and dominated by metrics. We apply the
the bureaucratic ideals of speed, productivity, and standardization to our relations with others.
Algorithms collect data that lend themselves to quantitative analysis. Our formerly ‘continuous’ selves are turned into a collection of disjointed junks presented on social medias timelines which deprives us of options for changing our minds and thus for personal growth. Again I remember the proverbial clock from The Shallows, discretizing time. Making technology invisible and unobtrusive is not a solution but just the final stage of a gradual development:
It obscures the way we’ve refashioned ourselves to accommodate the technology.
I have adopted technology as a professional, but sometimes also to respond to changes in the way we socialize today with everyone expecting to manage their lives through screens. Technology, especially networked one, fundamentally changes society. Already the power grid had a subtle impact on engineering culture, business culture, production, and finally living. You cannot fool yourself, and remain independent and self-sufficient in your spare time and just use technology if you have to. Carr states that self-reliance was once considered the mainstay of character. He advocates getting lost sometimes in contrast to Google Maps’ visions:
“No human ever has to feel lost again.” That certainly sounds appealing, as if some basic problem in our existence had been solved forever. And it fits the Silicon Valley obsession with using software to rid people’s lives of “friction.” But the more you think about it, the more you realize that to never confront the possibility of getting lost is to live in a state of perpetual dislocation. If you never have to worry about not knowing where you are, then you never have to know where you are. It is also to live in a state of dependency, a ward of your phone and its apps.
I read Walden at about the same time as Carr’s book – and I am reminded of this quote by Thoreau:
It is a surprising and memorable, as well as valuable experience, to be lost in the woods any time. … In our most trivial walks, we are constantly, though unconsciously, steering like pilots by certain well-known beacons and headlands, and if we go beyond our usual course we still carry in our minds the bearing of some neighboring cape; and not till we are completely lost, or turned round—for a man needs only to be turned round once with his eyes shut in this world to be lost—do we appreciate the vastness and strangeness of nature. … Not till we are lost, in other words not till we have lost the world, do we begin to find ourselves, and realize where we are and the infinite extent of our relations.