In 1870, W. A. Rogers, a British bureaucrat in the Bombay Civil Service, wrote of the fortifying effect that modern transport systems were having on the character of the local populace:
Railways are opening the eyes of the people who are within reach of them in a variety of ways. They teach them that time is worth money, and induce them to economise that which they had been in the habit of slighting and wasting; they teach them that speed attained is time, and therefore money, saved or made. . . . Above all, they induce in them habits of self-dependence, causing them to act for themselves promptly and not lean on others.
The locomotive was a moral engine as well as a mechanical one. It carried people horizontally, across the land, but also vertically, up the ladder of enlightenment. As Russell Hittinger notes:
What is most striking about [Rogers’s] statement is that the machine is regarded as the proximate cause of the liberal virtues; habits of self-dependence are the effect of the application of a technology. The benighted peoples of the sub-continent are to be civilized, not by reading Cicero, not by conversion to the Church of England, not even by adopting the liberal faith, but by receiving the discipline of trains and clocks. The machine is both the exemplar and the proximate cause of individual and cultural perfection.
Tools are, whether by design or by accident, imbued with a certain moral character — they instruct us in how to act — and that in-built, artificial morality offers a readymade substitute for our own. The technology becomes an ethical guide. We embrace its morality as our own. An earlier and more dramatic example of such ethical transference came, as Hittinger suggests, in the form of the mechanical clock. Before the arrival of the time-keeping machine, life was largely “free of haste, careless of exactitude, unconcerned by productivity,” the historian Jacques Le Goff has written. With every tick, the new clock in the town square issued an indictment of such idleness and imprecision. It taught people that time was as measurable as money, something precious that could be wasted or lost. The clock became, to quote David Landes, “prod and key to personal achievement and productivity.”
Just as the clock and the railroad gave our forebears lessons in, and indeed models for, industriousness, thrift, and punctuality, so the computer today offers us its own character instruction. Its technical features are taken for ethical traits. Consider how the protocols of networking, the arcane codes that allow computers to exchange data and share resources, have become imbued with moral weight. The computer fulfills its potential, becomes a whole being, so to speak, only when it is connected to and actively communicating with other computers. An isolated computer is as bad as an idle computer. And the same goes for people. The sense of the computer network as a model for a moral society runs, with different emphases, through the work of such prominent and diverse thinkers as Yochai Benkler, David Weinberger, Clay Shirky, Steven Johnson, and Kevin Kelly. We, too, become whole beings only when we are connected. And if being connected is the ideal, then being disconnected becomes morally suspect. The loner, the recluse, the outsider, the solitary thinker, the romantic quester: all such individuals carry an ethical stain today that goes beyond mere unsociability — they are letting the rest of us down by not sharing, by not connecting. To be inaccessible to the network is to waste one’s social capital, a deadly sin.
But the computer goes even further than mechanical tools and systems in shaping our conception of virtue. It provides more than just a model. It offers us a means for “outsourcing” our ethical sense, as Evan Selinger and Thomas Seager put it. With the personal computer, we have an intimate machine, a technological companion and guru, that can automate the making of moral choices, that through its programming can prod us, nudge us, and otherwise lead us down the righteous path. Arianna Huffington celebrates the potential of the smartphone to provide a “GPS for the soul,” offering ethical “course corrections” as we go through the day.
In discussing the automation of moral choice, Hittinger draws a connection with the work of the historian Christopher Dawson, who in a 1960 lecture argued that modern technology, and the social order it both represents and underpins, has become “the real basis of secular culture”:
Modern technologies are not only “labor saving” devices. A labor saving device, like an automated farm implement or a piston, replaces repetitive human acts. But most distinctive of contemporary technology is the replacement of the human act; or, of what the scholastic philosophers called the actus humanus. The machine reorganizes and to some extent supplants the world of human action, in the moral sense of the term. … It is important to understand that Dawson’s criticism of technology is not aimed at the tool per se. His criticism has nothing to do with the older, and in our context, misleading notion of “labor saving” devices. Rather, it is aimed at a new cultural pattern in which tools are either deliberately designed to replace the human act, or at least have the unintended effect of making the human act unnecessary or subordinate to the machine.
Philosophy professor Joshua Hochschild goes further: “Automation makes us forget that we are moral agents.” When software code becomes moral code, moral code becomes meaningless.
I’m positive that the millions in the United States with no home access to a computer (which figures have been deliberately obscured and ignored for decades now), have a far finer sense of morality than the amoral [Billionaire] Thought Leader [Sociopaths]™. [Billionaire] Thought Leader [Sociopaths]™ who, for their immense profit, hubris, and vanity, have created the chillingly inhuman and ghastly environment which has prompted you to make your point about ceding morality to computers/Soft Ware (computers/Soft Ware which can only possess the amorality of those HUMAN BEINGS who oversaw the designing of them).
Those millions – “with no home access to a computer” – includes: those whose once livable wage occupations never ever required the internet; those who sniffed the vast unlimited ‘potential’ of the underbelly of digital versus analog technology; millions of elderly; millions (in all age brackets) of historic, or just recently, barely sheltered poverty ridden; and, the exponentially increasing homeless.
I will bet that those millions have a far finer sense of morality than the amoral [Billionaire] Thought Leader [Sociopaths]™. [Billionaire] Thought Leader [Sociopaths]™ who, for their immense profit, hubris, and vanity, have created the chillingly inhuman and ghastly environment which has prompted you to make your point about ceding morality to computers/Soft Ware (computers/Soft Ware which can only possess the amorality of those who designed them).
It does not bode well, that those millions have no podium for their full thoughts – desperate 140 character [tracked] tweets™ from a “public” library (which now track anyone who uses their computers/Soft Ware) do not count for full thoughts. They are absolutely ignored, left to die in agonizingly slow motion misery.
Feeling, as I do (in large part due to my ghastly ‘profession’ experiences, and the only reasons why I felt it necessary to basically spin my wheels, to no good outcome at all, online: in order to stay employed in order to have shelter and eat), that ‘Soft Ware ‘was overtaken from its beginnings by the Financiers of the world, and their War Hawk Military Grunts, I can’t help but wonder that the word Soft™, preceeding theWare, might not have been the deliberate, ‘gentle’ed down and venal choice of someone the likes of one of those depraved Nazis whom the “US” ‘saved’ and placed …. in top positions during Operation Paperclip.
siiiiggggghhhh …..is an extreme understatement.
(Also, Joan Didion had some interesting commentary on the Robber Barons, and those Rail Roads. I hardly agree that railroads had anything to do with imbued morality. Not that certain technologies, cannot be of some assistance, but that is not at all what has come to be.)
Just tossing a simple & momentary ‘thank you’ into the void. Lately I am wondering more and more…what is the POINT of the internet? And I get it, of course there is no point. This is probably just another overflow of murphys law, we can, so we did. I realize that a part of me has grown impatient, I’ve heard my own thoughts shriek, “When will the internet just be over!?” But like the clock it looks like we still have plenty far to run with it. In that case- a reiteration- thank you for going into it with a fine comb and providing some context and comparison. Funny how alone we all feel amidst the digital trillions.
I would put the dogma “everything free, everything ad based” quite high in what the Internet has become.
And in fact the homonymy of the “free” adjective in English (gratis and freedom related) has quite a lot to do with it.
Your post points to the idea that technology wihin the social contruct drives cultural change. However only when individuals decide how and where its uses might be applicable is it defined or given meaning.
To that aim the conversation could be about conscious participation verses the larger social context. The larger context, as your post suggests, is defined more easily because it paints broad strokes that resist clarification unless applied to the advent of changing cultures.
The use of words like morality, ethical triats, moral charictor, GPS for the soul and the like in relationship to technology/tools assumes an inherent force for the good. I would suggest the opposite. Technology, unless used in the service of human kind, is closer to darkness than light.
Can one really embrace or exchange the so called morals and ethics of tools and technology for our own? I would rather point to the evolution of human concoiusness as the catalyst for change rather than usuming machinery/tools/technology has some inherent life force as such.
Regards, Alan
Mr. Carr
Thank you for all your work, and for your linking to the post at our blog. I just wanted to note that the attribution of the quote should be for Joshua Hochschild, who is the author of the piece you cite. As editor of the site, I posted his piece, and I wanted to be sure to credit him properly.
David, Thanks, and apologies. I have made the correction. Nick
Interesting article. While I admit that “new technology” always has a profound impact on prevailing social codes and behavior, I think it is equally true that somehow “we” always seem able to adapt far past our earliest experience of them. Witness, for example, the many (very similar) warnings that have been made about previous tools. The telephone would kill face-to-face conversation; TV would end the need to leave the house for entertainment; cassette tapes would spell the end of the music industry, VHS tapes the end of Hollywood; video games cause violent behavior, and countless more. IIRC there was even a concern that the modern (18th century) fiction novel was going to lead men into lives of romantic isolation, and the waltz would lead to a breakdown of social morals (all that touchy dancing).
I think needs to be kept in mind through all of this is that humankind is an ever-changing, evolving, and adapting social animal. You ask: will software code really replace moral code? But isn’t it moral humans who write that code, decide where and how it may be used, and ultimately make the decision to turn “on” that device that runs the code? In this light, tools don’t replace moral code – they simply provide us other opportunities to express our morality. Sometimes that expression of morality is bad (Hitler used trains very effectively); often it is meaningful and purposeful and enhances our moral position (using GPS-enabled, geo-coded 911 call to help a neighbor).
In closing, may I take a page from your title and turn another phrase: “Automation doesn’t kill people: People kill people.”
@Jack,
Your point is well noted, but at the same time I would suggest that this time around it is different.
We are at, perhaps, a unique point in the history of our culture where technological progress is not only accepted, but desired. Previously, new technology was likely to be treated with indifference (“What’s it good for, then?”) or outright hostility. These days we’re likely to cheer any kind of “innovation” unquestioningly.
This is reflected in both mainstream media and concrete political decisions. I am hard pressed to think of any technology, prior to the internet, that has received such strong endorsement from lawmakers from the start. Consider the wide range of immunities online businesses had received pretty much as soon as the decision to commercialise the internet arose (everything from the DMCA to privacy), in an attempt to prevent any difficulties that may arise in exploring this technology fully – long before it was clear what rights and responsibilities should actually apply to internet services.
It doesn’t help that the internet leverages already existing infrastructure and technology, meaning that adoption does not require large up-front investment. Online economics don’t help either – use of the internet, once you have a connection, is largely seen as free. All of this has resulted in a rapid uptake of everything online, good or ill, without giving our society time to consider the changes and decide what parts we want.
As things stand, we are told to “get online, or else” (through mandates that certain required official actions – like filing tax returns – need be conducted electronically; this from Poland, as it happens), whilst attempts to regulate the online sphere are continually met with resistance (Google is probably the biggest lobbying spender at the moment). If you consider just the simple fact that people who question the technological gradient we’re on, like Nick, are now at the margins of public discourse, you’ll realise that comparisons to concerns about other technologies – like the telephone or video games – are not really valid for comparison. The way we think about technology has changed too much in the interim.
Like I said, this point in our history may be unique. It may be that we will learn from our mistakes and that our enthusiasm for “innovation” may be tempered by the realisation that “new” and “good” are not synonyms (just like we learned that “new” does not mean “bad” in the past).
“Marshall McLuhan, what are you doin’?”
–Henry Gibson on Rowan and Martin’s Laugh-In 1968
“Nick Carr, what are you doin? Reverse engineering Marshall McLuhan?
”
— LG RTB 2015
Forget Myspace, Friendster, Facebook and Linkedin, check out the new killer social media phenomenon: http://www.squarespace.com/. I guess its where Kevin Flynn has been living since he moved into the virtual world in 1982. I guess it took him 30 years to get the MCP to cough up the VC for the start up. ;) Hurry – I’ve already claimed Ed Dillinger. LOL
Nick, regarding morality being ceded to Computers/Soft Ware[“ROBOTS”], I’m curious as to whether you’ve read:
Paulina Borsook’s 2000 – or 2001? – copyright, that year rather difficult to determine easily, despite all the nanosecondry search function[!] [ability]™, since that book was rapidly proclaimed OUT OF PRINT™. ( …And, oh my goodness, that horrid and near immediate shutdown of her main points, particularly in the “US,”on Bezos’ sickeningly ironically named, “Amazon[!]™ Book Reviews.” Her book was immediately shut down as to any timely second, or further, updated edition[s]):
Cyberselfish: A Critical Romp Through the Terribly Libertarian Culture of High-Tech
And
Steve Talbott’s, 2007:
Devices of the Soul: Battling for Our Selves in an Age of Machines
Both of whose questioning books precede your questioning books (I believe, at least I know Paulina’s book does), as to addressing – both, the morality of, and unquestioning adoration of – forced reliance on “[elite, human created and endorsed] algorithms,” in “the 21st century.”
If so, can you share your thoughts?
Thanks!