The 20th-Century Computer Science Debate That Foretold Our 21st-Century Technological Fears

Date:

Share post:


“To see a World in a Grain of Sand
And a Heaven in a Wild Flower,
Hold Infinity in the palm of your hand
And Eternity in an hour.”
—William Blake, Auguries of Innocence
*

Article continues after advertisement

I’ve been thinking about what constitutes a great programmer. Not “great” in our standard US or British English colloquial definition of (adj.) quite good, or at least not terrible, but great as in canonically Great; part of the story of computing.

The question arose the other night when, plagued by post-PyCon jetlag, I took advice from an algorithmically delivered New York Times piece on sleeplessness, left the bedroom and settled in an armchair with a book I presumed no insomnia could defy. In the event, Selected Writings on Computing: A Personal Perspective by Edsger Dijkstra proved disappointing in that regard. Chagrined, I stayed up reading till dawn.

Most of the essays in Selected Writings are from the 1970s, by which time Dijkstra was already established as a giant in the field, and they provide a colorful window not just onto changes in our notion of programming, but onto the world programmers are called upon to program.

The incisive mind of Dijkstra was troubled to see good scientists… censoring their own research to suit commercial investment priorities.

Who was Dijkstra? A background in theoretical physics may have explained the Dutchman’s apparent mistrust of pure mathematicians. His marked resemblance to the actor Bryan Cranston as Walter White in Breaking Bad would have unnerved those pure mathematicians had he lived a little beyond his allotted seventy-one years, from 1930 to 2002. As expected, many of the essays in Writings are technical timepieces that mean little to me (and yet some tempt with titles like “A New Elephant Built from Mosquitoes Humming in Harmony”). His colorful reports on conferences and events from the time, on the other hand, are priceless, jeweled with an unusual combination of playfulness and style and an unsparing honesty that prefigures the blurty hacker culture of today. Like most great performers, Dijkstra was also a fan, but if he was in your audience you needed to be on your mettle, because he didn’t suffer fools, even very clever ones.

Article continues after advertisement

In Writings is a constant sense that the words were penned at a time when everything was still up for grabs—before the PC, software microverse or social media fray were clearly defined or their trajectories set, a time when it felt natural for pioneers like him to describe algorithms not as being “written,” but discovered, as though they’re out there waiting for us like a new star or particle. He submitted to these pieces being published, he writes, partly to further his fluency in written English (being Dutch, we assume he already speaks it better than most native Anglophones), and you can tell he loves language, not least from his joy in learning the phrase “utterly preposterous” from a German colleague, which he converts to the acronym “UP” and finds great use for it at technical plenums. Dijkstra thinks nothing of quoting the visionary English artist and poet William Blake, whom he reveres, in the company of computing peers, nor of clashing with received thinking or reputation.

The current generation of coding doyens tends to be known for significant programs, languages or tools they’ve written or conceived, but Edsger Dijkstra’s Greatness consisted in something else entirely, which was founded on his ability to identify and dismantle many small but critical roadblocks to hardware and software creation, without which none who followed could have made their showier, more user-focused contributions. How would we persuade two or more computers to talk to each other when they are asynchronous—meaning their processor clocks are out of step with each other? How do you create a distributed system, where there is by definition no centralized control, that is also self-stabilizing (because if you can’t, there’s no point in even thinking about an application like the World Wide Web)? These were among Dijkstra’s questions and his heroic status rests in having found artful algorithmic answers to them, elegant as haikus.

Perhaps because digital worldviews were at stake in these early days, hackles frequently rose. Most fascinating is Dijkstra’s report on a transatlantic “Communication and Computers” gathering at Newcastle upon Tyne in the northeast of England in 1973. Though characteristically effusive in praise of speakers he feels have approached their subjects with wit and (most crucially) depth, the Dutch code master’s patience is also tested in ways that make for an acute cultural study. Where computing in Europe took place primarily within the academy, the starry US contingent in Newcastle were part of a fast-moving industry driven by a distinctively American mix of idealism and trust in the whims of markets—both of which Dijkstra regards with suspicion.

How did this divergence occur? Historians claim Britain forfeited its (and effectively Europe’s) early leadership in computing when paternalistic mandarins chose to classify the work of pioneers like Alan Turing after World War II. By contrast, the United States, big and openhearted at its best, invited the world in and thereby took the lead. But now Dijkstra sensed danger and thought he saw scientific truth being skewed to the requirements of large corporate interests. IBM, Bell Labs and ARPA had all sent large contingents to Newcastle, and the incisive mind of Dijkstra was troubled to see good scientists dancing around them, even censoring their own research to suit commercial investment priorities. This resulted in compromised systems that, once built, couldn’t easily be unbuilt and would constrain potentialities into the future. The ARPANET, forerunner of the modern internet from 1969, was one such avoidably flawed project—at a time when ARPA was trying to offload it onto the private sector, where companies like AT&T saw no future in it and passed.

Sparks flew when Dijkstra attended a pair of long talks by one of the totemic figures of American computing. His disinclination to name the speaker could look like gallantry had we been left in any doubt that the man in question was none other than Douglas Engelbart, head of the Augmentation Research Center (ARC) at Stanford University, whose team of young Californians preached the utopian vision of a networked world in which knowledge—and therefore wisdom—would be distributed; in which our electronically connected species would be held in a new embrace of fellow feeling. Wouldn’t empathy arise spontaneously from such connection? How could prejudice and xenophobia resist withering away once everything was shared?

Article continues after advertisement

What follows is a little shocking to me. Five years prior to the Newcastle summit, Engelbart had dazzled an audience of US computing peers with a vision for the digital future. His feature length presentation was entitled “A Research Center for Augmenting Human Intellect,” and demonstrated the first hypertext links, wooden “mouse” and graphical user interface (or GUI, meaning a navigation screen that was graphical and used symbols for actions rather than text). By all accounts, most who were present in San Francisco’s Civic Auditorium that day went away with their view of computing and its place in the world changed. Now Engelbart repeated his ideas for a European computing audience in Newcastle.

To those of us who saw and were excited by the democratizing, decentralizing potential of the web in the 1990s, Engelbart has always been a hero. His generous view of humanity and the “global wisdom society” was tied to a countercultural faith that people were innately collaborative and decent; that on occasion society bent this inheritance toward narcissism as wind bends a Monterey cypress—but that knowledge and connection could remedy such distortions. How much harder it should be to drop bombs on people you can see and communicate with.

Except that, because the revolution faded into the recession of the 1970s, society never came to a clear verdict on whether people enter the world good and requiring only encouragement, or with chaotic hearts in need of taming. Most of our big policy debates on issues like education, the penal system and welfare continue to revolve around this irresolution. But in 1968 in San Francisco and 1973 in Newcastle upon Tyne, Engelbart’s vision was radical and optimistic and most were willing to give it a try. Why wouldn’t they?

Not Dijkstra. In three separate addresses to the conference by Engelbart, the Dutch master saw something not just “terribly bad,” but also “dangerous.” The first of these verdicts is the least shocking, because no one ever accused Douglas Engelbart of being a magnetic speaker. Neither was the second to do with substance, because “the product he [Engelbart] was selling” was “a sophisticated text editor that could be quite useful.” Of graver concern was what Dijkstra saw as the “the way he appealed to mankind’s lower instincts while selling…an undisguised appeal to anti-intellectualism [that] was frightening.”

Now Dijkstra rehearses a series of complaints whose prophetic nature is only just becoming evident. Talk of “augmented knowledge workshops” reminds him of the US education system’s extremely “knowledge-oriented” ethos, in opposition to his view that “one of the main objects of education is the insight that makes a lot of knowledge superfluous.” A thought that puts me in mind of the coding nostrum that a good programmer distills a function down to the fewest lines, where a great one finds a way to make it redundant altogether—an approach from which most of us civilians could learn.

Article continues after advertisement

More striking yet was Dijkstra’s instinctual (and at the time counterintuitive) dismay at Engelbart’s overarching project, which was about connecting people on a global network, expressed in language foreshadowing Mark Zuckerberg’s four decades later. In essence what we glimpse here is the clash of two visionary programmers of the same generation, reacting in divergent ways to the legacy of the Second World War. Where the Stanford professor was just old enough to have served in the Navy for two years—but for a country that had never been invaded—the Dutchman’s childhood was defined by Nazi occupation and its assault on the individual. All the same, it is hard not to feel a little awed by the acuity of Dijkstra’s projection into the future; by the speed and accuracy of his ability to unspool the logic of a proposition when he says of Engelbart, fairly or not: “His anti-individualism surfaced when he recommended his gadget as a tool for easing the cooperation between persons and groups possibly miles apart, more or less suggesting that only then are you really ‘participating’: no place for the solitary thinker.”

What we glimpse here is the clash of two visionary programmers of the same generation, reacting in divergent ways to the legacy of the Second World War.

Despite some contrastingly enjoyable talks, Dijkstra went away frustrated. Newcastle’s highlight for him was a visit to Hadrian’s Wall, a barrier constructed across the north of England by the eponymous Roman emperor from ad 122, with “a true archaeologist!”

The point of all this for me, however, is to explain why the definition of programming Dijkstra provides in Selected Writings carries such weight and seems so interesting, not least because most of us feel we know what programming is by now, so never ask the question or imagine what an answer might be beyond “telling a computer what to do.” But in Dijkstra’s pomp, with first principles still subject to debate, the question still seems necessary. In accepting a 1974 award in the name of all the colleagues who had contributed to “the cause,” he says:

The cause in case is the conviction that the potentialities of automatic computing equipment will only bear the fruits we look for, provided that we take the challenge of the programming task and provided that we realize that what we are called to design will get so sophisticated that Elegance is no longer a luxury, but a matter of life and death. It is in this light that we must appreciate the view of programming as a practical exercise in the effective exploitation of one’s powers of Abstraction.

Powers of Abstraction? A word I’ve noticed computerists using like a key to the digital door, but have not yet begun to grasp, whose pursuit will draw me far beyond my comfort zone, deep into the microcosmos. Dijkstra ends this speech with a quote from his beloved Blake.

Article continues after advertisement

He who would do good to another must do it in minute particulars

General Good is the plea of the scoundrel, hypocrite and flatterer

For Art and Science cannot exist but in minutely organised particulars

One of Blake’s most candescent prints depicts an idealized Sir Isaac Newton, father of modern science, sitting naked on a lichen-covered outcrop of rock as he leans forward in intense concentration, performing some geometric calculation with a pair of compasses. The Scottish sculptor Eduardo Paolozzi’s three-dimensional, modernist rendering of Blake’s image greets visitors to the British Library at the entrance to the courtyard every morning. But as Paolozzi knew, Blake did not intend to celebrate Newton with a masterpiece whose full title is Newton: Personification of Man Limited by Reason. Rather, Blake regarded the scientist’s mechanistic reading of the world with disdain bordering alarm, and as the sun rises I spend a long time trying to unpack Dijkstra’s use of Blake’s quote here; then whether I agree with it or not. I get to wondering whether Blake, who famously saw “a world in a grain of sand” and toward the end of his life painted the soul of a flea as seen in a vision, would have liked the uncanniness of Einsteinian physics better…which leads me naturally to wondering how he would have viewed the digital realm. Perverse as it might seem and for reasons that will become clear, this matters more than you might think going forward.

__________________________________

Excerpted from Devil in the Stack: A Code Odyssey by Andrew Smith. Copyright © 2024. Reprinted with permission of the publisher, Atlantic Monthly Press, an imprint of Grove Atlantic, Inc. All rights reserved. 



Source link

Nicole Lambert
Nicole Lambert
Nicole Lamber is a news writer for LinkDaddy News. She writes about arts, entertainment, lifestyle, and home news. Nicole has been a journalist for years and loves to write about what's going on in the world.

Recent posts

Related articles

Lit Hub Daily: November 21, 2024

The Best of the Literary Internet, Every Day ...

An Ageist Disease: On Living in Fear of Alzheimer’s

The one disease I fear most is Alzheimer’s, and I am sure that I am not the...

Embrace the Journey: An Octogenarian’s Advice For Younger Writers

I’ve always been curious about why one chooses fiction for one story and nonfiction for another. For...

Ruben Reyes Jr. on Trump’s Plans for Mass Deportation

Writer Ruben Reyes Jr. joins co-hosts V.V. Ganeshananthan and Whitney Terrell to discuss the Trump administration’s plans...

Here are the winners of the 2024 National Book Awards…

November 20, 2024, 9:57pm After a long ceremony and lots of wonderful speeches about books, presenting the winners...

Here are all the winners of the 2024 Canadian Writers’ Trust literary prizes.

November 20, 2024, 2:29pm Yesterday in Toronto, the Writers’ Trust of Canada recognized the country’s best books and...

Texas public school students could soon be tested on the Bible.

November 20, 2024, 1:24pm In bummer news for all fans of the separation of church and state, this...