- Home
- David Auerbach
Bitwise Page 4
Bitwise Read online
Page 4
3. BUGS NEVER DISAPPEAR.
They only hibernate. Our worst fights, and in particular our most trivial fights, always came after nine p.m., though it took us ages to figure that out. One of us—and if I’m being honest, it was me far more often than it was Nina—would get bent out of shape over an unkind word, a logistical screwup, or some other domestic misdemeanor. Some friend was annoying, some bill needed paying, or one or the other of us had failed in one of those very specific ways that only has meaning within the long-established habits of a relationship. These fights, which always began with some tiny offense and inflated into competing indictments of how the original dispute stood for some bigger problem, were dumb. Raw emotion kept the momentum going past any sense of rationality and perspective. And they always started after nine p.m., running on the fumes of fatigue and confusion. After each one was settled, we bemoaned the waste of our voices and our nerves. Nina figured out one solution, which was just to walk out. Then I figured out the other, which was not to discuss anything too heavy, or even make a pointed criticism, after nine p.m. Over a decade down the line, that little discovery has probably saved us hours of wear and tear on our cortisol levels and amygdalae.
Bugs can seem evanescent. I saw server crashes that appear out of nowhere and, just as mysteriously, seem to disappear. Bugs never disappear. If you haven’t fixed it, it’s a dead certainty the enigmatic bug will return. Bad fights may abruptly dissipate for calmer times, but underlying issues fester, only to explode later if they aren’t excavated.
4. FOLLOW THE 90/10 RULE.
This is an old law of software optimization: “A program spends 90 percent of its time in 10 percent of its code. Humans are very bad at guessing where that 10 percent lies.” Early on, I thought that the strength of our relationship rested on our shared interests (literature, software) and our physical attraction to each other. Looking back, those turned out to contribute a lot less to relationship stability than agreeing on finances, sharing the same lifestyle, and appreciating sarcastic humor. Daily expressions of appreciation matter precisely because they are daily. Remembering not to wear my shoes indoors has done more to maintain my marriage than did our honeymoon. Nina’s smartwatch, which vibrates against her wrist even when her phone is buried in her bag, solved a prosaic but chronic source of stress. Practical matters were crucial, as was breathing space. Each of us needed a room of our own where we could be alone.
A person has only a limited number of resources to devote to a task, even to a relationship. We tried not to spend them arguing over television shows or each other’s eccentric habits. This is unnecessary optimization, and as comp-sci demigod Donald Knuth once said, “Premature optimization is the root of all evil.”
5. DO THE GRUNT WORK.
The 9:1 proportion recurs in computer science. Computer scientist Mary Shaw wrote, “Less than 10% of the code has to do with the ostensible purpose of the system; the rest deals with input-output, data validation, data structure maintenance, and other housekeeping.” That is to say, what seems like the core infrastructure of a relationship—love, sex, children—is made possible only through the support of auxiliary but equally essential pieces. Most of a relationship’s code will not be about love, but about supporting the possibility of love, and that has to do with working out the piddly things such as scheduling and finances and chores. Without the necessary support infrastructure, a relationship will crash like early Windows. Beware software engineers and lovers who want to be there only for the sexy stuff.
6. HYSTERESIS ENABLES HOMEOSTASIS.
Hysteresis is the dependency of a system not just on its current state but also on its past states. Hysteresis is incorporated into an algorithm to prevent changes from having too many drastic effects too quickly. For example, a thermostat set to 68 degrees might not automatically heat at 67 degrees and cool at 69 degrees; it tracks recent activity to prevent itself from oscillating wildly.
In a marriage, too, it can be disastrous to overreact to sudden changes. The human mind has a great knack for rewriting the past based on the caprices of present emotions. In my worst moods, calm nights at home can abruptly look like the wastage of many years—every book read or movie watched just another distraction from the empty spectacle of two vacant souls enduring a pointlessly fortunate life. In our worst weeks—I count perhaps half a dozen of them—neither of us could see a way to continuing our relationship, and whatever was positive about our previous years together vanished from our memory’s landscape. Those moods and memory lapses are also bugs. “With the perturbations of memory are linked the intermittencies of the heart,” wrote the great computer scientist Marcel Proust. And with the perturbations of code are linked the intermittencies of software.
7. FAULT TOLERANCE.
Hidden bugs can come out of the woodwork years or decades after the code was written. (The Y2K bug, despite its lack of apocalyptic consequences, didn’t show up for decades in systems that were designed in the 1970s and 1980s.) I once found a bug that would crash an entire chat server, kicking thousands of users offline, if a single user happened to send exactly sixty messages in one minute. Uncommon, unexpected inputs can produce catastrophic results, and the causes are rarely clear.
Good programmers realize that their knowledge of their programs is incomplete. They ensure that their programs are fault tolerant to the best of their abilities. After twenty years together, I know that my understanding of my wife is only an approximation of a complex individual. I have a chance of fully understanding a small enough segment of code to speak for its perfection; I could never say the same of a relationship. The formation of a relationship welds together disparate and incompatible pieces, each understood only approximately, into an indistinct structure requiring constant upkeep. We prepare not only for expected failures, but for unexpected ones. The relationship must be fault tolerant.
In 2000, back in the Triassic age of cellphone technology, my wife and I went on a disastrous kayaking trip organized by a friend indifferent to safety concerns. (I later found out his “expeditions” were called “death marches” by his other friends.) After paddling through the choppy and polluted waters of Puget Sound, just outside Seattle, we ended up stranded in darkness on a tiny spit of land with the tide encroaching. My kayak apron had not covered me correctly, and my clothes had gotten soaked. I shivered with incipient hypothermia while our friends bickered over which island on the map we were actually standing on. My wife quietly turned on the mobile phone that she’d presciently removed from its cobwebbed home in our car’s glove compartment. She whispered to me, “We have cellphone reception. We can call the Coast Guard.” I wouldn’t have expected her to be able to reassure me in such a dire situation, but something in our shared history made us more robust, and more able to endure such an unexpected and bizarre experience, than either of us had anticipated.
Such moments mean more than opportunities for gratitude; they reassure me that the two of us have built a foundation that could endure many stresses both known and unknown. All we have, finally, is the wonder of the program itself—everyday life and love—and the unceasing task of having to debug and maintain our own code.
The Work
Quisque suos patimur Manis [Each of us suffers his own peculiar ghost]
—VIRGIL
Discovering compilers and the greater depths of algorithms, I renewed my enthusiasm for computer science in college. But that was not what caused me to leave the humanities. Rather, I became disenchanted with literature as a discipline of study as computers took on a brighter shine.
Upon entering college, my relationship with the humanities—founded in the rewarding study of James Joyce, Virginia Woolf, Ralph Ellison, and Herman Melville during high school—went sour quickly for me. My teenage fantasy of academia as refuge—a fantasy considerably more fixed than any ideal I had of a human relationship—was badly squelched. I had a handful of inspiring teachers wh
o did open up the world of the humanities in new and profound ways.*16 But they only made more apparent that much humanities work, while serious in its way, was not serious about literature. The academic world was professionalized, competitive, and sterile. The evisceration of that scholarly dream was best put by John Williams in one of the greatest of all “academic” novels, Stoner.
Stoner looked across the room, out of the window, trying to remember. “The three of us were together, and Dave said—something about the University being an asylum, a refuge from the world, for the dispossessed, the crippled. But he didn’t mean [department chair] Walker. Dave would have thought of Walker as—as the world. And we can’t let him in. For if we do, we become like the world, just as unreal, just as…The only hope we have is to keep him out.”
Dreams stick as stubbornly as barnacles to our memories and feelings, so that they can only be removed once our minds have been rubbed raw. Such pain is the price, still a worthwhile one, for thinking to remain a serious task.
Computer science, while it became my undergraduate major and then the focus of my career, was not serious. I say that without irony: it was a free realm of play where symbols devoid of meaning were electronically juggled at superhuman speeds. Computer science was stable, secure, seemingly immune to trends, and most certainly on the rise. During college, I did an NSF fellowship in compiler research one summer, then interned at Microsoft the next. The promise of financial viability, work with real-world impact, and simple recognition and collaboration was far more appealing than the rat race of academia in which the slices of cheese were growing slimmer and slimmer. Where the humanities had been poisoned by becoming an occupation rather than a passion, software engineering was perfectly suited to disinterested professionalism and playful enthusiasm.
Ironically, the workaday world of software engineering, where I was forced to confront the perplexities of treating the world as data, gave me more occasion for philosophical reflection than the humanities did. It turned out I was ahead of the curve. There wasn’t much data to be had in the late 1990s, but as I worked on internet services at Microsoft and then the far larger data silos of Google, I was confronted each day with the sheer strangeness of how software companies and software engineers were looking at the world: not quite as machines, but certainly not as most humans do. In this new realm, human language and human life were subjugated to the order of the binary.
*1 Aristotle’s answers, particularly in the natural sciences, were usually wrong. His Physics is an ambitious compendium of mistakes and misapprehensions, drawn from intuition rather than science. But Aristotle’s genius lay in claiming new territory—not in mapping it accurately. If he had a fault, it was being far too easily satisfied. Perhaps if he had entertained more doubts about accuracy, he wouldn’t have treated as many subjects as he did.
*2 The turtle myth occurs most prominently in Northeastern Native American mythology, but the source of the philosophical fable, made most famous by Stephen Hawking in A Brief History of Time, is less certain. The legal scholar Roger Cramton attributes it to an anonymous commentator on William James, who told the fable with rocks instead of turtles.
*3 The ranks of hobbyists notably included literary critic and polymath Hugh Kenner, who, while writing eclectic studies like The Poetry of Ezra Pound, The Stoic Comedians: Flaubert, Joyce, and Beckett, Geodesic Math and How to Use It, and Chuck Jones: A Flurry of Drawings, found time to author the Heath/Zenith Z-100 User’s Guide (1984) and convince his friend William F. Buckley Jr. to switch to a word processing program in 1982.
*4 At the time, I and many other computer enthusiasts thought of Wozniak as the visionary star of Apple. Steve Jobs was seen as little more than Wozniak’s handler. Back then, Wozniak was the more approachable face for a company like Apple than the comparatively uptight and slick Jobs. How times change.
*5 Musician Julian Cope describes the mystique of the Velvet Underground’s bootleg “Foggy Notion” record: “With the accessibility of music via Napster and gemm.com, it’s difficult for all you young’uns to understand how mysterious unreleased and bootlegged material was back then.”
*6 The “Enter” key was exclusive to IBM computers. There is probably an interesting history about the concurrent evolution of “Enter” vs. “Return,” to parallel the double command characters of “line feed” and “carriage return.” Briefly, a carriage return moves the cursor position (where the next character is to be printed) back to the beginning of the line (the left, unless you’re typing in Hebrew, Arabic, Thaana, N’Ko, Mende Kikakui, or a couple other scripts), while a line feed moves the cursor to a new line. On a typewriter, line feed meant “down” and carriage return meant “go to the beginning of the line,” but once computer screens replaced printing typewriters, the two coalesced to carry the same meaning: go down a line and back to the left (or right).
*7 The 6502 was a 1 megahertz processor: it operated at one million clock cycles per second. It allowed for the execution of a bit less than half a million assembly instructions during that time. Sometime after the year 2000, increasingly sophisticated design, as well as the advent of multiprocessor machines and dedicated graphics processing units, ceased to make clock cycles a meaningful indicator of performance, so terms like a 3 gigahertz processor mean less today because their speed at executing instructions (as well as the content of those instructions) can vary wildly. In 2003, the classic 3 gigahertz Pentium 4 processor could execute about ten billion instructions per second. By 2012, a 3.2 gigahertz Intel Ivy Bridge chip with four CPUs (or cores) could theoretically peak at executing 130 billion instructions per second, over 30 billion per core, though avoiding the many other possible bottlenecks to reach that speed was not trivial. It’s probably still fair to say that that 2012 Ivy Bridge chip is about 100,000 times as powerful as the 6502 inside my old Apple IIe. What has been utterly lost is the linear nature of the 6502, which executed one instruction at a time in strict sequence. Processor development has gradually relaxed the idea that a computer is a calculator performing strictly ordered operations at increasing speeds.
*8 John von Neumann, one of the greatest geniuses of the twentieth century and the inventor of the standard architecture that forms the basis of nearly all modern computers, was so fluent with assembly that he saw no need for higher-level languages whatsoever. In 1954, thinking the whole idea a waste of time, he said, “Why would you want more than machine language?” He did not foresee, as almost no one did, a time when programs would grow so large even in these higher-level languages that they would defy full comprehension by a single person.
*9 The teenage definition of “deep” being “full of angst and weltschmerz.” The three big authors then for budding, alienated youth were J. D. Salinger, Albert Camus, and Sylvia Plath. I dutifully read them all.
*10 Gardner deserves a great deal of credit for being one of the great popularizers and connectors of mathematically inflected art and recreations in the twentieth century. He was a one-man archive of nerd high culture, and possibly helped create more young mathematicians and computer scientists than any other single factor.
*11 Perec’s path through the apartment building is called a Knight’s Tour. It is commonly assigned in introductory computer science classes: program a computer to find such a Knight’s Tour, if one exists, given a board of some size. Perec had to find his path by hand.
*12 I was not, in fact, a snob or a would-be snob. I was simultaneously consuming gargantuan helpings of trash in the form of comic books, movies, television, and middling fiction. Most of this trash hasn’t stayed with me (but neither has most of the highbrow stuff), which I think is the real marker between great works and mediocre ones.
*13 There are now several online guides to finding the small and unobtrusive plaque, none of which existed then. It reads “Quentin Compson III. June 2, 1910. Drowned in the fading of honeysuckle.”
*14 I read Proust’s In Search of
Lost Time almost immediately after getting engaged. It was fortunate I didn’t read it earlier, as one of the main themes is that there is no continuity in our feelings toward another. We can snap in and out of love at a moment’s notice. Had I read it before the engagement, it might have spooked me off marriage for another ten years.
*15 Political scientist Raymond Wolfinger coined the inverse of this phrase, “The plural of anecdote is data,” around 1970. The shift, I believe, brings out the underlying meaning better: while standardized anecdotes in sufficient number can constitute good data, it’s rare for a random assortment of anecdotes to rise to that level.
*16 In particular, literary scholar William Flesch, philosopher David Pears, the writer John Crowley, and the classicist Heinrich von Staden. After college, I found two more in the Joyce scholar Edmund Epstein and philosopher Galen Strawson.
2
CHAT WARS
Interop
It’s easier to ask forgiveness than it is to get permission.
—GRACE HOPPER
AT TWENTY-TWO, when I was just out of college and still a green software engineer, I fought America Online, and AOL won. The battle made the front page of the New York Times. The public was beginning to care about code. This wasn’t the code that would crack the secrets of mathematics or the nature of the universe. This code was the lifeblood of our economy and society. It was the summer of 1999, and people were starting to realize that the internet and the web were becoming a new, dynamic circulatory system for information, coordination, and life itself.