Free Novel Read

Bitwise Page 2


  This “substitute world” that we see is, in short, a lie. Our brains take sense data and inaccurately analogize it into forms that are already familiar to us. But as children growing up, this substitute world works quite well. It is manageable and legible to us, since we engage with the world in a functional and effective fashion. The world as it is only grants flashes of strangeness to a child to suggest that reality might be quite different. What really goes on inside our bodies? How did this world come to be? What is death? These questions don’t often present themselves, because we know to be productive with our time rather than diving into what David Hume called the “deepest darkness” of paralyzing skepticism. Yet much joy and satisfaction can be found in chasing after the secrets and puzzles of the world. I felt that joy first with computers. In them I found a world strictly divided between the program and the output. The instructions and the execution. The simple code and the beautiful tree.

  I remain a terrible artist, barely able to draw a human figure. But I fell in love with the concepts of algorithmic programming: instructions, branches, variables, functions. I saw how a program could generate the simulated world of the output. Recursion was too tricky for my seven-year-old self to wrap my head around, but I wanted badly to understand it, and I was convinced that I could. My wonder at the power of programming, the ability to create merely through simple lines of text and numbers, drove me.

  The complexity of life is all around us, but we grow numb to what we see of it, even while so much lies outside our immediate experience: microworlds of cells, atoms, particles, as well as the macrocosmos of our universe containing far more galaxies than the Earth has people (approximately two trillion galaxies by NASA’s 2016 estimate). Programming abstracted away the uncertainty of the world and laid its principles out before me. Notions of elegance and beauty drive programmers just as much as they do mathematicians and poets. What mattered is that I felt the jump from the programmatically simple to the aesthetically complex.

  On a computer, that jump is clean, elegant, and definitive. One popular philosophical fable concerns the myth that the Earth is supported on the shell of a gigantic turtle.*2 “What is supporting the turtle?” asks the philosopher. It’s “turtles all the way down,” comes the reply. There is no final answer available to us, only more questions. Programming offered a stopping point with its artificial world, a final answer. In Logo, there was just the turtle, just the one.

  The first turtle I worked with was a simple triangle, not the waddling shape you see in the pictures above. Later, the program LogoWriter made an appearance at my school. It was a frillier version of Logo, which replaced the triangle with a turtle shape closer to what I’ve used here. I disliked the turtle-shaped turtle (and I still do). LogoWriter added bells and whistles, but the representation of the turtle as a turtle had no functional impact whatsoever on the workings of Logo. It was a superfluous cosmetic change that drew attention away from what was truly remarkable about Logo: the relationship between the program and its execution. The turtle, whether triangle-shaped or turtle-shaped, was already abstracted away in my mind, just a point to designate where drawing would next originate.

  Even as I coded on Logo Writer, that tree still puzzled me. I could not understand the concept behind recursion, the powerful technique that allowed the tree program to draw such a complicated pattern with so few lines of code. I wouldn’t figure it out until my teens, when I would also learn what a powerful role it played in all of computer science and indeed in conceptual thinking in general. Recursion, in a nutshell, is use of a single piece of code to tackle a problem by breaking it down into subproblems of the same form—like drawing a branch of a tree that is itself a smaller tree. It is envisioning the world as an ornate yet fundamentally elegant fractal. Recursion reflects the efficient, parsimonious instinct of computer programming, which is to get a lot out of a little.

  The Assembly

  Why would you want more than machine language?

  —JOHN VON NEUMANN

  Programming isn’t a wholly abstract exercise. Programming requires hardware, which only became available to the average home in the 1980s, and it had its own subculture too. In the pre-internet days, there was a secret lore surrounding computers, and much of it revolved around the Apple II.

  My first computer was an Apple IIe. It was, by far, the most popular home computer of its age, thanks not only to Apple’s partnerships with educators but also to Apple’s focus on making a general-purpose computer for consumers and hobbyists. Other consumer-oriented computers, like the Commodore 64 or the Atari 400, were dedicated simply to running the primitive software of the time. Apple’s computers used Chuck Peddle’s ubiquitous (and cheap) 6502 processor and sat somewhere in between those casual machines and professional PCs like IBMs. This owes primarily to Apple creator Steve Wozniak’s background in the hobbyist and computer club community and his dedication to building a computer that could be both accessible and powerful.*3 Wozniak was not trained in academia or research labs. He came out of vaguely countercultural groups who got into PCs with the same fervency that others get into coin collecting, cars, or Dungeons & Dragons.*4

  There was a totality to the Apple IIe that no longer exists on computers today, or even mobile devices. It offered the sense of being close to the fundamental machinery of the system. The Apple IIe did not have a hard drive. Turn it on without a floppy in the drive and you’d just see “Apple ][” frozen at the top of the monitor. I had to boot a floppy disk containing Apple DOS, the disk operating system, where I could program in Applesoft BASIC, as did many others around that time.

  I remember the first programs I tinkered with on the Apple IIe. There was Lemonade Stand, a multiplayer accounting game originally created by Bob Jamison back in 1973, then ported to the Apple IIe in 1979 by Charlie Kellner. Playing it, I set prices and budgeted for advertising, depending on the weather. If prices were too high, people wouldn’t buy your lemonade. If prices were too low, you wouldn’t make a profit. After a few days of play, your mother stops giving you free sugar; a few days after that, the price of lemonade mix goes up. If it rained, everything was destroyed for that day and you took a total loss. If construction crews were present on the street, they would pay any price for your lemonade. I changed the code so everyone would pay whatever price you set. I made a killing because I could change the rules. Then I changed the code so that for the second player only, people would never buy lemonade at any price, and I asked my mother to play it against me. I won. She was baffled, then simultaneously impressed and annoyed (a reaction that is every child’s dream) when I told her I’d changed the code.

  Profiting from a heavy markup in Lemonade Stand

  BASIC was a less elegant language next to Logo. But it was native to the Apple IIe. With a few mysterious commands named PEEK, POKE, and CALL, I could tinker directly with the guts of the Apple IIe. These commands let you access the physical Random Access Memory (RAM) of the machine, the immediate, transient short-term storage of the computer. PEEK(49200) would make the speaker click, a thrilling sound when the single loud beep was the only sound easily available to a BASIC programmer. POKE(49384,0) would start the disk drive motor spinning, good for scaring someone into thinking their disk was being formatted. Other PEEKs and POKEs allowed for manipulation of text to make characters disappear and reappear and move around—things that weren’t easy to do in BASIC proper. You could also crash and reboot your machine, which was otherwise nearly impossible in BASIC. POKEs and CALLs were powerful stuff. PEEK was (mostly) safe.

  These numbers were arbitrary to outsiders, and they were not particularly publicized. Before the internet, programmers had to learn this kind of esoteric knowledge haphazardly from books, magazines, and other enthusiasts. There was a thrill of discovery that can’t be re-created now that most information can be found with a simple web search.*5 I would find a particular piece of Apple lore, then
think about it until the next time I got on the computer to try it out. I discovered much, as many did then, through the charts produced by Bert Kersey’s Beagle Bros software company. There is a certain set of people, myself included, for whom this chart will inspire an overpowering nostalgia.

  Part of this nostalgia owes to Kersey’s signature design and clip art, which distinguished Beagle Bros from other vendors. Part of it owes to the sheer intrigue around the secret details contained on the chart: this was hidden knowledge! Kersey captured the mystique:

  Pokes are often used to write machine-language routines that may be activated with the CALL command—the possibilities are infinite.

  Even novelties were fascinating because they revealed unsuspected capabilities of the Apple IIe. The program, which appeared in a Beagle Bros catalogue, was pretty much impossible to parse if read.

  1 HOME: LIST: BUZZ=49200

  2 A$="!/-"+CHR$(92): FOR A=1 TO 48: B=PEEK(BUZZ):FOR C=1 TO A: NEXT: X$=MID$ (A$,A-INT(A/4)*4+1,1): VTAB 3: HTAB 10: PRINT X$X$X$: NEXT: GOTO 2

  If typed in and executed, it would print itself and then make a varispeed buzz as the characters in the first line appeared to spin around in time with the buzzing. Who would think of such a thing? My nostalgia for this ephemera also owes to the tangibility and simplicity of these details. The Apple IIe was a very limited machine next to the Macintosh, which arrived only a few years later. With its text screen and minimal graphics, the Apple IIe offered a mechanical transparency that people these days obtain only from using Arduino circuit boards and working with firmware.

  PCs at that time did not have multitasking, something that is so taken for granted today that imagining a computer without it seems absurd. Multitasking is the ability for a computer to run multiple programs at the same time. If you look at the Task Manager on Windows or the process table on Linux or OSX, you will see that your computer is running dozens if not hundreds of programs (or tasks, or processes) simultaneously, with the operating system’s core “kernel” (the central controller of the entire operating system) allocating work to CPU cores in very intricate fashion. This scheduling is entirely opaque to users and to most programmers. But the Apple IIe, like most personal computers of the early to mid-eighties, did one thing at a time. If I asked Applesoft BASIC to PRINT "HELLO" and hit return,*6 the CPU would devote itself exclusively to printing HELLO on the screen until it was finished, at which point it would wait for further input from the user. Even the original Macintosh, with its graphical user interface, could not stop a program from running once it was executed (though it and MS-DOS both had mechanisms for tiny programs like a clock or a device driver—or a virus—to remain semi-present even if they weren’t technically running). So at any time, only one program could run, and it did so on top of the operating system, which ran on top of the hardware CPU, the central processing unit.

  Computers are best understood as a series of abstraction layers, one on top of the other. Each new top layer assembles the previous layer’s pieces into more complex, high-level structures. On the Apple, a BASIC program can be the top level, which executes on top of the DOS operating system, which executes on the hardware. The bottommost level is the hardware: the CPU. The CPU consists of more than a billion transistors arranged so that they can physically execute an “assembly” language (or “machine language”) that is native to that CPU. Assembly is the deepest layer of code, where one can directly give the CPU instructions. And what one can tell it to do is often pretty limited: store this number here, retrieve this number from there, add or subtract these two numbers, and branch to different bits of code depending on some condition or other. In different contexts, these operations can take on different meanings, such as printing text onto a screen or sending something across a network, but the overall level of structure is very primitive. Assembly can be tedious and even painful to program in—but because it is the language of the CPU, it is fast.

  It’s often said that an algorithm is a recipe, but let’s extend that analogy to computing abstraction layers, to illustrate how assembly connects to the hardware beneath it and the high-level language (BASIC, Logo, or something else) above it. Imagine that we’re in a restaurant. A BASIC program is a diner. Diners read the menu and know just enough about the dishes to decide what they want to eat. They don’t have to worry about how much salt to put in the soup, how long dishes need to cook, or how to lay out the food on the plate. They have only high-level control over the end result, their meal. Without knowledge of what goes on in the kitchen, diners take into account their taste preferences, allergies, recommendations, and such, and order a meal off the menu. The chef, who knows how to translate the dishes on the menu into actual recipes, is the compiler or interpreter, who translates high-level instructions (the diner’s order) into far more specific low-level instructions (the exact ingredients list and instructions for cooking). The kitchen cooks are the actual computing hardware, who have the expertise to perform a variety of precise cooking skills reliably without error. They make the dishes based on the exact instructions given to them by the chef. The diners remain ignorant of the details.

  High-level BASIC programs were translated (or interpreted) into a language called 6502 assembly for Apple IIe CPUs. The clock speed of a processor, given in cycles per second, or hertz, dictates just how fast a CPU chip could execute individual assembly instructions.*7 Vastly more daunting than BASIC, I didn’t dare touch 6502 assembly as a kid. Assembly language grants access to the physical memory of the computer and allows one to specify numerical operation codes (opcodes) that are actually understood by the hardware in the CPU. In assembly, there is almost no distance between the programmer and the hardware.

  Here’s some assembly for a “Hello world!” program (one that just displays “Hello world!” and exits) in Apple II 6502 assembly:

  And here it is in C:

  int main() {

  printf("Hello world!n");

  return 0;

  }

  And here it is in Applesoft BASIC:

  10 PRINT "HELLO WORLD!"

  In the eighties, many programmers coded directly in assembly. Programs were simpler and performance was critical. But as computers got larger and more complex, it became unfeasible to code in assembly.*8 Programmers need to learn a different assembly language for different processors (as with the Apple II’s 6502, the Macintosh’s 68000, and the PC’s 8086), which is horrendously inefficient. More efficient was to use a CPU-independent higher-level language. All the languages we hear about today, from C++ to Java to Ruby to Python, are higher-level languages. A compiler takes the code written in these languages and translates it into the assembly code for a particular processor.

  Until I learned assembly in college, and how language compiler programs translated higher-level programming languages into assembly, computers remained partly opaque to me. That gap in my knowledge bothered me, because even though I had far more direct control over those lower layers, I couldn’t understand them. When I took a compilers class in college, the infrastructure of the computer opened up to me. There was no longer a miracle in between my code and its execution. I could see the whole picture, finally, and it was beautiful.

  The Split

  I renounce any systematic approach and the demand for exact proof. I will only say what I think, and make clear why I think it. I comfort myself with the thought that even significant works of science were born of similar distress.

  I want to develop an image of the world, the real background, in order to be able to unfold my unreality before it.

  —ROBERT MUSIL

  When I was a teenager, programming lost its allure. The “real world,” such as it was, had drawn my attention away from what now looked to be the sterile, hermetic world of computers. It was the late eighties. The web did not exist in any accessible form, nor were computers part of most people’s daily lives. I was
part of the very last generation to grow up in such a world. People only a few years younger than me would have the nascent public internet and the web to dig into and explore. I had online bulletin board systems (BBSs) and such, but they were strictly cordoned off from my everyday existence, the exclusive preserve of hobbyists, eccentrics, and freaks. And I was miserable in my small suburban enclave. For many programmers, computers held the answer to such misery. They continue to provide the mesmeric escape from the dreary everyday routines of teenage and adult years. I don’t have a clear explanation as to why computers failed to offer me solace as they did for many others. Something kept me from locking in completely to the brain-screen bond that kept many teen programmers up all night coding games or hacking copy protection. Literature became my refuge instead.

  My parents had raised me on science fiction, the standard literary junk food of computer geeks, but I felt increasingly drawn to explorations of human emotion and existential crisis. At a point of typical thirteen-year-old despair, I devoured the complete works of Kurt Vonnegut over the course of two weeks. They touched me. Vonnegut led me to explore increasingly “deep” fiction.*9