Broad Band
by Claire L. Evans

Highlights
- Female mental labor was the original information technology, and women elevated the rudimentary operation of computing machines into an art called programming.
- To live with a box that connects the world to itself is expansive, life altering, and even a little magic. But the box itself is still only an object. If not taken to pieces and recycled, it’ll poison Earth for millennia, a permanence justifiable only if we believe what happens before the landfill is worthwhile. Spiritual, even. Computers are built to be turned on, cables are meant to be patched in, and links are made to be clicked. Without the human touch, current may run, but the signal stops. We animate the thing. We give it meaning, and in that meaning lies its worth. History books celebrate the makers of machines, but it’s the users—and those who design for the users—who really change the world.
- By the mid-twentieth century, computing was so much considered a woman’s job that when computing machines came along, evolving alongside and largely independently from their human counterparts, mathematicians would guesstimate their horsepower by invoking “girl-years,” and describe units of machine labor as equivalent to one “kilogirl.”
- “I do not believe that my father was (or ever could have been) such a Poet as I shall be an Analyst; (& Metaphysician),” she wrote to Charles Babbage later in life, “for with me the two go together indissolubly.”
- But Ada didn’t only explain the technical workings of the Analytical Engine. She imagined the impact it could have on the world, teasing out the implications of general-purpose computing to anticipate the transformative power of software.
- Like her father’s, Ada’s work outlived her, although it would be nearly a century before it was properly recognized. It took until the beginning of the computer age, when the magnitude of their prescience became undeniable, for her Notes to be republished, in a British computing symposium; its editor marveled, in 1953, that “her ideas are so modern that they have become of great topical interest once again.” Ada was lucky to have been born wealthy, noble, and relatively idle. Even without a professional path, she was able to educate herself, and she had time to privately follow her passions. Still, she could have done so much more, and it’s evident that she wanted to. Many brilliant women—born in the wrong centuries, the wrong places, or hoping to make an impact on the wrong field—have suffered similar fates, and far worse.
- back in the days “when the computer wore a skirt.”
- Human computing thrived as a stopgap between the emergence of large-scale scientific research and the capacity of hardware to carry out its calculations; eventually, the tireless machines that emerged from the spike in computer science research during the Second World War wore down their competition. After that war, the machines took over, decisively and permanently, shifting the definition of the word “computer” for the first and last time.
- Alone, women were the first computers; together, they formed the first information networks. The computer as we know it today is named for the people it replaced, and long before we came to understand the network as an extension of ourselves, our great-grandmothers were performing the functions that brought about its existence.
- Coaxing information into and out of the new machines was considered a woman’s job, too, on the level with typing, filing documents, and patching phone calls from place to place. Not that it was easy. Dealing with early mechanical computers required a keen analytical mind and limitless patience. Just like the women whose math moved mountains, early computer programmers and operators were tasked with enormous, intractable problems. Their creative solutions often meant the difference between life and death.
- Years later, when Grace was an established figure in the new field of computer programming, she’d always assign the hardest jobs to the youngest and least experienced members of her team. She figured they didn’t have the sense to know what was impossible.
- After the moth incident, she bought a box of plastic bedbugs in town and scattered them around the back of the computer on a lark, causing a two-day panic.
- Note: Grace Hopper
- The partial differential equation turned out to be a mathematical model for the central implosion of the atomic bomb. Grace never knew, until the bombs fell on Nagasaki and Hiroshima, precisely what she had helped to calculate.
- Note: Grace Hopper, pioneer of computing, performed cacultions on the Mark I with von Newmann to help work on the atomic bomb without knowing it.
- To save on processing time, Grace and Richard invented coding syntax and workarounds that set the groundwork for the way code is written to this day. As early as 1944, Grace realized she could save herself from rewriting code from scratch for each problem by holding onto reusable scraps, which came to be known as subroutines.
- Note: Grace Hopper and Richard Bloch
- When Grace’s code got thorny, she made a habit of annotating the master code sheets with comments, context, and equations, making it easier for colleagues to unravel her handiwork later. This system of documentation became standard practice for programmers, and it still is: good code is always documented.
- Note: Grace Hopper
- Grace’s most lasting contributions to the emerging field of computer programming all have to do with democratizing it: she pushed for programming advances that would radically change the way people talk to computers. With her help, they wouldn’t need advanced mathematical terms, or even zeros and ones. All they’d need is words.
- Note: Grace Hopper
- They hired engineers and former telephone company workers, who were good with relays, but most of the people who actually wired the ENIAC were women, part-time housewives with soldering irons on an assembly line.
- The ENIAC Six were an odd mix, thrown together by the circumstances of war. Betty Jean Jennings grew up barefoot on a teetotaling farm in Missouri, the sixth of seven children, and had never so much as visited a city before pulling into the North Philadelphia train station. Kay McNulty was Irish, her father a stonemason and ex-IRA; Ruth Lichterman, a native New Yorker from a prominent family of Jewish scholars; Betty Snyder, from Philadelphia, her father and grandfather both astronomers. Marlyn Wescoff, also a Philly native, had been hand calculating since before the war, and she was so adept that John Mauchly said she was “like an automaton.” They all met for the first time on a railroad platform in Philadelphia, on their way to the Aberdeen Proving Ground, a marshy plot in Maryland the army had converted into a weapons testing facility. Bunked together, they became fast friends. Even after long days training on the IBM equipment they would be using to tabulate and sort ENIAC data, they stayed up late talking about religion, their vastly different family backgrounds, and news of the secret computer.
- There were no instructions to read, no courses to take. The only manual for the ENIAC would be written years later, long after the women had reverse engineered it from the machine itself.
- Betty Snyder borrowed maintenance books for the machine’s punch card tabulator from a “little IBM maintenance man by the name of Smitty,” who told her he wasn’t allowed to lend them out but did anyway, just for a weekend, so she could figure out how the ENIAC’s input and output worked.
- They found a sympathetic man to let them take a plugboard apart and make their own diagram for reference, even though his supervisor wasn’t sure they’d be able to put it back together again (they were).
- As it turned out, Mauchly found other people to worry about those things—six people, in fact, in wool skirts and thrilled by the challenge. “How do you write down a program? How do you program? How do you visualize it? How do you get it on the machine? How do you do all these things?” wondered Betty Jean. It would be up to the ENIAC Six to figure it out.
- Betty Jean Jennings was more blunt. “It was a son of a bitch to program,” she wrote.
- The next morning—February 15, 1946—Betty arrived at the lab early and made a beeline to the ENIAC. She’d dreamed the answer, and knew precisely which switch out of three thousand to reset, and which of the ten possible positions it should take. She flipped the switch over one position, solving the problem instantly. Betty could “do more logical reasoning while she was asleep than most people can do awake,” marveled Betty Jean.
- The event made headlines. The women were photographed alongside their male colleagues—they remember flashbulbs—but the photos published in newspapers showed only men in suits and military decorations posing with the famous machine.
- “If the ENIAC’s administrators had known how crucial programming would be to the functioning of the electronic computer and how complex it would prove to be,” Betty Jean Jennings eventually determined, “they might have been more hesitant to give such an important role to women.”
- Men may have dropped bombs, but it was women who told them where to do it.
- Later in her career, she’d insist on not being called in until after the engineers had been stuck for at least four hours. Otherwise, it was a waste of her time.
- Note: Betty Holberton
- Grace Hopper was floored by Betty’s Sort-Merge Generator. According to Grace, it marked the first time a computer was ever used to write a program that wrote a program
- John Backus, a computer scientist at IBM and a contemporary of Grace Hopper’s, famously characterized programmers in the 1950s as a priesthood, “guarding skills and mysteries far too complex for ordinary mortals.”
- Grace wanted out of the priesthood. She strongly believed that computer programming should be widely known and available to nonexperts.
- Grace knew that would only happen when two things occurred: Users could command their own computers in natural language. That language was machine independent.
- Grace Hopper finished the first compiler, A-0, in the winter of 1951, during the peak of her personnel crisis at Remington Rand. The following May, she presented a paper on the subject, “The Education of a Computer,” at a meeting of the Association for Computing Machinery in Pittsburgh. In the paper, she explained something counterintuitive: that adding this extra step, a layer between the programmer and the computer, would increase efficiency.
- A-2 introduced what Grace called “pseudo-code,” a kind of in-between language more human than computer. It wouldn’t seem particularly friendly to a modern programmer, but this shorthand was the first step toward programming languages nonexperts could use. That would be Grace Hopper’s legacy.
- Grace flexed some navy connections and approached the Department of Defense, which at that time was running 225 computer installations with plans for many more. Only a month after the meeting at Penn, the DoD hosted the first organizational meeting of the Conference on Data Systems and Language, or CODASYL, at the Pentagon. Every major computer manufacturer sent diplomats to rub elbows with government brass and representatives from private industry. The common cause was a forward-thinking, easy-to-use language, preferably in simple English, that would be independent of any specific machine.
- For this, Grace is remembered as the grandmother of COBOL. Like a grandmother, she was responsible for the child but did not deliver it. Her diplomatic skills brought competitors, programmers, professional organizations, the military, and clients together. Her insistence that this milestone be reached collaboratively, rather than through competition for market share, was thirty years ahead of its time: the next generation of programmers to come of age might have sneered at COBOL’s unwieldy syntax, but many would employ a similar model of distributed innovation. As her biographer points out, Grace’s emphasis on collaborative development, and the network of volunteer programmers she mobilized, predated the open-source software movement by four decades. Further, building common languages that remained consistent even as hardware came and went would prove essential to the evolution of computing.
- In 1967, the April issue of Cosmopolitan ran an article, “The Computer Girls,” about programming. “The Computer Girls,” the magazine reported, were doing “a whole new kind of work for women” in the age of “big, dazzling computers,” teaching the “miracle machines what to do and how to do it.” Just as a woman twenty years previous might have chosen a career in education, nursing, or secretarial work, today, its author implied, she might consider computer programming.
- The shift from programmer to software engineer was an easy enough signal for female programmers to interpret. The new paradigm, subtle as it may seem, “brought with it unspoken ideas about which gender could best elevate the practice and status of programming,” historian Janet Abbate writes. She argues that this symbolic exclusion, in concert with the more concrete factors at play—wage discrimination, lack of childcare, lack of adequate mentoring and support—signaled to women to avoid computing just as it was suffering from an industry-wide shortage of talent.
- During the software crisis, aspects of software design that rely on “stereotypically feminine skills of communication and personal interaction” were “devalued and neglected,” ignored by male programmers and skipped over in software engineering curricula. As a result, the industry suffered, and suffers still.
- The novelist Richard Powers once wrote that “software is the final victory of description over thing.”
- every technological object, be it a map or a computer game, is also a human artifact.
- Even when women were invisible, it never means they weren’t there.
- She convinced them by speaking the language of their common interest: the computer was worth more as a tax-deductible donation than it was obsolescing in storage. That’s how, in April 1972, on the bed of a semitruck, the People’s Computer came to be delivered to Project One.
- That Pam managed to procure the SDS-940 from TransAmerica is still awe-inspiring. A 1972 Rolling Stone profile called it “one of the great hustles of modern times,” citing a fellow Resource One communard who claimed that Pam, soft-spoken as she seemed, could draw blood from a turnip.
- Ironically, much of Resource One’s funding came from the establishment: Bank of America supported the project, hoping to make good with—or monitor, depending on who you ask—the young people then so earnestly upending the status quo.
- She imagined Teletype terminals at every Switchboard phone room, in bookshops and libraries citywide, and within Project One itself, all daisy-chained into a decentralized network of shared resources and vernacular information. “If people needed something,” she says, “they could type it in and get it. If they needed help, if they wanted to share a car, or needed resources, they could get it.” Basically, she imagined the Internet.
- With language like this, a new archetypal image of the computer user was introduced to the world. Not the studious woman programmer, like Grace, or for that matter the software engineer in suit and tie, but the wild-eyed, wild-haired hacker, who was always a man.
- The cultural work of making Community Memory approachable to the people fell largely to Jude Milhon, a notorious female hacker and writer who would later come to be known as St. Jude, patroness of the “cypherpunks,” a computer subculture devoted to matters of encryption and copyright.
- Jude seeded the Community Memory database with provocations designed to lure users to the screen. She’d post proto-crowdsourcing questions, like: WHERE CAN I GET A DECENT BAGEL IN THE BAY AREA (BERKELEY PARTICULARLY) / IF YOU KNOW, LEAVE THE INFORMATION HERE IN THE COMPUTER.
- Community Memory demonstrated, long before the Web, how networked computing can strengthen local bonds and create a culture of its own. By connecting people, bagels, and jokes, it presaged the quirks of online community by a decade. “We opened the door to cyberspace and saw that it was hospitable territory,” Lee proclaimed.
- it might have been more than just a sense of completion that led to her move out; as the de facto woman in the original Resource One group, “Pam found herself unwittingly cast into a ‘queen bee’ role, with others trying to unload their emotional work onto her,” Lee Felsenstein tells me in an e-mail. “This may have been the major factor in her abruptly leaving the group—the burden of the accumulated desires of so many of us to act like our mother.”
- Their Social Services Referral Directory succeeded where efforts to interlink Bay Area Switchboards had failed, and for a simple reason: it actually considered its users.
- The Social Services Referral Directory represents one of the earliest efforts to apply computing to social good, and it reveals what happens when the process of technological design and implementation is opened up to more diverse groups of people. When the women of Resource One—radicals, feminists, and organizers all—brought their shared values to the machine, the result was a product more beneficial to their community.
- Her name was Elizabeth Feinler, but everyone back home in West Virginia called her Jake.
- At Stanford, Jake worked in a basement lab. It wasn’t long before an upstairs neighbor, Douglas Engelbart, began popping down to her office for organizational advice. Engelbart had invented a computer system called NLS (oNLine System) in the late 1960s, a predecessor to the modern personal computer in both form and philosophy, and the first system to incorporate a mouse and a keyboard into its design. NLS was so visionary that the first time Engelbart presented it in public is generally known in tech history as “the Mother of all Demos.”
- Putting the Resource Handbook together made Jake an instant authority on the ARPANET, and she eventually built the Network Information Center from a two-person operation to an eleven-million-dollar project, taking on all the major organizational responsibilities of the growing network. Working with a largely female staff, she created the ARPANET Directory, comprising, along with the Resource Handbook, the “electronic yellow and white pages” of the early Internet. In addition, she managed the registry for all new hosts, indexed the most important conversations on the network, ran the NIC’s Reference Desk—a hotline for the Internet that rang day in and day out—and suggested protocols that remain core utilities of the Internet to this day.
- “The Internet was more fun than a barrel of monkeys,” she said. “Having fallen in at an early stage, I had more fun than I ever thought I would ever have.” To celebrate network milestones—the first hundred hosts, the successful switch from one protocol to another—they’d throw parties in the conference room. Once, at the height of the spring season, Jake brought crabs and fresh asparagus for everyone at the NIC. “She wanted to have a crab feast,” says Mary Stahl with a laugh.
- Once the paper directory grew too large to update, Jake decided to build a people finder into the network itself. She established a new server at the NIC called WHOIS. “WHOIS was probably one of our biggest servers,” she explained. “We stopped putting out the directory, which was essentially the network phone book, and we put all that information under WHOIS. So you could say ‘WHOIS Jake Feinler,’ and it would come back and give you my name, address, e-mail address, affiliation on the net, that kind of thing.”
- But what would these domains be called? Jake suggested dividing them into generic categories, based on where the computers were kept: military hosts could have .mil, educational hosts .edu, government hosts .gov, organizations .org, and so forth. Commercial entities weren’t part of the Internet yet, but to fill it out, Jake and her colleagues debated between .bus, for business, and .com for commercial. Jake favored .bus, but there were some hardware components that used the word. They settled on .com. That we use this domain most of all today should say something about what the network has become.
- Jake spent her entire career keeping the young Internet tidy, labeled, and in check; without the NIC, it very well may not have worked.
- “The main purpose of the Internet was to push information across it,” Jake says. “So there had to be somebody who was organizing the information.” Who else but the women who were already there, answering the one phone number everybody knew by heart?
- She broke into the field when she dropped out of graduate school to take a job at Bolt, Beranek and Newman, where she fell in love with networks but was so consistently ignored by her coworkers that she once gave an entire presentation about the solution to a difficult, unsolved routing problem, only for the man running the meeting to announce that there was a difficult, unsolved routing problem he wanted everyone to solve—the very problem to which she’d just presented the solution.
- Ethernet could properly support only some hundred-odd computers before the packets of information traveling around the network started to collide and interrupt one another, like a bad conference call. This meant it could never really scale. Radia’s manager at the time tasked her to “invent a magic box” that would fix Ethernet’s limitations without taking up an additional iota of memory, no matter how large the network was. He issued this decree on a Friday, right before he was to leave on a weeklong vacation. “He thought that was going to be hard,” Radia says. That very night, Radia woke up with a start and a solution. “I realized, oh wow—it’s trivial,” she says. “I know exactly how to do it, and I can prove that it works.”
- This is Radia’s signature touch. She designs systems that run with minimal intervention, through self-configuring and self-stabilizing behavior. This approach makes a large computer network like the Internet possible. As she said in 2014, “Without me, if you just blew on the Internet, it would fall over and die.”
- The WELL had the leggy freshness of a booming frontier or a nation determining its constitution in the afterglow of a revolution. The closest thing it had to law was an axiom handed down by Stewart Brand: You Own Your Own Words. Or, as the WELLbeings say, YOYOW.
- When Stacy took her business proposal to the bank, “people just openly made fun of me. And looked at me like I was the biggest loser in the world to ever think that people would want to socialize via their computers.”
- At the time Stacy founded Echo, the entire Internet was only about 10 to 15 percent female. But women made up nearly half of Echo’s user base. “My success was due in part to the fact that I was the only one trying,” she explains.
- If nothing else, real-world accountability made Echo civil: it’s harder to be cruel to someone online when you might run in to them on Monday night at the Art Bar.
- “When the world that you’re in is made purely of speech,” posting offensive material just for the sake of it is “like you’re bombing the buildings.”
- Many of her colleagues didn’t consider interactive multimedia to be real computer science—it was seen as something fluffy, less serious, far closer to the humanities than to classical programming. But Wendy couldn’t shake the glimpse of the future she had seen: a future where images, texts, and ideas were connected through intuitive screen-based links, and computer screens were approachable to all.
- Nineteen eighty-seven was a banner year for hypertext, as it happens. Beyond the release of HyperCard, it marked the first academic hypertext conference, Hypertext ’87, in Chapel Hill, North Carolina.
- If my documents, strewn on my desk or clustered as icons on a screen, appear inscrutable to an outside observer, that’s no flaw in my system. They should be meaningless, because they’re only the remnants of a transformation process, like a sheaf of molted skin. The real technology is the user. That means me. And you.
- academic Internet. They called it the World Wide Web. To demonstrate the World Wide Web, Berners-Lee and Caillau brought their own computer with them on the plane from Geneva: a ten-thousand-dollar jet-black NeXT cube, at the time the only machine capable of running Berners-Lee’s graphical World Wide Web browser.
- Instead of employing a linkbase that could update documents automatically when links were moved or deleted, the Web embedded links in documents themselves. “That was all considered counter to what we were doing at the time,” Cathy adds. “It was kind of like, well: we know better than to do that.”
- The World Wide Web may not have been powerful enough for academics, but a lightweight, user-friendly tool is often more likely to take off than a vastly more powerful one. And while linkbases and constructive hypertext were easily maintained in relatively contained research and classroom environments, or on small networks of computers all running the same operating system, they would have quickly become unmanageable on a global scale.
- In a 1997 lecture at Southampton, she minced no words. “The Web has shown us that global hypertext is possible, but it has also shown us that it is easier to put rubbish on the Net than anything of real and lasting value,” she said.
- As the story of hypertext shows, technology alone isn’t enough to change the world—it has to be implemented in an accessible way and adopted by a community of users who feel enough ownership over it to invent new applications far beyond the imagination of its architects.
- “Jaime really knew how to present herself,” remembers Marisa Bowe. “She would go to Mac conventions and diss them in that little zine voice. Zine kids are always saying, ‘We hate these bands and we hate those bands.’ She was doing the same thing, but about computers, and on a floppy disk with her own music, and I thought it was just brilliant.”
- “What was fundamentally most fascinating and different about the Internet,” Marisa tells me, “was that people are the medium that you’re working with.”
- When clients came in, they’d see twenty or so weirdos in the conference room: friends who rented desks and threw down programming and design help whenever Jaime needed an extra hand. The office was a fake-out, but it worked.
- After all, the Internet’s only job is to shuttle packets of information from one place to the next without privileging one over the other. Our only job is to make the best packets we can. To make them worthy of the technology.
- “You wanted community because that kept people engaged.” It’s worth noting that modern-day social media giants have come around to this perspective, albeit on a much larger scale; when Facebook introduced private groups in 2010, it was to capitalize on the deeper ties forged in interest-specific online communities, which have become engines of user engagement for the platform.
- The same thing was happening on women’s sites across the Web. From iVillage to Oxygen, feminism ceded to fluff and political commentary gave way to fad diets, horoscopes, and compatibility calculators.
- In the nineteenth century, computers were actually women, and in the 1950s, they were a woman’s game, until programming was professionalized and masculinized. The ARPANET, built on a technical backbone radiating from military and academic centers, was male dominant because the people designing that infrastructure came from environments that favored men. Early hypertext was designed by women on the peripheries of computer science, but it took a man to popularize their ideas—until finally, in the early years of the Web, access to personal computers let women back in, and a generation of female culture workers and entrepreneurs made their impact yet again.
- Instead of trying again and again to beat a level, a baddie, or the clock, the girls Brenda interviewed preferred to wander around, exploring a virtual world and learning the relationships between its characters and places.
- In Life on the Screen, published only a few years after Brenda began her research, the sociologist Sherry Turkle argued that while men generally see computers as a challenge—something to master and dominate—women see computers as tools, objects to be collaborated with. This “soft mastery,” she explained, requires a closeness, a connection to computers that’s more like the relationship a musician has with her instrument: intimate, dialogue-driven, relational.
- And unlike the greater Web, the Purple Moon site was safe, because girls registered their accounts through their parents, and the site had a built-in panic button. “Simple,” Brenda explains, with characteristic brio. “Some shit’s going on, you push the panic button, you get a screen capture, everything goes to us, we see who was an asshole, we call their parents, we give them a warning.”
- Cyberfeminism conjures, in many ways, the countercultural, techno-utopian feeling of early Internet culture, and inherits the spirit of those West Coast cyberhippies who believed that computer-mediated communication would create a free civilization of the mind.
- The incorporeal newness that so intoxicated the earliest women online has morphed; it has become what the games critic Katherine Cross aptly calls a “Möbius strip of reality and unreality,” in which Internet culture “becomes real when it is convenient and unreal when it is not; real enough to hurt people in, unreal enough to justify doing so.”
- It’s not that the cyberfeminists, or any of their predecessors, have failed. It’s that as digital and real life edge into near-complete overlap, the digital world inherits the problems of the real.
- collective experience of what is right, real, and true. When we create technologies, we don’t just mirror the world. We actually make it. And we can remake it, so long as we understand the awesome nature of the responsibility.
- There’s no right kind of engineer, no special plane of thought that must be reached to make a worthwhile contribution. There’s no right education, no right career path. Sometimes there isn’t even a plan. The Internet is made of people, as it was made for people, and it does what we tell it to do. We can remake the world. The first step is to see it clearly, seeing who was really there at the most pivotal points in our technological history, without taking for granted the prevailing myths of garages and riches, of alpha nerds and brogrammers. The second step is to learn all the strategies of triumph and survival we can from our forebears, and I hope this book has unearthed a few: Ada Lovelace’s refusal of propriety, Grace Hopper’s forward-thinking tenacity, and the support the women of Resource One gave one another. Jake Feinler’s clarity of vision in the chaos of a changing network. A draft of Jaime Levy’s punk rock spirit for courage, and a healthy helping of VNS Matrix’s bodily self-assurance that the Internet is our place, wild and weird and mind-bending, as it has always been. The final step is the hardest: we get to work.