Home Where Were You When Apple II?

Where Were You When Apple II?

The iPhone, some of its earliest reviewers had said, was so unlike anything that had previously been called a telephone – so much so that it instantaneously un-defined everything heretofore that had gone by that name, making itself the singular specimen. No other device, I’ve read a hundred times now, had a similar effect on technology and design.

Rewind the clock just 34 years. Before the iPhone, there was an Apple that deserved an even greater superlative.

The seed is sewn

In the summer of 1976, the first “homebrew computer shows” consisted of weekend hobbyists who had constructed kit computers out of the first publicly available microprocessors and memory chips. Up until that time, “mini-computers” were comprised of logic circuits and soldered transistors. Now there were microcomputers, whose logic was represented digitally with a kind of chip we hadn’t seen before: ROM (read-only memory). The fellows making these machines were the ones who were good with soldering irons.

The following summer, the volume producers had joined the party. For most, their time of glory would be short-lived: Ohio Scientific (OSI), Cromenco, Processor Technology. There was a hobbyist electronics company that realized just in time that if it didn’t jump in, history would run right over it: Radio Shack. And then there was a calculator company from Europe whose name we would hear for a while: Commodore.

There was this maverick designer fresh out of Motorola named Chuck Peddle, who had a deal going with Commodore. He’d show up at computer shows with a wooden barrel full of processors that worked exactly like Motorola’s more expensive ones, but had different pinouts to avoid lawsuits. While he’d co-opt with Commodore for full-scale designs, he’d sell processors like candy bars for $25 a pop. (A friend of mine would buy his defective ones and make them into jewelry.) One hobbyist, a guy from HP, realized he could burn a design using Peddle’s 6502 that looked from a distance like an OSI, but could run programs he’d conceived for HP’s design with the Motorola processor, the 6800.

Sure, Steve Wozniak could solder chips to a board, but frankly that wasn’t his great talent. He was perhaps the greatest machine code programmer of all time, blowing Bill Gates clean out of the water. What Woz wanted wasn’t to thrill hobbyists with his soldering skills. He had ingenious ideas that would enable high-end language interpreters conceived for much bigger systems to run on a $25 8-bit processor, making all kinds of scientific applications available to the everyman. Which is why he named his system for Sir Isaac Newton, not the Beatles. His buddy was a born salesman, a guy who seemed to have the knack for making folks want something they couldn’t quite understand.

Apple comes home

“I was in high school,” my friend and fellow technologist Carmi Levy tells me. “We had two Apple IIs in the biology lab. They’d always seem to stick them in the back of the lab because that was the only space for them. They cleared out all the science equipment from two stations at the back of the lab, and they stuck them there. They had power and they had room, so off they went. And there was no rhyme or reason, no plan. They just plugged them in and let us figure out for ourselves how they worked.”


See Carmi’s discussion with CTV News Channel anchors on the passing of Steve Jobs.

Today, Carmi is one of Canada’s most recognized faces and voices on technology issues; when CTV News Channel wants context on a major event in tech, they go to Carmi.

In the late 1970s, Carmi’s high school was about ten miles from his house. That meant every day after dismissal, he’d have about 90 minutes to kill. He spent that time getting a real education, manning the computers in the biology lab. Correction: not all the computers. There was a Commodore PET there, but it didn’t have color and, besides, it looked like the robot sidekick from some cheesy ’70s sci-fi show. Carmi and his first set of colleagues manned the Apple II.

“We’d all gather around the Apple II and teach ourselves how to program, and we’d all take turns, it was very democratic. No one took more time than they were absolutely allowed, and it was magical… I loved that you booted right into BASIC. I realize that the world has moved on, but I’m sad that the development tools today are not as visible as they were back then, when they were an integral part of the experience, and in order to get the most out of the machine, you had to program it. Today, that’s not the case, and I kind of miss that.”

Apple goes to work

The first great consumer software industry evolved around the Apple II. By the summer of 1979, they were no longer hobbyists. While the other major brands were looking for ways to control this emerging channel, Apple let the community grow and flourish – a move uncharacteristic of the company in later years, with different models.

Radio Shack’s TRS-80 was the faster machine. It didn’t need an expansion card to produce 80 columns, and it had a complete set of Tandy-branded software available in-store. But the Apple II had color, and the fact that it was expandable with plug-in cards gave a boost to software companies that would otherwise get lost in a sea of floppy disks sold in zip-lock bags. By bundling its CP/M operating system with an expansion card for the Apple II that included not only memory but a stand-alone CPU (the 8 MHz Zilog Z-80), Microsoft sold its first consumer product under its own brand.

The race was on to legitimize the home computer as a business machine. David Strom, ReadWriteWeb’s channels editor, was already in the software business.

“I was the development manager of a small vertical market software company,” David tells us. “It was selling a $6,000 package to electric utilities to help with their generation capacity planning. That job was previously done with six-figure mainframe simulations. We had written it for the Apple II, and my job was to port it over to the IBM PC.”

The most commonly sold Apple IIs at that time had 48K of memory. The PC had 64K and built-in 80-column color graphics, plus the IBM brand name. Still, it had a viability gap to make up for. In an era before software was taken seriously as a legitimate product (businesses didn’t understand why you couldn’t just copy it), PC software didn’t appear to stack up against Apple II software bundled with expansion cards you could hold in your hand.

“Our program required additional memory,” says David. “At the time, the IBM PC came with 64K and we needed like 200K. We were worried that no one was going to buy our product because of this, and at one point we considered including a free RAM expansion card with the software to induce sales. Of course, all this software was on a single 5 1/4-inch floppy disk.”

No, Dad, I won’t go all sad-eyes on you

My mother was a professional artist, and one of her best friends in the 1970s was an art collector. One night at his house, she had me go play in his son’s room. He was older than me, was interested in math and science, and didn’t have a little brother. She probably hoped some of his hutzpah would rub off on me.

I found myself looking through the magazines on his floor. I was probably hoping for something with a centerfold. What I found was something that looked like a brochure sent to teachers about innovations in higher education, something written by people paid to fill space, back before the invention of blogs. In desperation, I opened it in search of coolness. I found it in a heartbeat.

The magazine was Creative Computing, published by a fellow I’d have the good fortune to briefly work for a decade later. It told me the entire story: the guys in the first conferences, building working computers that ran electronic programs, some of which played games. The idea of writing formulas that played chess or Go or Stratego or even Monopoly fascinated me, and here were guys with long hair and blue jeans making all that happen in their garages.

Two months later, I’m at the house of a friend of my father’s. He was an art collector. I played in his son Gil’s room, where my father hoped some of his hutzpah would rub off on me. A delivery had come by UPS earlier that day. It did not look like a cheap computer. It had a hard plastic case all around with a typewriter keyboard. It connected to the TV, just like an Atari machine. It had a joystick and paddles like an Atari (in fact, the paddles were exactly the same). But unlike an Atari, you could open the hatch and look at the circuit board. In fact, Dad would call it an Atari for the next few years, and I’d correct him each time.

Gil didn’t want me touching it. He knew this was the key to his future; his plan was to learn to write programs (which he did, in about three days’ time) and to sell them to businesses (which he also did, maybe three days later). Who knows where my hands had been, or how much electricity I’d conduct. If I had too much to do with it, it might inadvertently stop looking professional and more like a toy. So he kept me silent by letting me read the manual.

The AppleSoft BASIC manual was unlike anything I’d read. It made perfect sense. Although it was spiral-bound, it was on slick paper with magazine layout and superb color printing.

I made Gil pay for my being quarantined by shouting suggestions to him for things to try typing at the READY prompt. He was trying to write a program that plotted a circle. Back then, you couldn’t have text and graphics share the same portion of the screen because of the way memory was mapped, so there were only four lines of 40-column text on the bottom of the TV to do the job. He knew the formula, but after typing 3-dot-1-4-1-5-9-2-7, he’d type an “X” for “times.”

I had just read the section on arithmetic operators, and I knew multiplication was done with an asterisk. I told him but he wouldn’t believe me, even after several dozen “SYNTAX ERROR” messages. I sat for a while with the smug satisfaction of knowing something Gil didn’t, and I loved the feeling.

Dad knew I wanted an Apple II. I could tell because he wouldn’t look me straight in the eye, afraid I would give him some kind of puppy-dog look that meant, oh please, could I have sixteen hundred bucks? Dad didn’t have $1,600; he didn’t have $400, and we both knew it.

On the ride home, he started giving me one of his “life lessons” speeches, which were recognizable by the way he’d change his voice from his natural, melodic, almost sing-song tone to that of the narrator of some campy ’50s natural science film.

“Son, life is the sort of thing that deals us different decks of cards at different times,” Dad began, doing the one thing we both acknowledged later he was never particularly good at. I stopped him short and made him a promise: If Gil could earn his own way to a computer, so could I. We knew I couldn’t wait another three years to attend a college where I’d take another two years to perhaps, maybe, get enough experience to grab an internship learning to do something important in another four years. By that time, the new world would have already passed me by. What Creative Computing showed me looked like a fast-track program, and what my little exercise with Gil showed me was that I didn’t need to even touch a computer to figure it out.

My life is in three parts. Before that day in the winter of 1978 was Part 1. The day I met my wife in 1992 at a COMDEX convention was Part 3. Part 2 was made possible in large part by a company called Apple Computer. In high school biology labs and garages and art studios and school kids’ bedrooms all over the world, Apple was present for the transitions in our lives.

Some blog whose name I forget recently asked its readers, “When did the iPhone change your life?” Indeed, the iPhone is an industry landmark, a milestone in American business. But no phone could ever do for people’s lives what the Apple II did for so many thousands of us – high-schoolers and younger who were smarter and faster than our times, and wanted more from life than we were being offered. For those of us who had seen the beginning of this era, the iPhone did not change our lives at all. But it reminded us that Apple did.

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.