Becoming Steve Jobs Page 8
The personal computer industry was in its infancy, and everyone was flying blind, including Steve. One important thing he didn’t yet understand was that most breakthrough products result from a long cycle of hit-and-miss prototypes, the steady accumulation of features, and a timely synthesis of existing technologies. He and Woz, on the contrary, had stuck their heads down, worked hard, and on their very first try created something brilliant that the industry had never before seen. That was Steve’s idea of product development. But he was about to discover that that wasn’t the way it worked inside a corporation.
The company had concrete and meaningful goals for its new machine, to be called the Apple III. It would be aimed at the office as well as the home, and would support a color monitor that displayed 80 characters of text on each line, twice as many as the original Apple II. Since a typical typewritten document can accommodate 80 characters per line, the Apple III would compete head-on with specialized word-processing computers from Wang Laboratories that over the past two years had cascaded into offices across the United States and Europe in even greater numbers than the Apple II had landed in people’s homes. Done right, the new model would have been Apple’s entree into the corporate market for personal computers.
The success of the Apple 1 and the Apple II had given Steve a little too much confidence in his own technical judgment. He made a series of bad decisions that would be hard to undo later, the most important being his edict that the Apple III, whose footprint had to be small enough to leave lots of open room on an office desk, would be absolutely silent, which meant no internal cooling fan. This slowed the development process to a crawl, because engineers had to figure out how to create convection currents to draw heat away from the motherboard, which held all the semiconductor chips, as well as from the power supply. Without a fan, those components could make the innards of a small computer hot as a pizza oven. The solution the engineers finally came up with was to make the cabinet itself act as a heat sink to help draw out and dissipate the heat; however, that meant making it out of cast aluminum, a good heat conductor but a material that added considerably to the cost and complexity of manufacture.
It wasn’t just Steve’s demands that slowed the Apple III. Since Apple would be wooing customers who might have purchased an Apple II, the company had to make sure that that software created for the II would also run on the III. This “backwards-compatibility” was an annoying requirement that was far more complicated than Steve imagined, and the time his engineers spent learning to accomplish it slowed the project almost as much as his fussy hardware demands. Steve pushed the Apple III engineers relentlessly to solve these problems quickly. It didn’t matter to him that these were gnarly problems to solve. Accustomed to Woz’s magical ability to defy old boundaries and technical obstacles, he expected these new hardware and software engineers to do the same. They couldn’t.
STEVE’S IMPATIENCE WITH the nuts and bolts of corporate life was understandable. Steve was a visionary. It’s a word that is loosely tossed around these days, especially in Silicon Valley, but it legitimately applied to Steve even from very early in his life. He had the ability to see around corners, to envision how the seeds of existing ideas could be combined to create something unimaginable to others. The challenge he faced was to become an effective visionary—that’s what turns a dreamer into someone who changes the world.
A few weeks before he made that drive up to the Garden of Allah in late 1979, Steve had decided, at the urging of Bill Atkinson, Jef Raskin, and several other Apple technical employees, to check out some work being done by a well-known computer scientist named Alan Kay and some other engineers at Xerox Corporation’s Palo Alto Research Center, just a ten-minute drive up the peninsula from Cupertino. PARC, as it was known, would become famous for developing the concepts behind any number of important technologies, including Ethernet local area networking, high-resolution video monitors, laser printing, and object-oriented programming. That summer, Xerox had joined a number of venture capital firms in a $7 million secondary investment round in Apple (as part of the deal, Steve sold $1 million worth of his own stock to the investors), and in return had agreed to give Apple a peek at its most advanced technologies, or, as people in Silicon Valley like to say, to “open the kimono.” These visits were nothing short of an epiphany for Steve, because the technology at PARC was the visual expression of everything he believed computers could and should be.
It was at PARC that Steve and his group from Apple first saw the nascent technologies that would later become the distinguishing features of the Lisa and the Macintosh, and eventually all personal computers. They were shown a computer that featured a screen that was white like a sheet of paper, not black. Moreover, that screen had the exact dimensions of standard typing paper—8½ inches wide by 11 inches tall. On it were projected black characters so sharp and shapely that they looked as if they had been printed on that sheet of paper. The characters had been “bitmapped,” meaning that each pixel of the screen followed individual instructions from the computer, a radical new technology that gave developers full graphic control of the monitor screen. (Previously they had only been able to place rudimentary white, green, or orange characters on a black screen. Graphical images, other than meticulously arranged agglomerations of standard characters, were out of the question.) These bitmapped characters were really just the beginning of the changes: that screen also could display and organize the contents of the machine’s digital data storage by means of graphical “icons”—little symbolic figures a tad smaller than a postage stamp—representing documents that could be placed “inside” chimerical folder icons by means of a strange pointing device the geeks at PARC called a “mouse.” This mouse could also be used to directly move a cursor around a document on the screen when writing or editing. To delete a file or a folder, you would throw it in the trash by using the mouse to “drag” the icon representing a document over to a special icon on the screen that looked like a garbage can, and “drop” it in. Compared to the black screen with eerie green characters that had preceded it, this “graphical user interface”—or GUI (pronounced “gooey”), as it came to be known—represented at least as radical a break as when silent movies shifted to talkies.
The PARC researchers understood full well how significant a development this was, and were dismayed that Xerox had, in effect, paid for the privilege of giving Steve and the other Apple visitors access to technology this radical and new. They believed, correctly, that Xerox senior management back east was not that interested in building a full-blown computer; rather, they wanted to create better photocopiers, and perhaps a dedicated word processor to compete with Wang’s. Xerox did not come out with a computer using the PARC technology until 1981. Called the STAR, it was an intriguing device that was sold not to individual consumers but to businesses, as part of a networked system of at least three desktop units that sold for about $16,000 each. While it was a capable microcomputer for its time, customers balked at having to shell out $50,000 or so just to get the minimum setup for an office. It had little impact in the marketplace.
Steve realized that the Xerox GUI could be the foundation of something very ambitious and very personal. Visual iconography on a screen could make computing almost intuitive for just about everyone. Existing computer interfaces put a wall of arcane commands and typographical symbols that looked like expletives between the user and the results spewed back by the computer. If you replaced those commands with visual icons that could be easily manipulated via a mouse, harnessing the data-processing power of a computer might feel more like going to the library and pulling a book off the shelf, or like engaging in a discussion with a really smart friend or teacher. This interaction, this feeling of comfort with the back-and-forth with a computer, could lead to the realization of Steve’s overarching goal, the creation of a truly personal computer for ordinary people. Steve even had a metaphor for what that computer could be—a bicycle for the mind. After visiting PARC he was a changed man; th
ese were technologies he wanted to bring to everyone in the world.
NOW STEVE FACED the challenge of delivering on this promise within the gnawing confines of Apple. It would be a staggeringly ambitious project—one that no one at Apple but Steve could have imagined, and one that no one but he could have made so maddeningly complicated. The long road had many detours and would be pockmarked with collateral damage, but it would eventually lead to the introduction of the Macintosh computer in 1984.
After that visit to Xerox PARC, Steve completed what had been a slow abandonment of the Apple III development. The more he realized that the machine was simply a modest renovation of the Apple II, the more his attention wandered. Now he turned away completely, with the intention of applying what he’d learned at PARC to another computer already under development at Apple. This machine was specifically designed for Fortune 500 companies that required heavy-duty networked computing to accomplish tasks that were significantly more data-intensive than anything that could be handled by the Apple II or even the Apple III.
The Lisa, as this machine was dubbed, had been gestating since mid-1978, without making much progress. So when Steve assumed full control of the project in early 1980, the team felt a brief spurt of optimism. Steve told them that he fully intended to have the Lisa be the first computer to feature a graphical user interface and a mouse. They had, he told them, a chance to make history. He asked Bill Atkinson, the project’s lead software architect, how long it would take to translate what they’d seen at PARC into software that could be run on the Lisa. Atkinson predicted that he could do it in a mere half a year—missing the mark by some two and a half years. Clearly, Steve wasn’t the only person at Apple who could confuse a clear vision for a short path.
Steve’s brief management of the Lisa project revealed all his weaknesses. Once again, he couldn’t resolve the disparity between the corporate demands for this computer and his own ambitions. The Lisa was supposed to be for businesses, but Steve focused almost exclusively on what would make the machine accessible and friendly for an individual. Once again, he had the right idea for the long run—years later, easy-to-use computers would make personal computing ubiquitous across businesses both small and large—without the perspective needed to succeed in the short term. He paid lip service to the special needs of corporations and institutions, but what really fascinated him were the rounded edges of the icons on the Lisa’s “desktop” interface.
Atkinson and his programmers did create significant improvements on what they’d seen at PARC; the Lisa project is where the modern user-interface concepts of overlapping windows, seamless scrolling, and the mouse came into their own. But Steve was a failure at managing this group, in large part because he didn’t offer them a unified vision to rally around, given that his own interests were not those of the target audience—business users. When the project stalled, as was inevitable, Steve lashed out, excoriating the team and threatening to bring in Woz, who could surely get things done better and faster. Scotty tried to help Steve out by recruiting Larry Tesler, one of Xerox PARC’s own top researchers, and a group of his associates to try to bring a little more discipline and focus to the project. But two months after Tesler’s hire, Scotty looked at where the Lisa was headed and saw that the computer he had been counting on to represent Apple in the all-important business market was going to be too late, too expensive, and most likely a muddled mess of a machine.
In the fall of 1980, Scotty kicked Steve off the team after only nine months of being in charge, and handed its management over to a former Hewlett-Packard senior engineering exec named John Couch. Twice now, in rapid succession, Steve had failed when trying to lead a team to create a computer for the business market. While more and more people in the computer industry were keenly in tune with the needs of enterprise customers, Steve wasn’t one of them.
STEVE’S INSTINCTIVE PROBLEMS with authority—his inability to successfully manage large teams, his failure to adapt his strengths to the needs of his admittedly weak bosses—were wreaking havoc. The launch of the Apple III was a disaster. The computer didn’t ship until May 1980—a year later than planned—and sold for a base price of $4,340, more than double its target price. Within weeks, many buyers were returning their machines and demanding refunds, after an alarming number of Apple III’s started suffering catastrophic failures due to overheating. In some cases the motherboard got so hot that the solder softened and chips popped out of their sockets. All told, 14,000 had to be replaced. (The aluminum chassis worked fine for dissipating heat; the problem turned out to be that the parts had been nestled too closely to one another on the circuit board.) Moreover, backwards-compatibility had proved so thorny that very few programs were available out of the gate, so the computers that did happen to work weren’t particularly useful. The Apple III was an unmitigated commercial failure, selling only 120,000 units before it was discontinued in 1984. During that same time, the company sold nearly two million Apple II’s.
The pressure to get things right, which was amped up considerably now that the company’s stock was widely held, fell mainly on Scotty. But Jobs undercut his boss again and again. He humiliated suppliers that Scott had wooed; he complained endlessly and publicly about little things like the color of laboratory benches; and he repeatedly interfered with Scotty’s production schedules by insisting on the completion of inconsequential details. Steve’s own failures did nothing to chasten him. In fact, they only heightened his antipathy for Scotty, who was trying to balance the competing needs of thousands of employees. Steve didn’t accept the idea of compromise when it came to his own ideas, and so he turned every decision that didn’t go his way into a confrontation. The battles between the two became known within the company as the Scotty Wars.
Having such a difficult partner was just one of Scotty’s many frustrations, and finally it all became too much. Bit by bit, he disintegrated into an unreliable leader of a company that needed a firm hand. He started to develop some physical ailments, which seemed clearly related to stress. When he finally decided to repair some of the damage resulting from the bozo period by laying off some staff, he did so at a company-wide meeting in March 1981, at which he admitted to all present that he didn’t find running Apple to be all that much fun anymore.
Soon after that the board agreed that Scotty had to go. He departed after taking one last shot with a letter that attacked what he saw as a culture of hypocrites, yes-men, and “empire builders.” Of course, the divided culture was as much his fault as anyone else’s. But Steve knew that Scotty had borne the heaviest load as Apple morphed from a startup into a real operation. After his departure, Steve reportedly experienced a sudden bout of guilt; he was quoted as saying, “I was always afraid that I’d get a call to say that Scotty had committed suicide.”
ON A SEPTEMBER day in 1981, just a couple of months after Scotty’s departure, Bill Gates visited the Apple campus in Cupertino. The twenty-six-year-old CEO of Microsoft made the trip fairly often, since his company worked closely with Apple on programming languages for software developers. At the time, Steve was far richer and far more well-known. Gates, however, was the far more precocious and astute businessman.
After dropping out of Harvard, Gates had started Microsoft in 1975 in Albuquerque, New Mexico, with his prep school programming buddy, Paul Allen. Albuquerque was home to MITS, the maker of the Altair computer that had so excited the Homebrew hobbyists. Gates and Allen wrote a piece of software called an “interpreter” that made it possible for hobbyists to write their own programs for the Altair in the simple but popular BASIC programming language. MITS opted to bundle the program with each Altair it sold, and Micro-soft was in business.
The fortunate son of a prominent Seattle attorney and an accomplished, civic-minded mother, Gates came to business naturally. After he and Allen discovered that hobbyists were giving away pirated copies of their Altair BASIC interpreter, Gates wrote a kind of manifesto, asserting that developers of software for microcomputers should
be paid for their programs. If that happened, Gates predicted, an entirely new kind of software industry would arise that would benefit software developers, microcomputer makers, and users alike. This would represent a huge change: at that point, software development was mostly in the hands of the makers of computer hardware, who buried the development costs in the final price of the devices they sold. The prospect of making money by building software, Gates believed, would spur innovation and help the new microcomputer manufacturers take better advantage of the breakneck pace of improvement in semiconductor technology promised by Moore’s law.
Gates’s manifesto was every bit as significant as Moore’s law in the explosion of personal computing. Software development requires very little capital investment, since it is basically intellectual capital, pure thoughtstuff, expressed in a set of detailed instructions written in a language that machines can understand. The main cost is in the labor required to design and test it. There is no need for expensive factories or for the invention of fabrication equipment and processes. It can be replicated endlessly for practically nothing. And the prospect of hundreds of thousands of potential customers, or even more, means that developers don’t have to charge enormous prices.
Gates was right. Accepting the fact that software was worth paying for led to the emergence of a dynamic new industry. One could argue that Gates’s greatest contribution to the world was not Microsoft, or the MS-DOS or Windows operating systems, or the Office productivity applications that hundreds of millions of people use. It was his role as the first champion of the concept that software itself had value. The mind that could envision all that was a mind suited for the organizational matrixes of the corporate world. In those early days, Microsoft never lacked for enlightened leadership, unlike Apple.