Title: Accidental Empires: How the Boys of Silicon Valley Make Their Millions
Author: Robert X Cringely
Scope: 3 stars
Readability: 4 stars
My personal rating: 4.5 stars
See more on my book rating system.
If you enjoy this summary, please support the author by buying the book.
Topic of Book
Cringely describes the computer industry during the 1970s and 1980s.
If you would like to learn more about the history of technology, read my book From Poverty to Progress: How Humans Invented Progress, and How We Can Keep It Going.
Key Take-aways
- What we now think of as Silicon Valley started in 1957 when Bill Noyce formed Fairchild Semiconductor. Employees from that one firm later formed many of the most important companies in the region.
- Key milestones in the early computer industry were:
- 1969: Formation of Xerox Palo Alto Research Center.
They did most of basic research which made personal computers possible. - 1971: PARC builds prototype for first personal computer with GUI and mouse: the Alto.
- 1975: First programmable personal computer: Altair 8800.
Bill Gates and Paul Allen later write BASIC program to operate it. - 1977: First personal computer that was a commercial success: Apple II
- 1981: IBM releases PC.
This model quickly dominates with market. Microsoft licensed the operating system. - 1982: Compaq releases first commercially-successful PC clone. The clones eventually overtook IBM.
- 1983: First commercial personal computer with GUI and mouse: Apple Lisa.
It was too high priced to sell very well. - 1984: First personal computer with GUI and mouse that was a commercial success: Apple Macintosh.
VisiCalc, the first spreadsheet program, was key to its success. - 1985: Adobe releases Postscript, which made it possible for Macs to use laser printers.
- 1969: Formation of Xerox Palo Alto Research Center.
Important Quotes from Book
The PC business actually grew up from the semiconductor industry. Instead of being a little mainframe, the PC is, in fact, more like an incredibly big chip. Remember, they don’t call it Computer Valley. They call it Silicon Valley, and it’s a place that was invented one afternoon in 1957 when Bob Noyce and seven other engineers quit en masse from Shockley Semiconductor.
Noyce and the others started Fairchild Semiconductor, the archetype for every Silicon Valley start-up that has followed. They got the money to start Fairchild from a young investment banker named Arthur Rock, who found venture capital for the firm. This is the pattern that has been followed ever since as groups of technical types split from their old companies, pick up venture capital to support their new idea, and move on to the next start-up. More than fifty new semiconductor companies eventually split off in this way from Fairchild alone.
At the heart of every start-up is an argument. A splinter group inside a successful company wants to abandon the current product line and bet the company on some radical new technology. The boss, usually the guy who invented the current technology, thinks this idea is crazy and says so, wishing the splinter group well on their new adventure. If he’s smart, the old boss even helps his employees to leave by making a minority investment in their new company, just in case they are among the 5 percent of start-ups that are successful.
We can date the birth of the personal computer somewhere between the invention of the microprocessor in 1971 and the introduction of the Altair hobbyist computer in 1975.
It takes new ideas a long time to catch on—time that is mainly devoted to evolving the idea into something useful. This fact alone dumps most of the responsibility for early technical innovation in the laps of amateurs, who can afford to take the time. Only those who aren’t trying to make money can afford to advance a technology that doesn’t pay.
These folks were pursuing adventure, not business. They were the computer equivalents of the barnstorming pilots who flew around America during the 1920s.
Intel begat the microprocessor and the dynamic random access memory chip, which made possible MITS, the first of many personal computer companies with a stupid name. And MITS, in turn, made possible Microsoft.
[Bill] Gates was a businessman from the start; otherwise, why would he have been worried about being passed by? Gates knew that the first language—the one resold by MITS, maker of the Altair—would become the standard for the whole industry. Those who seek to establish such de facto standards in any industry do so for business reasons. While Allen and Gates deliberately went about creating an industry and then controlling it, they were important exceptions to the general trend of PC entrepreneurism. Most of their eventual competitors were people who managed to be in just the right place at the right time and more or less fell into business. These people were mainly enthusiasts who at first developed computer languages and operating systems for their own use.
The Altair 8800 may have been the first microcomputer, but it was not a commercial success.
The first microcomputer that was a major commercial success was the Apple II. It succeeded because it was the first microcomputer that looked like a consumer electronic product. You could buy the Apple from a dealer who would fix it if it broke and would give you at least a little help in learning to operate the beast. The Apple II had a floppy disk drive for data storage, did not require a separate Teletype or video terminal, and offered color graphics in addition to text. Most important, you could buy software written by others that would run on the Apple and with which a novice could do real work.
The Apple II still defines what a low-end computer is like.
The Apple II was guided by three spirits. Steve Wozniak invented the earlier Apple I to show it off to his friends in the Homebrew Computer Club. Steve Jobs was Wozniak’s younger sidekick who came up with the idea of building computers for sale and generally nagged Woz and others until the Apple II was working to his satisfaction. Mike Markkula was the semiretired Intel veteran (and one of Noyce’s boys) who brought the money and status required for the other two to be taken at all seriously.
More so than probably any other microcomputer, the Apple II was the invention of a single person; even Apple’s original BASIC interpreter, which was always available in read only memory, had been written by Woz.
The Apple II found its eventual home in business, answering the prayers of all those middle managers who had not been able to gain access to the company’s mainframe or who were tired of waiting the six weeks it took for the computer department to prepare a report, dragging the answers to simple business questions from corporate data. Instead, they quickly learned to use a spreadsheet program called VisiCalc, which was available at first only on the Apple II.
VisiCalc was a compelling application—an application so important that it, alone justified the computer purchase. Such an application was the last element required to turn the microcomputer from a hobbyist’s toy into a business machine.
The Apple II was a VisiCalc machine.
The true market for the Apple II turned out to be big business, and it was through the efforts of enthusiast employees, not Apple marketers, that the Apple II invaded industry.
Nearly everything in computing, both inside and outside the box, is derived from earlier work. In the days of mainframes and minicomputers and early personal computers like the Apple II and the Tandy TRS-80, user interfaces were based on the mainframe model of typing out commands to the computer one 80-character line at a time—the same line length used by punch cards.
But with the introduction of 16-bit microprocessors in 1981 and 1982, the mainframe role model was scrapped altogether. This second era of microcomputing required a new role model and new ideas to copy. And this time around, the ideas were much more powerful—so powerful that they were worth protecting, which has led us to this look-and-feel fiasco. Most of these new ideas came from the Xerox Palo Alto Research Center (PARC). They still do.
Nearly all companies do research and development, but only a few do basic research. The companies that can afford to do basic research (and can’t afford not to) are ones that dominate their markets. Most basic research in industry is done by companies that have at least a 50 percent market share. They have both the greatest resources to spare for this type of activity and the most to lose if, by choosing not to do basic research, they eventually lose their technical advantage over competitors.
Most of the basic research in computer science has been done at universities under government contract, at AT&T Bell Labs in New Jersey and in Illinois, at IBM labs in the United States, Europe, and Japan, and at the Xerox PARC in California.
During the 1970s, the Compiler Science Laboratory (CSL) at Xerox PARC was the best place in the world for doing computer research. Researchers at PARC invented the first high-speed computer networks and the first laser printers, and they devised the first computers that could be called easy to use, with intuitive graphical displays. The Xerox Alto, which had built-in networking, a black-on-white bit-mapped screen, a mouse, and hard disk data storage and sat under the desk looking like R2D2, was the most sophisticated computer workstation of its time, because it was the only workstation of its time.
Beyond the Alto, the laser printer, and Ethernet, what Xerox PARC contributed to the personal computer industry was a way of working—Bob Taylor’s way of working.
To accomplish so much so fast, Taylor created a flat organizational structure; everyone who worked at CSL, from scientists to secretaries, reported directly to Bob Taylor. There were no middle managers. Taylor knew his limits, though, and those limits said that he had the personal capacity to manage forty to fifty researchers and twenty to thirty support staff. Changing the world with that few people required that they all be the best at what they did, so Taylor became an elitist, hiring only the best people he could find and subjecting potential new hires to rigorous examination by their peers. Taylor was always cross-fertilizing, shifting people from group to group to get the best mix and make the most progress.
In time the dream at CSL and Xerox PARC began to fade, not because Taylor’s geniuses had not done good work but because Xerox chose not to do much with the work they had done.
As it became clear that Xerox was going to do little or nothing with their technology, some of the bolder CSL veterans began to hit the road as entrepreneurs in their own right, founding several of the most important personal computer hardware and software companies of the 1980s. They took with them Xerox technology—its look and feel too. And they took Bob Taylor’s model for running a successful high-tech enterprise—a model that turned out not to be so perfect after all.
Bill Gates is the Henry Ford of the personal computer industry… Each man consciously worked to create an industry out of something that sure looked like a hobby to everyone else.
Gates called it the “software factory,” but what he and Simonyi implemented at Microsoft was a hierarchy of metaprogrammers. Unlike Simonyi’s original vision, Gates’s implementation used several levels of metaprogrammers, which allowed a much larger organization.
The software factory allows for only a single genius—Bill Gates. But since Bill Gates doesn’t actually write the code in Microsoft’s software, that means that few flashes of genius make their way into the products. They are derivative—successful, but derivative.
In the software business, as in most manufacturing industries, there are inventive organizations and maintenance organizations… As inventive organizations grow and mature, they often convert themselves into maintenance organizations.
The Microsoft connection made the IBM PC possible, and the IBM connection ensured Microsoft’s long-term success as a software company.
The PC was a big success and rapidly became the top-selling microcomputer.
In both hardware and software, successful reinvention takes place along the edges of established markets. It’s usually not enough just to make another computer or program like all the others; the new product has to be superior in at least one respect. Reinvented products have to be cheaper, or more powerful, or smaller, or have more features than the more established products with which they are intended to compete. These are all examples of edges. Offer a product that is in no way cheaper, faster, or more versatile—that skirts no edges—and buyers will see no reason to switch from the current best-seller.
Lotus Development Corp. in Cambridge, Massachusetts, bet nearly $4 million on IBM and on the idea that Lotus 1-2-3 would become the compelling application that would sell the new PC. A spreadsheet program, 1-2-3 became the single most successful computer application of all.
When the IBM PC, for all its faults, instantly be came the number one selling personal computer, it became the de facto industry standard, because de facto standards are set by market share and nothing else. When Lotus 1-2-3 appeared, running on the IBM, and only on the IBM, the PC’s role as the technical standard setter was guaranteed not just for this particular generation of hardware but for several generations of hardware.
IBM compatibility quickly became the key, and the level to which a computer was IBM compatible determined its success.
Reverse engineering the IBM PC’s ROM-BIOS took the efforts of fifteen senior programmers over several months and cost $1 million for the company that finally did it: Compaq Computer.
Crunching the numbers harder than IBM had, the Compaq founders discovered that a smaller company with less overhead than IBM’s could, in fact, bring out a lower-priced product and still make an acceptable profit.
For companies like IBM, the eventual problem with a hard ware standard like the IBM PC is that it becomes a commodity. Companies you’ve never heard of in exotic places like Taiwan and Bayonne suddenly see that there is a big demand for specific PC power supplies, or cases, or floppy disk drives, or mother boards, and whump!the skies open and out fall millions of Acme power supplies, and Acme deluxe computer cases, and Acme floppy disk drives, and Acme Jr. motherboards, all built exactly like the ones used by IBM, just as good, and at one-third the price. It alwayshappens.
Commoditization is great for customers because it drives prices down and forces standard setters to innovate. In the absence of such competition, IBM would have done nothing.
Alone among the microcomputer makers of the 1970s, the people of Apple saw themselves as not just making boxes or making money; they thought of themselves as changing the world.
Lisa, the computer, was born after Jobs toured Xerox PARC in December 1979, seeing for the first time what Bob Taylor’s crew at the Computer Science Lab had been able to do with bit mapped video displays, graphical user interfaces, and mice.
Steve Jobs saw the future that day at PARC and decided that if Xerox wouldn’t make that future happen, then he would. Within days, Jobs presented to Markkula his vision of Lisa, which included a 16-bit microprocessor, a bit-mapped display, a mouse for controlling the on-screen cursor, and a keyboard that was separate from the main computer box. In other words, it was a Xerox Alto, minus the Alto’s built-in networking.
Still, when Lisa hit the market in 1983, it failed. The problem was its $10,000 price, which meant that Lisa wasn’t really a personal computer at all but the first real workstation.
John Warnock, founder of Adobe Systems. Warnock is the father that Steve Jobs always wished for. He’s also the man who made possible the Apple LaserWriter printer and desktop publishing. He’s the man who saved the Macintosh.
Apple needed an edge against all these would-be competitors, and that edge was the laser printer.
America’s advantage in the PC business doesn’t come from our education system, from our fluoridated water, or, Lord knows, from our tax structure. And it doesn’t come from some innate ability we have to run big companies with thousands of employees and billions in sales. The main thing America has had going for it is the high-tech start-up, and, of course, our incredible willingness to fail.
Success in a large organization, whether it’s a university or IBM, is generally based on appearance, not reality. It’s understanding the system and then working within it that really counts, not bowling scores or body bags.
In the world of high-tech start-ups, there is no system, there are no hard and fast rules, and all that counts is the end product.
What makes start-ups possible at all is the fact that there are lots of people who like to work in that kind of environment. And Americans seem more willing than other nationalities to accept the high probability of working for a company that fails. Maybe that’s because to American engineers and programmers, the pro fessional risk of being with a start-up is very low. The high demand for computer professionals means that if a start-up fails, its workers can always find other jobs.
There is an enormous difference between starting a company and running one. Thinking up great ideas, which requires mainly intelligence and knowledge, is much easier than building an organization, which also requires measures of tenacity, discipline, and understanding. Part of the reason that nineteen out of twenty high-tech start-ups end in failure must be the difficulty of making this critical transition.
Surfing is the perfect metaphor for high-technology business. People who are astute and technically aware can see waves of technology leaving basic research labs at least a decade before they become commercially viable. There are always lots of technology waves to choose from, though it is not always clear right away which waves are going to be the big ones. Great ideas usually appear years—sometimes decades—before they can become commercial products. It takes that long both to bring the cost of a high-tech product down to where it’s affordable by the masses, and it can take even longer before those masses finally perceive a personal or business need for the product.
Having chosen his or her wave, the high-tech surfer has to ride long enough to see if the wave is really a big one. This generally takes about three years.
It is better to get off a wave too early (provided that you have another wave already insight) than to ride it too long.
Surfing is best done on the front of the wave. That’s where the competition is least, the profit margins are highest, and the wave itself provides most of the energy propelling you and your company toward the beach.
Knowing when to move to the next big wave is by far the most important skill for long-term success in high technology,- indeed, it’s even more important than choosing the right wave to ride in the first place.
Related Books
- “Hard Drive: Bill Gates and the Making of the Microsoft Empire” by Wallace and Erickson
- “Where Wizards Stay Up Late: Origins of the Internet” by Hafner and Lyon
- “The Social Organism: A Radical Understanding of Social Media” by Luckett and Casey
If you would like to learn more about the history of technology, read my book From Poverty to Progress: How Humans Invented Progress, and How We Can Keep It Going.
