Tag Archives: Technology

Invisible Engines: How Software Platforms Drive Innovation and Transform Industries by David Evans

Summary

  1. “By focusing on the software platform we hope to offer the reader a perspective on the business dynamics and strategies of industries, old and new, that have been powered by these invisible engines…All of us quickly recognized that software platform businesses have at least two sides. Software platforms consist of services that are often made available to developers through APIs. They are also made available to computer users, but those computer users typically avail themselves of API-based services by buying applications that in turn use APIs. It is only a slight exaggeration to say that all software platform makers all the time invest in getting both developers and users to use their platforms. The developers/users are like the men/women, cards/merchants, advertisers/eyeballs, and buyers/sellers that we mentioned above. In fact, software platforms sometimes appeal to more than two distinct groups—including hardware makers and content providers. The economics of two-sided platforms provides a number of insights into pricing, design, organization, and governance of platform-based businesses. We were interested in understanding how this new economic learning could help shed light on the strategies followed by software platforms. On the flip side, we were interested in understanding how a diverse set of industries based on software platforms could be probed to provide insights for students of this new economics. This book is the result. It blends economics, history, and business analysis. It is intended for anyone who wants to better understand the business strategies that have been followed in industries based on software platforms. We focus on pricing, product design, and integration into downstream or upstream suppliers.”

Key Takeaways

  1. Most successful software platforms have exploited positive feedbacks (or network effects) between applications and users: more applications attract more users, and more users attract more applications. Nurturing both sides of the market helped Microsoft garner thousands of applications and hundreds of millions of users for its Windows platform.
  2. The modular approach has numerous advantages. If a new program (or other complex system) can be specified as N modules, N teams can work in parallel. Moreover, individual modules can subsequently be improved without touching other parts of the overall program, and they can be used in other programs.
  3. Operating systems provide services to applications through Application Programming Interfaces (APIs). These services range from rudimentary hardware services, such as moving a cursor on a monitor, to sophisticated software services, such as drawing and rotating three-dimensional objects. The APIs serve as interfaces between these services and applications…It is easy to see why application developers find the ability to access system services through APIs appealing. Rather than every application developer writing hundreds of lines of code to allocate memory to an object, to take the example above, the operating system developer writes 116 lines of code and makes the system services this code provides available to all application developers through the API.
  4. Software platforms make services available through APIs. Developers benefit from these because they avoid having to write some of their own code. Users benefit from a greater variety of and lower prices for applications. The economics of multisided platforms provides a set of tools for understanding the past, present, and future of software platforms.
  5. Multisided businesses can generate profits for themselves and benefits for their customers if they can figure out ways to increase and then capture indirect network externalities. There are three major ways in which they do this. First, they serve as matchmakers. Second, they build audiences. Advertising-supported media do mainly that: they use content to attract eyeballs and then sell access to those eyeballs to advertisers. Third, they reduce costs by providing shared facilities for the customers on each side. That’s the shopping mall case with which we began.
  6. Businesses in multisided markets often subsidize one side of the market to get the other side on board—sometimes explicitly by charging low or negative prices. A dating club may charge men a higher price just because they have more inelastic demand and because it is easy to identify that group of consumers. But businesses in multisided markets have an additional reason to price discriminate: by charging one group a lower price the business can charge another group a higher price; and unless prices are low enough to attract sufficient numbers of the former group, the business cannot obtain any sales at all. In contrast, economic analyses of multisided platforms, along with the industry case studies discussed in the following chapters, show that successful multisided platform businesses must pay careful attention to all relevant groups, and typically must worry more about balance among them than about building share with one of them. Getting the balance right seems to be more important than building shares. Platform markets do not tip quickly because as a practical matter, it takes time to get things right. And the first entrant often does not win in the end: many other firms may come in and successfully tweak the pricing structure, product design, or business model. The businesses that participate in such industries have to figure out ways to get both sides on board. One way to do this is to obtain a critical mass of users on one side of the market by giving them the service for free or even paying them to take it. Especially at the entry phase of firms in multisided markets, it is not uncommon to see precisely this strategy. Another way to solve the problem of getting the two sides on board simultaneously is to invest to lower the costs of consumers on one side of the market. As we saw earlier, for instance, Microsoft invests in the creation of software tools that make it easier for application developers to write application software for Microsoft operating systems and provides other assistance that makes developers’ jobs easier. In some cases, firms may initially take over one side of the business in order to get the market going.
  7. The copyleft provision means that if people choose to distribute software that is based in part on other software covered by the GPL, they must distribute their new software under the GPL. GPL software thereby propagates itself.
  8. Bundling features into the software platform is often efficient for the platform producer and for end users, as it is for most information goods, because it lowers distribution costs and expands demand.
  9. Multisided platforms must consider marginal costs and price sensitivity in pricing, like single-sided businesses, but they must also consider which side values the other side more. Software platforms generally charge low prices on one side in order to attract customers who can then be made available to the other side. Getting the balance right among all sides is more important than building market share.
  10. Per-copy charges also helped Microsoft capitalize on its investment in programming languages in the face of great uncertainty as to which computer makers would succeed. A flat fee would have earned less from the top sellers and would have discouraged other makers from even trying. Microsoft retained this basic pricing model when it went into the operating system business.
  11. In retrospect, having multiple operating systems run on a hardware platform is a poor strategy. The idea, of course, was to ensure that the hardware, not the operating system, became the standard that defined the platform and determined its evolution. Indeed, IBM followed an important economic principle for traditional industries: all firms would like everyone else in the supply chain to be competitive. IBM didn’t seem to recognize that this was far from a traditional industry. If IBM’s strategy had worked, and if several operating systems had been installed on substantial numbers of IBM PCs, what would have happened? Most likely, having multiple operating systems would have made the hardware platform less popular than having a single operating system. Applications are generally written for software platforms, not the underlying hardware. The more fragmented the installed base of operating systems, the less attractive it is to write an application for any one of them.
  12. Four key strategies helped Microsoft obtain the leading position in personal computers: (1) offering lower prices to users than its competitors; (2) intensely promoting API-based software services to developers; (3) promoting the development of peripherals, sometimes through direct subsidies, in order to increase the value of the Windows platform to developers and users; and (4) continually developing software services that provide value to developers directly and to end users indirectly.
  13. Technically, this is a two-part tariff, consisting of an access fee (the price of the razor) plus a usage fee (the price of the blade). Here the blade can be thought of as having two related roles. It meters the use of the durable good, and it sorts customers into those who are willing to pay more and those who are willing to pay less. These metering devices tend to increase profits and help companies better recover their fixed costs of investment. Because it is particularly attractive to make money on the blades, it is especially attractive to reduce the price of the razor, perhaps to below cost, or perhaps even to zero in extreme cases. For video game console makers this razorblade strategy made a lot of sense. Getting the console into the hands of many people increased the demand for the games it could play. Moreover, it made buying a console less risky for households, who had no good way of knowing how valuable the console would be until they saw the games produced for it. The game-console company, which was in the best position to forecast the quality of those games, took the risk: it lost money if consumers didn’t buy many games, and it made money if they did. The people who ultimately bought a lot of games were those who valued the console the most, so making profits mainly or even entirely on games enabled the console makers to earn the most from those willing to pay the most for their system.
  14. When consumers value product differentiation and platforms can offer innovative and unique features, multiple platforms can coexist despite indirect network effects that make bigger better.
  15. The console video gaming industry operates a radically different business model from other software platform industries. Game manufacturers tightly integrate hardware and software systems; they offer consoles to consumers at less than manufacturing cost, and they earn profits by developing games and charging third-party game developers for access to their platforms.
  16. Palm, on the other hand, regrouped. It surveyed Zoomer buyers to find out what they liked and didn’t like, what they used and didn’t use: What these people said opened the company’s eyes. More than 90% of Zoomer owners also owned a PC. More than half of them bought Zoomer because of software (offered as an add-on) that transferred data to and from a PC. These were business users, not retail consumers. And they didn’t want to replace their PCs—they wanted to complement them. People weren’t asking for a PDA that was smart enough to compete with a computer. They wanted a PDA that was simple enough to compete with paper.
  17. When you’re playing Bobby Fischer—and you want to win—don’t play chess. Make sure whatever game you’re playing—be it network delivery of media vs. stand-alone PC, whatever you’re in—that you’re not playing a game someone else has mastered when you have an option to play another game. —Rob Glaser, Founder of RealNetworks, May 20011
  18. Interestingly, many are made by Microsoft, which integrated into mouse production in 1983 mainly to be sure that the sort of mouse specified by its nascent Windows system would be available in the marketplace. Microsoft developed and patented a mouse that could connect to a PC through an existing serial port rather than to a special card installed within the computer. This innovation reduced the cost of the mouse and thus of mouse-using computers running Windows. Apple as a vertically integrated hardware and software platform maker has always produced its own mice.
  19. What is the cure? From A’s point of view, one cure is to have many competing producers of good b. Competition will then hold the price of b close to cost (including a reasonable return on capital) regardless of A’s pricing, so that A both effectively determines the system price (via the price of a) and captures all the economic profit. Generally, it is more attractive to rely on others to supply a complement (instead of buying it or making it), all else equal, if there are many producers of that complement who compete intensely. Hence the common strategic advice, “Commoditize the complements.”
  20. In a famous 1951 paper, Nobel Laureate George Stigler argued that this proposition implies that “vertical disintegration is the typical development in growing industries, vertical integration in declining industries.”
  21. Interestingly, we are aware of no examples of software platforms that initially integrated into the applications/games/content that subsequently exited that business entirely. On the other hand, almost all such platforms have adopted a two-sided strategy and made significant investments in attracting third-party suppliers. Partial integration is the norm. The only exceptions are those successful software platform vendors that launched without integration; they have remained out of the applications business. The tendency of computer-based industries to disintegrate over time is even clearer—with interesting exceptions—when we consider integration with the supply of basic hardware and peripherals. The Microsoft strategy of having the hardware complement its operating system produced by a competitive, technologically dynamic industry has served to make its operating systems more valuable and to speed their market penetration. Microsoft is not above using integration on occasion to stimulate important markets for complements, as its entry into mouse production, discussed earlier, illustrates.
  22. In a rephrasing of Mr. Katz’s words, Michael Dell told Microsoft upon refusing the Xbox deal offered to him: When Sony cuts the prices on their PlayStations, their stock price goes up. Every time I cut prices, my stock price goes down. If you don’t understand why that happens, you don’t understand the console business. I understand why this is strategic to Microsoft. I don’t understand why this is strategic to Dell.
  23. “Oh, ‘tanstaafl.’ Means ‘There ain’t no such thing as a free lunch.’ And isn’t,” I added, pointing to a FREE LUNCH sign across room, “or these drinks would cost half as much. Was reminding her that anything free costs twice as much in the long run or turns out worthless.” —Robert Heinlein
  24. In practice, it generally does matter which side pays, because two key assumptions made in the textbook discussion don’t apply. First, there are often significant transactions costs that prevent the customers on the two sides of most markets from just “sorting it out” themselves. Take the payment card example. Although most card systems prohibit merchant surcharging because it degrades the value of their product to cardholders, several countries have barred card systems from imposing such a no-surcharge rule. In those countries, however, most merchants don’t surcharge. One reason is that it is costly to impose small charges on customers. Those merchants that do surcharge often charge more than they are charged by the card system—an indication that they are using the fact that a customer wants to use her card as a basis for groupwise price discrimination.
  25. When balance matters in a mature two-sided business, the pricing problem is much more complex than in a single-sided business. Marginal cost and price responsiveness on both sides matter for both prices, and so does the pattern of indirect network effects. In general, if side A cares more about side B than B cares about A, then, all else equal, A will contribute more total revenue. Thus, newspapers make their money from selling advertising, not from selling papers. The textbook pricing formula for a single-sided market gives the optimal markup over marginal cost as 1 over a measure of price responsiveness (the price elasticity of demand), so low price responsiveness implies high markups. The corresponding formula for a two-sided business involves marginal costs on both sides, price responsiveness on both sides, and measures of the strength of indirect network effects in both directions. In particular, balance may require charging a price below marginal cost to a group with low price responsiveness, something a singlesided business would never do, if it is critical to attract members of that group in order to get members of the other group on board.
  26. The idea is initially to subsidize one side (or, more generally, to do whatever it takes) in order to get it on board even though the other side is not yet on board, and to use the presence of the subsidized side to attract the other side.6 This differs from the single-sided penetration pricing strategy discussed above because the key here is to generate indirect network effects, to use the subsidized side as a magnet to attract the other side. After entry has been successfully effected and both sides are on board, of course, the rationale for the initial subsidy vanishes, and one would expect to see a corresponding shift in pricing policy. One of the regularities we discuss below, however, is that pricing structures—the relative amounts paid by the various sides—appear fairly robust over time; there are not many examples of pricing low to one side at first and then raising prices significantly later.
  27. A fundamental decision facing all multisided platform businesses is choice of a price structure: How much should the platform vendor charge each side relative to the others? Since transactions involving some sides may have significant associated variable costs (the production and distribution costs of video game consoles, for instance), the most illuminating way to analyze observed price structures is to look at the contributions of each side to gross margin or variable profits: revenue minus side-specific variable cost. Should a two-sided platform derive most of its gross margin from one side of the market, and if so, which side, or should it choose a more balanced structure, with both sides making significant contributions to gross margin?
  28. Like all multisided platforms, the pricing structures of the software platforms we have encountered in this book reflect the need to get all unintegrated sides on board: end users, application/game/content developers, and manufacturers of hardware and peripheral equipment. The structures we have examined have three remarkable features. First, all of them are extremely skewed: almost all earn a disproportionate share of their variable profits on only one side of the market, either end users or developers. Second, for all but video games, the platform earns the bulk of its net revenues from end users. The third remarkable feature, which we consider in the next section, is that these structures have been stable over time.
  29. Components selling occurs when the firm offers A and B separately (cars and bicycle racks). • Pure bundling occurs when the firm only offers A and B together as a single bundled product, AB (men’s laced shoes). • Mixed bundling occurs when the firm offers the bundle AB and either or both of its components, A and B (such as the Sunday New York Times and the New York Times Book Review).
  30. It is common to bundle together products that are complements, such as automobiles and tires, but firms may find that it pays to bundle products that aren’t complements. We already saw an example of this above. Bundling persuaded two consumers to buy a product even though each wanted only a single component. This saved the manufacturer costs. The idea that bundling of noncomplements can be used to enhance profits goes back to a classic paper by Nobel Prize winning economist George Stigler. Stigler tried to explain why movie distributors at one time required theaters to take bundles of pictures. Bundling can be used in a different way to facilitate price discrimination, which we discussed in the preceding chapter. That is, if different groups of consumers place different values on groups of components, bundles can be designed so that those with stronger demand pay more. The idea is possible to design bundles of components that cause consumers to sort themselves by the bundles they choose into groups with different willingness to pay. (Marketers call this “segmentation.”) In the case of autos, some will want the car with the sports package, while others will want only the basic package. The seller can then charge a premium to groups that have a particularly high demand for a particular package and offer an especially aggressive price to consumers that are very sensitive to price but are also willing to take the no-frills deal. For this to work, there must be a predictable correlation between combinations of components and demand (for example, price-sensitive consumers generally have a low demand for frills). A number of studies have found, for example, that automobile companies have much higher markups on luxury models than on base models. Bundling drives innovation and creates industries.
  31. The ability to select bundles of features to sell helps firms segment their customers, control costs, and enhance profits. Bundled products offer consumers convenience, lower costs, and products tailored to their needs and wants.
  32. Bundling decisions by multisided platforms, such as software platforms, are more complex since they must take into account the effect on all customer groups. Multisided businesses must consider both the additional customers they get on one side as a result of including a new feature and the additional customers they will get on the other side from having those additional customers. They may also include features that harm one side directly but benefit the platform overall by getting more customers on board on another side.
  33. Bundling makes sense for businesses whenever the cost of adding additional features is lower than the additional sales generated thereby—even if most purchasers do not value or use all the features in a product bundle.
  34. Creative destruction has been a hallmark of economic progress for millennia, but it has proceeded at a glacial pace for most of history. The Industrial Revolution sped this process up. Even so, it took decades for change to filter through the economy following innovations such as the spinning jenny, steam engine, and electric generator. The information technology revolution has quickened the pace of industrial change greatly. The plummeting costs of computer processing and storage make it possible to create products and industries that were not only infeasible but also unimaginable a few years earlier. Software platforms further accelerate the process of creative destruction, mainly because code is digital and malleable. Think how easy it is to add a new feature to a software platform and distribute that change electronically over the Internet to potentially billions of computing devices around the world.
  35. One is familiar: developers. TiVo is evangelizing its software platform by providing tools and offering prizes for the best applications in several categories, including games, music, and photos.
  36. History teaches us that it takes decades for technological changes to work their way through the economy, destroying, creating, and transforming industries. The third industrial revolution got off to a quick start. We suspect that it will continue through at least the first few decades of the twenty-first century and that our invisible engines will ultimately touch most aspects of our business and personal lives.

What I got out of it

  1. Some of the examples are a bit outdated but the principles are just as valuable as ever – how to think about multisided markets, pricing, positioning, and so much more

Inside the Tornado by Geoffrey Moore

Summary

  1. For those within the high tech sector, or who manage investments in these companies, this imperative translates into a series of deceptively simple questions: what can we do during a tornado to best capitalize on our opportunity? How can we tell when one is coming, and what we can do to prepare? How can we sense when it is ending, and what should we do then? Finally, going forward, how can we reframe our strategic management concepts to better accommodate tornado market dynamics in general?

Key Takeaways

  1. The winning strategy does not just change as we move from stage to stage, it actually reverses the prior strategy. This is why this is so difficult and counterintuitive – what made you successful at an earlier stage causes failure at later stages. Early stages you must not segment, in the chasm and bowling alley you must segment, in the tornado you must not segment, on main street you must segment
  2. Truly discontinuous innovations – paradigm shifts – are new products or services that require the end user and the marketplace to dramatically change their past behavior, with the promise of gaining equally dramatic new benefits. 
  3. The only way to cross the chasm is to put all your eggs in one basket. That is, key to a winning strategy is to identify a single beachhead of pragmatist customers in a mainstream market segment and to accelerate the formation of 100% of their whole product. The goal is to win a niche foothold in the mainstream as quickly as possible – that is what is meant by crossing the chasm. Then, once in the tornado, you need to quickly switch strategies and gain mass market share at any cost, positioning your products horizontally as global infrastructure
    1. Many leaders are not cut out to lead the company through each of these phases. That’s fine and to be expected, but know what stage you’re in, what type of CEO you have, and when they might need to be replaced 
  4. Once any infrastructure is substantially deployed, power shifts from teh builders – the professional services firms – to the operators, or what we have come to call the transaction services firms. The key to the transaction services model is that the requisite infrastructure has already been assimilated (keeping support costs down) and amortized (minimizing ongoing investment
  5. For every stage of the technology adoption life cycle, there is an optimal business model
    1. early market – professional services. 
    2. bowling alley – application products
    3. tornado – infrastructure products – a period of mass-market adoption when the general marketplace switches over to the new infrastructure paradigm
    4. main street – transaction services
  6. This sequence of events unleashes a vortex of market demand. Infrastructure, to be useful, must be standard and global, so once the market moves to switch out the old for the new, it wants to complete this transition as rapidly as possible. All the pent-up interest in the product is thus converted into a massive purchasing binge, causing demand to vastly outstrip supply. Companies grow at hypergrowth rates, with billions of dollars of revenue seeming to appear from out of nowhere.
  7. Overview of the tech adoption lifecyle
    1. The forces that operate in the bowling alley argue for a niche-based strategy that is highly customer-centric
    2. Those in the tornado push in the opposite direction toward a mass-market strategy for deploying a common standard infrastructure
    3. Then on Main St., market forces push back again toward a customer-centric approach, focusing on specific adaptations of this infrastructure for added value through mass customization
    4. Given these dramatic reversals in strategy, it is imperative that organizations be able to agree on where their markets are in the life cycle
    5. In the meantime, the economic cataclysm of the tornado deconstructs and reconstructs the power structure in the market so rapidly that simply understanding who is friend and who is foe becomes a challenge
    6. Within the newly emerging market structure, companies must compete for advantage based on their status within it
    7. Positioning in this context consists of a company taking its rightful place in the hierarchy of power and defending it against challengers
    8. And finally, moving fluidly from strategy to strategy is the ultimate challenge of any organization, demanding an extraordinarily flexible response from its management team
  8. Safe path is to overinvest when invading any new segment, seeking to accelerate market leadership, and then divert resources as soon as the position is achieved
  9. Post tornado market share by revenue tends to be 50% for the gorilla, 15% for chimp 1, 15% for chimp 2, and 30% for the monkeys 
  10. The lessons that Oracle taught – attack the competition ruthlessly, expand your distribution channel as fast as possible, ignore the customer
  11. The lessons that HP taught – just ship, extend distribution channels, drive to the next lower price point
  12. The lessons that Wintel taught – recruit partners to create a powerful whole product, instituitionalize this whole product as the market leader, commoditize the whole product by designing out your partners
  13. +1 opportunities – what do we have to offer at little or no incremental cost to ourselves that the market would pay us more money for? Compelling fantasy like Nike and Mont Blanc do this better than anyone
  14. Recap
    1. Bowling alley:  product leadership, customer intimacy
    2. Tornado: product leadership, operational excellency 
    3. Main St: operational excellent, customer intimacy 
  15. Trust, it turns out, is a complicated and challenging relationship, as much so in business as in parenting or marriage. Like everything else we have been discussing in recent chapters, it is ultimately about power. The paradox of trust is that by intelligently relinquishing power, one gains it back many times over. Once you reach your persona limits, this is the only economy of scale that can help. And because hypergrowth markets will push you to your personal limits faster than most other challenges in business, this is a fitting thought on which to close this book

What I got out of it

  1. Fascinating insights into the paradoxical path that it takes to be successful with technologically disruptive companies

The Idea Factory: Bell Labs and the Great Age of American Innovation by Jon Gertner

Summary

  1. “This book is about innovation—about how it happens, why it happens, and who makes it happen. It is likewise about why innovation matters, not just to scientists, engineers, and corporate executives but to all of us. That the story is about Bell Labs, and even more specifically about life at the Labs between the late 1930s and the mid-1970s, isn’t a coincidence.” The people helping to make it happen including Mervin Kelly, Jim Fisk, William Shockley, Claude Shannon, John Pierce, and William Baker.

Key Takeaways

  1. Where is the knowledge we have lost in information? —T. S. Eliot, The Rock
  2. Yet understanding the circumstances that led up to that unusual winter of 1947 at Bell Labs, and what happened there in the years afterward, promises a number of insights into how societies progress. With this in mind, one might think of a host of reasons to look back at these old inventions, these forgotten engineers, these lost worlds.
  3. Edison’s genius lay in making new inventions work, or in making existing inventions work better than anyone had thought possible. But how they worked was to Edison less important.
  4. Contrary to its gentle image of later years, created largely through one of the great public relations machines in corporate history, Ma Bell in its first few decades was close to a public menace—a ruthless, rapacious, grasping “Bell Octopus,” as its enemies would describe it to the press. “The Bell Company has had a monopoly more profitable and more controlling—and more generally hated—than any ever given by any patent,” one phone company lawyer admitted.
  5. AT&T’s savior was Theodore Vail, who became its president in 1907, just a few years after Millikan’s friend Frank Jewett joined the company.11 In appearance, Vail seemed almost a caricature of a Gilded Age executive: Rotund and jowly, with a white walrus mustache, round spectacles, and a sweep of silver hair, he carried forth a magisterial confidence. But he had in fact begun his career as a lowly telegraph operator. Thoughtfulness was his primary asset; he could see almost any side of an argument. Also, he could both disarm and outfox his detractors. As Vail began overseeing Bell operations, he saw that the costs of competition were making the phone business far less profitable than it had been—so much so, in fact, that Vail issued a frank corporate report in his first year admitting that the company had amassed an “abnormal indebtedness.” If AT&T were to survive, it had to come up with a more effective strategy against its competition while bolstering its public image.
  6. Vail didn’t do any of this out of altruism. He saw that a possible route to monopoly—or at least a near monopoly, which was what AT&T had always been striving for—could be achieved not through a show of muscle but through an acquiescence to political supervision. Yet his primary argument was an idea. He argued that telephone service had become “necessary to existence.” Moreover, he insisted that the public would be best served by a technologically unified and compatible system—and that it made sense for a single company to be in charge of it. Vail understood that government, or at least many politicians, would argue that phone subscribers must have protections against a monopoly; his company’s expenditures, prices, and profits would thus have to be set by federal and local authorities. As a former political official who years before had modernized the U.S. Post Office to great acclaim, Vail was not hostile toward government. Still, he believed that in return for regulation Ma Bell deserved to find the path cleared for reasonable profits and industry dominance. In Vail’s view, another key to AT&T’s revival was defining it as a technological leader with legions of engineers working unceasingly to improve the system.
  7. The Vail strategy, in short, would measure the company’s progress “in decades instead of years.” Vail also saw it as necessary to merge the idea of technological leadership with a broad civic vision. His publicity department had come up with a slogan that was meant to rally its public image, but Vail himself soon adopted it as the company’s core philosophical principle as well. It was simple enough: “One policy, one system, universal service.” That this was a kind of wishful thinking seemed not to matter.
  8. “Of its output,” Arnold would later say of his group, “inventions are a valuable part, but invention is not to be scheduled nor coerced.” The point of this kind of experimentation was to provide a free environment for “the operation of genius.” His point was that genius would undoubtedly improve the company’s operations just as ordinary engineering could. But genius was not predictable. You had to give it room to assert itself.
  9. From the start, Jewett and Arnold seemed to agree that at West Street there could be an indistinctness about goals. Who could know in advance exactly what practical applications Arnold’s men would devise? Moreover, which of these ideas would ultimately move from the research department into the development department and then mass production at Western Electric? At the same time, they were clear about larger goals. The Bell Labs employees would be investigating anything remotely related to human communications, whether it be conducted through wires or radio or recorded sound or visual images.
  10. The industrial lab showed that the group—especially the interdisciplinary group—was better than the lone scientist or small team. Also, the industrial lab was a challenge to the common assumption that its scientists were being paid to look high and low for good ideas. Men like Kelly and Davisson would soon repeat the notion that there were plenty of good ideas out there, almost too many. Mainly, they were looking for good problems.
  11. Quantum mechanics, as it was beginning to be called, was a science of deep surprises, where theory had largely outpaced the proof of experimentation. Some years later the physicist Richard Feynman would elegantly explain that “it was discovered that things on a small scale behave nothing like things on a large scale.” In the quantum world, for instance, you could no longer say that a particle has a certain location or speed. Nor was it possible, Feynman would point out, “to predict exactly what will happen in any circumstance.”
  12. The Great Depression, as it happened, was a boon for scientific knowledge. Bell Labs had been forced to reduce its employees’ hours, but some of the young staffers, now with extra time on their hands, had signed up for academic courses at Columbia University in uptown Manhattan.
  13. “The [Bell] System,” Danielian pointed out, “constitutes the largest aggregation of capital that has ever been controlled by a single private company at any time in the history of business. It is larger than the Pennsylvania Railroad Company and United States Steel Corporation put together. Its gross revenues of more than one billion dollars a year are surpassed by the incomes of few governments of the world. The System comprises over 200 vassal corporations. Through some 140 companies it controls between 80 and 90 percent of local telephone service and 98 percent of the long-distance telephone wires of the United States.”
  14. The 512A was an example of how, if good problems led to good inventions, then good inventions likewise would lead to other related inventions, and that nothing was too small or incidental to be excepted from improvement. Indeed, the system demanded so much improvement, so much in the way of new products, so much insurance of durability, that new methods had to be created to guarantee there was improvement and durability amid all the novelty.
  15. We usually imagine that invention occurs in a flash, with a eureka moment that leads a lone inventor toward a startling epiphany. In truth, large leaps forward in technology rarely have a precise point of origin. At the start, forces that precede an invention merely begin to align, often imperceptibly, as a group of people and ideas converge, until over the course of months or years (or decades) they gain clarity and momentum and the help of additional ideas and actors. Luck seems to matter, and so does timing, for it tends to be the case that the right answers, the right people, the right place—perhaps all three—require a serendipitous encounter with the right problem. And then—sometimes—a leap. Only in retrospect do such leaps look obvious.
  16. There was something in particular about the way he [William Shockley] solved difficult problems, looking them over and coming up with a method—often an irregular method, solving them backward or from the inside out or by finding a trapdoor that was hidden to everyone else—to arrive at an answer in what seemed a few heartbeats.
  17. By intention, everyone would be in one another’s way. Members of the technical staff would often have both laboratories and small offices—but these might be in different corridors, therefore making it necessary to walk between the two, and all but assuring a chance encounter or two with a colleague during the commute. By the same token, the long corridor for the wing that would house many of the physics researchers was intentionally made to be seven hundred feet in length. It was so long that to look down it from one end was to see the other end disappear at a vanishing point. Traveling its length without encountering a number of acquaintances, problems, diversions, and ideas would be almost impossible. Then again, that was the point. Walking down that impossibly long tiled corridor, a scientist on his way to lunch in the Murray Hill cafeteria was like a magnet rolling past iron filings.
  18. Essentially Kelly was creating interdisciplinary groups—combining chemists, physicists, metallurgists, and engineers; combining theoreticians with experimentalists—to work on new electronic technologies.
  19. If the ingredients in the alloy weren’t pure—if they happened to contain minute traces of carbon, oxygen, or nitrogen, for instance—Permendur would be imperfect. “There was a time not so long ago when a thousandth of a percent or a hundredth of a percent of a foreign body in a chemical mixture was looked upon merely as an incidental inclusion which could have no appreciable effect on the characteristics of the substance,” Frank Jewett, the first president of the Labs, explained. “We have learned in recent years that this is an absolutely erroneous idea.”
  20. For Scaff and Theurer—and, in time, the rest of the solid-state team at Bell Labs—one way to think of these effects was that purity in a semiconductor was necessary. But so was a controlled impurity. Indeed, an almost vanishingly small impurity mixed into silicon, having a net effect of perhaps one rogue atom of boron or phosphorus inserted among five or ten million atoms of a pure semiconductor like silicon, was what could determine whether, and how well, the semiconductor could conduct a current. One way to think of it—a term that was sometimes used at the Labs—was as a functional impurity.
  21. The formal purpose of the new solid-state group was not so much to build something as to understand it. Officially, Shockley’s men were after a basic knowledge of their new materials; only in the back of their minds did a few believe they would soon find something useful for the Bell System.
  22. On November 17, Brattain and an electrochemist in the solid-state group, Robert Gibney, explored whether applying an electrolyte—a solution that conducts electricity—in a particular manner would help cut through the surface states barrier. It did. Shockley would later identify this development as a breakthrough and the beginning of what he called “the magic month.” In time, the events of the following weeks would indeed be viewed by some of the men in terms resembling enchantment—the team’s slow, methodical success effecting the appearance of preordained destiny. For men of science, it was an odd conclusion to draw. Yet Walter Brattain would in time admit he had “a mystical feeling” that what he ultimately discovered had been waiting for him.
  23. Any Bell scientist knew about the spooky and coincidental nature of important inventions. The origins of their entire company—Alexander Bell’s race to the patent office to beat Elisha Gray and become the recognized inventor of the telephone—was the textbook case.
  24. If an idea begat a discovery, and if a discovery begat an invention, then an innovation defined the lengthy and wholesale transformation of an idea into a technological product (or process) meant for widespread practical use. Almost by definition, a single person, or even a single group, could not alone create an innovation. The task was too variegated and involved.
  25. “It is the beginning of a new era in telecommunications and no one can have quite the vision to see how big it is,” Mervin Kelly told an audience of telephone company executives in 1951. Speaking of the transistor, he added that “no one can predict the rate of its impact.” Kelly admitted that he wouldn’t see its full effect before he retired from the Labs, but that “in the time I may live, certainly in 20 years,” it would transform the electronics industry and everyday life in a manner much more dramatic than the vacuum tube. The telecommunications systems of the future would be “more like the biological systems of man’s brain and nervous system.” The tiny transistor had reduced dimensions and power consumption “so far that we are going to get into a new economic area, particularly in switching and local transmission, and other places that we can’t even envision now.” It seemed to be some kind of extended human network he had in mind, hazy and fantastical and technologically sophisticated, one where communications whipped about the globe effortlessly and where everyone was potentially in contact with everyone else.
  26. He could remember, too, that as the tubes became increasingly common—in the phone system, radios, televisions, automobiles, and the like—they had come down to price levels that once seemed impossible. He had long understood that innovation was a matter of economic imperatives. As Jack Morton had said, if you hadn’t sold anything you hadn’t innovated, and without an affordable price you could never sell anything. So Kelly looked at the transistor and saw the past, and the past was tubes. He thereby intuited the future.
  27. “A Mathematical Theory of Communication”—“the magna carta of the information age,” as Scientific American later called it—wasn’t about one particular thing, but rather about general rules and unifying ideas. “He was always searching for deep and fundamental relations,” Shannon’s colleague Brock McMillan explains. And here he had found them. One of his paper’s underlying tenets, Shannon would later say, “is that information can be treated very much like a physical quantity, such as mass or energy.”
  28. One shouldn’t necessarily think of information in terms of meaning. Rather, one might think of it in terms of its ability to resolve uncertainty. Information provided a recipient with something that was not previously known, was not predictable, was not redundant. “We take the essence of information as the irreducible, fundamental underlying uncertainty that is removed by its receipt,” a Bell Labs executive named Bob Lucky explained some years later. If you send a message, you are merely choosing from a range of possible messages. The less the recipient knows about what part of the message comes next, the more information you are sending.
  29. (1) All communications could be thought of in terms of information; (2) all information could be measured in bits; (3) all the measurable bits of information could be thought of, and indeed should be thought of, digitally. This could mean dots or dashes, heads or tails, or the on/off pulses that comprised PCM.
  30. His calculations showed that the information content of a message could not exceed the capacity of the channel through which you were sending it. Much in the same way a pipe could only carry so many gallons of water per second and no more, a transmission channel could only carry so many bits of information at a certain rate and no more. Anything beyond that would reduce the quality of your transmission. The upshot was that by measuring the information capacity of your channel and by measuring the information content of your message you could know how fast, and how well, you could send your message. Engineers could now try to align the two—capacity and information content.
  31. Shannon’s paper contained a claim so surprising that it seemed impossible to many at the time, and yet it would soon be proven true. He showed that any digital message could be sent with virtual perfection, even along the noisiest wire, as long as you included error-correcting codes—essentially extra bits of information, formulated as additional 1s and 0s—with the original message. In his earlier paper on cryptography, Shannon had already shown that by reducing redundancy you could compress a message to transmit its content more efficiently. Now he was also demonstrating something like the opposite: that in some situations you could increase the redundancy of a message to transmit it more accurately.
  32. And yet Kelly would say at one point, “With all the needed emphasis on leadership, organization and teamwork, the individual has remained supreme—of paramount importance. It is in the mind of a single person that creative ideas and concepts are born.” There was an essential truth to this, too—John Bardeen suddenly suggesting to the solid-state group that they should consider working on the hard-to-penetrate surface states on semiconductors, for instance. Or Shockley, mad with envy, sitting in his Chicago hotel room and laying the groundwork for the junction transistor. Or Bill Pfann, who took a nap after lunch and awoke, as if from an edifying dream, with a new method for purifying germanium. Of course, these two philosophies—that individuals as well as groups were necessary for innovation—weren’t mutually exclusive. It was the individual from which all ideas originated, and the group (or the multiple groups) to which the ideas, and eventually the innovation responsibilities, were transferred.
  33. He would acknowledge that building devices like chess-playing machines “might seem a ridiculous waste of time and money. But I think the history of science has shown that valuable consequences often proliferate from simple curiosity.” “He never argued his ideas,” Brock McMillan says of Shannon. “If people didn’t believe in them, he ignored those people.”
  34. In truth, the handoff between the three departments at Bell Labs was often (and intentionally) quite casual. Part of what seemed to make the Labs “a living organism,” Kelly explained, were social and professional exchanges that moved back and forth, in all directions, between the pure researchers on one side and the applied engineers on the other. These were formal talks and informal chats, and they were always encouraged, both as a matter of policy and by the inventive design of the Murray Hill building.
  35. Physical proximity, in Kelly’s view, was everything. People had to be near one another. Phone calls alone wouldn’t do. Kelly had even gone so far as to create “branch laboratories” at Western Electric factories so that Bell Labs scientists could get more closely involved in the transition of their work from development to manufacture.
  36. Bell Labs had the advantage of necessity; its new inventions, as one of Kelly’s deputies, Harald Friis, once said, “always originated because of a definite need.”
  37. To innovate, Kelly would agree, an institute of creative technology required the best people, Shockleys and Shannons, for instance—and it needed a lot of them, so many, as the people at the Labs used to say (borrowing a catchphrase from nuclear physics), that departments could have a “critical mass” to foster explosive ideas.
  38. There was no precise explanation as to why this was such an effective goad, but even for researchers in pursuit of pure scientific understanding rather than new things, it was obvious that their work, if successful, would ultimately be used. Working in an environment of applied science, as one Bell Labs researcher noted years later, “doesn’t destroy a kernel of genius—it focuses the mind.”
  39. An instigator is different from a genius, but just as uncommon. An instigator is different, too, from the most skillful manager, someone able to wrest excellence out of people who might otherwise fall short. Somewhere between Shannon (the genius) and Kelly (the manager), Pierce steered a course for himself at Bell Labs as an instigator. “I tried to get other people to do things, I’m lazy,” Pierce once told an interviewer.
  40. Pierce’s real talent, according to Friis and Pierce himself, was in getting people interested in something that hadn’t really occurred to them before.
  41. Pierce had been correct in some respects about the traveling wave tube’s potential. But as he came to understand, inventions don’t necessarily evolve into the innovations one might at first foresee. Humans all suffered from a terrible habit of shoving new ideas into old paradigms. “Everyone faces the future with their eyes firmly on the past,” Pierce said, “and they don’t see what’s going to happen next.”
  42. A terrestrial signal could be directed toward the orbiting satellite in space; the satellite, much like a mirror, could in turn direct the signal to another part of the globe. Pierce didn’t consider himself the inventor of this idea; it was, he would later say, “in the air.”
  43. Ideas may come to us out of order in point of time,” the first director of the Rockefeller Institute for Medical Research, Simon Flexner, once remarked. “We may discover a detail of the façade before we know too much about the foundation. But in the end all knowledge has its place.”
  44. Why move in this direction? What kind of future did the men envision? One of the more intriguing attributes of the Bell System was that an apparent simplicity—just pick up the phone and dial—hid its increasingly fiendish interior complexity. What also seemed true, and even then looked to be a governing principle of the new information age, was that the more complex the system became in terms of capabilities, speed, and versatility, the simpler and sleeker it appeared. ESS was a case in point.
  45. I liked Fisk very much. But the combination of Fisk, who didn’t know a lot about what was going on in the bowels of the place, and Julius, who knew everything about what was going on in the bowels of the place, was a good combination.”
  46. Colleagues often stood amazed that Baker could recall by name someone he had met only once, twenty or thirty years before. His mind wasn’t merely photographic, though; it worked in some ways like a switching apparatus: He tied everyone he ever met, and every conversation he ever had, into a complex and interrelated narrative of science and technology and society that he constantly updated, with apparent ease.
  47. To Pollak, this was a demonstration not of Bill Baker’s cruelty but of his acumen—in this case to push his deep belief that science rests on a foundation of inquiry rather than certainty. Also, it revealed how nimble Baker’s mind really was. “A very small number of times in my life I’ve been in the presence of somebody who didn’t necessarily answer the question I asked. They answered the question I should have asked,” Pollak says. “And Bill Baker was one of those people. And there are other people who just build a mystique and give the impression of a mystique around them. And Bill had that, too.”
  48. New titles might not have increased his influence. By the start of the 1960s Baker was engaged in a willfully obscure second career, much like the one Mervin Kelly had formerly conducted, a career that ran not sequentially like some men’s—a stint in government following a stint in business, or vice versa—but simultaneously, so that Baker’s various jobs in Washington and his job at Bell Labs intersected in quiet and complex and multifarious ways. Baker could bring innovations in communications to the government’s attention almost instantly.
  49. “So often,” says Ian Ross, who worked in Jack Morton’s department at Bell Labs doing transistor development in the 1950s, “the original concept of what an innovation will do”—the replacement of the vacuum tube, in this case—“frequently turns out not to be the major impact.” The transistor’s greatest value was not as a replacement for the old but as an exponent for the new—for computers, switches, and a host of novel electronic technologies.
  50. Innovations are to a great extent a response to need.
  51. In the wake of the 1956 agreement, AT&T appeared to be indestructible. It now had the U.S. government’s blessing. It was easily the largest company in the world by assets and by workforce. And its Bell Laboratories, as Fortune magazine had declared, was indisputably “the world’s greatest industrial laboratory.” And yet even in the 1960s and 1970s, as Bill Baker’s former deputy Ian Ross recalls, the “long, long history of worry about losing our monopoly status persisted.” To a certain extent, Bill Baker and Mervin Kelly believed their involvement in government affairs could lessen these worries. In the view of Ross and others, such efforts probably helped delay a variety of antitrust actions. Ross recalls, “Kelly set up Sandia Labs, which was run by AT&T, managed by us, and whenever I asked, ‘Why do we stay with this damn thing, it’s not our line of business,’ the answer was, ‘It helps us if we get into an antitrust suit.’ And Bell Labs did work on military programs. Why? Not really to make money. It was part of being invaluable.”
  52. The fundamental goal in making transistor materials is purity; the fundamental goal in making fiber materials is clarity. Only then can light pass through unimpeded; or as optical engineers say, only then can “losses” of light in the fiber be kept to an acceptable minimum.
  53. Indeed, a marketing study commissioned by AT&T in the fall of 1971 informed its team that “there was no market for mobile phones at any price.” Neither man agreed with that assessment. Though Engel didn’t perceive it at the time, he later came to believe that marketing studies could only tell you something about the demand for products that actually exist. Cellular phones were a product that people had to imagine might exist.
  54. Pierce later remarked that one thing about Kelly impressed him above all else: It had to do with how his former boss would advise members of Bell Labs’ technical staff when they were asked to work on something new. Whether it was a radar technology for the military or solid-state research for the phone company, Kelly did not want to begin a project by focusing on what was known. He would want to begin by focusing on what was not known. As Pierce explained, the approach was both difficult and counterintuitive. It was more common practice, at least in the military, to proceed with what technology would allow and fill in the gaps afterward. Kelly’s tack was akin to saying: Locate the missing puzzle piece first. Then do the puzzle.
  55. Shannon had become wealthy, too, through friends in the technology industry. He owned significant shares in Hewlett-Packard, where his friend Barney Oliver ran the research labs, and was deeply invested in Teledyne, a conglomerate started by another friend, Henry Singleton. Shannon sat on Teledyne’s board of directors.
  56. “Ideas and plans are essential to innovation,” he remarked, “but the time has to be right.”
  57. “It is just plain silly,” he wrote, “to identify the new AT&T Bell Laboratories with the old Bell Telephone Laboratories just because the new Laboratories has inherited buildings, equipment and personnel from the old. The mission was absolutely essential to the research done at the old Laboratories, and that mission is gone and has not been replaced.”
  58. At the time of the breakup, in fact, it was widely assumed in the business press that IBM and AT&T would now struggle for supremacy. What undermined such an assumption was the historical record: Everything Bell Labs had ever made for AT&T had been channeled into a monopoly business. “One immediate problem for which no amount of corporate bulk can compensate is the firm’s lack of marketing expertise,” one journalist, Christopher Byron of Time, noted. It was a wise point. Bell Labs and AT&T had “never really had to sell anything.”3 And when they had tried—as was the case with the Picturephone—they failed. Government regulation, as AT&T had learned, could be immensely difficult to manage and comply with. But markets, they would soon discover, were simply brutal. AT&T’s leaders, such as CEO Charlie Brown, “had never had the experience or the training to compete,” Irwin Dorros, a former Bell Labs and AT&T executive, points out. “They tried to apply the skills that they grew up with, and it didn’t work.” In later years, the downsizing at Bell Labs, in terms of both purpose and people, would mostly be linked to this inability to compete.
  59. The purpose of innovation is sometimes defined as new technology. But the point of innovation isn’t really technology itself. The point of innovation is what new technology can do. “Better, or cheaper, or both”—Kelly’s rule—is one way to think about this goal.
  60. A large group of physicists, certainly, created a healthy flow of ideas. But Kelly believed the most valuable ideas arose when the large group of physicists bumped against other departments and disciplines, too. “It’s the interaction between fundamental science and applied science, and the interface between many disciplines, that creates new ideas,” explains Herwig Kogelnik, the laser scientist. This may indeed have been Kelly’s greatest insight.
  61. Eugene Kleiner, moreover, a founding partner at the premier venture capital firm Kleiner Perkins, was originally hired by Bill Shockley at his ill-fated semiconductor company. But the Silicon Valley process that Kleiner helped develop was a different innovation model from Bell Labs. It was not a factory of ideas; it was a geography of ideas. It was not one concentrated and powerful machine; it was the meshing of many interlocking small parts grouped physically near enough to one another so as to make an equally powerful machine. The Valley model, in fact, was soon so productive that it became a topic of study for sociologists and business professors. They soon bestowed upon the area the title of an “innovation hub.”
  62. “You may find a lot of controversy over how Bell Labs managed people,” John Mayo, the former Bell Labs president, says. “But keep in mind, I don’t think those managers saw it that way. They saw it as: How do you manage ideas? And that’s very different from managing people. So if you hear something negative about how John Pierce managed people, I’d say, well, that’s not surprising. Pierce wasn’t about managing people. Pierce was about managing ideas. And you cannot manage ideas and manage people the same way. It just doesn’t work. So if somebody tells you Pierce wasn’t a great manager . . . you say, of what?”
  63. Pierce, to put it simply, was asking himself: What about Bell Labs’ formula was timeless? In his 1997 list, he thought it boiled down to four things: A technically competent management all the way to the top. Researchers didn’t have to raise funds. Research on a topic or system could be and was supported for years. Research could be terminated without damning the researcher.
  64. What seems more likely, as the science writer Steven Johnson has noted in a broad study of scientific innovations, is that creative environments that foster a rich exchange of ideas are far more important in eliciting important new insights than are the forces of competition.
  65. To think long-term toward the revolutionary, and to simultaneously think near-term toward manufacturing, comprises the most vital of combinations.

What I got out of it

  1. The dominance of AT&T and how they were able to structure the organization to take advantage of the talent at Bell Labs was great to learn more about. Having to build or invent something which will have to go to market is important, having a diverse group of people who interact often, and “A technically competent management all the way to the top. Researchers didn’t have to raise funds. Research on a topic or system could be and was supported for years. Research could be terminated without damning the researcher.”

The Soul of a New Machine by Tracy Kidder

Summary

  1. Kidder brings the computer revolution to life by studying life inside Data General

Key Takeaways

  1. IBM set up two main divisions, each one representing the other’s main competition.
  2. Herb Richman, who had helped to found Data General, said, “We did everything well.” Obviously, they did not manage every side of their business better than everyone else, but these young men (all equipped with large egos, as one who was around them at this time remarked) somehow managed to realize that they had to attend with equal care to all sides of their operation—to the selling of their machine as well as to its design, for instance. That may seem an elementary rule for making money in a business, but it is one that is easier to state than to obey. Some notion of how shrewd they could be is perhaps revealed in the fact that they never tried to hoard a majority of the stock, but used it instead as a tool for growth. Many young entrepreneurs, confusing ownership with control, can’t bring themselves to do this.
  3. When they chose their lawyer, who would deal with the financial community for them, they insisted that he invest some of his own money in their company. “We don’t want you running away if we get in trouble. We want you there protecting your own money,”
  4. Richman also remembered that before they entered into negotiations over their second public offering of stock, after the company had been making money for a while and the stock they’d already issued had done very well indeed, their lawyer insisted that each of the founders sell some of their holdings in the company and each “take down a million bucks.” This so that they could negotiate without the dread of losing everything (“Having to go back to your father’s gas station,” Richman called that nightmare). As for the name of the theory behind selling enough stock to become millionaires, Richman told me, “I don’t know how you put it in the vernacular. We called it the Fuck You Theory.”
  5. “DEC owned 85 percent of the business and there was no strong number two. We had to distinguish ourselves from DEC,” Kluchman remembered. “DEC was known as a bland entity. Data General was gonna be unbland, aggressive, hustling, offering you more for your money…. We spread the idea that Data General’s salesmen were more aggressive than DEC’s, and they were, because ours worked on commissions and theirs worked on salaries. But I exaggerated the aggressiveness.” According to Kluchman, DEC actually gave them some help in setting up “the Hertz-Avis thing.” DEC’s management, he said, ordered their salesmen to warn their customers against Data General. “It was great! Because their customers hadn’t heard about us.”
  6. Where did the risks lie? Where could a company go badly wrong? In many cases, a small and daily growing computer company did not fall on hard times because people suddenly stopped wanting to buy its products. On the contrary, a company was more likely to asphyxiate on its own success. Demand for its products would be soaring, and the owners would be drawing up optimistic five-year plans, when all of a sudden something would go wrong with their system of production.
  7. You did not have to be the first company to produce the new kind of machine; sometimes, in fact, it was better not to be the first. But you had to produce yours before the new market really opened up and customers had made other marriages. For once they are lost, both old and prospective customers are often gone for good.
  8. Some of the engineers closest to West suspected that if he weren’t given a crisis to deal with once in a while, he would create one. To them he seemed so confident and happy in an emergency.
  9. By the mid-1960s, a trend that would become increasingly pronounced was already apparent: while the expense of building a computer’s hardware was steadily declining, the cost of creating both user and system software was rising. In an extremely bold stroke, IBM took advantage of the trend. They announced, in the mid-sixties, all at one time, an entire family of new computers—the famous 360 line. In the commerce of computers, no single event has had wider significance, except for the invention of the transistor. Part of the 360’s importance lay in the fact that all the machines in the family were software compatible.
  10. Software compatibility is a marvelous thing. That was the essential lesson West took away from his long talks with his friend in Marketing. You didn’t want to make a machine that wasn’t compatible, not if you could avoid it. Old customers would feel that since they’d need to buy and create all new software anyway, they might as well look at what other companies had to offer, they’d be likely to undertake the dreaded “market survey.” And an incompatible machine would not make it easy for new customers to buy both 16-bit Eclipses and the new machine.
  11. Kludge is perhaps the most disdainful term in the computer engineer’s vocabulary: it conjures up visions of a machine with wires hanging out of it, of things fastened together with adhesive tape.
  12. West had a saying: “The game around here is getting a machine out the door with your name on it.”
  13. Cray was a legend in computers, and in the movie Cray said that he liked to hire inexperienced engineers right out of school, because they do not usually know what’s supposed to be impossible. West liked that idea. He also realized, of course, that new graduates command smaller salaries than experienced engineers. Moreover, using novices might be another way in which to disguise his team’s real intentions. Who would believe that a bunch of completely inexperienced engineers could produce a major CPU to rival North Carolina’s?
  14. West invented the term, not the practice—was “signing up.” By signing up for the project you agreed to do whatever was necessary for success. You agreed to forsake, if necessary, family, hobbies, and friends—if you had any of these left (and you might not if you had signed up too many times before). From a manager’s point of view, the practical virtues of the ritual were manifold. Labor was no longer coerced. Labor volunteered. When you signed up you in effect declared, “I want to do this job and I’ll give it my heart and soul.”
  15. How do such moments occur? “Hey,” Wallach said, “no one knows how that works.” He remembered that during the time when he was working on the Navy computer for Raytheon—the one that got built and then scrapped—he was at a wedding and the solution to a different sort of problem popped into his mind. He wrote it down quickly on the cover of a matchbook. “I will be constantly chugging away in my mind,” he explained, “making an exhaustive search of my data bank.”
  16. Much of the engineering of computers takes place in silence, while engineers pace in hallways or sit alone and gaze at blank pages. Alsing favored the porch and staring out at trees. When writing code, he said, he often felt that he was playing an intense game of chess with a worthy opponent. He went on: “Writing microcode is like nothing else in my life. For days there’s nothing coming out. The empty yellow pad sits there in front of me, reminding me of my inadequacy. Finally, it starts to come. I feel good. That feeds it, and finally I get into a mental state where I’m a microcode-writing machine. It’s like being in Adventure. Adventure’s a completely bogus world, but when you’re there, you’re there. “You have to understand the problem thoroughly and you have to have thought of all the myriad ways in which you can put your microverbs together. You have a hundred L-shaped blocks to build a building. You take all the pieces, put them together, pull them apart, put them together. After a while, you’re like a kid on a jungle gym. There are all these constructs in your mind and you can swing from one to the other with ease.
  17. “West’s not a technical genius. He’s perfect for making it all work. He’s gotta move forward. He doesn’t put off the tough problem, the way I do. He’s fearless, he’s a great politician, he’s arbitrary, sometimes he’s ruthless.”
  18. “One never explicitly plays by these rules.” And West remarked that there was no telling which rules might be real, because only de Castro made the rules that counted, and de Castro was once quoted as saying, “Well, I guess the only good strategy is one that no one else understands.”
  19. Not Everything Worth Doing Is Worth Doing Well.
  20. there’s no such thing as a perfect design. Most experienced computer engineers I talked to agreed that absorbing this simple lesson constitutes the first step in learning how to get machines out the door. Often, they said, it is the most talented engineers who have the hardest time learning when to stop striving for perfection. West was the voice from the cave, supplying that information: “Okay. It’s right. Ship it.”
  21. In fact, the team designed the computer in something like six months, and may have set a record for speed. The task was quite complex.
  22. That fall West had put a new term in his vocabulary. It was trust. “Trust is risk, and risk avoidance is the name of the game in business,” West said once, in praise of trust. He would bind his team with mutual trust, he had decided. When a person signed up to do a job for him, he would in turn trust that person to accomplish it; he wouldn’t break it down into little pieces and make the task small, easy and dull.
  23. “With Tom, it’s the last two percent that counts. What I now call ‘the ability to ship product’—to get it out the door.”
  24. Rasala liked a contentious atmosphere, a vigorous, virile give-and-take among himself and his crew. “Smart, opinionated and nonsensitive, that’s a Hardy Boy,” he declared. Above all, Rasala wanted around him engineers who took an interest in the entire computer, not just in the parts that they had designed.
  25. Firth had just begun to study programming, but the error was “just obvious” to him. Remembering this incident years later, Firth said that the engineer had probably been “programming by rote. He wanted to make his program look like programs he’d seen before, and that clearly wasn’t gonna work.” Firth always tried to avoid such an approach. “I like to work around ‘why,’ ” he told me. “I prefer not to know the established limits and what other people think, when I start a project.”
  26. He also said: “No one ever pats anybody on the back around here. If de Castro ever patted me on the back, I’d probably quit.”
  27. The clerk had some trouble figuring what the beer we bought ought to cost, and as we left, West said, out of her earshot, “Ummmmh, one of the problems with machines like that. You end up making people so dumb they can’t figure out how many six-packs are in a case of beer.”
  28. West didn’t seem to like many of the fruits of the age of the transistor. Of machines he had helped to build, he said, “If you start getting interested in the last one, then you’re dead.” But there was more to it. “The old things, I can’t bear to look at them. They’re clumsy. I can’t believe we were that dumb.” He spoke about the rapidity with which computers became obsolete. “You spend all this time designing one machine and it’s only a hot box for two years, and it has all the useful life of a washing machine.” He said, “I’ve seen too many machines.” One winter night, at his home, while he was stirring up the logs in his fireplace, he muttered, “Computers are irrelevant.”
  29. “It doesn’t matter how hard you work on something,” says Holberger. “What counts is finishing and having it work.”
  30. “I get quite a lot of work done in the morning while taking a shower,” says Veres. “Showers are kinda boring things, all things considered.” Now in the shower, before leaving for work, he conceives a new approach.
  31. “The way West was with us, it provided a one-level separation—someone far enough away to lay blame on.”
  32. At one point, Jim Guyer said: “We didn’t get our commitment to this project from de Castro or Carman or West. We got it from within ourselves. Nobody told us we had to put extra effort into the project.” Ken Holberger burst out laughing. Guyer raised his voice. “We got it from within ourselves to put extra effort in the project.” Laughing hard, Holberger managed to blurt out, “Their idea was piped into our minds!” “The company didn’t ask for this machine,” cried Guyer. “We gave it to them. We created that design.” Others raised their voices. Quietly, Rasala said, “West created that design.”
  33. Engineers are supposed to stand among the privileged members of industrial enterprises, but several studies suggest that a fairly large percentage of engineers in America are not content with their jobs. Among the reasons cited are the nature of the jobs themselves and the restrictive ways in which they are managed. Among the terms used to describe their malaise are declining technical challenge; misutilization; limited freedom of action; tight control of working patterns.
  34. “He set up the opportunity and he didn’t stand in anyone’s way. He wasn’t out there patting people on the back. But I’ve been in the world too long and known too many bosses who won’t allow you the opportunity. He never put one restriction on me. Tom allowed me to take a role where I could make things happen. What does a secretary do? She types, answers the phone, and doesn’t put herself out too much. He let me go out and see what I could get done. You see, he allowed me to be more than a secretary there.
  35. West never passed up an opportunity to add flavor to the project. He helped to transform a dispute among engineers into a virtual War of the Roses. He created, as Rasala put it, a seemingly endless series of “brushfires,” and got his staff charged up about putting them out. He was always finding romance and excitement in the seemingly ordinary. He welcomed a journalist to observe his team; and how it did delight him when one of the so-called kids remarked to me, “What we’re doing must be important, if there’s a writer covering it.”
  36. West sits in his office and declares, “The only way I can do this machine is in this crazy environment, where I can basically do it any way that I want.”
  37. Steve Wallach gave the speech he had once dreaded, describing Eagle’s architecture to a jury of peers, at a meeting of a society of computer professionals, and when he was done, they got up and applauded—“the ultimate reward,” he said.

What I got out of it

  1. Really insightful read on a company and time I didn’t know much about. West seems to have been an amazing leader, someone who was able to inspire his team to do amazing things quickly, ship them out the door, and make his idea their idea – the keystone for any leader 

Fiber: The Coming Tech Revolution and Why America Might Miss It by Susan Crawford

Summary

  1. Susan Crawford goes into why fiber is superior to copper and cable and why it becoming ubiquitous in the US is so important.

Key Takeaways

  1. What China, Singapore, the Nordic countries, Korea, Japan, and Hong Kong have that other developed countries don’t is last mile fiber going into the home of residents. If copper wire is a 2 inch wide pipe, the fiber being used in these countries is like a 15 mile wide river – that is how superior the data transmission on fiber is compared to copper
  2. The US is falling behind in this instantaneous connectivity which could hurt us as other countries such as China move ahead and are able to iterate and innovate faster with nearly 0 latency connectivity. Just like the installation of electric lightbulbs was a wedge for other electric appliances and innovations, this instant connectivity will open up huge markets
  3. The problem with fiber is not capacity or longevity (electricity, water or hardly anything else can mess with it) but distribution. Everybody who wants access to the fiber has to be directly coupled into it or close enough so that they can propel wireless signals
  4. Only about 14% of connections in the US are fiber based whereas it is the norm in Singapore. In addition, they’re extremely expensive and difficult to come by unless you live in a very rich area. The US was the global leader when it came to copper but is falling far behind in fiber compared to other developed nations. The problem lies in latency and scarcity.
  5. Copper has to be close to the central source and is subject to interference. Cable will never be as frictionless as glass. Glass doesn’t have to be amplified and can pump way more data than copper or cable, it is more flexible and easier to upgrade
  6. The world is going wireless but fiber is still vital. Wireless needs wires to travel any distance whatsoever. They are complementary, mutually beneficial. Only fiber will be able to handle the flood of data that comes when everyone is connected, mobile and able to access constant and fast connectivity
  7. 5G is hoping to use multipass encoding over the 10 MHz wavelength which will helps bypass the Shannon Limit and encode more information on the same frequency wavelength. However, this requires a huge investment in towers or base stations. For example, 30,000 base stations to amplify, encode, and send out the signals were needed for AT&T to roll out 4G, but it is estimated that 10 million base stations would be needed for full coverage of 5G. Only fiber can handle the capacity needed to make this happen. It may sound paradoxical, but the future of wireless depends on fiber
  8. Stockholm is leading the way and creating ubiquitous, cheap, and fast connectivity. The city considers connectivity as a basic right and the government paid for laying down the infrastructure and then leased it out the fiber to companies to recap their investment. This has been immensely profitable – throwing off nearly $30 million in free cash flow per year which is being used to expand the service and to subsidize other city goals
  9. The cities and countries who are able to make fiber utility-like will have a leg up in terms of economic growth and innovation
  10. The great capital investment needed to install fiber is sometimes the choking point and often for the return is not directly measurable. Like electricity did for electrical innovations, constant connectivity from fiber spurs creativity, innovation, and growth which is the backbone of a healthy and growing economy
  11. Although laying down fiber is capital intensive, 80% of the cost is from labor which means a lot of jobs would be created. Also, although it typically takes close to 10 years to repay the initial investment, the returns after that are quite healthy as additional investment islimited it is mostly straight cash flow after that
  12. Upfront capital costs are big negatives as are current state laws and regulations 
  13. The author ends with a good analogy between how railroads, and later the highways, served to opened up the country and spur growth. Fiber is the next highway we need to build

What I got out of it

  1. Thought the author was a bit dramatic and repeated points but the central point is important. Ubiquity of fast connectivity spurs innovation and creativity, the backbone of a healthy economy

Blitzscaling: The Lightning-Fast Path to Building Massively Valuable Companies by Reid Hoffman, Chris Yeh

Blitzscaling: The Lightning-Fast Path to Building Massively Valuable Companies by Reid Hoffman, Chris Yeh

Summary

  1. Blitzscaling is when you put speed over efficiency, even in the face of uncertainty. This constant and fast feedback will help you adopt, evolve, and move forward faster than your competitors. Getting this feedback early and moving quickly on it is the name of the game – especially if you are a platform and have a two sided model. Blitzscaling is a risky decision but, if your competitor has taken this path, it is less risky than doing nothing. This book will walk you through how to do it, when to do it, why to do it, and what it looks like. The cost and inefficiencies are worth it because the downside of not doing it when new technology enables is far greater – irrelevance.

If you’d prefer to listen to this article, use the player below.

You can also find more of my articles in audio version at Listle

Key Takeaways

  • Blitzscaling Overview
    • Blitzscaling will help you make better decisions where speed is the ultimate super power
    • Blitzscaling works as both offense and defense – you can catch people off guard and as if you don’t, you might not survive. You can leverage your initial competitive advantage into a long-term one before the market and competitors can respond. You can get easier access to capital as investors prefer to back market leaders allowing you to further your advantage of competitors. Blitzscaling allows you to set the playing field to your advantage
  1. McKinsey found the companies that had 60% growth when they reached $100 million in revenues are 8x more likely to reach $1 billion then those who are growing at 20% of the similar size. They have first scaler advantage. At this point, the ecosystem around this massive company recognize them as the market leader and shift their behavior to better suit them which leads to positive feedback loops
  2. Startups, just like certain materials and chemicals, go through phase changes. A dominant global leader is not simply a startup times of thousand it is a fundamentally different machine. Just as ice skates are useless on water, the same tactics used in the startup may be useless once you have achieved product market fit.
  3. The five phases of Blitzscaling: Family, tribe, village, city, nation
  4. The first step is creating a business model that can grow. This sounds elementary but it’s amazing how many startup founders miss this simple piece. You must have a business model that can scale or else it’ll break before you can reach dominance. Business model innovation is more important than most people think as technology today is not the differentiator it used to be. Most great startups are like Tesla which combine existing technologies in a unique and special way rather than like Space-X where they had to invent their own technologies 
  5. Blitzscaling is a strategic innovation and hurls much common wisdom out the window. Founders should only blitzscale when they determine that the most important factor in their company’s survival is speed into the market. It is a big bet but can pay off handsomely.
  6. The revenue model don’t have to be perfect when you do it. Your only goal is scale into a market that is winner take all or winner take most. However, not every company should blitzscale if product-market fit isn’t there or if the business model isn’t there
  7. You should blitzscale when there’s a big new opportunity, when the size of the market and gross margins overlap to create potential huge value. You should also blitzscale when there is no dominant market leader or oligopoly that controls the market 
  8. Another way to think about blitzscaling is by climbing learning curves faster than others
  9. Blitzscaling is not meant to go on forever. You should stop when growth slows, when average revenue per employee slows, when gross margins begin to climb, and other similar leading indicator show that your growth is slowing. You should also slow when you are reaching the upper bounds of a market
  10. In blitzscaling mode, raise more cash (much more cash) than you think you’ll need. Typically you should try to raise enough money for 18 to 24 months of survival. When trying to raise money nothing is more powerful than not needing it. Only spend money on things which are life or death if not solved
  11. As startups blitzscale, they have to balance responsibility with their power
  12. Try to partner with currently blitzscaling companies or companies which have blitzscaled in the past
  13. Managing Growth
    1. The role and skills needed by the CEO and top management are different for every level and size of the startup. It is never static and is always changing
    2. Business model growth factors
      1. Market size – paying customers, great distribution, fixed and expanding margins
      2. Distribution – leveraging existing networks, virality
      3. High gross margins – more revenues lead to more cash on hand which can be put to use, more attractive to investors
      4. Network effects – direct, indirect, two sided, local, compatibility and standards
    3. There are two growth limiters: product market fit and operational scalability
    4. 8 key transitions 
      1. From small teams to large teams. This can be a tough psychological change for founders and early employees as it is now impossible to be part of every decision and have clarity into every department 
      2. From generalists to specialists 
      3. From managers to executives. Executives organize and lead managers and managers execute day to day operations. Hire people who are known to at least one current team member, start them small and let them prove their value and gain other’s trust, then think about promoting them
      4. From dialogue to broadcasting. Establishing formal and consistent communication is extremely important as you grow. Chesky sends out a weekly email on Sunday nights which highlight growth metrics but also give the team updates and clarity on how the company is doing and other important topics so that everyone continues to feel involved and informed
      5. From improvisation and inspiration to data. At the beginning you have no customers to listen to but over time you have to track team metrics and analyze the data so that you can improve and adapt. Track the number of user’s raw engagement and churn to begin with and then customize and go deeper as is necessary for your product or service. No company should have more than 3 to 5 metrics as more tends to lead to confusion. It doesn’t necessarily matter what data you collect but what data gets presented to decision-makers. 
      6. From single threading to multithreading. The author doesn’t know of one start up that didn’t start out as singularly focused. They can branch out from there but it is important to have a deep focus when you’re first getting started
      7. From pirate to navy. From continuous offense to a blend of offense and defense. You must strike a balance between the power of being small and nimble and the benefits of being large and having scale. Much like JBS Haldane stipulates, you are fundamentally different when you scale. You can’t run a city the same way you run a tribe and you can’t run a nation the same way you run a city
      8. From founder to leader. Your role as the founder will change as the company scales and grows and you must adapt to it or you won’t be serving the company as it needs you to. You have to keep your personal learning curve ahead of the businesses’ growth curve. There are three ways to scale yourself: delegation, amplification, and simply getting better.
    5. Doing things which don’t scale when you’re growing quickly. It might be best to find a hack that you’ll have to throw away later than taking your time and running an elegant piece of software
    6. Ignore your customers at least at this stage in your growth. You have to provide whatever customer service you can that doesn’t slow you down – most likely this will be no customer service. However, you cannot ignore culture a strong culture is really important and is defined by consistent values and actions across the company. The executive in charge of the functional area which drives the culture of the company tends to be the natural successor to the CEO
  14. Awesome analysis on Zara the clothing retailer who uses split scaling techniques. Although it is a retailer, they use speed to their advantage and focus less on efficiency
  15. Incumbents have some natural advantages such as scale, the power and resources to continuously innovate, longevity, and mergers and acquisitions but the disadvantages include poorly aligned incentives, managerial overhang, lack of risk appetite, public pressure since they’re publicly traded, etc.
  16. A good way to gauge risk is by thinking through the knowns and the unknowns and systemic risk and non-systemic risk. Therefore, you must act immediately if there’s some big systemic risk, do something short term to solve your problem, note the problem now so that you can solve it later and let it burn (if unknown and non-systemic)
  17. Instability and change are the new norm and the only way to thrive is to know that you have to adapt faster than the change around you.  Be an infinite learner, be a first responder who is willing and able to act, veer towards industries, people, and companies that are biased towards blitzscaling as this is where the fastest and biggest growth lies
  18. Other
    1. Real value is created when innovative technologies allow for innovative products / services, with an innovative business, model to emerge
    2. It’s important to differentiate between first mover advantage and first scaler advantage. First movers often die but successful first scalers tend to achieve a very powerful position
    3. Do everything by hand until it’s too painful. Then automate it
    4. Common patterns of dominant businesses:
      1. Bits versus atoms (software/digital rather than physical)
      2. Platforms
      3. Free or freemium
      4. Marketplaces
      5. Subscriptions
      6. Digital goods
      7. Newsfeeds which drive user engagement and retention
    5. You must focus on adaptation rather than optimization
    6. You should always have a plan a Plan B and plan Z that you can fall back on in case your first option doesn’t work out and then your option in case worst case scenarios come up
    7. In the early days prioritize hiring those who can add value immediately and not the absolute perfect candidate
    8. Tolerate bad management. At the beginning it is more important to move quickly than to have perfect organization and processes in place
    9. Launch a product that embarrasses you. You don’t want to wait so long until it’s perfect want to get out and see what the market thinks of it
    10. You have to listen to your customers. Not only what they say, but you also have to know when to ignore them – must learn to blend data/intuition
    11. You have to know which fires to fight which ones to say no to and which ones you actually have some control over. Only then can you know which problems to tackle and in which order. Distribution, product, customer service, operations are some of the most important

What I got out of it

  • Blitzscaling is the pursuit of growth and speed, even in the face of uncertainty. It is a big gamble but is necessary sometimes if coming to market first, fastest, and biggest gives you a shot at owning a big market. A great playbook for anybody thinking about pursuing this strategy

Increasing Returns and Path Dependence in the Economy by Brian Arthur

Summary

  1. The idea of increasing returns has come up every few decades but Brian Arthur’s precise and fully-modeled papers caused us to clearly understand what kinds of models have what kinds of implications. One outstanding characteristic of Arthur’s viewpoint is emphatically dynamic in nature. Learning by using or doing plays an essential role, as opposed to static examples of returns to scale (those based on volume-area relations). The object of study is a history. Another distinctive feature of most of the work is its stochastic character. This permits emphasis on the importance of random deviations for long-run tendencies. Other tendencies include the multiplicity of possible long-run states, depending on initial conditions and on random fluctuations over time, and the specialization (in terms of process or geographical location) in an outcome achieved. Increasing returns may also serve as a reinforcement for early leading positions and so act in a manner parallel to more standard forms of increasing returns. A similar phenomenon occurs even in individual learning, where again successes reinforce some courses of action and inhibit others, thereby causing the first to be used more intensively, and so forth. There are in all of these models opposing tendencies, some toward achieving an optimum, some toward locking in on inefficient forms of behavior. 

 Key Takeaways

  1. The papers here reflect two convictions I have held since I started work in this area. The first is that increasing returns problems tend to show common properties and raise similar difficulties and issues wherever they occur in economics. The second is that the key obstacle to an increasing returns economics has been the “selection problem” – determining how an equilibrium comes to be selected over time when there are multiple equilibria to choose from. Thus the papers here explore these common properties – common themes – of increasing returns in depth. And several of them develop methods, mostly probabilistic, to solve the crucial problem of equilibrium selection. 
  2. Arthur studied electrical engineering so was vaguely familiar with positive feedback already and became more intrigued when he read about the history of the discovery of the structure of DNA and read whatever he could about molecular biology and enzyme reactions and followed these threads back to the domain of physics. In this work, outcomes were not predictable, problems might have more than one solution, and chance events might determine the future rather than be average away. The key to this work, I realized, lay not in the domain of the science it was dealing with, whether laser theory, or thermodynamics, or enzyme kinetics. It lay in the fact that these were processes driven by some form of self-reinforcement, or positive feedback, or cumulative causation – processes, in economics terms that were driven by nonconvexities. Here was a framework that could handle increasing returns. 
    1. Great discoveries tend to come from outside the field 
  3. Polya Process – path-dependent  process in probability theory 
  4. In looking back on the difficulties in publishing these papers, I realize that I was naive in expecting that they would be welcomed immediately in the journals. The field of economics is notoriously slow to open itself to ideas that are different. The problem, I believe is not that journal editors are hostile to new ideas. The lack of openness stems instead from a belief embedded deep within our profession that economics consists of rigorous deductions based on a fixed set of foundational assumptions about human behavior and economic institutions. If the assumptions that mirror reality are indeed etched in marble somewhere, and apply uniformly to all economics problems, and we know what they are, there is of course no need to explore the consequences of others. But this is not the case. The assumptions economists need to use vary with the context of the problem and cannot be reduced to a standard set. Yet, at any time in the profession, a standard set seems to dominate. I am sure this state of affairs is unhealthy. It deters many economists, especially younger ones, from attempting approaches or problems that are different. It encourages use of the standard assumptions in applications where they are not appropriate. And it leaves us open to the charge that economics is rigorous deduction based upon faulty assumptions. At this stage of its development economics does not need orthodoxy and narrowness; it needs openness and courage. 
  5. I did not set out with an intended direction but if I have had a constant purpose it is to show that transformation, change, and messiness are natural in the economy. The increasing-returns world in economics is a world where dynamics, not statics, are natural; a world of evolution rather than equilibrium; a world or probability and chance events. Above all, it is a world of process and pattern change
  6. Positive Feedbacks in the Economy
    1. Diminishing returns, what conventional economic theory is built around, imply a single economic equilibrium point for the economy, but positive feedback – increasing returns – makes for many possible equilibrium points. There is no guarantee that the particular economic outcome selected from among the many alternatives will be the “best” one. Furthermore, once random economic events select a particular path, the choice may become locked-in regardless of the advantages of the alternatives
    2. Increasing returns do not apply across the board – agriculture and mining (resource-based portions) – are subject to diminishing returns caused by limited amounts of fertile land or high quality deposits. However, areas of the economy which are knowledge-based are largely subject to increasing returns. Even the production of aircraft is subject to increasing returns – it takes a large initial investment but each plane after that is only a fraction of the initial cost. In addition, producing more units means gaining more experience in the manufacturing process and achieving greater understanding of how to produce additional units even more cheaply. Moreover, experience gained with one product or technology can make it easier to produce new products incorporating similar or related technologies. Not only do the costs of producing high-technology products fall as a company makes more of them, but the benefits of using them increase. Many items such as computers or telecommunications equipment work in networks that require compatibility; when one brand gains a significant market share, people have a strong incentive to buy more of the same product so as to be able to exchange information with those using it already. 
    3. Timing is important too in the sense that getting into an industry that is close to being locked in makes little sense. However, early superiority does not correlate with long term fitness 
    4. Like punctuated equilibrium, most of the time the perturbations are averaged away but once in a while they become all important in tilting parts of the economy into new structures and patterns that are then preserved and built on in a fresh layer of development 
  7. Competing technologies, increasing returns, and lock-in by historical events 
    1. There is an indeterminacy of outcome, nonergodicity (path dependence where small events cumulate to cause the systems to gravitate towards that outcome rather than others). There may be potential inefficiency and nonpredictability. Although individual choices are rational, there is no guarantee that the side selected is, from any long term viewpoint, the better of the two. The dynamics thus take on an evolutionary flavor with a “founder effect” mechanism akin to that in genetics 
  8. Path dependent processes and the emergence of macrostructure 
    1. Many situations dominated by increasing returns are most usefully modeled as dynamic processes with random events and natural positive feedbacks or nonlinearities. We call these nonlinear Polya processes and show that they can model a wide variety of increasing returns and positive feedback problems. In the presence of increasing returns or self reinforcement, a nonlinear Polya process typically displays a multiplicity if possible asymptotic outcomes. Early random fluctuations cumulate and are magnified or attenuated by the inherent nonlinearities of the process. By studying how these build up as the dynamics of the process unfold over time, we can observe how an asymptotic outcomes becomes “selected” over time 
    2. Very often individual technologies show increasing returns to adoption – the more they are adopted the more is learned about them; in then the more they are improved, and the more attractive they become. Very often, too, there are several technologies that compete for shares of a “market” of potential adopters 
  9. Industry location patterns and the importance of history 
    1. This study indeed shows that it is possible to put a theoretical basis under the historical-accident-plus-agglomeration argument (mostly arbitrary location for determining where a city is established but then more people flock to it, it receives more investment, more buildings come up, etc. which leads to agglomeration and increasing returns).
  10. Information Contagion
    1. When a prospective buyer is making purchasing decisions among several available technically-based products, choosing among different computer workstations, say, they often augment whatever publicly available information they can find by asking previous purchasers about their experiences – which product they chose, and how it is working for them. This is a natural and reasonable procedure; it adds information that is hard to come by otherwise. But it also introduces an “information feedback” into the process whereby products compete for market share. The products new purchasers learn about depend on which products the previous purchasers “polled” or sampled and decided to buy. They are therefore likely to learn more about a commonly purchased product than one with few previous users. Hence, where buyers are risk-averse and tend to favor products they know more about, products that by chance win market share early on gain an information-feedback advantage. Under certain circumstances a product may come to dominate by this advantage alone. This is the information contagion phenomenon
  11. Self-Reinforcing Mechanisms in Economics
    1. Dynamical systems of the self-reinforcing or autocatalytic type – systems with local positive feedbacks – in physics, chemical kinetics, and theoretical biology tend to possess a multiplicity of asymptotic states or possible “emergent structures”. The initial starting state combined with early random events or fluctuations acts to push the dynamics into the domain of one of these asymptotic states and thus to “select” the structure that the system eventually “locks into”. 
    2. Self-reinforcing mechanisms are variants of or derive from four generic sources:
      1. Large set up or fixed costs (which give the advantage of falling unit costs to increased output)
      2. Learning effects (which act to improve products or lower their cost as their prevalence increases)
      3. Coordination effects (which confer advantages to “going along” with other economic agents taking similar action)
      4. Self-reinforcing expectations (where increased prevalence on the market enhances beliefs of further prevalence)
    3. Besides these 4 properties, we might note other analogies with physical and biological systems. The market starts out even symmetric, yet it ends up asymmetric: there is “symmetry breaking.” An “order” or pattern in market shares “emerges” through initial market “fluctuations.” The two technologies compete to occupy one “niche” and the one that gets ahead exercises “competitive exclusion” on its rival. And if one technology is inherently superior and appeals to a larger proportion of purchasers, it is more likely to persist: it possesses “selectional advantage.”
    4. Some more characteristics: multiple equilibria (multiple “solutions” are possible but the outcome is indeterminate, not unique and predictable); possible inefficiency, lock-in, path dependence
    5. We can say that the particular equilibrium is locked in to a degree measurable by the minimum cost to effect changeover to an alternative equilibrium. In many economic systems, lock-in happens dynamically, as sequential decisions “groove” out an advantage that the system finds it hard to escape from. Exiting lock-in is difficult and depends on the degree to which the advantages accrued by the inferior “equilibrium” are reversible or transferable to an alternative one. It is difficult when learning effects and specialized fixed costs are the source of reinforcement. Where coordination effects are the source of lock-in, often advantages are transferable. As long as each user has certainty that the others also prefer the alternative, each will decide independently to “switch”. Inertia must be overcome though because few individuals dare change in case others do not follow
  12. Path Dependence, Self-Reinforcement, and Human Learning
    1. There is a strong connection between increasing returns mechanisms and learning problems. Learning can be viewed as competition among beliefs or actions, with some reinforced and others weakened as fresh evidence and data are obtained. But as such, the learning process may then lock-in to actions that are not necessarily optimal nor predictable, by the influence of small events
    2. What makes this iterated-choice problem interesting is the tension between exploitation of knowledge gained and exploration of poorly understood actions. At the beginning many actions will be explored or tried out in an attempt to gain information on their consequences. But in the desire to gain payoff, the agent will begin to emphasize or exploit the “better” ones as they come to the fore. This reinforcement of “good” actions is both natural and economically realistic in this iterated-choice context; and any reasonable algorithm will be forced to take account of it. 
  13. Strategic Pricing in Markets and Increasing Returns
    1. Overall, we find that producers’ discount rates are crucial in determining whether the market structure is stable or unstable. High discount rates damp the effect of self-reinforcement and lead to a balanced market, while low discount rates enhance it and destabilize the market. Under high discount rates, firms that achieve a large market share quickly lose it again by pricing high to exploit their position for near-term profit. And so, in this case the market stabilizes. Under low discount rates, firms price aggressively as they struggle to lock in a future dominant position; and when the market is close to balanced shares, each drops its price heavily in the hope of reaping future monopoly rents. The result is a strong effort by each firm to “tilt” the market in its favor, and to hold it in an asymmetric position if successful. And so, in this case strategic pricing destabilizes the market
    2. The simple dynamics and stochastic model of market competition analyzed in this paper reveals striking properties. First, positive feedback or self-reinforcement to market share may result in bistable stationary distributions with higher probabilities assigned to asymmetric market shares. The stronger the positive feedback, the lower the probability of passing from the region of relative prevalence of one product to that of the other. Second, when producers can influence purchase probabilities by prices, in the presence of positive feedback, optimal pricing is highly state-dependent. The producers struggle for market shares by lowering prices, especially near pivot states with balanced shares. 

 What I got out of it

  1. Influential read discussing self-reinforcement, lock-in, increasing returns in knowledge-based economies/industries, path dependence, and more. Extremely applicable for business, investing, economics, learning, and more. A great mental model to have in your toolbox

The Nature of Technology: What it is and How it Evolves by Brian Arthur

Summary

  1. This book is an argument about what technology is and how it evolves. Technologies are put together from pieces – themselves technologies – that already exist. Technologies therefore share ancestry, combine more, and combined again to create further technologies. Technology evolves similar to how a coral reef builds itself from activities of small organisms – it creates itself from itself; all technologies are descended from earlier technologies. Technologies are not “inventions” that come from nowhere so in a sense, technology created itself 

Key Takeaways

  1. Technology, Evolution, Recursion, Phenomena
    1. Technologies have a recursive structure and collectively advance by capturing phenomenon and putting them to use. The economy arises from technologies and therefore issued forth from all these capturings of phenomena and subsequent combinations
    2. We are caught between two huge and unconscious forces: our deepest hope as human’s lies in technology but our deepest trust lies in nature. These forces are like tectonic plates grinding inexorably into each other in one long slow collision. The collision is not new but more than anything else it is defining our era. Technology is steadily creating the dominant issues and upheavals of our time. We are moving from an era where machines enhance the natural to one that brings in technologies that resemble or replace the natural. As we learn to use these technologies we are moving from using nature to intervening directly within nature. And so the story of the century will be about the clash between what technology offers and what we feel comfortable with. 
    3. We have great understanding about individual technologies but very little in the way of the general understanding. Much like in 1800 there was a great understanding about the family relationships among animals but few principles like evolution to hold all this knowledge together. Missing in other words is the theory of technology – an “Ology” of technology
    4. For me how technology evolves is the central question in technology because if we could understand its evolution we could understand that most mysterious of processes: innovation. Combination drives change or at least the innovation of technology. Invention proceeds from the constructive assimilation of pre-existing elements into new syntheses. So the very cumulation of earlier technologies begets further accumulation. The more there is to invent with the greater will be the number of inventions. These two pieces lead to a theory of evolution of technology that novel technologies arise by combination of existing technologies and that existing technologies beget further technologies. 
    5. Why we are seeing change, innovations, disruption at levels never before seen – there are more building blocks than ever before that can be combined and recombined in new ways, leading to new innovations. This trend seems likely only to continue
    6. The change in vision I am proposing is from standalone technologies, each with a fixed purpose, to seeing them as objects that can be formed into endless new combinations. These technologies can be easily combined and they form building blocks which can be used again and again. Technology, once a means of production, is becoming a chemistry
    7. Arthur gives three definitions of technology:
      1. A means to fulfill a human purpose
      2. An assemblage of practices and components
      3. An entire collection of devices and practices available to a culture.
      4. A means to fulfill a purpose: a device, method, or process (combination, recursiveness, reliance on a natural effect(s) 
    8. Technology consists of parts organized into component systems or modules and some of these form the central assembly and others have supporting functions. This is a general rule: what starts as a series of parts loosely strung together, if used heavily enough, congeals into a self-contained unit. The modules of technology over time become standardized units. In this sense technologies have a recursive structure as they consist of technologies within technologies all the way down to the elemental parts. There is no characteristic scale for technology as every technology stands ready, at least potentially, to become a component in further technologies at a higher level 
    9. Combination is inherently a very disciplined process as all these different modules must not only work together but further the primary function 
    10. Just like higher level technologies are composed of a series of assemblies and subassemblies, they’re also composed of a series of natural phenomenon. For example, maybe one or two phenomena such as trucks use the burning of fuel and low friction to roll or several phenomena such as detecting planets that are too far away to see directly. But, in either case, it is combinations of natural effects that we can exploit for greater technology
    11. Phenomena are the source of all technologies. In the essence of technology lies and orchestrating them to fulfill a purpose. Phenomenon or simply natural effects exist independently of humans and of technology. They have no use attached to them. The principal by contrast is the idea of use of a phenomenon for some purpose and it exist very much in the world of humans and of use. In practice, before phenomenon can be used for technology, they must be harnessed and set up to work properly. They can barely be used in raw form and must be coaxed to operate satisfactorily and may only work in a narrow range of conditions. So, the right combination of means to set them up for the purpose intended must first be found. Therefore the practical technology consists of many phenomena working together. Technology can then be thought of as a collection of phenomenon captured and put to use. In its essence a technology consist of certain phenomenon programmed for some purpose. Technology can then be seen as a metabolism where the phenomenon are the genes of technology – they interact in complex ways, converse with each other, similar to how subroutines and computer programs call each other. Biology programs genes into myriad structures and technology programs phenomena to myriad uses 
    12. I like to think of phenomena as hidden underground – not available until discovered in mind into. This is general with phenomena as a family of phenomena is mined into effect. Some covered earlier begin to create methods and understandings that help uncover later. One effect leads to another, then to another, until eventually a whole vein of related phenomenon has been mined into. A family of a facts forms a set of chambers connected by seams and passageways – one leading to another. And that is not all. The chambers in one place, one family, of the facts leads through passageways to chambers elsewhere to different families. Quantum phenomenon could not have been uncovered without the prior uncovering of the electrical phenomena. Phenomenon form a connected system of excavated chambers and passageways. The whole system underground is connected. This build out happens slowly as it earlier forms of instruments and devices help uncover later ones. In this way, the uncovering a phenomenon builds itself out of itself. Phenomena accumulate by bootstrapping their way forward. 
    13. Not every phenomenon of course has an immediate use but when a family of phenomenon is uncovered, a train of technology typically follows. 
    14. Technology is not merely applied science. It is better to say it builds both from science and from its own experience. Science is in no small part the probing of nature via instruments and methods – via technology
    15. Evolution works by new technologies forming from existing ones which act as building blocks. Sometimes these blocks come from radical innovation but novel building block elements also arise from standard day-to-day engineering. 
    16. Novel technologies come from linking, conceptually or physically, the needs of some purpose with an exploitable effect (or set of effects). Invention, we can say, consists in linking a need with some effect to satisfactorily achieve that need
    17. Technologies tend to become more complex – much more complex – as they mature. 
  2. Domains
    1. The greatest innovations are new domainings – a switching to a new cluster of technologies. They allow not only a wholly new and more efficient way to carry out a purpose but allow entirely new possibilities. As when the provision of power switched from being expressed in waterwheel technology to steam. A change in domain is the main way in which technology progresses but a novel domain may appear to have little direct importance early on. Such components and the way they are used do not just reflect the style of the times, they define the style of the times. An era does not just create technology, technology creates the era
    2. Half of the effectiveness of a domain lives in its reach. The possibilities it opens up. The other half lives in using similar combinations again and again for different purposes
    3. The domain’s grammar determines how its elements fit together and the conditions under which they fit together determines what works. Where do such grammars arise from? Well, of course ultimately from nature. Behind the grammar of electronics lies the physics of the electron motions and the laws of electrical phenomena. Big grammar determines how the elements interrelate, interact, and combine to generate structures. Grammars in large part reflect our understanding of how nature works in a particular domain. Mastery in the technology in fact is difficult to achieve because of technology grammar. Unlike a linguistic one, this grammar changes rapidly. 
    4. Domains are worlds in the sense that experts lose themselves in them. They disappear mentally into them just as we disappear into the world of English when we write a letter. They think in terms of purposes and work these backwards into individual operations in their mental world. Much as a composer works a musical theme back into the instrumental parts that will express it. Some domains have deep worlds with a lot of possibilities. What can be accomplished easily in the domain’s world constitutes that domains power. So, understanding this leads to the natural conclusion that an object or business activity to be worked on effectively must be brought into more than one world to make use of what can be accomplished in each. But there is a general lesson here: cost accumulates anywhere and activity leaves one world and enters another. Shipping a freight containers by sea is not expensive but transferring freight from the domain of rail into the shipping container world requires the cumbersome and expensive technologies of railhead, stocks, container handling cranes, and stevedoring. Such bridging technologies are usually the most awkward aspect of a domain. They create delays and bottlenecks and therefore run-up costs but they are necessary because they make the domain available in control what can enter and leave its world. We can think of a domain as containing a small number of central operations that are streamlined and cheap – maritime container transportation say. But, surrounding these on the outer edges of the domain, are the slower and more awkward technologies that allow activities to enter the world and leave it when finished – the docs and gantry cranes of that world. These in general are costly. Domains reflect the power of the worlds they create but they also reflect its limitations. There is nothing static about these worlds. What can be accomplished constantly changes as a domain evolves and as it expands its base of phenomena. One implication is that innovation is not so much a parade of inventions with subsequent adoptions. It is a constant re-expressing or redomaining of old tasks within new worlds of the possible
    5. If we can see technologies as having dynamic insides we can better understand how technology can modify themselves over their lifetime. We can see that technologies interior components are changing all the time. As better parts are substituted, materials improve, methods for construction change, the phenomenon the technology is based on are better understood, and new elements become available, its parent domain develops. So, technology is not a fixed thing that produces a few variations or updates from time to time. It is a fluid thing – dynamic, alive, highly configurable, and highly changeable overtime. The second difference lies in how we see technology’s possibilities in its collective sense. Technology does not just offer a set of limited functions. It provides a vocabulary of elements that can be put together or programmed in endlessly novel ways for endlessly novel purposes. 
  3. Design & Invention
    1. Requirements start from the key purpose and proceed outward, the needs of one assembly determining those of the next. A design is a set of compromises. Intention comes first and the means to fulfill it – the combination of components – fall in behind it. Design is expression 
    2. Many innovations and great designs do not come from genius but from an accumulation of knowledge and expertise slowly gathered over years 
    3. The search is continuous, conceptual, wide, and often obsessive. This continuous thinking allows the subconscious to work, possibly to recall an effect or concept from past experience, and it procures a subconscious alertness so that when a candidate principle or a different way to define the problem suggests itself the whisper at the door is heard. Strangely, for people who report such breakthroughs, the insight arrives whole, as if the subconscious had already put the parts together. And it arrives with a “knowing” that the solution is right – a feeling of its appropriateness, its elegance, its extraordinary simplicity. The insight comes to an individual person, not a team, for it wells always from an individual subconscious. And it arrives not in the midst of activities or in frenzied thought, but in moments of stillness. One must be open to see a purpose for what appears to be a spurious effect 
    4. At the creative heart of invention lies appropriation, some sort of mental borrowing that comes in the form of a half conscious suggestion 
    5. Invention at its core is mental association. Principles often apply across field and at the core of this mechanism – call it principle transfer – is seeing an analogy. 
    6. An emerging technology always emerges from a cumulative of previous components and functionalities already in place. This is the pyramid of causality. Particularly important is knowledge – both scientific and technical – that has cumulated over time 
    7. Origination is at bottom a linking – a linking of the observational givens of a problem with a principle (a conceptual insight) that roughly suggests these, and eventually with a complete set of principles that reproduces these. At heart, all inventions had the same mechanism: all link a purpose with a principle that will fulfill it, and all must translate that principle into working parts 
    8. A technology develops not just by the direct efforts applied to it. Many of a technology’s parts are shared by other technologies, so a great deal of development happens automatically as components improve in other uses “outside” that technology. A technology piggybacks on the external development of its components. This internal replacement is part of what makes technologies more complex as they age but so does structural deepening. Sometimes changing internal components won’t do, so adding assemblies or systems is needed. 
    9. Origination is not just a new way of doing things, but a new way of seeing things. But it threatens. It can cause an emotional mismatch between the potential of the new and security of the old. Old technologies can lock in because of this and causes a phenomenon we will call adaptive stretch. It is easier to reach for the old technology and adapt it by “stretching” it to cover the new circumstances. There is a natural cycle. A new principle arrives, begins development, runs into limitations, and its structure elaborates. The new base principle is simpler, but in due course it becomes elaborated itself. 
    10. Just as pulling on one thread of a spider’s web causes the web to stretch and reshape itself in response, so the arrival of a new technology causes the web of prices and production in the economy to stretch and reshape itself across all industries. Cheaper steel due to the Bessemer process caused railroads, construction, and heavy machinery to changed their costs and what they could offer their consumers 
    11. Innovation emerges when people are faced by problems: particular, well-specified problems. It arises as solutions to these are conceived of by people stating many means or many functionalities that they can combine. It is enhanced by funding that enables this by training and experience in myriad functionalities. By the existence of special projects and labs devoted to the study of particular problems and by local cultures which foster deep craft. But it is not a monopoly of a single region or country or people. It arises anywhere problems are studied and sufficient background exists in the pieces that will form solutions. In fact we can see that innovation has two main themes. One is this constant finding or putting together of new solutions out of existing tool boxes of pieces and practices. The other is industries constantly combining their practices and processes with functionality is drawn from newly arriving toolboxes, new domains. This second theme, like the first, is about the creation of new processes and arrangements, new means to purposes. But it is more important. This is because it is a new domain of significance. Think of the digital one – it is encountered by all industries in an economy. As this happens, the domain combines some of its offerings with arrangements native to many industries. The result is new processes and arrangements, new ways of doing things – not just in one area of application but all across the economy. 
    12. Because all technologies come from some combination of past technologies, the value of the technology lies not only in what can be done with it but also in what further possibilities it will lead to. Inventions beget more inventions as there are more possible combinations, leading to exponential growth. Even if new technologies can potentially be supplied by the combination of existing ones, they will only come into existence if there exist some need, some demand for them. Or, even better yet, opportunities for technology niches they can usefully occupy. 
  4. Other
    1. Ironically we can say that design works by combining and manipulating clichés. But, still, a beautiful design always contain some unexpected combination that shocks us with its appropriateness. 
    2. We must get comfortable with technology with non-physical effects such as organizational or behavioral effects like the monetary system, contracts, symphonies, algorithms, legal codes, and so on
    3. All explanations are constructions from simpler parts
    4. I do not believe there is any such thing as genius. Rather it is the possession of a very large quiver of functionalities and principles. 
    5. New bodies of technology tend to have their leading edge highly concentrated in one country or region as real advanced technology issues not from knowledge but from something we’ll call deep craft. It is more than knowledge. It is a set of knowing. Knowing what is likely to work and not work. Knowing what methods to use, what principles, what parameters. It derives from a shared culture of beliefs, an unspoken culture of common experience. Deep knowings in a technology can be levered into deep knowings in another. Technology proceeds out of deep understandings of phenomena and he’s become embedded as a deep set of shared knowing that reside in people and establishes itself locally and that grows over time. This is why countries that lead in science lead also in technology. And so, if a country wants to lead in advanced technology, it needs to do more than invest in industrial parks for vaguely foster innovation. It needs to build its basic science without any stated purpose of commercial use and it needs to culture that science in a stable setting with funding and encouragement. Let the science sow itself commercially and small startup companies allow these nascent ventures to grow and sprout with minimal interference. Allow the science and its commercial applications to seed new revolutions. Building a capacity for advanced technology is not like planning production in a socialist economy but more like growing a rock garden. Planting, watering, and weeding are more appropriate than five year plans
    6. Human needs are not just created by biological nerds or prosperity but are also created directly by individual technologies. Once we possess rocketry, we experience a need for space exploration. However the vast majority of niches for technology are created not from human needs but from the needs of technologies themselves. The reasons are several. For one thing every technology by its very existence sets up an opportunity for fulfilling its purpose more cheaply or efficiently. And, so, for every technology there exists always an open opportunity. And, for another, every technology requires supporting technologies to manufacture it, organize for its production and distribution, maintain it, and enhance his performance. And these require their own sub supporting technologies. The third reason technology generates needs is because they often cause problems indirectly. In this it generates needs or opportunities for solutions
    7. These technologies and their needs grow fractally. Entertainment used to consist of public speeches or shows but now novels, movies, podcasts, sports and so much more exist too. 
    8. Arthur thinks of the economy as the set of arrangements and activities by which a society satisfies its needs. The economy is an expression of its technologies. The economy in this way emerges from its technologies. It constantly creates itself out of its technologies and decides which new technologies will enter it. Notice the circular causality at work here. Technology creates the structure of the economy and the economy mediates the creation of novel technology and therefore its own creation
    9. Technologies can cause structural change in the economy and this change is fractal – it branches out at lower levels just as an embryonic arterial system branches out as it develops into smaller arteries and capillaries 
    10. The more high-tech and sophisticated technologies become, the more they become biological we are beginning to appreciate the technology is as much metabolism as mechanism. As we come to better understand biology we are steadily seeing it as more mechanistic as we better understand the mechanisms behind it. Conceptually at least, biology is becoming technology and physically technology is becoming a biology. The two are starting to close on each other and, indeed, as we move deeper into genomics, more than this, they are starting to intermingle
    11. As technology becomes more biological and generative, the economy reflects this too. In the generative economy, management derives its competitive advantage not from its stock of resources and its ability to transform these into finished goods but from its ability to translate its stock of deep expertise into ever new strategic combinations. Reflecting this, nations will prosper not so much from the ownership of resources as from the ownership of specialized scientific and technical expertise

What I got out of it

  1. A fascinating and deep read about technology, how it evolves, permeates, and builds off of itself. Some rich language and concepts to apply to many disparate fields

The Dream Machine: JCR Licklider and the Revolution that Made Computing Personal by Mitchell Waldrop

Summary

  1. Licklider was far ahead of his generation in seeing the potential for computers – for making them humane and individual, in democratizing access to information, creating a symbiosis between man and machine. It was his work in the Pentagon along with many other visionaries who made this possible – that allowed for the standalone computer with a mouse and a graphical user interface to come into existence. His desire to understand how the brain worked as a system fueled his curiosity. Lick went on to form the ARPA Information Processing Techniques Office in 1962 and started the research funding for interactive computing and pervasive worldwide networks that has resulted in most of the technology we use today and also fueled the next generations of computing researchers – many of whom became the founders and mainstays of Xerox PARC. When computers were a short step removed from mechanical data processor, Lick’s treatises on human/computer symbiosis shifted our understanding of what computers were and could be.

Key Takeaways

  1. Lick’s goal was to forge ahead with the human/computer symbiosis and create an interconnected, self-perpetuating system into a single computer network. An electronic medium to connect everyone – the ARPA net. Today it is known as the internet and everything we now associate with it
  2. JCR Licklider may be one of the most intuitive geniuses of all time. He simply saw in his head how information flowed, and how people, things, and ideas are interconnected
  3. Lick, while humble and nice, hated sloppy work, glib answers, and never took anything for granted. He was mischievous and a little anarchical. He was never satisfied with the ordinary and always pushed the limits. His grounding in psychology was essential for his later work with computers as he always tried to design the computer and how it functioned to best meet the needs of the humans operating it. Lick approached every problem as a systems problem rather than a detailed or individual problem
  4. The first high-profile project he worked on was related to acoustics for the war and his boss had a simple mantra: hire the best people, buy them the best machines money can buy, inspire them to no end, and work them 14 hours a day. With this formula they achieve nearly everything they set out to
  5. Norbert Wiener was a prodigious character at MIT. He was a genius in multiple ways, especially mathematics where he was able to use his intuition and form physical models in his head of the problem rather than merely manipulating symbols on the page. He had the hologram in the head 
  6. Alan Turing didn’t like seeing what others had accomplished before him. He preferred to reinvent the wheel and figure things out for himself. He wasted a lot of time and reinvented the wheel but he came to understand things deeply.
  7. Johnny Von Neumann’s stored program concept created software and changed computing, opening up the potential that we associate with computers today
  8. Claude Shannon thought of information through a 5 part framework: source, transmitter, communication medium, receiver, destination. This simple framework helped him think through the purpose of information and not get bogged down in details. Information ought to measure how much you learn from a given message. If you knew everything in a given message, the information content is zero. However, information and meaning is separated as it relates to computers. Shannon also proved that it is possible to get a message through with perfect fidelity no matter how much static or distortion or how faint the signal. It’ll eventually get too slow and the codes too long but it is possible to overcome noise. This is the fundamental theorem of information theory. Shannon didn’t like how information and meaning could be too easily confused so he had Von Neumann come up with a new name and he came up with one immediately: entropy. Information is entropy. It has the same formula as the physicists formula for entropy. A mathematical variable related to the flow of heat. Information is everywhere and in everything it is as old as time and ties together the mind-body problem, computation, communication, and more
  9. Lick was interested in every domain and was always pulling in new ideas from different fields. He loved novel ideas and would always push himself and others to think about things differently in order to gain new or deeper insights. While Lick has high expectations for his team, he was extremely devoted and his team knew it – he had built a tribe more than a research group. Lick optimized for creativity and productivity so cared very little for credit. He would give his ideas and insights away for others to work on and publish so that he could get more done 
  10. Understanding how our brain works brought together information theory, logic, communication, cognitive science, behavioral psychology, and much more. Two key breakthroughs were understanding chunking and that it matters tremendously how our neurons fire and are organized – not just the raw number of neurons we have
  11. When Lick was brought on to head up the new ARPA project there was no budget, no mandate, no charter. This was perfect as they could simply talk about and work on the most important questions and topics as they came up, not being pigeonholed or sucked into a specific purpose but able to adjust and adapt to everything new that was happening
  12. A key realization for Lick was that if all his visions where to come true, he had to create a self-reinforcing and self-sustaining community between all the different groups who are contributing to this project. Without this focus and insight, many of these dreams might have been lost, forgotten, or not achieved for some other reason
  13. Corvado created the first open source system which led to the software boom and the PC. Controversial at the time, he followed the dictum that if you create something useful people, will use it. This was significantly different from other utilities of the past because rather than value flowing just one way (like electricity to users), value flows two ways now: from software to user and user back to software. This had tremendous implications
  14. Lick give people plenty of space as long as they’re doing something interesting and living up to his high standards. However, if not, he can be ruthless and shut down programs that weren’t performing
  15. For all of Lick’s strengths, he was terrible administratively. Frustrating his colleagues and friends as they had to badger him for weeks or months to get anything done. And, when everything is funded by ARPA, this was a huge deal 
  16. Lick at ARPA and Bob Taylor at Xerox Parc had to learn how to find a way to get their groups all to move together, to give their groups a sense of cohesion and purpose without crushing their spontaneity and creativity. They had to set things up and create an environment where they would follow their own instincts and self-organize. This is the fundamental to dilemma of management. Bob Taylor spent years traveling and getting to know the cultures of different high performing groups and he took the time to speak to the youngest people there. Not only tp pick up their ideas but to understand what their values were and how he could cater to them.  Taylor’s style of research can be summed up as don’t just invent the future, go live in it. Don’t worry about the cost for now but whatever you invent, make sure to use it and then show others how to use it and why it’s helpful. The only mandatory program was a once weekly discussion from the program leaders about what they were doing and for an hour the other people would have at him. This created a sense of cohesion and purpose and also flushed out ideas before going too far along the wrong path. These meetings often got heated and Taylor would help turn them from “class 1” to “class 2” meetings, meaning they would go from yelling at each other to having to explain the other side‘s position to their satisfaction. This worked amazingly well to flush out ideas and improve communication.
  17. Xerox PARC’s main vision was to create the digital office, an integrated symbiosis between working man and machine. Broadly, it was split into two groups – one focused on hardware and the other on applications. Low cost, high performance and high quality graphics was a thread which ran through everything they were trying to do. Moore’s Law was just beginning to take hold and this who were still sold on time sharing began to be able to see the possibility of an individual, high powered machine for everybody
    1. There was this thread that ran through Vannevar Bush, Licklider, Doug Engelbart, Alan Kay, and others. It was the ascent of man, it was like the Holy Grail. PARC would rationalize it according to what Xerox needed but whenever they could phrase an idea to align with this path everybody’s eyes would light up, hitting a sort of resonance frequency. 
      1. Engelbart’s “Mother of All Demos” – showing off technology which set fire to the vision of the future and what could be
  18. Alan Kay was one of the key members of PARC’s team and was a prodigy from a young age. He learned to read by the age of three and read hundreds of books before going to school. By that young age he knew that a lot of what the teachers were telling him was wrong or at least that there were multiple points of view. The teachers did not like this. He never distinguished art from science and was one of the key pioneers in this field. 
  19. Good names are incredibly important for prototypes – they have to be familiar, easy to spell, easy to use, easy to understand, have a broad theme, and conjure up pleasant feelings. 
  20. Alan Kay mentions that in the history of art, it is not the adults who actually invent the new medium who do amazing things, but the first generation of kids to grow up with it who do
  21. Xerox was growing so quickly in the late 1960s and 1970s that they almost choked on their own growth. In order to survive, they had to bring in management, marketing, and finance types – mostly from IBM and Ford.  While this helped them survive their amazing growth, it also reinforced some bad lessons – that nothing exists or is useful unless it could be shown and captured on the spreadsheet and eventually this led to the demise of Xerox PARC and that era of research and innovation. Jim O’Neil became the numbers guy and shut down much of the spontaneous generation and innovation because if it didn’t meet his numbers he couldn’t “see it” and wouldn’t buy into it. When sales and finance make all the shots, the company is on a downward spiral as they are not able to innovate or think long term
  22. Xerox PARC was an Eden in many ways but what allowed them to flourish was the vision, the people, and an abundance mentality. The fact that they had money to spend and didn’t have to jump through hoops to get it. When there is scarcity you don’t have a community, you just have a bunch of people trying to survive. In 1975 Xerox’s printer and copier business was being threatened and this was their cash cow. The instinct is to keep pouring money into this in order to save it but sometimes that isn’t appropriate. You must know when to cannibalize or disrupt yourself 
  23. You always got the sense that Lick was playing. He was like a kid in a candy store. His exploratory and curious child-like mind never went away. He was not suited to be an administrator or manager but was a visionary and community builder. He encouraged people and showed them what was possible, what they were really working towards 
  24. DEC took advantage of the open architecture and was able to foster creativity and uses for their machines that they never would’ve been able to come up with. Many people loved the ability to tinker, upgrade, or personalize what they bought rather than buying a finish package from an IBM for example. Roberts and his Altera machine would follow DEC‘s lead and make it an open architecture which unleashed a wave on entrepreneurialism and garage start ups by the hundreds – filling all sorts of niches and launching some of the world’s biggest and most successful companies (such as Microsoft)

What I got out of it

  1. An incredibly fun read – detailing not only the people and the history behind the computer revolution, but the atmosphere, thinking, and optimism which fueled it

Crossing the Chasm by Geoffrey Moore

Summary
  1. Navigating in such the uncharted waters of the chasm requires beacons that can be seen above the waves, and that is what models in general, and the chasm models in particular, are for. Models are like constellations—they are not intended to change in themselves, but their value is in giving perspective on a highly changing world. The chasm model represents a pattern in market development that is based on the tendency of pragmatic people to adopt new technology when they see other people like them doing the same. This causes them to hang together as a group, and the group’s initial reaction, like teenagers at a junior high dance, is to hesitate and watch. This is the chasm effect. The tendency is very deep-rooted, and so the pattern is very persistent. As a result, marketers can predict its appearance and build strategies to cope with it, and it is the purpose of this book to help in that process. To be specific, the point of greatest peril in the development of a high-tech market lies in making the transition from an early market dominated by a few visionary customers to a mainstream market dominated by a large block of customers who are predominantly pragmatists in orientation. The gap between these two markets, heretofore ignored, is in fact so significant as to warrant being called a chasm, and crossing this chasm must be the primary focus of any long-term high-tech marketing plan. A successful crossing is how high-tech fortunes are made; failure in the attempt is how they are lost.
Key Takeaways
  1. Background & Fundamentals of Crossing the Chasm
    1. It is only natural to cling to the past when the past represents so much of what we have strived to achieve. This is the key to Crossing the Chasm. The chasm represents the gulf between two distinct marketplaces for technology products—the first, an early market dominated by early adopters and insiders who are quick to appreciate the nature and benefits of the new development, and the second a mainstream market representing “the rest of us,” people who want the benefits of new technology but who do not want to “experience” it in all its gory details. The transition between these two markets is anything but smooth. Indeed, what Geoff Moore has brought into focus is that, at the time when one has just achieved great initial success in launching a new technology product, creating what he calls early market wins, one must undertake an immense effort and radical transformation to make the transition into serving the mainstream market. This transition involves sloughing off familiar entrepreneurial marketing habits and taking up new ones that at first feel strangely counterintuitive.
    2. The basic forces don’t change, but the tactics have become more complicated. Moreover, we are seeing a new effect which was just barely visible in the prior decade, the piggybacking of one company’s offer on another to skip the chasm entirely and jump straight into hypergrowth. In the 1980s Lotus piggybacked on VisiCalc to accomplish this feat in the spreadsheet category. In the 1990s Microsoft has done the same thing to Netscape in browsers. The key insight here is that we should always be tracking the evolution of a technology rather than a given company’s product line—it’s the Technology Adoption Life Cycle, after all. Thus it is spreadsheets, not VisiCalc, Lotus, or Excel, that is the adoption category, just as it is browsers, not Navigator or Explorer. In the early days products and categories were synonymous because technologies were on their first cycles. But today we have multiple decades of invention to build on, and a new offer is no longer quite as new or unprecedented as it used to be.
    3. If we step back from this chasm problem, we can see it as an instance of the larger problem of how the marketplace can cope with change in general. For both the customer and the vendor, continually changing products and services challenge their institution’s ability to absorb and make use of the new elements. What can marketing do to buffer these shocks? Fundamentally, marketing must refocus away from selling product and toward creating relationships. Relationship buffers the shock of change. Marketing’s first deliverable is that partnership. This is what we mean when we talk about “owning a market.” Customers do not like to be “owned,” if that implies lack of choice or freedom. The open systems movement in high tech is a clear example of that. But they do like to be “owned” if what that means is a vendor taking ongoing responsibility for the success of their joint ventures. Ownership in this sense means abiding commitment and a strong sense of mutuality in the development of the marketplace. When customers encounter this kind of ownership, they tend to become fanatically loyal to their supplier, which in turn builds a stable economic base for profitability and growth.
    4. The fundamental requirement for the ongoing, interoperability needed to sustain high tech is accurate and honest exchange of information. Your partners need it, your distribution channel needs it and must support it, and your customers demand it.
    5. The fundamental basis of market relations is to build and manage relationships with all the members that make up a high-tech marketplace, not just the most visible ones. In particular, it means setting up formal and informal communications not only with customers, press, and analysts but also with hardware and software partners, distributors, dealers, VARs, systems and integrators, user groups, vertically oriented industry organizations, universities, standards bodies, and international partners. It means improving not only your external communications but also your internal exchange of information among the sales force, the product managers, strategic planners, customer service and support, engineering, manufacturing, and finance.
      1. Must see through every stakeholder’s eyes and create win-win relationships. This becomes even more complicated with public, high-tech companies given the number of constituents
    6. The problem, since these techniques are antithetical to each other, is that you need to decide which one – fad or trend – you are dealing with before you start. It would be much better if you could start with a fad, exploit it for all it was worth, and then turn it into a trend. That may seem like a miracle, but that is in essence what high-tech marketing is all about. Every truly innovative high-tech product starts out as a fad—something with no known market value or purpose but with “great properties” that generate a lot of enthusiasm within an “in crowd.” That’s the early market. Then comes a period during which the rest of the world watches to see if anything can be made of this; that is the chasm. If in fact something does come out of it—if a value proposition is discovered that can predictably be delivered to a targetable set of customers at a reasonable price-then a new mainstream market forms, typically with a rapidity that allows its initial leaders to become very, very successful. The key in all this is crossing the chasm—making that mainstream market emerge. This is a do-or-die proposition for high-tech enterprises; hence, it is logical that they be the crucible in which “chasm theory” is formed.
    7. The rule of thumb in crossing the chasm is simple: Pick on somebody your own size.
    8. These are the two “natural” marketing rhythms in high tech— developing the early market and developing the mainstream market. You develop an early market by demonstrating a strong technology advantage and converting it to product credibility, and you develop a mainstream market by demonstrating a market leadership advantage and converting it to company credibility. By contrast, the “chasm transition” represents an unnatural rhythm. Crossing the chasm requires moving from an environment of support among the visionaries back into one of skepticism among the pragmatists. It means moving from the familiar ground of product-oriented issues to the unfamiliar ground of market-oriented ones, and from the familiar audience of like-minded specialists to the unfamiliar audience of essentially uninterested generalists.
    9. Market Development Strategy Checklist. This list consists of a set of issues around which go-to-market plans are built, each of which incorporates a chasm-crossing factor, as follows:
      1. Target customer
      2. Compelling reason to buy
      3. Whole product
      4. Partners and allies
      5. Distribution
      6. Pricing
      7. Competition
      8. Positioning
      9. Next target customer
  2. Technology Adoption Life Cycle – The Cause for the Chasm
    1. To recap the logic of the Technology Adoption Life Cycle, its underlying thesis is that technology is absorbed into any given community in stages corresponding to the psychological and social profiles of various segments within that community. This profile, is in turn, the very foundation of the High-Tech Marketing Model. That model says that the way to develop a high-tech market is to work the curve left to right, focusing first on the innovators, growing that market, then moving on to the early adopters, growing that market, and so on, to the early majority, late majority, and even to the laggards. In this effort, companies must use each “captured” group as a reference base for going on to market to the next group. Thus, the endorsement of innovators becomes an important tool for developing a credible pitch to the early adopters, that of the early adopters to the early majority, and so on. The idea is to keep this process moving smoothly, proceeding something like passing the baton in a relay race or imitating Tarzan swinging from vine to well-placed vine. It is important to maintain momentum in order to create a bandwagon effect that makes it natural for the next group to want to buy in. Too much of a delay and the effect would be something like hanging from a motionless vine—nowhere to go but down. As you can see, the components of the life cycle are unchanged, but between any two psychographic groups has been introduced a gap. This symbolizes the dissociation between the two groups—that is, the difficulty any group will have in accepting a new product if it is presented in the same way as it was to the group to its immediate left. Each of these gaps represents an opportunity for marketing to lose momentum, to miss the transition to the next segment, thereby never to gain the promised land of profit-margin leadership in the middle of the bell curve. The key to winning over this segment is to show that the new technology enables some strategic leap forward, something never before possible, which has an intrinsic value and appeal to the nontechnologist. This benefit is typically symbolized by a single, compelling application, the one thing that best captures the power and value of the new product. If the marketing effort is unable to find that compelling application, then market development stalls with the innovators, and the future of the product falls through the crack.
    2. It turns out our attitude toward technology adoption becomes significant—at least in a marketing sense—any time we are introduced to products that require us to change our current mode of behavior or to modify other products and services we rely on. In academic terms, such change-sensitive products are called discontinuous innovations. The contrasting term, continuous innovations, refers to the normal upgrading of products that does not require us to change behavior.
    3. Of course, talking this way about marketing merely throws the burden of definition onto market, which we will define, for the purposes of high tech, as:
      1. A set of actual or potential customers
      2. For a given set of products or services
      3. Who have a common set of needs or wants
      4. Who reference each other when making a buying decision.
    4. The goal should be to package each of the phases such that each phase
      1. Is accomplishable by mere mortals working in earth time
      2. Provides the vendor with a marketable product
      3. Provides the customer with a concrete return on investment that can be celebrated as a major step forward.                                                                                                                                                    
    1. Innovators
      1. Visionaries are not looking for an improvement; they are looking for a fundamental breakthrough.
      2. In sum, visionaries represent an opportunity early in a product’s life cycle to generate a burst of revenue and gain exceptional visibility.
    2. Early Adopters
      1. What the early adopter is buying is some kind of change agent. By being the first to implement this change in their industry, the early adopters expect to get a jump on the competition, whether from lower product costs, faster time to market, more complete customer service, or some other comparable business advantage. They expect a radical discontinuity between the old ways and the new, and they are prepared to champion this cause against entrenched resistance. Being the first, they also are prepared to bear with the inevitable bugs and glitches that accompany any innovation just coming to market.
    3. Early Majority (Pragmatists)
      1. The real news, however, is not the two cracks in the bell curve, the one between the innovators and the early adopters, the other between the early and late majority. No, the real news is the deep and dividing chasm that separates the early adopters from the early majority. This is by far the most formidable and unforgiving transition in the Technology Adoption Life Cycle, and it is all the more dangerous because it typically goes unrecognized. The reason the transition can go unnoticed is that with both groups the customer list and the size of the order can look the same.
      2. The early majority want to buy a productivity improvement for existing operations. They are looking to minimize the discontinuity with the old ways. They want evolution, not revolution. They want technology to enhance, not overthrow, the established ways of doing business. And above all, they do not want to debug somebody else’s product. By the time they adopt it, they want it to work properly and to integrate appropriately with their existing technology base. This contrast just scratches the surface relative to the differences and incompatibilities among early adopters and the early majority. Let me just make two key points for now: Because of these incompatibilities, early adopters do not make good references for the early majority. And because of the early majority’s concern not to disrupt their organizations, good references are critical to their buying decisions. So what we have here is a catch-22. The only suitable reference for an early majority customer, it turns out, is another member of the early majority, but no upstanding member of the early majority will buy without first having consulted with several suitable references.
      3. Of course, to market successfully to pragmatists, one does not have to be one—just understand their values and work to serve them. To look more closely into these values, if the goal of visionaries is to take a quantum leap forward, the goal of pragmatists is to make a percentage improvement—incremental, measurable, predictable progress. If they are installing a new product, they want to know how other people have fared with it. The word risk is a negative word in their vocabulary—it does not connote opportunity or excitement but rather the chance to waste money and time.
      4. If pragmatists are hard to win over, they are loyal once won, often enforcing a company standard that requires the purchase of your product, and only your product, for a given requirement. This focus on standardization is, well, pragmatic, in that it simplifies internal service demands. But the secondary effects of this standardization—increasing sales volumes and lowering the cost of sales—is dramatic. Hence the importance of pragmatists as a market segment.
      5. When pragmatists buy, they care about the company they are buying from, the quality of the product they are buying, the infrastructure of supporting products and system interfaces, and the reliability of the service they are going to get. In other words, they are planning on living with this decision personally for a long time to come.
      6. Pragmatists won’t buy from you until you are established, yet you can’t get established until they buy from you. Obviously, this works to the disadvantage of start-ups and, conversely, to the great advantage of companies with established track records. On the other hand, once a start-up has earned its spurs with the pragmatist buyers within a given vertical market, they tend to be very loyal to it, and even go out of their way to help it succeed. When this happens, the cost of sales goes way down, and the leverage on incremental R&D to support any given customer goes way up. That’s one of the reasons pragmatists make such a great market.
      7. Overall, to market to pragmatists, you must be patient. You need to be conversant with the issues that dominate their particular business. You need to show up at the industry-specific conferences and trade shows they attend. You need to be mentioned in articles that run in the magazines they read. You need to be installed in other companies in their industry. You need to have developed applications for your product that are specific to the industry. You need to have partnerships and alliances with the other vendors who serve their industry. You need to have earned a reputation for quality and service. In short, you need to make yourself over into the obvious supplier of choice. This is a long-term agenda, requiring careful pacing, recurrent investment, and a mature management team
      8. Conservatives like to buy preassembled packages, with everything bundled, at a heavily discounted price. The last thing they want to hear is that the software they just bought doesn’t support the printer they have installed. They want high-tech products to be like refrigerators—you open the door, the light comes on automatically, your food stays cold, and you don’t have to think about it. The products they understand best are those dedicated to a single function—word processors, calculators, copiers, and fax machines. The notion that a single computer could do all four of these functions does not excite them—instead, it is something they find vaguely nauseating. The conservative marketplace provides a great opportunity, in this regard, to take low-cost, trailing-edge technology components and repackage them into single-function systems for specific business needs. The quality of the package should be quite high because there is nothing in it that has not already been thoroughly debugged. The price should be quite low because all the R&D has long since been amortized, and every bit of the manufacturing learning curve has been taken advantage of. It is, in short, not just a pure marketing ploy but a true solution for a new class of customer. There are two keys to success here. The first is to have thoroughly thought through the “whole solution” to a particular target end user market’s needs, and to have provided for every element of that solution within the package. This is critical because there is no profit margin to support an afterpurchase support system. The other key is to have lined up a low-overhead distribution channel that can get this package to the target market effectively.
      9. Just as the visionaries drive the development of the early market, so do the pragmatists drive the development of the mainstream market. Winning their support is not only the point of entry but the key to long-term dominance. But having done so, you cannot take the market for granted. To maintain leadership in a mainstream market, you must at least keep pace with the competition. It is no longer necessary to be the technology leader, nor is it necessary to have the very best product. But the product must be good enough, and should a competitor make a major breakthrough, you have to make at least a catch-up response.
      10. The key to making a smooth transition from the pragmatist to the conservative market segments is to maintain a strong relationship with the former, always giving them an open door to go to the new paradigm, while still keeping the latter happy by adding value to the old infrastructure. It is a balancing act to say the least, but properly managed the earnings potential in loyal mature market segments is very high indeed.
      11. So the corollary lesson is, we must use our experience with the pragmatist customer segment to identify all the issues that require service and then design solutions to these problems directly into the product.
      12. In sum, the pragmatists are loath to buy until they can compare. Competition, therefore, becomes a fundamental condition for purchase. So, coming from the early market, where there are typically no perceived competing products, with the goal of penetrating the mainstream, you often have to go out and create your competition. Creating the competition is the single most important marketing decision made in the battle to enter the mainstream. It begins with locating your product within a buying category that already has some established credibility with the pragmatist buyers. That category should be populated with other reasonable buying choices, ideally ones with which the pragmatists are already familiar. Within this universe, your goal is to position your product as the indisputably correct buying choice.
      13. In sum, to the pragmatist buyer, the most powerful evidence of leadership and likelihood of competitive victory is market share. In the absence of definitive numbers here, pragmatists will look to the quality and number of partners and allies you have assembled in your camp, and their degree of demonstrable commitment to your cause.
    4. Late Majority
      1. Simply put, the early majority is willing and able to become technologically competent, where necessary; the late majority, much less so. When a product reaches this point in the market development, it must be made increasingly easier to adopt in order to continue being successful. If this does not occur, the transition to the late majority may well stall or never happen.
    5. Laggards
      1. Skeptics—the group that makes up the last one-sixth of the Technology Adoption Life Cycle—do not participate in the high-tech marketplace, except to block purchases. Thus, the primary function of high-tech marketing in relation to skeptics is to neutralize their influence. In a sense, this is a pity because skeptics can teach us a lot about what we are doing wrong
  3. The D-Day Strategy – Choose a Target Niche
    1. Entering the mainstream market is an act of aggression. The companies who have already established relationships with your target customer will resent your intrusion and do everything they can to shut you out. The customers themselves will be suspicious of you as a new and untried player in their marketplace. No one wants your presence. You are an invader. This is not a time to focus on being nice. As we have already said, the perils of the chasm make this a life-or-death situation for you. You must win entry to the mainstream, despite whatever resistance is posed. That’s it. That’s the strategy. Replicate D Day, and win entry to the mainstream. Cross the chasm by targeting a very specific niche market where you can dominate from the outset, force your competitors out of that market niche, and then use it as a base for broader operations. Concentrate an overwhelmingly superior force on a highly focused target. It worked in 1944 for the Allies, and it has worked since for any number of high-tech companies.
    2. The D-Day strategy prevents this mistake. It has the ability to galvanize an entire enterprise by focusing it on a highly specific goal that is (1) readily achievable and (2) capable of being directly leveraged into long-term success. Most companies fail to cross the chasm because, confronted with the immensity of opportunity represented by a mainstream market, they lose their focus, chasing every opportunity that presents itself, but finding themselves unable to deliver a salable proposition to any true pragmatist buyer. The D-Day strategy keeps everyone on point—if we don’t take Normandy, we don’t have to worry about how we’re going to take Paris. And by focusing our entire might on such a small territory, we greatly increase our odds of immediate success.
      1. This isn’t rocket science, but it does represent a kind of discipline. And it is here that high-tech management shows itself most lacking. Most high-tech leaders, when it comes down to making marketing choices, will continue to shy away from making niche commitments, regardless. Like marriage-averse bachelors, they may nod in all the right places and say all the right things, but they will not show up when the wedding bells chime.
      2. “Never attack a fortified hill.” Same with beachheads. If some other company got there before you, all the market dynamics that you are seeking to make work in your favor are already working in its favor. Don’t go there.
    3. One of the most important lessons about crossing the chasm is that the task ultimately requires achieving an unusual degree of company unity during the crossing period. This is a time when one should forgo the quest for eccentric marketing genius, in favor of achieving an informed consensus among mere mortals. It is a time not for dashing and expensive gestures but rather for careful plans and cautiously rationed resources—a time not to gamble all on some brilliant coup but rather to focus everyone on making as few mistakes as possible. One of the functions of this book, therefore-and perhaps its most important one-is to open up the logic of marketing decision making during this period so that everyone on the management team can participate in the marketing process. If prudence rather than brilliance is to be our guiding principle, then many heads are better than one. If marketing is going to be the driving force-and most organizations insist this is their goal—then its principles must be accessible to all the players, and not, as is sometimes the case, be reserved to an elect few who have managed to penetrate its mysteries.
    4. The consequences of being sales-driven during the chasm period are, to put it simply, fatal.
    5. Segment. Segment. Segment. One of the other benefits of this approach is that it leads directly to you “owning” a market. That is, you get installed by the pragmatists as the leader, and from then on, they conspire to help keep you there. This means that there are significant barriers to entry for any competitors, regardless of their size or the added features they have in their product. Mainstream customers will, to be sure, complain about your lack of features and insist you upgrade to meet the competition. But, in truth, mainstream customers like to be “owned”—it simplifies their buying decisions, improves the quality and lowers the cost of whole product ownership, and provides security that the vendor is here to stay. They demand attention, but they are on your side. As a result, an owned market can take on some of the characteristics of an annuity—a building block in good times, and a place of refuge in bad—with far more predictable revenues and far lower cost of sales than can otherwise be achieved.
    6. For all these reasons—for whole product leverage, for word-of-mouth effectiveness, and for perceived market leadership—it is critical that, when crossing the chasm, you focus exclusively on achieving a dominant position in one or two narrowly bounded market segments. If you do not commit fully to this goal, the odds are overwhelmingly against your ever arriving in the mainstream market.
    7. The key to moving beyond one’s initial target niche is to select strategic target market segments to begin with. That is, target a segment that, by virtue of its other connections, creates an entry point into a larger segment. For example, when the Macintosh crossed the chasm, the target niche was the graphics arts department in Fortune 500 companies. This was not a particularly large target market, but it was one that was responsible for a broken, mission-critical process—providing presentations for executives and marketing professionals.
    8. The niche wins—presuming the beachhead strategy is conducted correctly—by getting a fix for its specialized problem. And the vendor wins because it gets certified by at least one group of pragmatists that its offering is mainstream. So, because of the dynamics of technology adoption, and not because of any niche properties in the product itself, platforms must take a vertical market approach to crossing the chasm even though it seems unnatural.
    9. The answer is that when you are picking a chasm-crossing target it is not about the number of people involved, it is about the amount of pain they are causing. In the case of the pharmaeutical industry’s regulatory affairs function, the pain was excruciating.
    10. This is a standard pattern in crossing the chasm. It is normally the departmental function who leads (they have the problem), the executive function who prioritizes (the problem is causing enterprise-wide grief), and the technical function that follows (they have to make the new stuff work while still maintaining all the old stuff).
    11. The more serious the problem, the faster the target niche will pull you out of the chasm. Once out, your opportunities to expand into other niches are immensely increased because now, having one set of customers solidly behind you, you are much less risky to back as a new vendor.
  4. Next Target Segment
    1. The second key is to have lined up other market segments into which you can leverage your initial niche solution. This allows you to reinterpret the financial gain in crossing the chasm. It is not about the money you make from the first niche: It is the sum of that money plus the gains from all subsequent niches. It is a bowling alley estimate, not just a head pin estimate, that should drive the calculation of gain.
    2. First you divide up the universe of possible customers into market segments. Then you evaluate each segment for its attractiveness. After the targets get narrowed down to a very small number, the “finalists,” then you develop estimates of such factors as the market niches’ size, their accessibility to distribution, and the degree to which they are well defended by competitors.
    3. Now, the biggest mistake one can make in this state is to turn to numeric information as a source of refuge or reassurance. You need to understand that informed intuition, rather than analytical reason, is the most trustworthy decision-making tool to use. The key is to understand how intuition—specifically, informed intuition—actually works. Unlike numerical analysis, it does not rely on processing a statistically significant sample of data in order to achieve a given level of confidence. Rather, it involves conclusions based on isolating a few high-quality images—really, data fragments—that it takes to be archetypes of a broader and more complex reality. These images simply stand out from the swarm of mental material that rattles around in our heads. They are the ones that are memorable. So the first rule of working with an image is: If you can’t remember it, don’t try, because it’s not worth it. Or, to put this in the positive form: Only work with memorable images.
    4. Target-customer characterization is a formal process for making up these images, getting them out of individual heads and in front of a marketing decision-making group. The idea is to create as many characterizations as possible, one for each different type of customer and application for the product. (It turns out that, as these start to accumulate, they begin to resemble one another so that, somewhere between 20 and 50, you realize you are just repeating the same formulas with minor tweaks, and that in fact you have outlined 8 to 10 distinct alternatives.)
    5. It is extremely difficult to cross the chasm in consumer market. Almost all successful crossings happen in business markets, where the economic and technical resources can absorb the challenges of an immature product and service offering.
    6. The elements you need to capture are five:
      1. Scene or situation: Focus on the moment of frustration. What is going on? What is the user about to attempt?
      2. Desired outcome: What is the user trying to accomplish? Why is this important?
      3. Attempted approach: Without the new product, how does the user go about the task?
      4. Interfering factors: What goes wrong? How and why does it go wrong?
      5. Economic consequences: So what? What is the impact of the user failing to accomplish the task productively?
  5. Whole Product Package
    1. Systems integrators could just as easily be called whole product providers—that is their commitment to the customer.
    2. The whole product model provides a key insight into the chasm phenomenon. The single most important difference between early markets and mainstream markets is that the former are willing to take responsibility for piecing together the whole product (in return for getting a jump on their competition), whereas the latter are not.
    3. Tactical alliances have one and only one purpose: to accelerate the formation of whole product infrastructure within a specific target market segment. The basic commitment is to codevelop a whole product and market it jointly. This benefits the product manager by ensuring customer satisfaction. It benefits the partner by providing expanded distribution into a hitherto untapped source of sales opportunities.
    4. To sum up, whole product definition followed by a strong program of tactical alliances to speed the development of the whole product infrastructure is the essence of assembling an invasion force for crossing the chasm. The force itself is a function of actually delivering on the customer’s compelling reason to buy in its entirety. That force is still rare in the high-tech marketplace, so rare that, despite the overall high-risk nature of the chasm period, any company that executes a whole product strategy competently has a high probability of mainstream market success.
    5. Review the whole product from each participant’s point of view. Make sure each vendor wins, and that no vendor gets an unfair share of the pie. Inequities here, particularly when they favor you, will instantly defeat the whole product effort—companies are naturally suspicious of each other anyway, and given any encouragement, will interpret your entire scheme as a rip-off.
    6. The fundamental rule of engagement is that any force can defeat any other force—if it can define the battle. If we get to set the turf, if we get to set the competitive criteria for winning, why would we ever lose? The answer, depressingly enough, is because we don’t do it right. Sometimes it is because we misunderstand either our own strengths and weaknesses, or those of our competitors. More often, however, it is because we misinterpret what our target customers really want, or we are afraid to step up to the responsibility of making sure they get it.
  6. Distribution
    1. The number-one corporate objective, when crossing the chasm, is to secure a channel into the mainstream market with which the pragmatist customer will be comfortable.
    2. In other words, during the chasm period, the number-one concern of pricing is not to satisfy the customer or to satisfy the investors, but to motivate the channel.
    3. To sum up, when crossing the chasm, we are looking to attract customer-oriented distribution, and one of our primary lures will be distribution-oriented pricing.
    4. When functioning at its best, within the limits just laid out, direct sales is the optimal channel for high tech. It is also the best channel for crossing the chasm.
    5. All other things being equal, however, direct sales is the preferred alternative because it gives us maximum control over our own destiny.
    6. First and foremost, the retail system works optimally when its job is to fulfill demand rather than to create it.
  7. Positioning
    1. To sum up, your market alternative helps people identify your target customer (what you have in common) and your compelling reason to buy (where you differentiate). Similarly, your product alternative helps people appreciate your technology leverage (what you have in common) and your niche commitment (where you differentiate). Thus you create the two beacons that triangulate to teach the market your positioning.
    2. You can keep yourself from making most positioning gaffes if you will simply remember the following principles:
      1. Positioning, first and foremost, is a noun, not a verb. That is, it is best understood as an attribute associated with a company or a product, and not as the marketing contortions that people go through to set up that association.
      2. Positioning is the single largest influence on the buying decision. It serves as a kind of buyers’ shorthand, shaping not only their final choice but even the way they evaluate alternatives leading up to that choice. In other words, evaluations are often simply rationalizations of preestablished positioning.
      3. Positioning exists in people’s heads, not in your words. If you want to talk intelligently about positioning, you must frame a position in words that are likely to actually exist in other people’s heads, and not in words that come straight out of hot advertising copy.
      4. People are highly conservative about entertaining changes in positioning. This is just another way of saying that people do not like you messing with the stuff that is inside their heads. In general, the most effective positioning strategies are the ones that demand the least amount of change.
    3. Given all of the above, it is then possible to talk about positioning as a verb—a set of activities designed to bring about positioning as a noun. Here there is one fundamental key to success: When most people think of positioning in this way, they are thinking about how to make their products easier to sell. But the correct goal is to make them easier to buy. Think about it, most people resist selling but enjoy buying. By focusing on making a product easy to buy, you are focusing on what the customers really want. In turn, they will sense this and reward you with their purchases. Thus, easy to buy becomes easy to sell. The goal of positioning, therefore, is to create a space inside the target customer’s head called “best buy for this type of situation” and to attain sole, undisputed occupancy of that space. Only then, when the green light is on, and there is no remaining competing alternative, is a product easy to buy.
  8. Pricing
    1. Set pricing at the market leader price point, thereby reinforcing your claims to market leadership (or at least not undercutting them), and build a disproportionately high reward for the channel into the price margin, a reward that will be phased out as the product becomes truly established in the mainstream, and competition for the right to distribute it increases.
  9. Other
    1. So how can we guarantee passing the elevator pitch test? The key is to define your position based on the target segment you intend to dominate and the value proposition you intend to dominate it with. Within this context, you then set forth your competition and the unique differentiation that belongs to you and that you expect to drive the buying decision your way. Here is a proven formula for getting all this down into two short sentences. Try it out on your own company and one of its key products. Just fill in the blanks:
      1. For (target customers—beachhead segment only) who are dissatisfied with (the current market alternative), our product is a (new product category) that provides (key problem-solving capability). unlike (the product alternative), we have assembled (key whole product features for your specific application).
    2. So building relationships with business press editors, initially around a whole product story, is a key tactic in crossing the chasm.
    3. The purpose of the postchasm enterprise is to make money. This is a much more radical statement than it appears. To begin with, we need to recognize that this is not the purpose of the prechasm organization. In the case of building an early market, the fundamental return on investment is the conversion of an amalgam of technology, services, and ideas into a replicable, manufacturable product and the proving out that there is some customer demand for this product. Early market revenues are the first measure of this demand, but they are typically not—nor are they expected to be-a source of profit.
    4. How wide is the chasm? Or, to put this in investment terms, how long will it take before I can achieve a reasonably predictable ROI from an acceptably large mainstream market? The simple answer to this question is, as long as it takes to create and install a sustainable whole product. The chasm model asserts that no mainstream market can occur until the whole product is in place.
    5. The key is to initiate the transition by introducing two new roles during the crossing-the-chasm effort. The first of these might be called the target market segment manager, and the second, the whole product manager. Both are temporary, transitional positions, with each being a stepping stone to a more traditional role. Specifically, the former leads to being an industry marketing manager, and the latter to a product marketing manager. These are their “real titles,” the ones under which they are hired, the ones that are most appropriate for their business cards. But during the chasm transition they should be assigned unique, one-time-only responsibilities, and while they are in that mode, we will use their “interim” titles. The target market segment manager has one goal in his or her short job life—to transform a visionary customer relationship into a potential beachhead for entry into the mainstream vertical market that particular customer participates
What I got out of it
  1. Awesome playbook for building out a high-tech company and framework for how to invest in them (see The Gorilla Game for further notes on the investing portion). The innovators gladly take on new technology but it is the pragmatists or the early majority who need proof of concept, who need evidence that you will be around for a while and that other respected players are using your product or service before they buy in, and they are where the real profits lie. The chasm is formed between these innovators and pragmatists and your strategy, focus, and mindset has to shift when attempting to tap into the mainstream.