Tag Archives: Innovation

Working Backwards by Colin Bryar, Bill Carr

Summary

  1. The authors lay out the leadership principles of what it means to be “Amazonian.” Working backwards is all about starting with the customer perspective, working backwards step-by-step, questioning assumptions until you figure out exactly what you want to build and how you want to deliver it. It is all about seeking truth. This process takes more time on the front end, but it is much less expensive and difficult to course correct in the early innings rather than at the stage where you have an operating business

Key Takeaways

  1. Especially in the early days, one bad customer experience can influence hundreds if not thousands a future customers. That is why Jeff was so maniacal about customer service. What distinguishes Amazon is that it’s leadership principles are deeply integrated into every person process and function across the company.
  2. The 14 Amazon leadership principles
    1. Customer obsession
    2. Ownership
    3. Invent and simplify
    4. Leaders are right, a lot
    5. Learn and be curious
    6. Hire and develop the best
    7. Insist on the highest standards
    8. Think big
    9. Bias for action
    10. Frugality
    11. Earn trust
    12. Dive Deep
    13. Have backbone; disagree and commit
    14. Deliver results
  3. Tensions don’t work, mechanisms do.
    1. Three core mechanisms to help translate the principles into action include the annual goal setting, the S team goalsetting plan and annual compensation packages to tie customer service with rewards. These operating plans are meant to intertwine the individual and company goals so that everything is self reinforcing and mutually beneficial. This takes tremendous planning and effort but it helps align every one in the organization.
    2. Amazon is different in the sense that the senior team focuses relentlessly on execution and details, not just strategy. Compensation very much tied to long-term Equity performance which ties to how well the company is serving customers
  4. The bar raiser program is one of Amazon’s most successful and widespread applications. It is used as an efficient way to hire appropriate people in a way that is scalable, teachable, and that contains feedback loops for the team and the interviewee. This ensures a sustainable way to hire great people who always raise the bar for the company as a whole. The interview starts with a phone call. If likely to hire they will be brought in for a behavioral and bar raiser section. Four or five various people interview the candidate and there is a written segment that follows with their thoughts and takeaways. Then there is a group discussion for the feedback is collectively read and the candidate discussed. If the group decides to move forward they will do reference checks and then hire the candidate if they choose to do so
  5. One way Amazon has combated the friction that comes with becoming a large organization is what they call single threaded leadership. This person heads up on autonomous team who does nothing but focuses on this major initiative and they have complete responsibility and accountability for it. The best way to fail is to make that thing somebody’s part-time job
  6. what matters is not quite speed, but velocity (speed + vector)
    1. Velocity
  7. Be aware of where their exist dependencies – whether technical, organizational, or otherwise – that slow you and your progress down as you have to rely on others to accomplish what you need. Too many dependencies are said to be tightly coupled. Amazon had to shift to small, autonomous teams but are now famous two pizza rule. Micro services are offered by small independent teams that are able to move quickly and independently yet offer great service and features
  8. Amazon does not focus or spend time on morale boosting events. Instead, they focus on attracting world-class talent and empowering them to build things that scale. Focusing on controllable input methods rather than uncontrollable output methods leads to sustainable and meaningful growth. A high morale is an output and not an input
  9. Two pizza themes are most effective in product development and each team is given specific metrics that are agreed-upon which helps keep everyone aligned and on task. However, Amazon found out that it was not the size of the team which predicted predicted success Patty right leader who are the necessary skills experience authority and capacity to build a team and lead to success
  10. The highest salary but Amazon is $160,000. There are no bonuses or anything else any sort of extra compensation if you stock the thus between 18 and 24 months. This is difficult if you’re looking for a short term hip and satisfaction but very lucrative if you believe in Amazon and it’s long-term prospects
  11. Amazon was able to move into a completely tangential business with AWS through its single threaded leader ship principle that ruthlessly iterates and keeps customer service top of mind

What I got out of it

  1. A great, inside look into what makes Amazon, Amazon. Always start with what the customer wants and work backwards from there. 

Diffusion of Innovations by Everett Rogers

Summary

  1. This book is about social change, moving from information to innovation. “Information is a difference in matter-energy that affects uncertainty in a situation where a choice exists among a set of alternatives. One kind of uncertainty is generated by innovation, defined as an idea, practice, or object that is perceived as new by an individual or another unit of adoption. An innovation presents an individual or an organization with a new alternative or alternatives, as well as new means of solving problems. However, the probability that the new ideas is superior to previous practice is not initially known with certainty by individual problem solvers. Thus, individuals are motivated to seek further information about the innovation in order to cope with the uncertainty that it creates. Information about an innovation is often sought from peers, especially information about their subjective evaluations of the innovation. This information exchange about a new idea occurs through a convergence process involving interpersonal networks. The diffusion of innovations is essentially a social process in which subjectively perceived information about a new idea is communicated from person to person. The meaning of an innovation is thus gradually worked through a process of social construction.”

Key Takeaways

  1. Diffusion a social matter even more than a technical one – how potential adopters view a change agent affects their willingness to adopt new ideas
    1. Lateral Networks, Culture, Hierarchies, Galilean Relativity
  2. A technological innovation embodies information and thus reduces uncertainty about cause-effect relationships in problem solving
  3. Attributes that help speed diffusion 
    1. (Perceived) Relative advantage – the improvement one innovation vs. what precedes it, perceived > objective advantage 
      1. Many adopters want to participate actively in customizing an innovation to fit their unique situation. Innovation diffuses more rapidly when it can be reinvented and that its adoption is more likely to be sustained
      2. Taking into account people’s perception of an innovation cannot be overstressed
      3. Rationality = using most effective means to reach a goal
      4. Fastest routes to adoption come when felt needs are met
      5. Mass media has a short, spiky effect on adoption whereas interpersonal communication is more sustainable 
    2. Compatibility – degree to which an innovation is perceived as being consistent with the existing values, past experiences, and needs of potential adopters
      1. This dependence on the experience of near peers suggests that the heart of the diffusion process consists of the modeling and imitation by potential adopters of their network partners who have previously adopted. Diffusion is a very special process that involves interpersonal communication relationships 
      2. One of the most distinctive problems in the diffusion of innovations is that the participants are usually quite heterophilous. Homophilous situations slows the spread of the innovation as these groups tend to socialize “horizontally” and don’t break through to other groups/classes
        1. Strength of Weak Ties
      3. The structure of a social system can facilitate or impede the diffusion of innovations. The impact of the social structure on diffusion is of special interest to sociologists and social psychologists, and the way in which the communication structure of a system affects diffusion is a particularly interesting topic for communication scholars. Katz remarked, “It is as unthinkable to study diffusion without some knowledge of the social structures in which potential adopters are located as it is to study blood circulation without adequate knowledge of the veins and arteries.”
        1. Opinion leaders thus exemplify and express the system’s structure. These are incredibly powerful and valuable members to have on your side
        2. A communication network consists of interconnected individuals whoa re linked by patterned flows of information 
    3. Complexity – degree to which an innovation is perceived as difficult to understand and use
      1. There are 5 main steps in the innovation-decision process – knowledge, persuasion, decision, implementation, and confirmation
    4. Trialability – degree to which an innovation may be experimented with on a limited basis
    5. Observability – degree to which the results of an innovation are visible to others
      1. Salience = degree of importance to an individual, want more information and will tell others about it
  4. Social marketing – segmentation and formative research lead to effective messages, positioning, price, communication channels
  5. Tactics to reach critical mass 
    1. Highly-respected individuals in a system’s hierarchy for initial adoption of the interactive innovation should be targeted
    2. Individuals’ perceptions of the innovation can be shaped, for instance, by implying that adoption of it is inevitable, that it is very desirable, or that the critical mass has already occurred or will occur soon 
      1. Chicken and egg…
    3. Introduce to intact groups whose members are likely to be relatively more innovative
    4. Incentives for early adoption of the interactive innovation should be provided, at least until critical mass is reached 
  6. Look for change agents and innovation champions who stand behind your product and who throw their support behind you, thus overcoming the indifference or resistance that the new idea may provoke
    1. “One of the greatest pains to human nature is the pain of a new idea. It…makes you think that after all, your favorite motions may be wrong, your firmest beliefs ill-founded…Naturally, therefore, common men hate a new idea, and are disposed more or less to ill-treat the original man who brings it. – Walter Bagehot, Physics and Politics
  7. Routinization occurs when the innovation has become incorporated into the regular activities of the organization and loses its separate identity. Sustainability, a closely related concept to routinization, is defined by the degree to which an innovation continues to be used after the initial effort to secure adoption is completed. Sustainability is more likely if widespread participation has occurred in the innovation process, if reinvention took place, and if an innovation champion was involved. This fifth stage, routinization, marks the end of the innovation process in an organization
  8. As much as change is about adapting to the new, it is about detaching from the old – Ronald Burt
  9. Stages in the Innovation-Decision Process
    1. Knowledge, persuasion, decision, implementation, confirmation

What I got out of it

  1. Seems like the “godfather” to such books as Geoffrey Moore and others have written. Learning about the attributes that help speed innovation – perceived relative advantage, compatibility, complexity, trialability, and observability – were worth the price of admission

Invisible Engines: How Software Platforms Drive Innovation and Transform Industries by David Evans

Summary

  1. “By focusing on the software platform we hope to offer the reader a perspective on the business dynamics and strategies of industries, old and new, that have been powered by these invisible engines…All of us quickly recognized that software platform businesses have at least two sides. Software platforms consist of services that are often made available to developers through APIs. They are also made available to computer users, but those computer users typically avail themselves of API-based services by buying applications that in turn use APIs. It is only a slight exaggeration to say that all software platform makers all the time invest in getting both developers and users to use their platforms. The developers/users are like the men/women, cards/merchants, advertisers/eyeballs, and buyers/sellers that we mentioned above. In fact, software platforms sometimes appeal to more than two distinct groups—including hardware makers and content providers. The economics of two-sided platforms provides a number of insights into pricing, design, organization, and governance of platform-based businesses. We were interested in understanding how this new economic learning could help shed light on the strategies followed by software platforms. On the flip side, we were interested in understanding how a diverse set of industries based on software platforms could be probed to provide insights for students of this new economics. This book is the result. It blends economics, history, and business analysis. It is intended for anyone who wants to better understand the business strategies that have been followed in industries based on software platforms. We focus on pricing, product design, and integration into downstream or upstream suppliers.”

Key Takeaways

  1. Most successful software platforms have exploited positive feedbacks (or network effects) between applications and users: more applications attract more users, and more users attract more applications. Nurturing both sides of the market helped Microsoft garner thousands of applications and hundreds of millions of users for its Windows platform.
  2. The modular approach has numerous advantages. If a new program (or other complex system) can be specified as N modules, N teams can work in parallel. Moreover, individual modules can subsequently be improved without touching other parts of the overall program, and they can be used in other programs.
  3. Operating systems provide services to applications through Application Programming Interfaces (APIs). These services range from rudimentary hardware services, such as moving a cursor on a monitor, to sophisticated software services, such as drawing and rotating three-dimensional objects. The APIs serve as interfaces between these services and applications…It is easy to see why application developers find the ability to access system services through APIs appealing. Rather than every application developer writing hundreds of lines of code to allocate memory to an object, to take the example above, the operating system developer writes 116 lines of code and makes the system services this code provides available to all application developers through the API.
  4. Software platforms make services available through APIs. Developers benefit from these because they avoid having to write some of their own code. Users benefit from a greater variety of and lower prices for applications. The economics of multisided platforms provides a set of tools for understanding the past, present, and future of software platforms.
  5. Multisided businesses can generate profits for themselves and benefits for their customers if they can figure out ways to increase and then capture indirect network externalities. There are three major ways in which they do this. First, they serve as matchmakers. Second, they build audiences. Advertising-supported media do mainly that: they use content to attract eyeballs and then sell access to those eyeballs to advertisers. Third, they reduce costs by providing shared facilities for the customers on each side. That’s the shopping mall case with which we began.
  6. Businesses in multisided markets often subsidize one side of the market to get the other side on board—sometimes explicitly by charging low or negative prices. A dating club may charge men a higher price just because they have more inelastic demand and because it is easy to identify that group of consumers. But businesses in multisided markets have an additional reason to price discriminate: by charging one group a lower price the business can charge another group a higher price; and unless prices are low enough to attract sufficient numbers of the former group, the business cannot obtain any sales at all. In contrast, economic analyses of multisided platforms, along with the industry case studies discussed in the following chapters, show that successful multisided platform businesses must pay careful attention to all relevant groups, and typically must worry more about balance among them than about building share with one of them. Getting the balance right seems to be more important than building shares. Platform markets do not tip quickly because as a practical matter, it takes time to get things right. And the first entrant often does not win in the end: many other firms may come in and successfully tweak the pricing structure, product design, or business model. The businesses that participate in such industries have to figure out ways to get both sides on board. One way to do this is to obtain a critical mass of users on one side of the market by giving them the service for free or even paying them to take it. Especially at the entry phase of firms in multisided markets, it is not uncommon to see precisely this strategy. Another way to solve the problem of getting the two sides on board simultaneously is to invest to lower the costs of consumers on one side of the market. As we saw earlier, for instance, Microsoft invests in the creation of software tools that make it easier for application developers to write application software for Microsoft operating systems and provides other assistance that makes developers’ jobs easier. In some cases, firms may initially take over one side of the business in order to get the market going.
  7. The copyleft provision means that if people choose to distribute software that is based in part on other software covered by the GPL, they must distribute their new software under the GPL. GPL software thereby propagates itself.
  8. Bundling features into the software platform is often efficient for the platform producer and for end users, as it is for most information goods, because it lowers distribution costs and expands demand.
  9. Multisided platforms must consider marginal costs and price sensitivity in pricing, like single-sided businesses, but they must also consider which side values the other side more. Software platforms generally charge low prices on one side in order to attract customers who can then be made available to the other side. Getting the balance right among all sides is more important than building market share.
  10. Per-copy charges also helped Microsoft capitalize on its investment in programming languages in the face of great uncertainty as to which computer makers would succeed. A flat fee would have earned less from the top sellers and would have discouraged other makers from even trying. Microsoft retained this basic pricing model when it went into the operating system business.
  11. In retrospect, having multiple operating systems run on a hardware platform is a poor strategy. The idea, of course, was to ensure that the hardware, not the operating system, became the standard that defined the platform and determined its evolution. Indeed, IBM followed an important economic principle for traditional industries: all firms would like everyone else in the supply chain to be competitive. IBM didn’t seem to recognize that this was far from a traditional industry. If IBM’s strategy had worked, and if several operating systems had been installed on substantial numbers of IBM PCs, what would have happened? Most likely, having multiple operating systems would have made the hardware platform less popular than having a single operating system. Applications are generally written for software platforms, not the underlying hardware. The more fragmented the installed base of operating systems, the less attractive it is to write an application for any one of them.
  12. Four key strategies helped Microsoft obtain the leading position in personal computers: (1) offering lower prices to users than its competitors; (2) intensely promoting API-based software services to developers; (3) promoting the development of peripherals, sometimes through direct subsidies, in order to increase the value of the Windows platform to developers and users; and (4) continually developing software services that provide value to developers directly and to end users indirectly.
  13. Technically, this is a two-part tariff, consisting of an access fee (the price of the razor) plus a usage fee (the price of the blade). Here the blade can be thought of as having two related roles. It meters the use of the durable good, and it sorts customers into those who are willing to pay more and those who are willing to pay less. These metering devices tend to increase profits and help companies better recover their fixed costs of investment. Because it is particularly attractive to make money on the blades, it is especially attractive to reduce the price of the razor, perhaps to below cost, or perhaps even to zero in extreme cases. For video game console makers this razorblade strategy made a lot of sense. Getting the console into the hands of many people increased the demand for the games it could play. Moreover, it made buying a console less risky for households, who had no good way of knowing how valuable the console would be until they saw the games produced for it. The game-console company, which was in the best position to forecast the quality of those games, took the risk: it lost money if consumers didn’t buy many games, and it made money if they did. The people who ultimately bought a lot of games were those who valued the console the most, so making profits mainly or even entirely on games enabled the console makers to earn the most from those willing to pay the most for their system.
  14. When consumers value product differentiation and platforms can offer innovative and unique features, multiple platforms can coexist despite indirect network effects that make bigger better.
  15. The console video gaming industry operates a radically different business model from other software platform industries. Game manufacturers tightly integrate hardware and software systems; they offer consoles to consumers at less than manufacturing cost, and they earn profits by developing games and charging third-party game developers for access to their platforms.
  16. Palm, on the other hand, regrouped. It surveyed Zoomer buyers to find out what they liked and didn’t like, what they used and didn’t use: What these people said opened the company’s eyes. More than 90% of Zoomer owners also owned a PC. More than half of them bought Zoomer because of software (offered as an add-on) that transferred data to and from a PC. These were business users, not retail consumers. And they didn’t want to replace their PCs—they wanted to complement them. People weren’t asking for a PDA that was smart enough to compete with a computer. They wanted a PDA that was simple enough to compete with paper.
  17. When you’re playing Bobby Fischer—and you want to win—don’t play chess. Make sure whatever game you’re playing—be it network delivery of media vs. stand-alone PC, whatever you’re in—that you’re not playing a game someone else has mastered when you have an option to play another game. —Rob Glaser, Founder of RealNetworks, May 20011
  18. Interestingly, many are made by Microsoft, which integrated into mouse production in 1983 mainly to be sure that the sort of mouse specified by its nascent Windows system would be available in the marketplace. Microsoft developed and patented a mouse that could connect to a PC through an existing serial port rather than to a special card installed within the computer. This innovation reduced the cost of the mouse and thus of mouse-using computers running Windows. Apple as a vertically integrated hardware and software platform maker has always produced its own mice.
  19. What is the cure? From A’s point of view, one cure is to have many competing producers of good b. Competition will then hold the price of b close to cost (including a reasonable return on capital) regardless of A’s pricing, so that A both effectively determines the system price (via the price of a) and captures all the economic profit. Generally, it is more attractive to rely on others to supply a complement (instead of buying it or making it), all else equal, if there are many producers of that complement who compete intensely. Hence the common strategic advice, “Commoditize the complements.”
  20. In a famous 1951 paper, Nobel Laureate George Stigler argued that this proposition implies that “vertical disintegration is the typical development in growing industries, vertical integration in declining industries.”
  21. Interestingly, we are aware of no examples of software platforms that initially integrated into the applications/games/content that subsequently exited that business entirely. On the other hand, almost all such platforms have adopted a two-sided strategy and made significant investments in attracting third-party suppliers. Partial integration is the norm. The only exceptions are those successful software platform vendors that launched without integration; they have remained out of the applications business. The tendency of computer-based industries to disintegrate over time is even clearer—with interesting exceptions—when we consider integration with the supply of basic hardware and peripherals. The Microsoft strategy of having the hardware complement its operating system produced by a competitive, technologically dynamic industry has served to make its operating systems more valuable and to speed their market penetration. Microsoft is not above using integration on occasion to stimulate important markets for complements, as its entry into mouse production, discussed earlier, illustrates.
  22. In a rephrasing of Mr. Katz’s words, Michael Dell told Microsoft upon refusing the Xbox deal offered to him: When Sony cuts the prices on their PlayStations, their stock price goes up. Every time I cut prices, my stock price goes down. If you don’t understand why that happens, you don’t understand the console business. I understand why this is strategic to Microsoft. I don’t understand why this is strategic to Dell.
  23. “Oh, ‘tanstaafl.’ Means ‘There ain’t no such thing as a free lunch.’ And isn’t,” I added, pointing to a FREE LUNCH sign across room, “or these drinks would cost half as much. Was reminding her that anything free costs twice as much in the long run or turns out worthless.” —Robert Heinlein
  24. In practice, it generally does matter which side pays, because two key assumptions made in the textbook discussion don’t apply. First, there are often significant transactions costs that prevent the customers on the two sides of most markets from just “sorting it out” themselves. Take the payment card example. Although most card systems prohibit merchant surcharging because it degrades the value of their product to cardholders, several countries have barred card systems from imposing such a no-surcharge rule. In those countries, however, most merchants don’t surcharge. One reason is that it is costly to impose small charges on customers. Those merchants that do surcharge often charge more than they are charged by the card system—an indication that they are using the fact that a customer wants to use her card as a basis for groupwise price discrimination.
  25. When balance matters in a mature two-sided business, the pricing problem is much more complex than in a single-sided business. Marginal cost and price responsiveness on both sides matter for both prices, and so does the pattern of indirect network effects. In general, if side A cares more about side B than B cares about A, then, all else equal, A will contribute more total revenue. Thus, newspapers make their money from selling advertising, not from selling papers. The textbook pricing formula for a single-sided market gives the optimal markup over marginal cost as 1 over a measure of price responsiveness (the price elasticity of demand), so low price responsiveness implies high markups. The corresponding formula for a two-sided business involves marginal costs on both sides, price responsiveness on both sides, and measures of the strength of indirect network effects in both directions. In particular, balance may require charging a price below marginal cost to a group with low price responsiveness, something a singlesided business would never do, if it is critical to attract members of that group in order to get members of the other group on board.
  26. The idea is initially to subsidize one side (or, more generally, to do whatever it takes) in order to get it on board even though the other side is not yet on board, and to use the presence of the subsidized side to attract the other side.6 This differs from the single-sided penetration pricing strategy discussed above because the key here is to generate indirect network effects, to use the subsidized side as a magnet to attract the other side. After entry has been successfully effected and both sides are on board, of course, the rationale for the initial subsidy vanishes, and one would expect to see a corresponding shift in pricing policy. One of the regularities we discuss below, however, is that pricing structures—the relative amounts paid by the various sides—appear fairly robust over time; there are not many examples of pricing low to one side at first and then raising prices significantly later.
  27. A fundamental decision facing all multisided platform businesses is choice of a price structure: How much should the platform vendor charge each side relative to the others? Since transactions involving some sides may have significant associated variable costs (the production and distribution costs of video game consoles, for instance), the most illuminating way to analyze observed price structures is to look at the contributions of each side to gross margin or variable profits: revenue minus side-specific variable cost. Should a two-sided platform derive most of its gross margin from one side of the market, and if so, which side, or should it choose a more balanced structure, with both sides making significant contributions to gross margin?
  28. Like all multisided platforms, the pricing structures of the software platforms we have encountered in this book reflect the need to get all unintegrated sides on board: end users, application/game/content developers, and manufacturers of hardware and peripheral equipment. The structures we have examined have three remarkable features. First, all of them are extremely skewed: almost all earn a disproportionate share of their variable profits on only one side of the market, either end users or developers. Second, for all but video games, the platform earns the bulk of its net revenues from end users. The third remarkable feature, which we consider in the next section, is that these structures have been stable over time.
  29. Components selling occurs when the firm offers A and B separately (cars and bicycle racks). • Pure bundling occurs when the firm only offers A and B together as a single bundled product, AB (men’s laced shoes). • Mixed bundling occurs when the firm offers the bundle AB and either or both of its components, A and B (such as the Sunday New York Times and the New York Times Book Review).
  30. It is common to bundle together products that are complements, such as automobiles and tires, but firms may find that it pays to bundle products that aren’t complements. We already saw an example of this above. Bundling persuaded two consumers to buy a product even though each wanted only a single component. This saved the manufacturer costs. The idea that bundling of noncomplements can be used to enhance profits goes back to a classic paper by Nobel Prize winning economist George Stigler. Stigler tried to explain why movie distributors at one time required theaters to take bundles of pictures. Bundling can be used in a different way to facilitate price discrimination, which we discussed in the preceding chapter. That is, if different groups of consumers place different values on groups of components, bundles can be designed so that those with stronger demand pay more. The idea is possible to design bundles of components that cause consumers to sort themselves by the bundles they choose into groups with different willingness to pay. (Marketers call this “segmentation.”) In the case of autos, some will want the car with the sports package, while others will want only the basic package. The seller can then charge a premium to groups that have a particularly high demand for a particular package and offer an especially aggressive price to consumers that are very sensitive to price but are also willing to take the no-frills deal. For this to work, there must be a predictable correlation between combinations of components and demand (for example, price-sensitive consumers generally have a low demand for frills). A number of studies have found, for example, that automobile companies have much higher markups on luxury models than on base models. Bundling drives innovation and creates industries.
  31. The ability to select bundles of features to sell helps firms segment their customers, control costs, and enhance profits. Bundled products offer consumers convenience, lower costs, and products tailored to their needs and wants.
  32. Bundling decisions by multisided platforms, such as software platforms, are more complex since they must take into account the effect on all customer groups. Multisided businesses must consider both the additional customers they get on one side as a result of including a new feature and the additional customers they will get on the other side from having those additional customers. They may also include features that harm one side directly but benefit the platform overall by getting more customers on board on another side.
  33. Bundling makes sense for businesses whenever the cost of adding additional features is lower than the additional sales generated thereby—even if most purchasers do not value or use all the features in a product bundle.
  34. Creative destruction has been a hallmark of economic progress for millennia, but it has proceeded at a glacial pace for most of history. The Industrial Revolution sped this process up. Even so, it took decades for change to filter through the economy following innovations such as the spinning jenny, steam engine, and electric generator. The information technology revolution has quickened the pace of industrial change greatly. The plummeting costs of computer processing and storage make it possible to create products and industries that were not only infeasible but also unimaginable a few years earlier. Software platforms further accelerate the process of creative destruction, mainly because code is digital and malleable. Think how easy it is to add a new feature to a software platform and distribute that change electronically over the Internet to potentially billions of computing devices around the world.
  35. One is familiar: developers. TiVo is evangelizing its software platform by providing tools and offering prizes for the best applications in several categories, including games, music, and photos.
  36. History teaches us that it takes decades for technological changes to work their way through the economy, destroying, creating, and transforming industries. The third industrial revolution got off to a quick start. We suspect that it will continue through at least the first few decades of the twenty-first century and that our invisible engines will ultimately touch most aspects of our business and personal lives.

What I got out of it

  1. Some of the examples are a bit outdated but the principles are just as valuable as ever – how to think about multisided markets, pricing, positioning, and so much more

The Innovator’s Solution: Creating and Sustaining Successful Growth by Clayton Christensen

Summary

  1. The solution to the innovator’s dilemma is two-fold: first, get top-level commitment by framing an innovation as a threat during the resource allocation process; later, shift responsibility for the project to an autonomous organization that can frame it as an opportunity 

Key Takeaways

  1. Overview
    1. The structures and initial conditions that are required for successful growth are enumerated in the chapters in this book. They include starting with a cost structure in which attractive profits can be earned at low price points and which can then be carried up market; being in a disruptive position relative to competitors so that they are motivated to flee rather than fight; starting with a set of customers who had been nonconsumers so that they are pleased with modest products; targeting a job that customers are trying to get done; skating to where the money will be, not to where it was; assigning managers who have taken the right courses in the school of experience and putting them to work within processes and organizational values that are attuned to what needs to be done; having the flexibility to respond as a viable strategy emerges; and starting with capital that can be patient for growth. If you start in conditions such as these, you do not need to see deeply into the future. Attractive choices that lead to success will present themselves. It is when you start in conditions that are opposite to these that attractive options may not appear, and the right choices will be difficult to make
    2. Never copy others. One of the most valuable contributions you can make is to keep watching for changes in circumstances. If you do this, you can understand when and why changes need to be made long before the evidence is clear to those whose vision is not clarified by theory. 
    3. Never say yes to a strategy that targets customers and markets that look attractive to an established competitor. 
    4. If your team targets customers who already are using pretty good products, send them back to see if they can find a way to compete against nononsumption
    5. If there are no nonconsumers available, ask your team to explore whether a low-end disruption is feasible
    6. Never try to change the behavior or process of the customer
    7. Segment the market in ways that mirror the jobs that customers are trying to get done
    8. Look towards the low-end for the opportunity to change the basis of competition
    9. Develop competencies where money will be made in the future rather than where it was made in the past 
    10. Integration to modularity is a key cycle, competition forces modularity which leads to power which leads to integration (the rise and fall)
    11. Be impatient for profit and keep your company growing so that you can be patient for growth 
  2. Executives must answer 3 sets of questions to determine whether an idea has disruptive potential
    1. Is there a large population of people who have not had the money, equipment, or skill to do this thing for themselves, and as a result have gone without it altogether or have needed to pay someone with more expertise to do it for them?
    2. To use the product or service, do customers need to go to an inconvenient, centralized location?
    3. Are there customers at the low end of the market who would be happy to purchase a product with less (but good enough) performance if they could get it at a lower price?
    4. Can we create a business model that enables us to earn attractive profits at the discount prices required to win the business of these overserved customers as the low-end?
    5. Is the innovation disruptive to all of the significant incumbents in the industry? If it appears to be sustaining to one or more significant players in the industry, then the odds will be stacked in that firm’s favor, and the entrant is unlikely to win
  3. Other
    1. Need to develop products for the circumstance and not the customer, the chain needs to communicate the circumstance, and not necessarily to the customer
    2. Needs to by symmetry of motivation across the entire chain of entities that add value to the product on its way to the end customer. Win/win
    3. Disruption causes others to be disinterested in what you are doing. This is exactly what you want with competitors: you want them to ignore you. But offering something that is disruptively unattractive to your customers – which includes all of the downstream entities that compose your channel – spells disaster. Companies in your channel are customers with a job to get done, which is to grow profitably 
    4. To succeed with a nonintegrated, specialist strategy, you need to be certain you’re competing in a modular world. When the functionality and the reliability of a product are not good enough to meet customers’ needs, then the companies that will enjoy significant competitive advantage are those whose product architectures are proprietary and that are integrated across the performance-limiting interfaces in the value chain. 
    5. Core competence, as it is used by many managers, is a dangerously inward-looking notion. Competitiveness is far more about doing what customers value than doing what you think you’re good at. 
    6. The Law of conservation of attractive profits states that in the value chain there is a requisite juxtaposition of modular and interdependent architectures, and of reciprocal processes of commoditization and decommoditization, that exists in order to optimize the performance of what is not good enough. The law states that when modularity and commoditization cause attractive profits to disappear at one stage in the value chain, the opportunity to earn attractive profits with proprietary products will usually emerge at an adjacent stage. 
    7. Emergent processes should dominate in circumstances in which the future is hard to read and in which it is not clear what the right strategy should be. This is almost always the case during the early phases of a company’s life. However, the need for emergent strategy arises whenever a change in circumstances portends that the formula that worked in the past may not be as effective in the future. On the other hand, the deliberate strategy process should be dominant once a winning strategy has become clear, because in those circumstances effective execution often spells the difference between success and failure
    8. Senior executives have 3 jobs when it comes to disruptive growth: be the interface between disruptive and mainstream business and determine which of the corporation’s resources should go to the new business and which should not; shepherd the creation of a process we call the disruptive growth engine; and to sense when the circumstances are changing and to keep teaching others to recognize these signals – start before you need to, get a senior manager in charge, get an expert team of movers and shapers, train the troops 

What I got out of it

  1. Great to read these books alongside Moore’s books – so much to learn from the general progression and regression of entire industries, what to look out for, and how to take advantage of them

The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail by Clayton Christensen

Summary

  1. The research reported in this book supports his latter view: it shows that in the cases of well-managed firms, good management was the most powerful reason they failed to stay atop their industries. Precisely because these firms listened to their customers, invested heavily in new technologies that would provide their customers more and better products of the short they wanted, and because they carefully studied market trends and systematically allocated investment capital to innovations that promised the best returns, they lost their positions of leadership. What this implies at a deeper level is that many of what are now widely accepted principles of good management are, in fact, only situationally appropriate. There are time at which it is right not to listen to customers, right ot invest in developing lower-performance products that promise lower margins, and right to aggressively pursue small, rather than substantial markets. 

Key Takeaways

  1. One common theme to all of these failures, however, is that the decisions that led to failure were made when the leaders in question were widely regarded as among the best companies in the world
  2. The failure framework is built upon 3 findings. The first is that there is a strategically important distinction between what I call sustaining technologies and those that are disruptive. Second, the pace of technological progress can, and often does, outstrip what markets need. This means that the relevance and competitiveness of different technological approaches can change with respect to different markets over time. And third, customers and financial structures of successful companies color heavily the sorts of investments that appear to be attractive to them, relative to certain types of entering firms
  3. Case for investing in disruptive technologies can’t be made confidently until it is too late
  4. Established firms confronted with disruptive technology typically viewed their primary development challenge as a technological one: to improve the disruptive technology enough that it suits known markets. In contrast, the firms that were most successful in commercializing a disruptive technology were those framing their primary development challenge as a marketing one: to build or find a market where product competition occurred along dimensions that favored the disruptive attributes of the product. 
  5. It has almost always been the case that disruptive products redefine the dominant distribution channels, because dealers’ economics – their models for how to make money – are powerfully shaped by the mainstream value network, just s the manufacturer’s are. 
  6. Principles of disruptive innovation
    1. Companies depend on customers and investors for resources – difficult for companies tailored for high-end markets to compete in low-end markets as well. Creating an independent organization that can compete in these disruptive technologies is the only viable way for established firms to harness this principle. Promise of upmarket margins, simultaneous upmarket movement of customers, and the difficulty of cutting costs to move downmarket profitably create a powerful barrier to downward mobility. In fact, cultivating a systematic approach to weeding out new product development initiatives that would likely lower profits is one of the most important achievements of any well-managed company. Creates a vacuum in the low-end market that attracts competition
    2. Small markets don’t solve the growth needs of small companies – create small organizations that get excited about small opportunities and small wins
    3. Markets that don’t exist can’t be analyzed – those who need analysis and quantification before they invest become paralyzed when faced with disruptive technologies
    4. Technology supply may not equal market demand – sometimes “good enough” is competitive and established firms tend to overshoot what the market demands. Moves from functionality to reliability to convenience to price
    5. Not wise to always be a technological leader or a follower – need to take distinctly different postures depending on whether they are addressing a disruptive or sustaining technology. Disruptive technologies have a large first-mover advantage and leadership is important

What I got out of it

  1. Great way to think about how you could do all the right things and still lose. Helmer’s counterpositioning in action

Inside the Tornado by Geoffrey Moore

Summary

  1. For those within the high tech sector, or who manage investments in these companies, this imperative translates into a series of deceptively simple questions: what can we do during a tornado to best capitalize on our opportunity? How can we tell when one is coming, and what we can do to prepare? How can we sense when it is ending, and what should we do then? Finally, going forward, how can we reframe our strategic management concepts to better accommodate tornado market dynamics in general?

Key Takeaways

  1. The winning strategy does not just change as we move from stage to stage, it actually reverses the prior strategy. This is why this is so difficult and counterintuitive – what made you successful at an earlier stage causes failure at later stages. Early stages you must not segment, in the chasm and bowling alley you must segment, in the tornado you must not segment, on main street you must segment
  2. Truly discontinuous innovations – paradigm shifts – are new products or services that require the end user and the marketplace to dramatically change their past behavior, with the promise of gaining equally dramatic new benefits. 
  3. The only way to cross the chasm is to put all your eggs in one basket. That is, key to a winning strategy is to identify a single beachhead of pragmatist customers in a mainstream market segment and to accelerate the formation of 100% of their whole product. The goal is to win a niche foothold in the mainstream as quickly as possible – that is what is meant by crossing the chasm. Then, once in the tornado, you need to quickly switch strategies and gain mass market share at any cost, positioning your products horizontally as global infrastructure
    1. Many leaders are not cut out to lead the company through each of these phases. That’s fine and to be expected, but know what stage you’re in, what type of CEO you have, and when they might need to be replaced 
  4. Once any infrastructure is substantially deployed, power shifts from teh builders – the professional services firms – to the operators, or what we have come to call the transaction services firms. The key to the transaction services model is that the requisite infrastructure has already been assimilated (keeping support costs down) and amortized (minimizing ongoing investment
  5. For every stage of the technology adoption life cycle, there is an optimal business model
    1. early market – professional services. 
    2. bowling alley – application products
    3. tornado – infrastructure products – a period of mass-market adoption when the general marketplace switches over to the new infrastructure paradigm
    4. main street – transaction services
  6. This sequence of events unleashes a vortex of market demand. Infrastructure, to be useful, must be standard and global, so once the market moves to switch out the old for the new, it wants to complete this transition as rapidly as possible. All the pent-up interest in the product is thus converted into a massive purchasing binge, causing demand to vastly outstrip supply. Companies grow at hypergrowth rates, with billions of dollars of revenue seeming to appear from out of nowhere.
  7. Overview of the tech adoption lifecyle
    1. The forces that operate in the bowling alley argue for a niche-based strategy that is highly customer-centric
    2. Those in the tornado push in the opposite direction toward a mass-market strategy for deploying a common standard infrastructure
    3. Then on Main St., market forces push back again toward a customer-centric approach, focusing on specific adaptations of this infrastructure for added value through mass customization
    4. Given these dramatic reversals in strategy, it is imperative that organizations be able to agree on where their markets are in the life cycle
    5. In the meantime, the economic cataclysm of the tornado deconstructs and reconstructs the power structure in the market so rapidly that simply understanding who is friend and who is foe becomes a challenge
    6. Within the newly emerging market structure, companies must compete for advantage based on their status within it
    7. Positioning in this context consists of a company taking its rightful place in the hierarchy of power and defending it against challengers
    8. And finally, moving fluidly from strategy to strategy is the ultimate challenge of any organization, demanding an extraordinarily flexible response from its management team
  8. Safe path is to overinvest when invading any new segment, seeking to accelerate market leadership, and then divert resources as soon as the position is achieved
  9. Post tornado market share by revenue tends to be 50% for the gorilla, 15% for chimp 1, 15% for chimp 2, and 30% for the monkeys 
  10. The lessons that Oracle taught – attack the competition ruthlessly, expand your distribution channel as fast as possible, ignore the customer
  11. The lessons that HP taught – just ship, extend distribution channels, drive to the next lower price point
  12. The lessons that Wintel taught – recruit partners to create a powerful whole product, instituitionalize this whole product as the market leader, commoditize the whole product by designing out your partners
  13. +1 opportunities – what do we have to offer at little or no incremental cost to ourselves that the market would pay us more money for? Compelling fantasy like Nike and Mont Blanc do this better than anyone
  14. Recap
    1. Bowling alley:  product leadership, customer intimacy
    2. Tornado: product leadership, operational excellency 
    3. Main St: operational excellent, customer intimacy 
  15. Trust, it turns out, is a complicated and challenging relationship, as much so in business as in parenting or marriage. Like everything else we have been discussing in recent chapters, it is ultimately about power. The paradox of trust is that by intelligently relinquishing power, one gains it back many times over. Once you reach your persona limits, this is the only economy of scale that can help. And because hypergrowth markets will push you to your personal limits faster than most other challenges in business, this is a fitting thought on which to close this book

What I got out of it

  1. Fascinating insights into the paradoxical path that it takes to be successful with technologically disruptive companies

The Mythical Man-Month: Essays on Software Engineering by Frederick P. Brooks

Summary

  1. This book is a belated answer to Tom Watson’s probing questions as to why programming is hard to manage…Briefly, I believe that large programming projects suffer management problems different in kind from small ones, due to division of labor. I believe the critical need to be the preservation of the conceptual integrity of the product itself. These chapters explore both the difficulties of achieving this unity and methods for doing so. The later chapters explore other aspects of software engineering management….The Mythical Man-Month is only incidentally about software but primarily about how people in teams make things. There is surely some truth in this

Key Takeaways

  1. A rule of thumb, I estimate that a programming product costs at least three times as much as a debugged program with the same function…Programming system component costs at least three times as much as a stand-alone program of the same function.
  2. First, one must perform perfectly. The computer resembles the magic of legend in this respect, too. If one character, one pause, of the incantation is not strictly in proper form, the magic doesn’t work. Human beings are not accustomed to being perfect, and few areas of human activity demand it. Adjusting to the requirement for perfection is, I think, the most difficult part of learning to program. Next, other people set one’s objectives, provide one’s resources, and furnish one’s information. One rarely controls the circumstances of his work, or even its goal. In management terms, one’s authority is not sufficient for his responsibility. It seems that in all fields, however, the jobs where things get done never have formal authority commensurate with responsibility. In practice, actual (as opposed to formal) authority is acquired from the very momentum of accomplishment. The dependence upon others has a particular case that is especially painful for the system programmer. He depends upon other people’s programs. These are often maldesigned, poorly implemented, incompletely delivered (no source code or test cases), and poorly documented. So he must spend hours studying and fixing things that in an ideal world would be complete, available, and usable. The next woe is that designing grand concepts is fun; finding nitty little bugs is just work. With any creative activity come dreary hours of tedious, painstaking labor, and programming is no exception.
  3. The challenge and the mission are to find real solutions to real problems on actual schedules with available resources.
  4. More software projects have gone awry for lack of calendar time than for all other causes combined. Why is this cause of disaster so common? First, our techniques of estimating are poorly developed. More seriously, they reflect an unvoiced assumption which is quite untrue, i.e., that all will go well. Second, our estimating techniques fallaciously confuse effort with progress, hiding the assumption that men and months are interchangeable.
    1. Key point – men and months are not interchangeable, but we make assumptions that they are
  5. Fifth, when schedule slippage is recognized, the natural (and traditional) response is to add manpower. Like dousing a fire with gasoline, this makes matters worse, much worse. More fire requires more gasoline, and thus begins a regenerative cycle which ends in disaster.
  6. For the human makers of things, the incompletenesses and inconsistencies of our ideas become clear only during implementation. Thus it is that writing, experimentation, “working out” are essential disciplines for the theoretician.
  7. The second fallacious thought mode is expressed in the very unit of effort used in estimating and scheduling: the man-month. Cost does indeed vary as the product of the number of men and the number of months. Progress does not. Hence the man-month as a unit for measuring the size of a job is a dangerous and deceptive myth. It implies that men and months are interchangeable.
  8. The bearing of a child takes nine months, no matter how many women are assigned. Many software tasks have this characteristic because of the sequential nature of debugging.
  9. Since software construction is inherently a systems effort—an exercise in complex interrelationships—communication effort is great, and it quickly dominates the decrease in individual task time brought about by partitioning. Adding more men then lengthens, not shortens, the schedule. 
  10. For some years I have been successfully using the following rule of thumb for scheduling a software task: 1/3 planning 1/6 coding 1/4 component test and early system test 1/4 system test, all components in hand.
  11. Programming managers have long recognized wide productivity variations between good programmers and poor ones. But the actual measured magnitudes have astounded all of us. In one of their studies, Sackman, Erikson, and Grant were measuring performances of a group of experienced programmers. Within just this group the ratios between best and worst performances averaged about 10:1 on productivity measurements and an amazing 5:1 on program speed and space measurements! In short the $20,000/year programmer may well be 10 times as productive as the $10,000/year one. The converse may be true, too. The data showed no correlation whatsoever between experience and performance. (I doubt if that is universally true.)
  12. I have earlier argued that the sheer number of minds to be coordinated affects the cost of the effort, for a major part of the cost is communication and correcting the ill effects of miscommunication (system debugging). This, too, suggests that one wants the system to be built by as few minds as possible.
  13. The dilemma is a cruel one. For efficiency and conceptual integrity, one prefers a few good minds doing design and construction. Yet for large systems one wants a way to bring considerable manpower to bear, so that the product can make a timely appearance. How can these two needs be reconciled? Mills’s Proposal A proposal by Harlan Mills offers a fresh and creative solution. Mills proposes that each segment of a large job be tackled by a team, but that the team be organized like a surgical team rather than a hog-butchering team. That is, instead of each member cutting away on the problem, one does the cutting and the others give him every support that will enhance his effectiveness and productivity.
  14. Even though they have not taken centuries to build, most programming systems reflect conceptual disunity far worse than that of cathedrals. Usually this arises not from a serial succession of master designers, but from the separation of design into many tasks done by many men. I will contend that conceptual integrity is the most important consideration in system design. It is better to have a system omit certain anomalous features and improvements, but to reflect one set of design ideas, than to have one that contains many good but independent and uncoordinated ideas.
  15. The purpose of a programming system is to make a computer easy to use…Because ease of use is the purpose, this ratio of function to conceptual complexity is the ultimate test of system design. Neither function alone nor simplicity alone defines a good design. This point is widely misunderstood. As soon as ease of use is held up as the criterion, each of these is seen to be unbalanced, reaching for only half of the true goal. Ease of use, then, dictates unity of design, conceptual integrity. Conceptual integrity in turn dictates that the design must proceed from one mind, or from a very small number of agreeing resonant minds.
  16. Architecture must be carefully distinguished from implementation. As Blaauw has said, “Where architecture tells what happens, implementation tells how it is made to happen.”
  17. Not trivial, however, is the principle that such mini-decisions be made consistently throughout.
  18. In most computer projects there comes a day when it is discovered that the machine and the manual don’t agree. When the confrontation follows, the manual usually loses, for it can be changed far more quickly and cheaply than the machine.
  19. The project manager’s best friend is his daily adversary, the independent product-testing organization. This group checks machines and programs against specifications and serves as a devil’s advocate, pinpointing every conceivable defect and discrepancy. Every development organization needs such an independent technical auditing group to keep it honest.
  20. The second reason for the project workbook is control of the distribution of information. The problem is not to restrict information, but to ensure that relevant information gets to all the people who need it.
  21. The purpose of organization is to reduce the amount of communication and coordination necessary; hence organization is a radical attack on the communication problems treated above.
  22. The means by which communication is obviated are division of labor and specialization of function.
  23. On larger projects it is very rarely workable, for two reasons. First, the man with strong management talent and strong technical talent is rarely found. Thinkers are rare; doers are rarer; and thinker-doers are rarest.
  24. Practice is the best of all instructors. —PUBLILIUS
  25. Experience is a dear teacher, but fools will learn at no other. —POOR RICHARD’S ALMANAC
  26. The linear extrapolation of such sprint figures is meaningless. Extrapolation of times for the hundred-yard dash shows that a man can run a mile in under three minutes.
  27. Fostering a total-system, user-oriented attitude may well be the most important function of the programming manager.
  28. First, writing the decisions down is essential. Only when one writes do the gaps appear and the inconsistencies protrude. The act of writing turns out to require hundreds of mini-decisions, and it is the existence of these that distinguishes clear, exact policies from fuzzy ones.
  29. Chemical engineers learned long ago that a process that works in the laboratory cannot be implemented in a factory in only one step. An intermediate step called the pilot plant is necessary to give experience in scaling quantities up and in operating in nonprotective environments. For example, a laboratory process for desalting water will be tested in a pilot plant of 10,000 gallon/day capacity before being used for a 2,000,000 gallon/day community water system.
    1. Scaling Effects
  30. In most projects, the first system built is barely usable. It may be too slow, too big, awkward to use, or all three. There is no alternative but to start again, smarting but smarter, and build a redesigned version in which these problems are solved. The discard and redesign may be done in one lump, or it may be done piece-by-piece. But all large-system experience shows that it will be done. Where a new system concept or new technology is used, one has to build a system to throw away, for even the best planning is not so omniscient as to get it right the first time. The management question, therefore, is not whether to build a pilot system and throw it away. You will do that. The only question is whether to plan in advance to build a throwaway, or to promise to deliver the throwaway to customers. Seen this way, the answer is much clearer.
  31. Structuring an organization for change is much harder than designing a system for change.
  32. Things are always at their best in the beginning,” said Pascal. C. S. Lewis has stated it more perceptively: That is the key to history. Terrific energy is expended—civilizations are built up—excellent institutions devised; but each time something goes wrong. Some fatal flaw always brings the selfish and cruel people to the top, and then it all slides back into misery and ruin. In fact, the machine conks. It seems to start up all right and runs a few yards, and then it breaks down.
  33. A good workman is known by his tools. —PROVERB
  34. The most pernicious and subtle bugs are system bugs arising from mismatched assumptions made by the authors of various components.
  35. Many poor systems come from an attempt to salvage a bad basic design and patch it with all kinds of cosmetic relief. Top-down design reduces the temptation. I am persuaded that top-down design is the most important new programming formalization of the decade.
  36. Add one component at a time. This precept, too, is obvious, but optimism and laziness tempt us to violate
  37. Lehman and Belady offer evidence that quanta should be very large and widely spaced or else very small and frequent. The latter strategy is more subject to instability, according to their model. My experience confirms it: I would never risk that strategy in practice.
  38. How does one control a big project on a tight schedule? The first step is to have a schedule. Each of a list of events, called milestones, has a date. Picking the dates is an estimating problem, discussed already and crucially dependent on experience. For picking the milestones there is only one relevant rule. Milestones must be concrete, specific, measurable events, defined with knife-edge sharpness. It is more important that milestones be sharp-edged and unambiguous than that they be easily verifiable by the boss. Rarely will a man lie about milestone progress, if the milestone is so sharp that he can’t deceive himself. But if the milestone is fuzzy, the boss often understands a different report from that which the man gives.  Sharp milestones are in fact a service to the team, and one they can properly expect from a manager. The fuzzy milestone is the harder burden to live with. It is in fact a millstone that grinds down morale, for it deceives one about lost time until it is irremediable. And chronic schedule slippage is a morale-killer.
  39. The preparation of a PERT chart is the most valuable part of its use. Laying out the network, identifying the dependencies, and estimating the legs all force a great deal of very specific planning very early in a project. The first chart is always terrible, and one invents and invents in making the second one.
  40. Most of the big last gains in software productivity have come from removing artificial barriers that have made the accidental tasks inordinately hard, such as severe hardware constraints, awkward programming languages, lack of machine time. How much of what software engineers now do is still devoted to the accidental, as opposed to the essential? Unless it is more than 9/10 of all effort, shrinking all the accidental activities to zero time will not give an order of magnitude improvement. Therefore it appears that the time has come to address the essential parts of the software task, those concerned with fashioning abstract conceptual structures of great complexity. I suggest:
    1. Exploiting the mass market to avoid constructing what can be bought.
    2. Using rapid prototyping as part of a planned iteration in establishing software requirements.
    3. Growing software organically, adding more and more function to systems as they are run, used, and tested.
    4. Identifying and developing the great conceptual designers of the rising generation.
  41. The gap between the best software engineering practice and the average practice is very wide—perhaps wider than in any other engineering discipline. A tool that disseminates good practice would be important.
  42. The development of the mass market is, I believe, the most profound long-run trend in software engineering. The cost of software has always been development cost, not replication cost. Sharing that cost among even a few users radically cuts the per-user cost. Another way of looking at it is that the use of n copies of a software system effectively multiplies the productivity of its developers by n. That is an enhancement of the productivity of the discipline and of the nation.
  43. No other part of the conceptual work is so difficult as establishing the detailed technical requirements, including all the interfaces to people, to machines, and to other software systems. No other part of the work so cripples the resulting system if done wrong. No other part is more difficult to rectify later. Therefore the most important function that software builders do for their clients is the iterative extraction and refinement of the product requirements. For the truth is, the clients do not know what they want. They usually do not know what questions must be answered, and they almost never have thought of the problem in the detail that must be specified.
  44. I would go a step further and assert that it is really impossible for clients, even those working with software engineers, to specify completely, precisely, and correctly the exact requirements of a modern software product before having built and tried some versions of the product they are specifying. Therefore one of the most promising of the current technological efforts, and one which attacks the essence, not the accidents, of the software problem, is the development of approaches and tools for rapid prototyping of systems as part of the iterative specification of requirements. A prototype software system is one that simulates the important interfaces and performs the main functions of the intended system, while not being necessarily bound by the same hardware speed, size, or cost constraints. Prototypes typically perform the mainline tasks of the application, but make no attempt to handle the exceptions, respond correctly to invalid inputs, abort cleanly, etc. The purpose of the prototype is to make real the conceptual structure specified, so that the client can test it for consistency and usability.
  45. Incremental development—grow, not build, software. I still remember the jolt I felt in 1958 when I first heard a friend talk about building a program, as opposed to writing one. In a flash he broadened my whole view of the software process. The metaphor shift was powerful, and accurate. Today we understand how like other building processes the construction of software is, and we freely use other elements of the metaphor, such as specifications, assembly of components, and scaffolding. The building metaphor has outlived its usefulness. It is time to change again. If, as I believe, the conceptual structures we construct today are too complicated to be accurately specified in advance, and too complex to be built faultlessly, then we must take a radically different approach. Let us turn to nature and study complexity in living things, instead of just the dead works of man. Here we find constructs whose complexities thrill us with awe. The brain alone is intricate beyond mapping, powerful beyond imitation, rich in diversity, self-protecting, and self-renewing. The secret is that it is grown, not built. So it must be with our software systems. Some years ago Harlan Mills proposed that any software system should be grown by incremental development. That is, the system should first be made to run, even though it does nothing useful except call the proper set of dummy subprograms. Then, bit by bit it is fleshed out, with the subprograms in turn being developed into actions or calls to empty stubs in the level below. I have seen the most dramatic results since I began urging this technique on the project builders in my software engineering laboratory class. Nothing in the past decade has so radically changed my own practice, or its effectiveness. The approach necessitates top-down design, for it is a top-down growing of the software. It allows easy backtracking. It lends itself to early prototypes. Each added function and new provision for more complex data or circumstances grows organically out of what is already there. The morale effects are startling. Enthusiasm jumps when there is a running system, even a simple one. Efforts redouble when the first picture from a new graphics software system appears on the screen, even if it is only a rectangle. One always has, at every stage in the process, a working system. I find that teams can grow much more complex entities in four months than they can build. The same benefits can be realized on large projects as on my small ones. 
  46. The differences are not minor—it is rather like Salieri and Mozart. Study after study shows that the very best designers produce structures that are faster, smaller, simpler, cleaner, and produced with less effort. The differences between the great and the average approach an order of magnitude.
  47. My first proposal is that each software organization must determine and proclaim that great designers are as important to its success as great managers are, and that they can be expected to be similarly nurtured and rewarded. Not only salary, but the perquisites of recognition—office size, furnishings, personal technical equipment, travel funds, staff support—must be fully equivalent. How to grow great designers? Space does not permit a lengthy discussion, but some steps are obvious:
    1. Systematically identify top designers as early as possible. The best are often not the most experienced.
    2. Assign a career mentor to be responsible for the development of the prospect, and keep a careful career file.
    3. Devise and maintain a career development plan for each prospect, including carefully selected apprenticeships with top designers, episodes of advanced formal education, and short courses, all interspersed with solo design and technical leadership assignments.
    4. Provide opportunities for growing designers to interact with and stimulate each other.
  48. Turski and I both insist that pipe-dreaming inhibits forward progress and wastes effort.
  49. Capers Jones, writing first in a series of memoranda and later in a book, offers a penetrating insight, which has been stated by several of my correspondents. “NSB,” like most writings at the time, was focused on productivity, the software output per unit of input. Jones says, “No. Focus on quality, and productivity will follow.” He argues that costly and late projects invest most of the extra work and time in finding and repairing errors in specification, in design, in implementation. He offers data that show a strong correlation between lack of systematic quality controls and schedule disasters. I believe it.
  50. Representation is the essence of programming.
  51. Fixing a defect has a substantial (20 to 50 percent) chance of introducing another.
  52. Vyssotsky: “I have found it handy to carry both ‘scheduled’ (boss’s dates) and ‘estimated’ (lowest-level manager’s dates) dates in the milestone report. The project manager has to keep his fingers off the estimated dates.”
  53. The subsystem boundaries must be at those places where interfaces between the subsystems are minimal and easiest to define rigorously.
  54. Featuritis. The besetting temptation for the architect of a general purpose tool such as a spreadsheet or a word processor is to overload the product with features of marginal utility, at the expense of performance and even of ease of use. The appeal of proposed features is evident at the outset; the performance penalty is evident only as system testing proceeds. The loss of ease of use sneaks up insidiously, as features are added in little increments, and the manuals wax fatter and fatter.
  55. If one believes, as I have argued at many places in this book, that creativity comes from individuals and not from structures or processes, then a central question facing the software manager is how to design structure and process so as to enhance, rather than inhibit, creativity and initiative. Fortunately, this problem is not peculiar to software organizations, and great thinkers have worked on it. E. F. Schumacher, in his classic, Small is Beautiful: Economics as if People Mattered, proposes a theory of organizing enterprises to maximize the creativity and joy of the workers. For his first principle he chooses the “Principle of Subsidiary Function” from the Encyclical Quadragesimo Anno of Pope Pius XI: It is an injustice and at the same time a grave evil and disturbance of right order to assign to a greater and higher association what lesser and subordinate organizations can do. For every social activity ought of its very nature to furnish help to the members of the body social and never destroy and absorb them. . . . Those in command should be sure that the more perfectly a graduated order is preserved among the various associations, in observing the principle of subsidiary function, the stronger will be the social authority and effectiveness and the happier and more prosperous the condition of the State.

What I got out of it

  1. The importance of thinking in parallel vs. series, adding margins of safety (things always go wrong, so do you bake that into your assumptions or do you pay for it dearly at a later point?), adding more software developers generally makes projects even later, as few minds as possible to make the system easy to use (top-down design one of the most important aspects to consider), the importance of testing and iteration at every step along the process (grow, don’t build software), sharp rather than fuzzy milestones

The Idea Factory: Bell Labs and the Great Age of American Innovation by Jon Gertner

Summary

  1. “This book is about innovation—about how it happens, why it happens, and who makes it happen. It is likewise about why innovation matters, not just to scientists, engineers, and corporate executives but to all of us. That the story is about Bell Labs, and even more specifically about life at the Labs between the late 1930s and the mid-1970s, isn’t a coincidence.” The people helping to make it happen including Mervin Kelly, Jim Fisk, William Shockley, Claude Shannon, John Pierce, and William Baker.

Key Takeaways

  1. Where is the knowledge we have lost in information? —T. S. Eliot, The Rock
  2. Yet understanding the circumstances that led up to that unusual winter of 1947 at Bell Labs, and what happened there in the years afterward, promises a number of insights into how societies progress. With this in mind, one might think of a host of reasons to look back at these old inventions, these forgotten engineers, these lost worlds.
  3. Edison’s genius lay in making new inventions work, or in making existing inventions work better than anyone had thought possible. But how they worked was to Edison less important.
  4. Contrary to its gentle image of later years, created largely through one of the great public relations machines in corporate history, Ma Bell in its first few decades was close to a public menace—a ruthless, rapacious, grasping “Bell Octopus,” as its enemies would describe it to the press. “The Bell Company has had a monopoly more profitable and more controlling—and more generally hated—than any ever given by any patent,” one phone company lawyer admitted.
  5. AT&T’s savior was Theodore Vail, who became its president in 1907, just a few years after Millikan’s friend Frank Jewett joined the company.11 In appearance, Vail seemed almost a caricature of a Gilded Age executive: Rotund and jowly, with a white walrus mustache, round spectacles, and a sweep of silver hair, he carried forth a magisterial confidence. But he had in fact begun his career as a lowly telegraph operator. Thoughtfulness was his primary asset; he could see almost any side of an argument. Also, he could both disarm and outfox his detractors. As Vail began overseeing Bell operations, he saw that the costs of competition were making the phone business far less profitable than it had been—so much so, in fact, that Vail issued a frank corporate report in his first year admitting that the company had amassed an “abnormal indebtedness.” If AT&T were to survive, it had to come up with a more effective strategy against its competition while bolstering its public image.
  6. Vail didn’t do any of this out of altruism. He saw that a possible route to monopoly—or at least a near monopoly, which was what AT&T had always been striving for—could be achieved not through a show of muscle but through an acquiescence to political supervision. Yet his primary argument was an idea. He argued that telephone service had become “necessary to existence.” Moreover, he insisted that the public would be best served by a technologically unified and compatible system—and that it made sense for a single company to be in charge of it. Vail understood that government, or at least many politicians, would argue that phone subscribers must have protections against a monopoly; his company’s expenditures, prices, and profits would thus have to be set by federal and local authorities. As a former political official who years before had modernized the U.S. Post Office to great acclaim, Vail was not hostile toward government. Still, he believed that in return for regulation Ma Bell deserved to find the path cleared for reasonable profits and industry dominance. In Vail’s view, another key to AT&T’s revival was defining it as a technological leader with legions of engineers working unceasingly to improve the system.
  7. The Vail strategy, in short, would measure the company’s progress “in decades instead of years.” Vail also saw it as necessary to merge the idea of technological leadership with a broad civic vision. His publicity department had come up with a slogan that was meant to rally its public image, but Vail himself soon adopted it as the company’s core philosophical principle as well. It was simple enough: “One policy, one system, universal service.” That this was a kind of wishful thinking seemed not to matter.
  8. “Of its output,” Arnold would later say of his group, “inventions are a valuable part, but invention is not to be scheduled nor coerced.” The point of this kind of experimentation was to provide a free environment for “the operation of genius.” His point was that genius would undoubtedly improve the company’s operations just as ordinary engineering could. But genius was not predictable. You had to give it room to assert itself.
  9. From the start, Jewett and Arnold seemed to agree that at West Street there could be an indistinctness about goals. Who could know in advance exactly what practical applications Arnold’s men would devise? Moreover, which of these ideas would ultimately move from the research department into the development department and then mass production at Western Electric? At the same time, they were clear about larger goals. The Bell Labs employees would be investigating anything remotely related to human communications, whether it be conducted through wires or radio or recorded sound or visual images.
  10. The industrial lab showed that the group—especially the interdisciplinary group—was better than the lone scientist or small team. Also, the industrial lab was a challenge to the common assumption that its scientists were being paid to look high and low for good ideas. Men like Kelly and Davisson would soon repeat the notion that there were plenty of good ideas out there, almost too many. Mainly, they were looking for good problems.
  11. Quantum mechanics, as it was beginning to be called, was a science of deep surprises, where theory had largely outpaced the proof of experimentation. Some years later the physicist Richard Feynman would elegantly explain that “it was discovered that things on a small scale behave nothing like things on a large scale.” In the quantum world, for instance, you could no longer say that a particle has a certain location or speed. Nor was it possible, Feynman would point out, “to predict exactly what will happen in any circumstance.”
  12. The Great Depression, as it happened, was a boon for scientific knowledge. Bell Labs had been forced to reduce its employees’ hours, but some of the young staffers, now with extra time on their hands, had signed up for academic courses at Columbia University in uptown Manhattan.
  13. “The [Bell] System,” Danielian pointed out, “constitutes the largest aggregation of capital that has ever been controlled by a single private company at any time in the history of business. It is larger than the Pennsylvania Railroad Company and United States Steel Corporation put together. Its gross revenues of more than one billion dollars a year are surpassed by the incomes of few governments of the world. The System comprises over 200 vassal corporations. Through some 140 companies it controls between 80 and 90 percent of local telephone service and 98 percent of the long-distance telephone wires of the United States.”
  14. The 512A was an example of how, if good problems led to good inventions, then good inventions likewise would lead to other related inventions, and that nothing was too small or incidental to be excepted from improvement. Indeed, the system demanded so much improvement, so much in the way of new products, so much insurance of durability, that new methods had to be created to guarantee there was improvement and durability amid all the novelty.
  15. We usually imagine that invention occurs in a flash, with a eureka moment that leads a lone inventor toward a startling epiphany. In truth, large leaps forward in technology rarely have a precise point of origin. At the start, forces that precede an invention merely begin to align, often imperceptibly, as a group of people and ideas converge, until over the course of months or years (or decades) they gain clarity and momentum and the help of additional ideas and actors. Luck seems to matter, and so does timing, for it tends to be the case that the right answers, the right people, the right place—perhaps all three—require a serendipitous encounter with the right problem. And then—sometimes—a leap. Only in retrospect do such leaps look obvious.
  16. There was something in particular about the way he [William Shockley] solved difficult problems, looking them over and coming up with a method—often an irregular method, solving them backward or from the inside out or by finding a trapdoor that was hidden to everyone else—to arrive at an answer in what seemed a few heartbeats.
  17. By intention, everyone would be in one another’s way. Members of the technical staff would often have both laboratories and small offices—but these might be in different corridors, therefore making it necessary to walk between the two, and all but assuring a chance encounter or two with a colleague during the commute. By the same token, the long corridor for the wing that would house many of the physics researchers was intentionally made to be seven hundred feet in length. It was so long that to look down it from one end was to see the other end disappear at a vanishing point. Traveling its length without encountering a number of acquaintances, problems, diversions, and ideas would be almost impossible. Then again, that was the point. Walking down that impossibly long tiled corridor, a scientist on his way to lunch in the Murray Hill cafeteria was like a magnet rolling past iron filings.
  18. Essentially Kelly was creating interdisciplinary groups—combining chemists, physicists, metallurgists, and engineers; combining theoreticians with experimentalists—to work on new electronic technologies.
  19. If the ingredients in the alloy weren’t pure—if they happened to contain minute traces of carbon, oxygen, or nitrogen, for instance—Permendur would be imperfect. “There was a time not so long ago when a thousandth of a percent or a hundredth of a percent of a foreign body in a chemical mixture was looked upon merely as an incidental inclusion which could have no appreciable effect on the characteristics of the substance,” Frank Jewett, the first president of the Labs, explained. “We have learned in recent years that this is an absolutely erroneous idea.”
  20. For Scaff and Theurer—and, in time, the rest of the solid-state team at Bell Labs—one way to think of these effects was that purity in a semiconductor was necessary. But so was a controlled impurity. Indeed, an almost vanishingly small impurity mixed into silicon, having a net effect of perhaps one rogue atom of boron or phosphorus inserted among five or ten million atoms of a pure semiconductor like silicon, was what could determine whether, and how well, the semiconductor could conduct a current. One way to think of it—a term that was sometimes used at the Labs—was as a functional impurity.
  21. The formal purpose of the new solid-state group was not so much to build something as to understand it. Officially, Shockley’s men were after a basic knowledge of their new materials; only in the back of their minds did a few believe they would soon find something useful for the Bell System.
  22. On November 17, Brattain and an electrochemist in the solid-state group, Robert Gibney, explored whether applying an electrolyte—a solution that conducts electricity—in a particular manner would help cut through the surface states barrier. It did. Shockley would later identify this development as a breakthrough and the beginning of what he called “the magic month.” In time, the events of the following weeks would indeed be viewed by some of the men in terms resembling enchantment—the team’s slow, methodical success effecting the appearance of preordained destiny. For men of science, it was an odd conclusion to draw. Yet Walter Brattain would in time admit he had “a mystical feeling” that what he ultimately discovered had been waiting for him.
  23. Any Bell scientist knew about the spooky and coincidental nature of important inventions. The origins of their entire company—Alexander Bell’s race to the patent office to beat Elisha Gray and become the recognized inventor of the telephone—was the textbook case.
  24. If an idea begat a discovery, and if a discovery begat an invention, then an innovation defined the lengthy and wholesale transformation of an idea into a technological product (or process) meant for widespread practical use. Almost by definition, a single person, or even a single group, could not alone create an innovation. The task was too variegated and involved.
  25. “It is the beginning of a new era in telecommunications and no one can have quite the vision to see how big it is,” Mervin Kelly told an audience of telephone company executives in 1951. Speaking of the transistor, he added that “no one can predict the rate of its impact.” Kelly admitted that he wouldn’t see its full effect before he retired from the Labs, but that “in the time I may live, certainly in 20 years,” it would transform the electronics industry and everyday life in a manner much more dramatic than the vacuum tube. The telecommunications systems of the future would be “more like the biological systems of man’s brain and nervous system.” The tiny transistor had reduced dimensions and power consumption “so far that we are going to get into a new economic area, particularly in switching and local transmission, and other places that we can’t even envision now.” It seemed to be some kind of extended human network he had in mind, hazy and fantastical and technologically sophisticated, one where communications whipped about the globe effortlessly and where everyone was potentially in contact with everyone else.
  26. He could remember, too, that as the tubes became increasingly common—in the phone system, radios, televisions, automobiles, and the like—they had come down to price levels that once seemed impossible. He had long understood that innovation was a matter of economic imperatives. As Jack Morton had said, if you hadn’t sold anything you hadn’t innovated, and without an affordable price you could never sell anything. So Kelly looked at the transistor and saw the past, and the past was tubes. He thereby intuited the future.
  27. “A Mathematical Theory of Communication”—“the magna carta of the information age,” as Scientific American later called it—wasn’t about one particular thing, but rather about general rules and unifying ideas. “He was always searching for deep and fundamental relations,” Shannon’s colleague Brock McMillan explains. And here he had found them. One of his paper’s underlying tenets, Shannon would later say, “is that information can be treated very much like a physical quantity, such as mass or energy.”
  28. One shouldn’t necessarily think of information in terms of meaning. Rather, one might think of it in terms of its ability to resolve uncertainty. Information provided a recipient with something that was not previously known, was not predictable, was not redundant. “We take the essence of information as the irreducible, fundamental underlying uncertainty that is removed by its receipt,” a Bell Labs executive named Bob Lucky explained some years later. If you send a message, you are merely choosing from a range of possible messages. The less the recipient knows about what part of the message comes next, the more information you are sending.
  29. (1) All communications could be thought of in terms of information; (2) all information could be measured in bits; (3) all the measurable bits of information could be thought of, and indeed should be thought of, digitally. This could mean dots or dashes, heads or tails, or the on/off pulses that comprised PCM.
  30. His calculations showed that the information content of a message could not exceed the capacity of the channel through which you were sending it. Much in the same way a pipe could only carry so many gallons of water per second and no more, a transmission channel could only carry so many bits of information at a certain rate and no more. Anything beyond that would reduce the quality of your transmission. The upshot was that by measuring the information capacity of your channel and by measuring the information content of your message you could know how fast, and how well, you could send your message. Engineers could now try to align the two—capacity and information content.
  31. Shannon’s paper contained a claim so surprising that it seemed impossible to many at the time, and yet it would soon be proven true. He showed that any digital message could be sent with virtual perfection, even along the noisiest wire, as long as you included error-correcting codes—essentially extra bits of information, formulated as additional 1s and 0s—with the original message. In his earlier paper on cryptography, Shannon had already shown that by reducing redundancy you could compress a message to transmit its content more efficiently. Now he was also demonstrating something like the opposite: that in some situations you could increase the redundancy of a message to transmit it more accurately.
  32. And yet Kelly would say at one point, “With all the needed emphasis on leadership, organization and teamwork, the individual has remained supreme—of paramount importance. It is in the mind of a single person that creative ideas and concepts are born.” There was an essential truth to this, too—John Bardeen suddenly suggesting to the solid-state group that they should consider working on the hard-to-penetrate surface states on semiconductors, for instance. Or Shockley, mad with envy, sitting in his Chicago hotel room and laying the groundwork for the junction transistor. Or Bill Pfann, who took a nap after lunch and awoke, as if from an edifying dream, with a new method for purifying germanium. Of course, these two philosophies—that individuals as well as groups were necessary for innovation—weren’t mutually exclusive. It was the individual from which all ideas originated, and the group (or the multiple groups) to which the ideas, and eventually the innovation responsibilities, were transferred.
  33. He would acknowledge that building devices like chess-playing machines “might seem a ridiculous waste of time and money. But I think the history of science has shown that valuable consequences often proliferate from simple curiosity.” “He never argued his ideas,” Brock McMillan says of Shannon. “If people didn’t believe in them, he ignored those people.”
  34. In truth, the handoff between the three departments at Bell Labs was often (and intentionally) quite casual. Part of what seemed to make the Labs “a living organism,” Kelly explained, were social and professional exchanges that moved back and forth, in all directions, between the pure researchers on one side and the applied engineers on the other. These were formal talks and informal chats, and they were always encouraged, both as a matter of policy and by the inventive design of the Murray Hill building.
  35. Physical proximity, in Kelly’s view, was everything. People had to be near one another. Phone calls alone wouldn’t do. Kelly had even gone so far as to create “branch laboratories” at Western Electric factories so that Bell Labs scientists could get more closely involved in the transition of their work from development to manufacture.
  36. Bell Labs had the advantage of necessity; its new inventions, as one of Kelly’s deputies, Harald Friis, once said, “always originated because of a definite need.”
  37. To innovate, Kelly would agree, an institute of creative technology required the best people, Shockleys and Shannons, for instance—and it needed a lot of them, so many, as the people at the Labs used to say (borrowing a catchphrase from nuclear physics), that departments could have a “critical mass” to foster explosive ideas.
  38. There was no precise explanation as to why this was such an effective goad, but even for researchers in pursuit of pure scientific understanding rather than new things, it was obvious that their work, if successful, would ultimately be used. Working in an environment of applied science, as one Bell Labs researcher noted years later, “doesn’t destroy a kernel of genius—it focuses the mind.”
  39. An instigator is different from a genius, but just as uncommon. An instigator is different, too, from the most skillful manager, someone able to wrest excellence out of people who might otherwise fall short. Somewhere between Shannon (the genius) and Kelly (the manager), Pierce steered a course for himself at Bell Labs as an instigator. “I tried to get other people to do things, I’m lazy,” Pierce once told an interviewer.
  40. Pierce’s real talent, according to Friis and Pierce himself, was in getting people interested in something that hadn’t really occurred to them before.
  41. Pierce had been correct in some respects about the traveling wave tube’s potential. But as he came to understand, inventions don’t necessarily evolve into the innovations one might at first foresee. Humans all suffered from a terrible habit of shoving new ideas into old paradigms. “Everyone faces the future with their eyes firmly on the past,” Pierce said, “and they don’t see what’s going to happen next.”
  42. A terrestrial signal could be directed toward the orbiting satellite in space; the satellite, much like a mirror, could in turn direct the signal to another part of the globe. Pierce didn’t consider himself the inventor of this idea; it was, he would later say, “in the air.”
  43. Ideas may come to us out of order in point of time,” the first director of the Rockefeller Institute for Medical Research, Simon Flexner, once remarked. “We may discover a detail of the façade before we know too much about the foundation. But in the end all knowledge has its place.”
  44. Why move in this direction? What kind of future did the men envision? One of the more intriguing attributes of the Bell System was that an apparent simplicity—just pick up the phone and dial—hid its increasingly fiendish interior complexity. What also seemed true, and even then looked to be a governing principle of the new information age, was that the more complex the system became in terms of capabilities, speed, and versatility, the simpler and sleeker it appeared. ESS was a case in point.
  45. I liked Fisk very much. But the combination of Fisk, who didn’t know a lot about what was going on in the bowels of the place, and Julius, who knew everything about what was going on in the bowels of the place, was a good combination.”
  46. Colleagues often stood amazed that Baker could recall by name someone he had met only once, twenty or thirty years before. His mind wasn’t merely photographic, though; it worked in some ways like a switching apparatus: He tied everyone he ever met, and every conversation he ever had, into a complex and interrelated narrative of science and technology and society that he constantly updated, with apparent ease.
  47. To Pollak, this was a demonstration not of Bill Baker’s cruelty but of his acumen—in this case to push his deep belief that science rests on a foundation of inquiry rather than certainty. Also, it revealed how nimble Baker’s mind really was. “A very small number of times in my life I’ve been in the presence of somebody who didn’t necessarily answer the question I asked. They answered the question I should have asked,” Pollak says. “And Bill Baker was one of those people. And there are other people who just build a mystique and give the impression of a mystique around them. And Bill had that, too.”
  48. New titles might not have increased his influence. By the start of the 1960s Baker was engaged in a willfully obscure second career, much like the one Mervin Kelly had formerly conducted, a career that ran not sequentially like some men’s—a stint in government following a stint in business, or vice versa—but simultaneously, so that Baker’s various jobs in Washington and his job at Bell Labs intersected in quiet and complex and multifarious ways. Baker could bring innovations in communications to the government’s attention almost instantly.
  49. “So often,” says Ian Ross, who worked in Jack Morton’s department at Bell Labs doing transistor development in the 1950s, “the original concept of what an innovation will do”—the replacement of the vacuum tube, in this case—“frequently turns out not to be the major impact.” The transistor’s greatest value was not as a replacement for the old but as an exponent for the new—for computers, switches, and a host of novel electronic technologies.
  50. Innovations are to a great extent a response to need.
  51. In the wake of the 1956 agreement, AT&T appeared to be indestructible. It now had the U.S. government’s blessing. It was easily the largest company in the world by assets and by workforce. And its Bell Laboratories, as Fortune magazine had declared, was indisputably “the world’s greatest industrial laboratory.” And yet even in the 1960s and 1970s, as Bill Baker’s former deputy Ian Ross recalls, the “long, long history of worry about losing our monopoly status persisted.” To a certain extent, Bill Baker and Mervin Kelly believed their involvement in government affairs could lessen these worries. In the view of Ross and others, such efforts probably helped delay a variety of antitrust actions. Ross recalls, “Kelly set up Sandia Labs, which was run by AT&T, managed by us, and whenever I asked, ‘Why do we stay with this damn thing, it’s not our line of business,’ the answer was, ‘It helps us if we get into an antitrust suit.’ And Bell Labs did work on military programs. Why? Not really to make money. It was part of being invaluable.”
  52. The fundamental goal in making transistor materials is purity; the fundamental goal in making fiber materials is clarity. Only then can light pass through unimpeded; or as optical engineers say, only then can “losses” of light in the fiber be kept to an acceptable minimum.
  53. Indeed, a marketing study commissioned by AT&T in the fall of 1971 informed its team that “there was no market for mobile phones at any price.” Neither man agreed with that assessment. Though Engel didn’t perceive it at the time, he later came to believe that marketing studies could only tell you something about the demand for products that actually exist. Cellular phones were a product that people had to imagine might exist.
  54. Pierce later remarked that one thing about Kelly impressed him above all else: It had to do with how his former boss would advise members of Bell Labs’ technical staff when they were asked to work on something new. Whether it was a radar technology for the military or solid-state research for the phone company, Kelly did not want to begin a project by focusing on what was known. He would want to begin by focusing on what was not known. As Pierce explained, the approach was both difficult and counterintuitive. It was more common practice, at least in the military, to proceed with what technology would allow and fill in the gaps afterward. Kelly’s tack was akin to saying: Locate the missing puzzle piece first. Then do the puzzle.
  55. Shannon had become wealthy, too, through friends in the technology industry. He owned significant shares in Hewlett-Packard, where his friend Barney Oliver ran the research labs, and was deeply invested in Teledyne, a conglomerate started by another friend, Henry Singleton. Shannon sat on Teledyne’s board of directors.
  56. “Ideas and plans are essential to innovation,” he remarked, “but the time has to be right.”
  57. “It is just plain silly,” he wrote, “to identify the new AT&T Bell Laboratories with the old Bell Telephone Laboratories just because the new Laboratories has inherited buildings, equipment and personnel from the old. The mission was absolutely essential to the research done at the old Laboratories, and that mission is gone and has not been replaced.”
  58. At the time of the breakup, in fact, it was widely assumed in the business press that IBM and AT&T would now struggle for supremacy. What undermined such an assumption was the historical record: Everything Bell Labs had ever made for AT&T had been channeled into a monopoly business. “One immediate problem for which no amount of corporate bulk can compensate is the firm’s lack of marketing expertise,” one journalist, Christopher Byron of Time, noted. It was a wise point. Bell Labs and AT&T had “never really had to sell anything.”3 And when they had tried—as was the case with the Picturephone—they failed. Government regulation, as AT&T had learned, could be immensely difficult to manage and comply with. But markets, they would soon discover, were simply brutal. AT&T’s leaders, such as CEO Charlie Brown, “had never had the experience or the training to compete,” Irwin Dorros, a former Bell Labs and AT&T executive, points out. “They tried to apply the skills that they grew up with, and it didn’t work.” In later years, the downsizing at Bell Labs, in terms of both purpose and people, would mostly be linked to this inability to compete.
  59. The purpose of innovation is sometimes defined as new technology. But the point of innovation isn’t really technology itself. The point of innovation is what new technology can do. “Better, or cheaper, or both”—Kelly’s rule—is one way to think about this goal.
  60. A large group of physicists, certainly, created a healthy flow of ideas. But Kelly believed the most valuable ideas arose when the large group of physicists bumped against other departments and disciplines, too. “It’s the interaction between fundamental science and applied science, and the interface between many disciplines, that creates new ideas,” explains Herwig Kogelnik, the laser scientist. This may indeed have been Kelly’s greatest insight.
  61. Eugene Kleiner, moreover, a founding partner at the premier venture capital firm Kleiner Perkins, was originally hired by Bill Shockley at his ill-fated semiconductor company. But the Silicon Valley process that Kleiner helped develop was a different innovation model from Bell Labs. It was not a factory of ideas; it was a geography of ideas. It was not one concentrated and powerful machine; it was the meshing of many interlocking small parts grouped physically near enough to one another so as to make an equally powerful machine. The Valley model, in fact, was soon so productive that it became a topic of study for sociologists and business professors. They soon bestowed upon the area the title of an “innovation hub.”
  62. “You may find a lot of controversy over how Bell Labs managed people,” John Mayo, the former Bell Labs president, says. “But keep in mind, I don’t think those managers saw it that way. They saw it as: How do you manage ideas? And that’s very different from managing people. So if you hear something negative about how John Pierce managed people, I’d say, well, that’s not surprising. Pierce wasn’t about managing people. Pierce was about managing ideas. And you cannot manage ideas and manage people the same way. It just doesn’t work. So if somebody tells you Pierce wasn’t a great manager . . . you say, of what?”
  63. Pierce, to put it simply, was asking himself: What about Bell Labs’ formula was timeless? In his 1997 list, he thought it boiled down to four things: A technically competent management all the way to the top. Researchers didn’t have to raise funds. Research on a topic or system could be and was supported for years. Research could be terminated without damning the researcher.
  64. What seems more likely, as the science writer Steven Johnson has noted in a broad study of scientific innovations, is that creative environments that foster a rich exchange of ideas are far more important in eliciting important new insights than are the forces of competition.
  65. To think long-term toward the revolutionary, and to simultaneously think near-term toward manufacturing, comprises the most vital of combinations.

What I got out of it

  1. The dominance of AT&T and how they were able to structure the organization to take advantage of the talent at Bell Labs was great to learn more about. Having to build or invent something which will have to go to market is important, having a diverse group of people who interact often, and “A technically competent management all the way to the top. Researchers didn’t have to raise funds. Research on a topic or system could be and was supported for years. Research could be terminated without damning the researcher.”

Loonshots: How to Nurture the Crazy Ideas That Win Wars, Cure Diseases, and Transform Industries by Safi Bahcall

Summary

  1. “I’ve always appreciated authors who explain their points simply, right up front. So here’s the argument in brief: The most important breakthroughs come from loonshots, widely dismissed ideas whose champions are often written off as crazy. Large groups of people are needed to translate those breakthroughs into technologies that win wars, products that save lives, or strategies that change industries. Applying the science of phase transitions to the behavior of teams, companies, or any group with a mission provides practical rules for nurturing loonshots faster and better.”

Key Takeaways

  1. The Bush-Vail Rules: Many of the lessons in this book are adapted from how Vannevar Bush at DARPA and Theodore Vail at AT&T’s Bell Labs handled and fostered loonshots
    1. Separate the phases
      1. Separate your artists and soldiers
        1. Create separate groups for inventors and operators: those who may invent the next transistor vs. those who answer the phone; those who design radically new weapons vs. those who assemble planes. You can’t ask the same group to do both, just like you can’t ask water to be liquid and solid at the same time
      2. Tailor the tools to the phase
        1. Wide management spans, loose controls, and flexible (creative) metrics work best for loonshot groups. Narrow management spans, tight controls, and rigid (quantitative) metrics work best for franchise groups
      3. Watch your blind side: nurture both types of loonshots
        1. Make sure your loonshot nursery seeds both types of loonshots, especially the type you are least comfortable with. S-type loonshots are the small changes in strategy no one thinks will amount to much. P-type loonshots are technologies no one thinks will work.
    2. Create dynamic equilibrium
      1. Love your artists and soldiers equally
        1. Artists tend to favor artists; soldiers tend to favor soldiers. Teams and companies need both to survive and thrive. Both need to feel equally valued and appreciated. (Try to avoid calling one side “bozos.”)
      2. Manage the transfer, not the technology: be a gardener, not a Moses
        1. Innovative leaders with some successes tend to appoint themselves loonshot judge and jury (the Moses Trap). Instead, create a natural process for projects to transfer from the loonshot nursery to the field, and for valuable feedback and market intelligence to cycle back from the field to the nursery. Help manage the timing of the transfer: not too early (fragile loonshots will be permanently crushed), not too late (making adjustments will be difficult). Intervene only as needed, with a gentle hand. In other words, be a gardener, not a Moses.
      3. Appoint and train project champions to bridge the divide
        1. Soldiers will resist change and see only the warts on the baby-stage ideas from artists. Artists will expect everyone to appreciate the beautiful baby underneath. They may not have the skills to convince soldiers to experiment and provide the feedback that is crucial for ultimate success. Identify and train bilingual specialists, fluent in both artist-speak and soldier-speak, to bridge the divide
    3. Spread a system mindset
      1. Keep asking why the organization made the choices that it did
        1. Level 0 teams don’t analyze failures. Level 1 teams assess how product features may have failed to meet market needs (outcome mindset). Level 2 teams probe why the organization made the choices that it did (system mindset). They analyze both successes and failures because they recognize that good outcomes don’t always imply good decisions (got lucky), just as bad outcomes don’t always imply bad decisions (played the odds well). In other words, they analyze the quality of decisions, not just the quality of outcomes.
      2. Keep asking how the decision-making process can be improved
        1. Analyzing a product or a market may be technically challenging, but it is a familiar and straightforward exercise. Analyzing why a team arrived at a decision can be both unfamiliar and uncomfortable. It requires self-awareness from team members; the self-confidence to acknowledge mistakes, especially interpersonal ones; and the candor and trust to give and receive delicate feedback. The process is likely to be more efficient, and less painful, when it is mediated by a neutral expert from outside the team.
      3. Identify key influences – people involved, data considered, analyses conducted, how choices were framed, how market or company conditions affected that framing – as well as both financial and nonfinancial incentives for individuals and for the team as a whole. Ask how those influences can be changed to enhance the decision-making process in the future
      4. Identify teams with outcome mindset and help them adopt system mindset
    4. Raise the magic number
      1. Reduce return-on-politics
        1. Make lobbying for compensation and promotion decisions difficult. Find ways to make those decisions less dependent on an employee’s manager and more independently assessed and fairly calibrated across the company.
      2. Use soft equity (nonfinancial rewards)
        1. Identify and apply nonfinancial rewards that make a big difference. For example, peer recognition, intrinsic motivators
      3. Increase project–skill fit (scan for mismatches)
        1. Invest in the people and the processes that will scan for a mismatch between employees’ skills and their assigned projects, and will help managers adjust roles or employees transfer between groups. The goal is to have employees stretched neither too much nor too little by their roles.
      4. Fix the middle (reduce perverse incentives for middle managers)
        1. Identify and fix perverse incentives, the unintended consequences of well-intentioned rewards. Pay special attention to the dangerous middle-manager levels, the weakest point in the battle between loonshots and politics. Shift away from incentives that encourage battles for promotion and toward incentives centered on outcomes. Celebrate results, not rank.
      5. Bring a gun to a knife fight (engage a chief incentives officer)
        1. Competitors in the battle for talent and loonshots may be using outmoded incentive systems. Bring in specialist in the subtleties of the art – a chief incentives officer.
      6. Fine-tune the spans (wide for loonshots groups; narrow for franchise groups)
        1. Widen management spans in loonshot groups (but not in franchise groups) to encourage looser controls, more experiments, and peer-to-peer problem solving
    5. For anyone championing a loonshot, anywhere:
      1. Mind the False Fail
        1. Is a negative outcome due to a flaw in the idea or the test? What would you have to believe for it to be a flaw in the test? How might you evaluate that hypothesis
      2. Listen to the Suck with Curiosity (LSC)
        1. When you have poured your soul into a project, you will be tempted to argue with critics and dismiss whoever challenges you. You will improve your odds of success by setting aside those urges and investigating, with genuine curiosity, the underlying reasons why an investor declines, a partner walks, or a customer choose a competitor. It’s hard to hear no one likes your baby. It’s even harder to keep asking why
      3. Apply system rather than outcome mindset
        1. Everyone will make wrong turns in navigating the long, dark tunnel through which every loonshot travels. You will gain much more (and feel much better) by trying to understand the process by which you arrived at those decisions. How did you prepare? What influenced you? How might you improve your decision-making process?
      4. Keep your eyes on SRT: spirit, relationships, time
        1. When championing a loonshot, it’s easy to lose sight of what’s important, of why you are doing with what you are doing. A little obsession can be good. Too much can backfire. What’s helped me, on occasion, to pull back from the edge – to create a more sustainable and productive level of obsession – is stepping back to think on SRT

What I got out of it

  1. A beautiful and powerful framework for how to foster and handle loonshots. Important for any size company or venture

The Dream Machine: JCR Licklider and the Revolution that Made Computing Personal by Mitchell Waldrop

Summary

  1. Licklider was far ahead of his generation in seeing the potential for computers – for making them humane and individual, in democratizing access to information, creating a symbiosis between man and machine. It was his work in the Pentagon along with many other visionaries who made this possible – that allowed for the standalone computer with a mouse and a graphical user interface to come into existence. His desire to understand how the brain worked as a system fueled his curiosity. Lick went on to form the ARPA Information Processing Techniques Office in 1962 and started the research funding for interactive computing and pervasive worldwide networks that has resulted in most of the technology we use today and also fueled the next generations of computing researchers – many of whom became the founders and mainstays of Xerox PARC. When computers were a short step removed from mechanical data processor, Lick’s treatises on human/computer symbiosis shifted our understanding of what computers were and could be.

Key Takeaways

  1. Lick’s goal was to forge ahead with the human/computer symbiosis and create an interconnected, self-perpetuating system into a single computer network. An electronic medium to connect everyone – the ARPA net. Today it is known as the internet and everything we now associate with it
  2. JCR Licklider may be one of the most intuitive geniuses of all time. He simply saw in his head how information flowed, and how people, things, and ideas are interconnected
  3. Lick, while humble and nice, hated sloppy work, glib answers, and never took anything for granted. He was mischievous and a little anarchical. He was never satisfied with the ordinary and always pushed the limits. His grounding in psychology was essential for his later work with computers as he always tried to design the computer and how it functioned to best meet the needs of the humans operating it. Lick approached every problem as a systems problem rather than a detailed or individual problem
  4. The first high-profile project he worked on was related to acoustics for the war and his boss had a simple mantra: hire the best people, buy them the best machines money can buy, inspire them to no end, and work them 14 hours a day. With this formula they achieve nearly everything they set out to
  5. Norbert Wiener was a prodigious character at MIT. He was a genius in multiple ways, especially mathematics where he was able to use his intuition and form physical models in his head of the problem rather than merely manipulating symbols on the page. He had the hologram in the head 
  6. Alan Turing didn’t like seeing what others had accomplished before him. He preferred to reinvent the wheel and figure things out for himself. He wasted a lot of time and reinvented the wheel but he came to understand things deeply.
  7. Johnny Von Neumann’s stored program concept created software and changed computing, opening up the potential that we associate with computers today
  8. Claude Shannon thought of information through a 5 part framework: source, transmitter, communication medium, receiver, destination. This simple framework helped him think through the purpose of information and not get bogged down in details. Information ought to measure how much you learn from a given message. If you knew everything in a given message, the information content is zero. However, information and meaning is separated as it relates to computers. Shannon also proved that it is possible to get a message through with perfect fidelity no matter how much static or distortion or how faint the signal. It’ll eventually get too slow and the codes too long but it is possible to overcome noise. This is the fundamental theorem of information theory. Shannon didn’t like how information and meaning could be too easily confused so he had Von Neumann come up with a new name and he came up with one immediately: entropy. Information is entropy. It has the same formula as the physicists formula for entropy. A mathematical variable related to the flow of heat. Information is everywhere and in everything it is as old as time and ties together the mind-body problem, computation, communication, and more
  9. Lick was interested in every domain and was always pulling in new ideas from different fields. He loved novel ideas and would always push himself and others to think about things differently in order to gain new or deeper insights. While Lick has high expectations for his team, he was extremely devoted and his team knew it – he had built a tribe more than a research group. Lick optimized for creativity and productivity so cared very little for credit. He would give his ideas and insights away for others to work on and publish so that he could get more done 
  10. Understanding how our brain works brought together information theory, logic, communication, cognitive science, behavioral psychology, and much more. Two key breakthroughs were understanding chunking and that it matters tremendously how our neurons fire and are organized – not just the raw number of neurons we have
  11. When Lick was brought on to head up the new ARPA project there was no budget, no mandate, no charter. This was perfect as they could simply talk about and work on the most important questions and topics as they came up, not being pigeonholed or sucked into a specific purpose but able to adjust and adapt to everything new that was happening
  12. A key realization for Lick was that if all his visions where to come true, he had to create a self-reinforcing and self-sustaining community between all the different groups who are contributing to this project. Without this focus and insight, many of these dreams might have been lost, forgotten, or not achieved for some other reason
  13. Corvado created the first open source system which led to the software boom and the PC. Controversial at the time, he followed the dictum that if you create something useful people, will use it. This was significantly different from other utilities of the past because rather than value flowing just one way (like electricity to users), value flows two ways now: from software to user and user back to software. This had tremendous implications
  14. Lick give people plenty of space as long as they’re doing something interesting and living up to his high standards. However, if not, he can be ruthless and shut down programs that weren’t performing
  15. For all of Lick’s strengths, he was terrible administratively. Frustrating his colleagues and friends as they had to badger him for weeks or months to get anything done. And, when everything is funded by ARPA, this was a huge deal 
  16. Lick at ARPA and Bob Taylor at Xerox Parc had to learn how to find a way to get their groups all to move together, to give their groups a sense of cohesion and purpose without crushing their spontaneity and creativity. They had to set things up and create an environment where they would follow their own instincts and self-organize. This is the fundamental to dilemma of management. Bob Taylor spent years traveling and getting to know the cultures of different high performing groups and he took the time to speak to the youngest people there. Not only tp pick up their ideas but to understand what their values were and how he could cater to them.  Taylor’s style of research can be summed up as don’t just invent the future, go live in it. Don’t worry about the cost for now but whatever you invent, make sure to use it and then show others how to use it and why it’s helpful. The only mandatory program was a once weekly discussion from the program leaders about what they were doing and for an hour the other people would have at him. This created a sense of cohesion and purpose and also flushed out ideas before going too far along the wrong path. These meetings often got heated and Taylor would help turn them from “class 1” to “class 2” meetings, meaning they would go from yelling at each other to having to explain the other side‘s position to their satisfaction. This worked amazingly well to flush out ideas and improve communication.
  17. Xerox PARC’s main vision was to create the digital office, an integrated symbiosis between working man and machine. Broadly, it was split into two groups – one focused on hardware and the other on applications. Low cost, high performance and high quality graphics was a thread which ran through everything they were trying to do. Moore’s Law was just beginning to take hold and this who were still sold on time sharing began to be able to see the possibility of an individual, high powered machine for everybody
    1. There was this thread that ran through Vannevar Bush, Licklider, Doug Engelbart, Alan Kay, and others. It was the ascent of man, it was like the Holy Grail. PARC would rationalize it according to what Xerox needed but whenever they could phrase an idea to align with this path everybody’s eyes would light up, hitting a sort of resonance frequency. 
      1. Engelbart’s “Mother of All Demos” – showing off technology which set fire to the vision of the future and what could be
  18. Alan Kay was one of the key members of PARC’s team and was a prodigy from a young age. He learned to read by the age of three and read hundreds of books before going to school. By that young age he knew that a lot of what the teachers were telling him was wrong or at least that there were multiple points of view. The teachers did not like this. He never distinguished art from science and was one of the key pioneers in this field. 
  19. Good names are incredibly important for prototypes – they have to be familiar, easy to spell, easy to use, easy to understand, have a broad theme, and conjure up pleasant feelings. 
  20. Alan Kay mentions that in the history of art, it is not the adults who actually invent the new medium who do amazing things, but the first generation of kids to grow up with it who do
  21. Xerox was growing so quickly in the late 1960s and 1970s that they almost choked on their own growth. In order to survive, they had to bring in management, marketing, and finance types – mostly from IBM and Ford.  While this helped them survive their amazing growth, it also reinforced some bad lessons – that nothing exists or is useful unless it could be shown and captured on the spreadsheet and eventually this led to the demise of Xerox PARC and that era of research and innovation. Jim O’Neil became the numbers guy and shut down much of the spontaneous generation and innovation because if it didn’t meet his numbers he couldn’t “see it” and wouldn’t buy into it. When sales and finance make all the shots, the company is on a downward spiral as they are not able to innovate or think long term
  22. Xerox PARC was an Eden in many ways but what allowed them to flourish was the vision, the people, and an abundance mentality. The fact that they had money to spend and didn’t have to jump through hoops to get it. When there is scarcity you don’t have a community, you just have a bunch of people trying to survive. In 1975 Xerox’s printer and copier business was being threatened and this was their cash cow. The instinct is to keep pouring money into this in order to save it but sometimes that isn’t appropriate. You must know when to cannibalize or disrupt yourself 
  23. You always got the sense that Lick was playing. He was like a kid in a candy store. His exploratory and curious child-like mind never went away. He was not suited to be an administrator or manager but was a visionary and community builder. He encouraged people and showed them what was possible, what they were really working towards 
  24. DEC took advantage of the open architecture and was able to foster creativity and uses for their machines that they never would’ve been able to come up with. Many people loved the ability to tinker, upgrade, or personalize what they bought rather than buying a finish package from an IBM for example. Roberts and his Altera machine would follow DEC‘s lead and make it an open architecture which unleashed a wave on entrepreneurialism and garage start ups by the hundreds – filling all sorts of niches and launching some of the world’s biggest and most successful companies (such as Microsoft)

What I got out of it

  1. An incredibly fun read – detailing not only the people and the history behind the computer revolution, but the atmosphere, thinking, and optimism which fueled it