Скачать книгу

the wireless network evolved to a point that enabled widespread adoption of cell phones and alphanumeric pagers. The big applications were phone calls and email, then text messaging. The consumer era had begun. The business era had lasted nearly twenty years—from 1975 to 1995—but no business complained when it ended. Technology aimed at consumers was cheaper and somewhat easier to use, exactly what businesses preferred. It also rewarded a dimension that had not mattered to business: style. It took a few years for any vendor to get the formula right.

      The World Wide Web in the mid-nineties was a beautiful thing. Idealism and utopian dreams pervaded the industry. The prevailing view was that the internet and World Wide Web would make the world more democratic, more fair, and more free. One of the web’s best features was an architecture that inherently delivered net neutrality: every site was equal. In that first generation, everything on the web revolved around pages, every one of which had the same privileges and opportunities. Unfortunately, the pioneers of the internet made omissions that would later haunt us all. The one that mattered most was the choice not to require real identity. They never imagined that anonymity would lead to problems as the web grew.

      Time would expose the naïveté of the utopian view of the internet, but at the time, most participants bought into that dream. Journalist Jenna Wortham described it this way: “The web’s earliest architects and pioneers fought for their vision of freedom on the Internet at a time when it was still small forums for conversation and text-based gaming. They thought the web could be adequately governed by its users without their need to empower anyone to police it.” They ignored early signs of trouble, such as toxic interchanges on message boards and in comments sections, which they interpreted as growing pains, because the potential for good appeared to be unlimited. No company had to pay the cost of creating the internet, which in theory enabled anyone to have a website. But most people needed tools for building websites, applications servers and the like. Into the breach stepped the “open source” community, a distributed network of programmers who collaborated on projects that created the infrastructure of the internet. Andreessen came out of that community. Open source had great advantages, most notably that its products delivered excellent functionality, evolved rapidly, and were free. Unfortunately, there was one serious problem with the web and open source products: the tools were not convenient or easy to use. The volunteers of the open source community had one motivation: to build the open web. Their focus was on performance and functionality, not convenience or ease of use. That worked well for the infrastructure at the heart of the internet, but not so much for consumer-facing applications.

      The World Wide Web took off in 1994, driven by the Mosaic/Netscape browser and sites like Amazon, Yahoo, and eBay. Businesses embraced the web, recognizing its potential as a better way to communicate with other businesses and consumers. This change made the World Wide Web geometrically more valuable, just as Metcalfe’s Law predicted. The web dominated culture in the late nineties, enabling a stock market bubble and ensuring near-universal adoption. The dot-com crash that began in early 2000 left deep scars, but the web continued to grow. In this second phase of the web, Google emerged as the most important player, organizing and displaying what appeared to be all the world’s information. Apple broke the code on tech style—their products were a personal statement—and rode the consumer wave to a second life. Products like the iMac and iPod, and later the iPhone and iPad, restored Apple to its former glory and then some. At this writing, Apple is the most valuable company in the world. (Fortunately, Apple is also the industry leader in protecting user privacy, but I will get to that later.)

      In the early years of the new millennium, a game changing model challenged the page-centric architecture of the World Wide Web. Called Web 2.0, the new architecture revolved around people. The pioneers of Web 2.0 included people like Mark Pincus, who later founded Zynga; Reid Hoffman, the founder of LinkedIn; and Sean Parker, who had cofounded the music file sharing company Napster. After Napster, Parker launched a startup called Plaxo, which put address books in the cloud. It grew by spamming every name in every address book to generate new users, an idea that would be copied widely by social media platforms that launched thereafter. In the same period, Google had a brilliant insight: it saw a way to take control of a huge slice of the open internet. No one owned open source tools, so there was no financial incentive to make them attractive for consumers. They were designed by engineers, for engineers, which could be frustrating to non-engineers.

      Google saw an opportunity to exploit the frustration of consumers and some business users. Google made a list of the most important things people did on the web, including searches, browsing, and email. In those days, most users were forced to employ a mix of open source and proprietary tools from a range of vendors. Most of the products did not work together particularly well, creating a friction Google could exploit. Beginning with Gmail in 2004, Google created or acquired compelling products in maps, photos, videos, and productivity applications. Everything was free, so there were no barriers to customer adoption. Everything worked together. Every app gathered data that Google could exploit. Customers loved the Google apps. Collectively, the Google family of apps replaced a huge portion of the open World Wide Web. It was as though Google had unilaterally put a fence around half of a public park and then started commercializing it.

      The steady march of technology in the half century prior to 2000 produced so much value—and so many delightful surprises—that the industry and customers began to take positive outcomes for granted. Technology optimism was not equivalent to the law of gravity, but engineers, entrepreneurs, and investors believed that everything they did made the world a better place. Most participants bought into some form of the internet utopia. What we did not realize at the time was that the limits imposed by not having enough processing power, memory, storage, and network bandwidth had acted as a governor, limiting the damage from mistakes to a relatively small number of customers. Because the industry had done so much good in the past, we all believed that everything it would create in the future would also be good. It was not a crazy assumption, but it was a lazy one that would breed hubris.

      When Zuck launched Facebook in early 2004, the tech industry had begun to emerge from the downturn caused by the dot-com meltdown. Web 2.0 was in its early stages, with no clear winners. For Silicon Valley, it was a time of transformation, with major change taking place in four arenas: startups, philosophy, economics, and culture. Collectively, these changes triggered unprecedented growth and wealth creation. Once the gravy train started, no one wanted to get off. When fortunes can be made overnight, few people pause to ask questions or consider side effects.

      The first big Silicon Valley change related to the economics of startups. Hurdles that had long plagued new companies evaporated. Engineers could build world-class products quickly, thanks to the trove of complementary software components, like the Apache server and the Mozilla browser, from the open source community. With open source stacks as a foundation, engineers could focus all their effort on the valuable functionality of their app, rather than building infrastructure from the ground up. This saved time and money. In parallel, a new concept emerged—the cloud—and the industry embraced the notion of centralization of shared resources. The cloud is like Uber for data—customers don’t need to own their own data center or storage if a service provides it seamlessly from the cloud. Today’s leader in cloud services, Amazon Web Services (AWS), leveraged Amazon.com’s retail business to create a massive cloud infrastructure that it offered on a turnkey basis to startups and corporate customers. By enabling companies to outsource their hardware and network infrastructure, paying a monthly fee instead of the purchase price of an entire system, services like AWS lowered the cost of creating new businesses and shortened the time to market. Startups could mix and match free open source applications to create their software infrastructure. Updates were made once, in the cloud, and then downloaded by users, eliminating what had previously been a very costly and time-consuming process of upgrading individual PCs and servers. This freed startups to focus on their real value added, the application that sat on top of the stack. Netflix, Box, Dropbox, Slack, and many other businesses were built on this model.

      Thus began the “lean startup” model. Without the huge expense and operational burden of creating a full tech infrastructure, new companies did not have to aim for perfection when they launched a new product, which had been Silicon Valley’s primary model to that point. For a fraction of the cost, they could create a minimum viable product (MVP), launch it, and see what happened.

Скачать книгу