To mark the passing of Steve Jobs, this seems like a good time to review Apple's contribution over its 35-year history to what we now call open innovation.
Since Wednesday’s news of Jobs’ untimely death, various press reports have pointed to Apple’s success under Jobs in creating or transforming multiple industries. Often such coverage points to Apple losing the market share war due to its proprietary Macintosh platform, but the story of its problems during the 1990s was always more complicated than that.
When it comes to open architecture, its record is mixed. Apple has had its proprietary side — particularly recently with its iPhone, which was originally locked to one carrier, banning some applications but not others, and making misleading claims about its reasons for being closed.
However, Apple under Steve Jobs was also a pioneer in open innovation for technology-based industries. There were three ways that Apple set a new standard for how a firm can profit from external sources of innovation — and also how it could enable third party innovators to profit as well.
1. Component Based-Business Models
Apple has always shipped its own proprietary software. The two Steves recognized earlier than any other computer make that the value of the computer comes from the software, not the case or power supply. Apple has always been a system integrator, a crucial approach to open innovation in component-based industries.
Steve Wozniak is recalled as the electronics tinkerer who graduated from the Homebrew Computer Club to a Los Altos garage. However, even marketing guru Steve Jobs was (as he recalled in a 1995 interview) inaugurated to electronics by building electronics gadgets out of standardized components.
The Apple 1, II, ///, Macintosh and other products were made out standardized components, including CPUs from MOS Technology and Motorola. With the rise of global supply chains during the 1990s we take that for granted that this is normal, but it was certainly not normal in the 1960s and 1970s for market leading computer companies like IBM and DEC.
Apple also helped fuel the success of a small software company called Microsoft — which provided the Basic interpreter for the Apple II and its new Word and Excel applications for the Macintosh. And in the 21st century, Apple supported open source by bundling and provided resources for technologies such as CUPS and particularly with WebKit.
2. Standardized Complements
IBM certainly started the software industry with its crucial 1969 decision to unbundle software from hardware, made in the shadow of ongoing antitrust investigation. However, Apple played a comparable role in enabling the 1979 release of the first true software “application.”
The problem was that before the Apple II, nearly all software required some technical skill to install and/or configure. No two mainframe configurations were exactly the same, but with only a few thousand models sold of most mainframes. Even the popular CP/M systems had a wide range of configurations that posed a challenge for user-installed software.
The Apple II offered software developers a standard configuration, a large target market and with it economies of scale. The result was VisiCalc, the first “killer app” that provided a reason to buy a computer.
3. Direct Distribution of Complements
Steve Jobs was gone from Apple from 1985-1997, and during those dark ages many of us despaired that the company would be able to survive. This was also the era when the company had the worst problem with “Not Invented Here.”
When Jobs returned, the most crucial contribution to open innovation in the Jobs II era was the invention of the app store. Yes, the app store was important to the success of Apple’s iPhone franchise. And yes the success of the app store was not just the concept but also executing better than its competitor, while at the same time Google has been more open in its app store approach. Even when competitors had more installed base and more applications, Apple’s approach generated more revenue for more developers than any of its rivals.
Apple revolutionized the distribution of third party software with its app store. Before the app store, software developers had to fight to get distribution of their software on virtual or electronic shelf space. (That’s why my company got out of the software products business in 1993.) The iPhone App Store eliminated this barrier, enabling entry by even the smallest software developer with more than 500,000 applications available to any iPhone owner. (Windows claims millions of applications, but there is no practical way for the average user to identify or acquire the vast majority of these apps.)
Apple also created markets for third-party hardware with its physical retail stores. The various makers of iPod (and later iPhone) doodads — such as cases and external speakers — would not have enjoyed the sales they did if these complementary products were not internationally distributed alongside the core product.
Apple certainly gained success through partly-open, high margin strategies. No one was more intent on value capture than Steve Jobs, and there were many things the company did to justify its proprietary reputation.
However, the success of Apple also paved the way for the success of many other innovators, setting the pattern for the rest of the IT industry in its integration of hardware and software components, by creating standardized markets for complements and then accelerating the distribution of those complements.