July 18, 2013

What I learned about UI in Brighton

After 2½ days at #oui2013 at the University of Brighton, user innovation (and a few open innovation) researchers have scattered to the wind. It’s hard to summarize 52 hours in one post, but I’ll give it a try.

The location was great, host Steve Flowers was very gracious, and it was quite handy to be across the motorway from the SPRU thought leaders.

What did I learn? I learned quite a bit about being at the beach on a summer weekend near London, and also how expensive it is to host 140 people (or 200 for the US workshops) with all meals for three days. I learned a lot about my own work on firm external collaborations (more in another post).

Here is Flowers’ word cloud from the ~100 paper titles. (To keep things in proportion, the cloud omits “innovation” and the abstract word cloud omits “von Hippel”).

Below are random thoughts about what I learned about UI topics; apologies to any of the presenters if I mangled the story from their talks. I've also excerpted from a few of about 80 photos of the conference, which I have posted for general viewing on Picasa.

Diffusion of User Innovations

One of the big topics at the conference is what happens after users innovate. Diffusion of such innovations was a major focus of the talk by Jeroen de Jong, a self-described “part of Eric’s mercenary army.” He started out with the amusing video by Martin Aslander, who created a wooden iPad stand that was featured in Wired and then got an explosion of interest on their website.

Based on a large-scale survey in Finland, users disseminate only 19% of their innovations. One reason users might not share their innovations is that they’re too narrow or individualistic to be value for other users; in the Finnish survey, 85% of these innovations are thought (by the inventor) to be of general value. Of these 85%, 22% were diffused: 6% by transfer to commercial firms, and 16% via peer-to-peer diffusion.

This is a very useful framework: peer diffusion vs. commercial diffusion. The one thing missing is subdividing commercial diffusion between existing firms and new firms. The latter would include both user entrepreneurship and non-user entrepreneurship. So I propose this typology
  1. Non-diffusion
  2. Non-commercial (social) diffusion
  3. Commercialization by an existing firm
  4. Commercialization by a new (unrelated) firm
  5. Commercialization by the user entrepreneur
De Jong’s approach is an important step forward in defining the impact of user innovation, and I hope this classification approach will take root.

In a minitalk Wednesday, Christoph Stockstrom of TUHH looked at a particular example of user innovation in services — surgical procedures. As we know from previous work, user innovation is common by healthcare providers, but how does it diffuse? Yes, the attitude of the user innovator has a big impact on diffusion, but what’s also crucial is the involvement of the initial round of early adopters, who (for medical innovations) become trainers to interpret and diffuse the innovation to others.

Rational Non-Diffusion

Ben Martin (R) listens to Manabu Mizuno
present his minitalk
On the final day Wednesday, there were a lot of interesting 2-minute minitalks — too many to follow up by attending the full talk. While I decided to focus on people (and work) that I know, several talks seemed worthy of mentioning.

Two of these minitalks were from Japan. The most fun minitalk all day was by Manabu Mizuno of Hannan University. The example he showed was of a “bird deterrent device” (think 21st century scarecrow), based on people hanging used CDs. (From prior bird deterrent efforts, I assume the reflections off the CD scare the birds away).

It was a very simple user invention, but Mizuno asked: how would you commercialize it? First, can a simple hanging CD be turned into a real product, such as a vest of hanging CDs? Secondly, if it’s such a simple idea — that can’t be patented and can’t be kept a secret — how can firms appropriate the returns of such innovation, and if not, how will the innovation be diffused?

Self Organization

Another Japan-oriented talk was by Peter Svensson and colleagues. He talked about how — after Fukushima — citizens and scientists self-organized to improvise low-cost real-time radiation measurement equipment. Instead of waiting for commercial equipment to be developed, the urgency of monitoring radiation levels for public health prompted volunteers to adapt technologies to solve the problem at hand.

To me, the Safecast.jp network provides a fascinating example of a) how an exogenous shock stimulated concerted user innovation and b) how the network self-organized to harness that enthusiasm.

This seems like a wonderful example of a research design that allows us to study the early phase of coordination in user innovation communities.

Rational Irrationality and Randomness

The opening plenary talk by Nik Franke Tuesday asked “Are users really Spocks?” summarizing his 2010, 2010 and 2013 JPIM articles and 2013 Org Science. The graphical illustrations (using the Star Trek character) brought several rounds of laughter, but unfortunately from my vantage point I couldn’t get a good picture.

If I understood the argument, it was there are more reasons why users innovation (and share innovations) than just for the personal “functional benefit.” One reason is a sense of accomplishment; this is old hat to open source researchers, and also something (from two 2010 JPIM articles ) Nik mentioned in his plenary in Vienna two years ago.

His Org Sci focuses on the role of affect in the decision to share — something that seems to assumed away from the stylized stories of altruistic “free revealing” that have been told in the past. I love the title of the latest paper: “Does this sound like a fair deal?”

Franke also had my favorite title of the conference, “Does God play dice?” After doing a study with a €50,000 prize to find optimal solutions, Franke and colleagues then did a random sampling of a subset of the participants to see what the results would have been if everyone had not participated. It turns out that 22 factors that predict community success explained 11% of the variance — while the community size (# of participants from the draw) explained 60% of the variance.

This both confirms what we know about the importance of attracting participation — and also (as Franke noted) is a humbling chastening for our attempts to ascribe success to causal mechanisms.

July 15, 2013

The economic drag of innovation fraud

Today at #oui2013 at the University of Brighton, the final session featured three journal editors from SPRU on the other side of the A23 motorway: Ben Martin (co-editor of Research Policy), Joe Tidd (managing editor of International Journal of Innovation Management) and Paul Nightingale (co-editor of Industrial and Corporate Change).

The three editors explained their respective journals and topical interests, and gave doctoral students and others unfamiliar with the journal advice on how to avoid a desk reject. (Tip #1: only send papers that fit the journal’s stated scope).

There were no questions during Q&A, so I asked the question that has been on the mind of many experienced innovation scholars: “How is the journal process changing in the light of high profile retractions?” The answer revealed that trying to avoid a repeat of the recent embarrassing and systematic fraud has already created a drag on the innovation publishing system.

Martin (two retractions a year ago) said that academic misconduct had no impact on his workload 6-8 years ago, but now requires one day a week. The journal is both trying to design new processes to prevent fraudulent articles from getting through, and to follow up on the suspicions of authors and editors.

Meanwhile, Nightengale said that academic fraud (presumably the May 2013 retraction) cost him two months to investigate (“I don’t have two months”). Facing the threat of litigation over the retractions, Oxford University Press provided legal defense.

Essay on Research Integrity

During the coffee break, Martin informed me of the recent publication of his editorial on academic integrity (Martin, 2013) — an essay that I read in draft form last year. By my count, the essay lists 4 examples of authors attempting plagiarism, 12 cases of self-plagiarism.

The final paragraph in the latter section provided the most complex example of self-plagiarism in the essay:
More recently, an even more complicated case was brought to the attention of RP Editors by two individuals who independently had been asked to review papers by the same author (a professor at a European university) submitted to two other journals. They discovered that the author concerned had published an astonishing total of over 60 journal articles since 2004. Since this number was too great to handle, the two reviewers concentrated their attention on 15 articles published in leading journals over the period 2007–2010 (including three published in Research Policy), all of which formed part of a single stream of research emerging from a survey of over 100 firms in Europe that the author had conducted. They found that in these papers, similar analyses had been carried out with differing combinations from a large set of variables (sometimes relabelled, to add to the confusion) with no references to indicate the prior papers the author had already produced on the same broad theme. Moreover, in some cases, a given variable was treated as a dependent variable and in others as an independent variable. Perhaps more worryingly, variables that were demonstrated to be significant in some papers were then (presumably deliberately) omitted in the analysis reported in other papers. The author was asked for an explanation. This explanation was deemed unsatisfactory by the RP Editors, with the result that two of the RP papers had to be formally retracted.
Other Thoughts

Many attendees at OUI have followed the 13 OI retractions (and 3 withdrawn papers) of the past year by reading this blog and Retraction Watch. All are all awaiting the results of the long-delayed investigations at WHU and Mannheim.

One OI scholar said the policy of many journals — rejecting plagiarized papers — paralleled the lax 1980s policy of the Budapest subway system: if you were caught failed to pay for a subway ride, the consequence was the price of a subway ride. Without severe penalties, there was no deterrence for misconduct. (In contrast, Martin 2013 mentions examples of attempted plagiarists told by RP not to submit any paper for 1 or 2 years).

Another OI scholar saw the recent raft of retractions as being to innovation studies what American Lance Armstrong was to professional cycling. The cheating has become such a huge scandal precisely because it featured one of Europe’s most successful young innovation scholars. Using banned substances to gain unfair advantage seems like an apt metaphor for cheating in academic research papers.


Ben R. Martin, “Whither research integrity? Plagiarism, self-plagiarism and coercive citation in an age of research assessment,” Research Policy, Volume 42, Issue 5, June 2013, Pages 1005 - 1014. DOI: 10.1016/j.respol.2013.03.011

July 14, 2013

52 hours of Open and User Innovation

I’m now in Brighton, the 19th century English seaside resort, awaiting the start of the 11th Open and User Innovation Workshop, which starts Monday morning 9 a.m. at the University of Brighton. While I’m not excited about jet lag (or the drunken Englishmen who party outside my beach-area hotel), I am excited to be attending what I consider the premiere venue for discussing innovation beyond the firm.

Since starting in 2008, this will be my sixth consecutive OUI (née UOI) conference, and nowadays I attend this conference in preference to the much larger (and less focused) Academy of Management, which I typically attend in alternate years. As in most years (and most conferences), when attending sessions I’m torn between hearing friends, learning something new, and monitoring related work upstream (which I might cite), downstream (which cites me), and directly competing.

According to my analysis of the program, there are 98 papers in 20 sessions across 2 1/2 days. 12 of these are in 3 sessions titled "open innovation,” while “firms and users” (From a UI perspective) account for another 16 papers across 3 sessions.

I won’t be presenting in any of those, but instead in one of 3 sessions (15 papers) related to crowdsourcing (not counting an additional 6 papers in a 4th session on crowdfunding. (My paper with Frank Piller bridges user innovation, open innovation and co-creation in developing a model for firm and user collaboration).

As noted, I’d like to monitor topics that I’m actively researching, particularly two papers first presented last year at OUI 2012. One topic is health innovation, which has 7 papers in a session directly competing with the West-Piller paper on Tuesday. The other is community — both a topic from last year and something I’ve researched for almost a decade — which is indirectly represented by 4 papers on open source and 5 papers on community motivation.

Of course, the major topics of user innovation (including lead users and toolkits) are still represented. What’s disappointing is a decline in user entrepreneurship (only 2 papers) which I hope does not represent a broader decline of interest in the topic.

Topic # Papers Day
Crowdfunding 6 Monday
Crowdsourcing I 6 Tuesday
Crowdsourcing II 4 Wednesday
Entrepreneurship 2 Wednesday
Firms and User Innovation I 6 Monday
Firms and Users II 5 Tuesday
Firms and Users III 5 Wednesday
Health Innovation 7 Tuesday
Law and Policy 4 Wednesday
Lead Users 6 Wednesday
Motivation (Community) 5 Wednesday
Motivation (Crowdsourcing) 5 Tuesday
Open Innovation I 5 Monday
Open Innovation II 4 Tuesday
Open Innovation III 3 Wednesday
Open Source 4 Monday
Surveys 4 Monday
Toolkits 3 Wednesday
User Innovation I 6 Monday
User Innovation II 8 Tuesday

Hope to see many of you tomorrow…

July 9, 2013

CFP: OI in Bath

This summer I’m making two trips to England to speak about open innovation. The first is for next week’s Open and User Innovation Workshop being hosted at the University of Brighton.

The second will be for a two-day workshop (Sept 19-20) at the University of Bath, entitled “Strategizing open innovation: foundations for new approaches”. The session is being hosted by Felicia Fai and Anthony Roath of the School of Management, and features four guest speakers:
  • Joel West, Keck Graduate Institute of Applied Life Sciences, Claremont, California, USA
  • Richard Whittington, Saïd Business School, University of Oxford, UK
  • Teppo Felin, Saïd Business School, University of Oxford, UK
  • Letizia Mortara, Institute for Manufacturing, University of Cambridge, UK
Not listed is Bath’s newest faculty member, widely cited OI scholar Ammon Salter, who joins the school this fall.

Although the keynote speakers have been selected, the organizers are waiting for submissions to determine the rest of the program:
We invite submissions that relate to the adoption of open innovation (OI) practices and their implications for strategic development. This may include (non-exclusively or non-exhaustively): adjustments in managerial mindsets, the creation of new resources, the development of new or adjustment of existing capabilities, the creation of new business models, structures and strategies for OI.

To signal your interest, please send a summary of your idea and how it contributes to the workshop theme in no more than 500 words. A maximum of 30 submissions will be selected for discussion at the workshop.
Submissions are due July 28 (US time). See the call for proposals for more details.

July 5, 2013

Managing Open Innovation in Large Firms

Those who follow elsewhere on the OpenInnovation.net website will have seen last week’s article announcing results of the 2013 executive survey on open innovation. But for those who didn’t — or who didn’t open the report — I thought I’d summarize a few highlights.

The 40-page report is entitled “Managing Open Innovation in Large Firms.” It was co-authored by Henry Chesbrough (faculty director of the Garwood Center for Corporate Innovation) and Sabine Brunswicker (Head of Open Innovation at the Fraunhofer Institute for Industrial Engineering. It is available for a free download at the Berkeley website.

The goal of the study was to understand the nature and degree of firms support for open innovation. Who did they study?
To perform the first quantitative study on open innovation in large firms we emailed our survey on open innovation to senior executives at the headquarters of more than 2,840 large and stock market listed firms. Our sample included all large companies in Europe and US, with revenues annually in excess of US$ 250 million and more than 1,000 employees. This sampling frame was drawn to fill a gap as there is no quantitative study on open innovation in very large firms. We received usable survey responses from 125 firms in November and December 2012. We sent the survey to at least one contact person at the company headquarters. Our primary contact was the Chief Executive Officer or the Chief Operations Officer. We also sent our survey to the Chief Technology Officer or a senior executive responsible for strategy or business development (e.g. VP Strategy or Business Development) if contact details were available.
The responses were overweighed for manufacturing and European companies but were representative in terms of size and age.

While I recommend the full report to everyone (it’s free!), a few findings caught my eye.

First (p.6) was the degree of OI adoption, which averaged 78% across the entire sample. Within manufacturing, high-tech firms were the highest (90.91%) and low-tech manufacturing was the lowest (40%) among all the industry categories, with two other categories (“medium-high-tech” and “medium-low-tech”) in between. Other than low-tech manufacturing, the only below average industries were financial services (including real estate and insurance) and transportation/utilities.

The next was how they measured OI practices by using Dahlander & Gann (2010)’s 2x2 of inbound vs. outbound and pecuniary vs. non-pecuniary (p.10). Not surprisingly, inbound dominates outbound by better than 4:1. Key OI partners (p.15) include customers, universities, suppliers, indirect customers (e.g. consumers if they lack a direct consumer relationship) and public research organizations. Unlike the open source firms famously studied by Joachim Henkel, page 17 includes the dog-bites-man conclusion that
Firms are more likely to receive freely revealed information from outside participants than they are to provide it to others. In other words, large firms are net “takers” of such freely revealed information.
Finally, the strategic objectives for using external partners (p.18) emphasized new technologies, partners and opportunities over reducing R&D costs. I have been quite pessimistic about the impact of inbound OI on internal R&D — based on a few anecdotal announcements by US and Japanese companies — but apparently this is not widespread (at least among this sample).

Other aspects of the study examine the internal organization of open innovation within firms (Chapter 5) and how firms measure the impact of open innovation (Chapter 6). Both chapters suggest other opportunities for future research.

July 1, 2013

Newco corralling new IP

UCLA is trying a new approach to licensing its IP, using an external 501(c)(3). The plan is stirring up considerable controversy.

At the May 15, the UC Regents voted to create an external nonprofit corporation, tentatively called “Newco.” Here are some key excerpts from the board agenda:
The primary mission of Newco and its Board will be to: (i) improve UCLA’s rate of invention disclosures per year; (ii) increase UCLA’s volume of patent applications per year; (iii) increase the overall flow of licensing royalties back to UCLA; and (iv) better position UCLA to win large, multi-year ISR contracts. This improved performance will not only benefit UCLA, but the surrounding community and the public at large.

In comparison to its peers at other universities, UCLA OIP-ISR has historically underperformed in the areas of technology transfer and ISR. UCLA believes this underperformance has not been caused by a lack of productivity among UCLA faculty or a deficiency in UCLA scholarship, but rather due to deficiencies in the current process of managing IP and ISR at UCLA.
Last week, the plan inspired a breathless exposé in a publication I’d never heard of, the East Bay Express. The opening excerpts:
June 26, 2013

Public Research for Private Gain UC Regents recently approved a new corporate entity that will likely give a group of well-connected businesspeople control over how academic research is used.By Darwin BondGraham

In a unanimous vote last month, the Regents of the University of California created a corporate entity that, if spread to all UC campuses as some regents envision, promises to further privatize scientific research produced by taxpayer-funded laboratories.
There are so many problems with the article, it’s hard to know where to start. The new plan would change what patents get filed and licensed for the UCLA medical center. It won’t be privatizing the research: the same research will be done, and it would still be licensed to firms.

The UC has been licensing IP to businesses for decades. Patents get licensed because it generates money, because it gets the technology into society’s hands, and because (after Bayh-Dole) it’s the law. Without Herb Boyer of UCSF (and Stan Cohen of Stanford) and their famous patent, there would be no recombinant DNA, no gene splicing, no Genentech.

The new plan for UCLA will decide differently which patents to prosecute (file) and license. An unpaid external board — the volunteer directors of Newco — will decide which ones to pursue and which ones not to. In the whole pipeline of developing and commercializing university research, the plan only changes the final few steps. (As far as I can tell, it doesn’t change the pricing of licenses nor the allocation of royalties to inventors and places within the university).

The exposé suggests that the plan parallels the University of Washington’s “Center for Commercialization,” but I can’t get enough info on either one to tell for sure. Certainly other schools have tried drastic measures in hopes of making their tech transfer more entrepreneurial.

I’m not sure what’s wrong with UCLA. According to the UC tech transfer FY2011-2012 annual report, Berkeley (no surprise) leads the pack, but by various measures UCLA, UCSD and UCSF all seem to be comparable. In some years, UCR does much better than some of the other UC campuses — perhaps owing to its century-long history as an agricultural research station.

However, in looking into the proposal, the proponents point to a March 2011 study “An Ecosystem for Entrepreneurs at UCLA,” written by the oft-cited Bill Ouchi, who in recent years has focused his efforts on changing troubled organizations. Based on 17 universities and benchmarking against 16 universities, UCLA was neither the best nor the worse. It holds up Columbia as a role model, because it can generate $154 million in licensing income off research expenditures of $604 million. In terms of license revenue per research dollar, Columbia is 9x as effective as UCLA.

Some of the plan seems unrealistic: it calls for “setting the national standard in technology transfer” : not everyone can be above average, not even at Lake Wobegon High. Private schools are inherently less bureaucratic and (best case) more entrepreneurial. Regressing to the mean of UC ratios is probably more realistic than matching Columbia’s results (assuming Columbia has topped the customary pillars of tech licensing — MIT and Stanford).

Still, as an experiment it seems like a worthwhile one. The plan calls for a performance evaluation after two years and every five years thereafter. It would be worth trying this experiment elsewhere (e.g. with Berkeley or UCSD’s engineering school) but not making it UC-wide for all colleges until it has passed at least two outside evaluations.