close

Вход

Забыли?

вход по аккаунту

?

Crowdsourcing scientific innovation.

код для вставкиСкачать
MESSAGE FROM THE EDITOR
Crowdsourcing Scientific Innovation
Scientific innovation has tended to be a very private endeavor, with principal investigators working in relative isolation, delving deeply into specific research areas. This has
limited available services and infrastructure, as we’ve discussed in these pages previously.1 More important, though,
is the potential impact of investigator isolation on the types
of research questions we address and on multidisciplinary innovation. For those working in information technology, the
Internet has become a powerful tool for increasing the pool
of ideas and solutions, and for accelerating knowledge development. As we recognize the need for greater efficiency and
multidisciplinary collaboration in biomedical research, can
we learn from examples of revolutionary Web-based approaches?
The Internet has spawned a revolution in collaboration.
With more people contributing, productivity, efficiency, and
creativity may all increase. Perhaps the most obvious example of this is Wikipedia. Since its launch in 2001, it has become the most comprehensive encyclopedia available, with
more than 9 million articles in over 200 languages. It was initially launched as an adjunct to a professionally edited online encyclopedia, mainly to get ideas for new entries and to
flesh out references, but it became obvious that the publically
edited Wikipedia was a better means of creating the content
outright and the professionally edited version was abandoned. Wikipedia is now orders of magnitude larger than
the Encyclopedia Britannica, and its overall accuracy is comparable, at least in noncontroversial areas.2 The users of
Wikipedia edit its content and also identify areas for new entries. In the biomedical sciences, similar wiki resources have
sprung up for RNA libraries, radiology, microarray results,
and many other areas. An army of volunteers has created unrivaled resources, which are tremendously valuable. Of
course, the value is not realized by the creators of Wikipedia
or these other wikis, who still rely on donations, but the costs
of development are also tiny compared to professional models.
Open-source software is another major collaborative endeavor driven by the Internet. Linux, the powerful computer
operating system, may be the most widely known opensource software but now there are many examples in various
areas. In the sciences, many programs to analyze imaging,
genomics, or other data are open source. Like Wikipedia,
open-source software relies on volunteers from the community of users to identify and fix bugs, and to produce new
tools and other functionality. Similarly, the developers of
open-source software make little or no money, although some
do profit by assisting with installations or providing user support.
Another similar trend in business extends the crowdsourcing notion to solve specific problems usually handled
by one’s own personnel. Innocentive (www.innocentive.com)
is a company that allows an organization to post a problem
that it wants solved. For example, Prize4Life, a nonprofit organization devoted to finding a treatment for ALS, has posted
a challenge on Innocentive: it will pay up to $1 million to
any group that identifies a reliable biomarker for ALS disease
progression that can be used in phase II clinical trials. Some
challenges are much smaller, such as a request for a new
method of selecting drug dose ranges for clinical trials
($25,000), or of analyzing survival data in oncology trials
($10,000). Innocentive acts as a broker, advertising the challenge, collecting responses, and even assisting with contracting and intellectual property transfers or licensing. The
process is something like crowdsourcing, with a much larger
array of potential innovators trying to solve a problem, but it
is competitive rather than collaborative, with a single team
obtaining the award or contract. Furthermore, the intellectual property exchange is controlled. Finally, whereas the
community users are defining the needs and filling them in
both Wikipedia (for new entries) and in open-source software (for new tools), here an organization is requesting a solution from others. Innocentive is really more similar to a
system for handling numerous, specific RFPs, almost like a
“Help Wanted” board for projects rather than people.
Even more original, engineers at the University of Leipzig
started a website called Cofundos where a group of people
come together to define a need or a set of related needs.3 The
site is limited to software tools right now but the concept
could be extended to other areas. The group defining the
need then commits a certain amount (usually less than $100
each, pooled to around $1,000 collectively) for an answer to
the need. A contractor who is willing to deliver at that price
is then selected by those putting up the money. Sometimes
posted comments will reveal that a solution is already available or could be adapted inexpensively. One could imagine
that the Prize4Life goal could have been posted here, allowing other foundations and even individual donors to add to
the bounty for a biomarker for ALS. Again, this is another
example of crowdsourcing but here the crowd is defining and
funding the research goal and an individual or single group
is coming forward to address it.
In essence, the NIH is managing its own crowdsourcing.
In its role as the public’s representative to fund biomedical
research, it puts out broad calls for proposals, open to a large
universe of potential investigators. The investigators then individually define the needs, describe a solution and, if
funded, execute that plan. Although the initial steps of ap-
© 2009 American Neurological Association
Published by Wiley-Liss, Inc., through Wiley Subscription Services
A7
.
Science is of course a competitive endeavor, with high
value placed on individual creativity and the provenance of
ideas. Ego and the desire for personal success will always
drive scientific discovery, but at least in theory these very
human needs could easily be met by opensource models. By
encoding the specific contributions of each participant, webbased problem-solving provides a transparent and permanent
record of the life history of the project from conception to
completion. Credit can be assigned, and funding decisions
based, on this historical record.
Such a mechanism for funding biomedical research may
be a pipedream, but it would be imprudent to ignore the
rapid advances in collaboration brought by the Internet. Although groupthink can slow innovation, capitalizing broadly
on the diverse areas of expertise represented by a wider group
of collaborators can clearly accelerate our efforts as knowledge workers. Engaging funders more fully and directly
could also promote greater interest and contributions.
plying and review are handled confidentially, the end result
is in the public domain, making the model a little simpler
than ones requiring tracking of intellectual property transfers.
This NIH model clearly works well for focused research
but it impedes open collaboration at several steps. First, although proposals may improve during peer review, a truly
collaborative process of building proposals, such as that supported by Cofundos, could more reliably enhance their quality. With a broader group of contributors, project proposals
might capture deeper expertise from a range of individuals
and could better combine input from diverse disciplines.
Second, projects might be executed more efficiently and effectively if a group of people were working together. Such a
collaborative approach could reduce the start-up costs and
inefficiencies inherent in supporting a multifaceted project
at a single site.
One can imagine a new program to support an open call
for research proposals to advance the neurosciences, on which
ideas for projects could be posted freely by funders or investigators and edited by others. With this series of posted ideas,
funders could identify particularly promising collaborative
proposals and fund together the best ones in their area. For
example, Prize4Life and the ALS society could work together
to seed a proposal idea on ALS biomarkers. A group from
several institutions could work together on the website to respond, editing and building on each other’s suggestions.
Other responses could also be posted and edited collaboratively. Finally, the sponsors and their advisors could select
and fund the most compelling proposal.
A8
Αnnals of Neurology
Vol 65 No 6 June 2009
S. Claiborne Johnston MD, PhD and Stephen L. Hauser MD,
Editors
References:
1. Johnston SC, Hauser SL. Investigator balkanization. Ann Neurol 2008;64:A11-12.
2. Giles J. Internet encyclopaedias go head to head. Nature
2005;438:900-901.
3. Auer S, Braun-Thurmann H. Towards bottom-up, stakeholderdriven research funding -- open scinece and open peer review.
http://www.informatik.uni-leipzig.de/~auer/publication/OpenScience.pdf. Accessed May 21, 2009.
.
Документ
Категория
Без категории
Просмотров
0
Размер файла
38 Кб
Теги
scientific, innovation, crowdsourcing
1/--страниц
Пожаловаться на содержимое документа