Tuesday, November 29, 2011

Allen Institute for Brain Science Open for Science

Great WSJ article about the Allen Institute for Brain Science turning its data out to the public free of charge.

Edit: I notice the WSJ article requires a subscription; very 'un-open' of Paul Allen; regardless, I was able to read it after searching for the article on google news with keywords "Open Science Paul Allen."

Saturday, November 26, 2011

"Climategate" and Openness

Here's a piece about "Climategate." While I, like everyone else, is tired of tagging "-gate" to all scandals or perceived scandals, the actual scandal relates back to a need for openness. I want to transpose this piece that goes right back to the need for Open Science, but in a slightly politicized agenda:

"Voters in a democracy do not argue about science. They argue about the authority of scientists. And scientists’ claim to authority comes from the perception that, in fact, they do not let their vanities and rivalries influence their work...scientists pursue only the truth...some prominent members of the climate-change establishment were not operating in a spirit of openness."

Friday, November 25, 2011

Facebook for Scientists and Open Science Study

Hope everyone has enjoyed his or her respective Thanksgiving traditions. Here is an interesting article on the prospects of Facebook for Scientists. Something perhaps a more titillating is this project on science as a public enterprise. It has several links discussing open science in detail. (One is a link to Lancet, to which you must register as a user [for free] to see.)

Tuesday, November 15, 2011

A Map of Science's Future

From the Institute for the Future, a literal map of the future of science. We, of the open science mantra, are found up in the right-hand corner under the heading of "Massively Multiplayer Data" and pretty much the entire right-hand side (the "interstellar clouds of creation").

This is only a theoretical gesticulation, but it's pretty cool that open science is right there with decrypting the human brain, finding extraterrestrial life, nanotechnology, etc.

Check it out:

http://boingboing.net/wp-content/uploads/2011/11/IFTF_SR-1454A_FutureofScience_Mapside_only.jpg

The Toaster Project

Here's a fun piece about an amateur scientist (among other titles), Thomas Thwaites, who made his own toaster from scratch. I saw him on The Colbert Report tonight, of which you can all try to catch a rerun, or you can watch his Ted talk here:

http://www.ted.com/talks/thomas_thwaites_how_i_built_a_toaster_from_scratch.html


It's great to see inquisitive, thoughtful minds out there who want to know how things work, even if the end product is a ridiculous fire-hazard as in this case of the toaster. Amateur scientists are an untapped resource full of good ideas and are unfairly silent due to their unfamiliarity with the entrenched publication system of academia. Hopefully opening up science can provide the platform amateur scientists need to share their data and ideas.

Sunday, November 13, 2011

Open Science and Entrepreneurship

Let me tell you a true story of…luck. Very recently a group of MBA students at the University of Louisville has been winning entrepreneurial competitions, including the 2011 Venture Labs Investment Competition (formerly and still colloquially known as Moot Corp.) and the 2011 Rice University Business Plan Super Bowl. This group, under the name of TNG Pharmaceuticals, has been the benefactor of discovering some researchers at Auburn University with a “vaccine” that inactivates the blood-thinning factors in the horn fly’s saliva; the horn fly has a $1 billion impact on the cattle industry. TNG Pharmaceuticals is marketing the vaccine as FlyVax and is setting up to make lots of money as long as the USDA gives it the green light.


How this group found these researchers is the more interesting point. The members began a web-based search through the Offices of Technology Transfer (OTTs) of various universities. These OTTs have several goals/functions. Most important developments in medical science begin in labs, but these developments have limited scope beyond meeting a narrow research goal. Developing these advances into impactful measures requires other steps including more R&D, testing, approval by appropriate regulatory bodies, manufacturing, and distribution. More or less OTTs function to carry out technology transfer mandates by retaining title to inventions and licensing these inventions to private entities to ensure use, commercialization, and public availability.


The word “luck” I used earlier is not my own, it’s one of the collaborator’s at TNG Pharmaceuticals. It was luck that the research they found was just sitting on the shelf for more than five years. But that begs the question about what other research is out there sitting on the shelf that is potentially highly lucrative and marketable? I doubt the cure to cancer is waiting, but valuable research can lay dormant for years until others bring it back to life or repurpose it. This is where Open Science can prove valuable. OTTs have a necessary function that Open Science is not intrinsically designed to cover, but Open Science can fill the gaps and offer good, inactive research a chance to be rekindled, reexamined, re-purposed. Who knows, maybe the future online forums constituting the fruit of Open Science labors will hold some entrepreneur’s moneymaker!


Some links for those curious:

http://techcrunch.com/2011/05/09/bplan-winners-mccombs-2011/

http://money.cnn.com/galleries/2011/smallbusiness/1105/gallery.rice_business_plan_winners.fortune/2.html

http://www.ott.nih.gov/about_nih/about.aspx

Thursday, November 10, 2011

The increasing rate of retractions

This article was published back in August of this year. After I read it, I was amazed.. Fellow OSK readers, you got to check it out.

Retractions of scientific studies are surging

Wednesday, November 9, 2011

Publication Bias


I have been ruminating lately on the idea of publication bias and some of its negative consequences.

For those of you who are not familiar with the term, publication bias refers to the fact that studies finding statistically significant results are published more frequently than those finding no significance. A term was coined back in 1979 for this: the “file drawer effect”. This term refers to the fact that many studies are conducted, but will never be reported due to non-significant results (1). These studies languish in the proverbial file drawer, never seeing the light of day, un-publishable within the current model.

If researcher A goes through the effort of conducting a given study evaluating the differences between drugs X and Y, and the null hypothesis is upheld (i.e. there is no difference between drug X and drug Y), then why spend all of the time and money going through the effort of writing up a manuscript to have it rejected? In the modern world of academic science, the age-old mantra of “publish or perish” still holds true. For his/her career to progress he needs to produce a steady supply of manuscripts in a timely manner. In my opinion, it is not that people do not want to publish manuscripts with null results; it is that there is very limited real estate in journals.

Positive results (i.e. significant results) are, flat out, sexy. I mean, what is more appealing, reading that drug A does not cure “incurable disease” or that drug B cures “incurable disease”? Obviously, drug B makes a better story.

If the journal’s only purpose were to keep us abreast of the current developments in a given field, then this situation would be fine. This is the function of tabloids. That is not their sole purpose. Journals are supposed to act as the repository of emerging scientific knowledge, sexy or dull.

Anyone that has tried to learn anything new knows we often learn the most from our mistakes. There is value in not just reading about proven solutions, but also in awareness of the various ideas that were not successful.

If you take this idea out to its logical conclusion, you can see a situation in which there are several hypothetical researchers working on the same problem. Let’s pretend their intention is to find a cure for breast cancer (since there really are many separate scientists working on this right now). The different labs are all working as hard as they can, trying different solutions independently. Yet, many of their failed attempts at cures will not make it into the journals for others to see. This means that others out there are making the same mistakes, completely unaware of the failed attempts of their peers.

This is very inefficient, costly, and dangerous.

A better solution would be an open science approach, where all attempts at science were published in a way allowing loose collaboration between labs. One lab could use the failed experiences of another lab, expediting the process and introducing a level of efficiency that doesn’t currently exist.
In this model, there would be no more reinventing of the wheel; reading that somebody in Kansas, Russia, or Nigeria has already attempted your failed idea would allow you to move your limited resources on to the next step.

(1) Robert Rosenthal (May 1979). "The file drawer problem and tolerance for null results".Psychological Bulletin 86 (3): 638–641. doi:10.1037/0033-2909.86.3.638

Monday, November 7, 2011

Another take on Open Science

I was perusing the internet and came across this article on the openscience.org blog. On there is an entry that was written back in 2009. It gives a good definition of what open science is. It is always useful to get other perspectives on this topic.

Another Take of Open Science

Saturday, November 5, 2011

Another example of poor vetting

Why do we keep letting this happen. All it takes is the ability to fool 2-3 people and up goes your "research". There is no mention in here anywhere of the responsibility that the journals also share in letting this guy's stuff get published.

Psychologist admits faking data in dozens of studies

Tuesday, November 1, 2011

All in peer-reviewed journals...

Paying for the back-end; research is only the beginning of the cost

Let’s be realistic: new ideas are too expensive.


The research part is only the tip of the iceberg.


After years of meticulous research, you must publish; this is enormously expensive. Manuscript publication may include various opaque “first-copy costs” and indexing costs, as well as back end costs associated with sustaining an editorial staff and infrastructure at many thousands of peer-reviewed journals. These costs come either via the direct cost for publishing your manuscript or for purchasing licenses to view other published manuscripts for citation in your paper. You may also have to pay for translation to English or from English to other languages. You might purchase a subscription to an individual journal so you can offer a copy of your published manuscript to your mom and dad!


In addition to publishing your research in a journal, you will also try to present your research at a reputable conference, where you’ll be able to share with, and learn from, other experts in your field. In this case, you’ll be paying for airfare, housing, and transportation. You might even pay to submit your abstract to that conference for consideration! However, you will certainly pay to print a glossy 35”x55” poster to help your peers visually understand your research, unless you plan to be maligned for a lack of professionalism and credibility. There will be additional registration fees for attending the conference to hear selected research presentations. You will probably purchase a few drinks of KY bourbon to make these mounting costs easier to rationalize.


Succinctly: if you can’t afford to reference old journal articles, you can’t publish new manuscripts. If you can’t afford to support journal infrastructure, you can’t disseminate new ideas. If you don’t attend conferences, you aren’t part of the discussion. There is no good alternative to this process.


For every chunk of new information you create, you’ll move through this process once. If you’re good, these costs will be paid many, many times throughout your career.


Where does the money go? Who is making sure this money is spent effectively?


How much should it cost to share an important new idea with the world?


It’s easy to blame journals, conferences, airlines, etc. for these costs. They are prohibitively expensive, elitist in their assurance that only researchers with financial backing are taken seriously, conservative in their support of the status quo over progressive science. Unfortunately, researchers are responsible for sustaining these exorbitant costs, too.


Scientists participate in a system that both requires and rewards dutiful, enduring payment of these costs. They are systemic, rationalized, and accepted, in that they are accounted for in research and development costs. Ultimately, they are anticipated by principal investigators at universities, in industrial settings, and at government laboratories. Thriving within this system is a major piece of achieving recognition as a scientist, securing research grants and a laboratory, and reaching tenure. Spending money to disseminate information in this way is part of being able to continue doing great science.


There is nothing sinister on either side; the system demands payment then rewards participation.


It is a monopoly.


At OSK, we believe all parties involved are doing the best they can to discover great things with as little waste, in as much haste, as the system allows. Despite the tradition that is this process, however, there are probably cheaper ways to do it.


We’d like to see more accountability across the board. We believe discussion about the merits of other information systems must emerge, allowing the community to identify the best ways to innovate. For a brief moment in human history, we believe both publishers and researchers should be held up to scrutiny to justify these exorbitant costs, to identify to best way to move forward with science.


The result of reducing/eliminating unnecessary back-end research costs will be fewer barriers to entry into research, less money wasted on infrastructure and bureaucracy, a greater breadth of big ideas, and higher quality science. Cutting these costs will ultimately require a concerted effort by publishers/conference facilitators and researchers.


(The above hyperlinks were the first examples I found illustrating these very typical publishing costs – no harm is meant to individual entities therein)

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 3.0 United States License.