Category Archives: Open Access

This is more like it

Copy number by dbking
Copy Number Variation Detection:
[Via Bench Marks]

With the sequencing of the human genome came the startling revelation that the number of copies of a given gene can vary widely between individuals. This Copy Number Variation (or CNV), contributes to our species’ genetic diversity but it has also been linked to genetic diseases. This month’s issue of Cold Spring Harbor Protocols features a new method for detecting copy number variation. Like all of our monthly featured protocols, it’s freely accessible for subscribers and non-subscribers alike.

Copy Number Variation Detection Via High-Density SNP Genotyping
describes the use of PennCNV, a new computational tool for CNV detection in data from genomic arrays. Developed in the laboratory of Maja Bucan at the University of Pennsylvania, the software is freely available for download. Analysis with PennCNV will provide a more comprehensive understanding of genome variation and will aid in studies seeking the causes of genetic diseases. More information on PennCNV can be found in this Genome Research article, PennCNV: An integrated hidden Markov model designed for high-resolution copy number variation detection in whole-genome SNP genotyping data.

I took the liberty of showing the entire post from David’s blog because, in contrast to my story below, this demonstrates a very good approach for publishing scientific work online.

It highlights a useful new protocol that can be downloaded for free. It also links to a Genome Research article that I can also download for free. Nice. I can quickly get up to speed on a novel protocol.

Protocols, particularly new ones are very useful to have. Making a small number available for free is a nice way to get people to check out the journal. I have it in my newsfeed. I like CSHP and enjoy David’s blog tremendously. Now I just need to find a way to become an adjunct professor at some research organization with an institutional license so I can read all the articles.

Technorati Tags: , ,

Some science journals are messed up

I posted this at my personal blog but thought it might be of interest here since it demonstrates just how current online tools have changed the way scientific research is published, presented and read.

flying snake by Beige Alert
Why snakes don’t have legs:
[Via 2collab public bookmarks]

Tags: Hox gene, Homeobox gene, Limb
Authors: Cunliffe, Vincent
Source: Trends in Genetics; 15, 8, Page 306; 1 August 1999
Sharing: Public

I’m providing a detailed examination of an online journey I took this morning that demonstrates how the Internet has altered the landscape for publishing of articles in scientific journals. Online access certainly changes how we search for and how we read articles. It is also changing where we chose to publish.

So I see this interesting name for an article – Why snakes don’t have legs – in my newsfeed. I click on thru (why it is on 2collab I do not know?) and get this page. Great. ScienceDirect which usually charges for journal access. But this is an article from 1999. Surely it will be open by now?

Nope. They want $31.50 for a nine year old article. With no abstract or any other way to determine whether this article is worth the price. $31.50! First off, few articles in science today that are nine years old are worth $5, much less $31.50. Secondly, with no abstract how am I to even figure out if it is worth the price?

This greatly limits access to the article and encourages other routes for getting the information than reading it. Why would a scientist want to publish an article that no one will read? We want as many people as possible to see our wonderful work. This is not like literature or art where older is better.

Seems to me that this is a losing business model. I can see paying a premium for up-to-date work. I understand someone has to get paid and can easily pay a reasonable price. But $31.50?! For an article that is almost a decade old!? That makes no sense in an online world.

Very few articles in biology that are ten years old retain much value. Just a few years ago, I would have been stuck but now I have other tools.

I went to PubMed, the database of journal articles, and did a search for “snakes AND legs”. Got 48 articles. The critical one appears to be by Cohn and Tickle “Developmental basis of limblessness and axial patterning in snakes” in Nature from June 1999. Great. Now I have a subscription to Nature so this article is available to me but if you wanted to read it without a subscription it would cost $35! Wow! But at least now it has an abstract.

The evolution of snakes involved major changes in vertebrate body plan organization, but the developmental basis of those changes is unknown. The python axial skeleton consists of hundreds of similar vertebrae, forelimbs are absent and hindlimbs are severely reduced. Combined limb loss and trunk elongation is found in many vertebrate taxa1, suggesting that these changes may be linked by a common developmental mechanism. Here we show that Hox gene expression domains are expanded along the body axis in python embryos, and that this can account for both the absence of forelimbs and the expansion of thoracic identity in the axial skeleton. Hindlimb buds are initiated, but apical-ridge and polarizing-region signalling pathways that are normally required for limb development are not activated. Leg bud outgrowth and signalling by Sonic hedgehog in pythons can be rescued by application of fibroblast growth factor or by recombination with chick apical ridge. The failure to activate these signalling pathways during normal python development may also stem from changes in Hox gene expression that occurred early in snake evolution.

Sounds really interesting to me but still not sure it is worth $35. But right above that link from PubMed is another one – from Current Biology with pictures. “How the snake lost its legs”. It is a ScienceDirect link also but this one is available for free. And it has nice pictures while discussing the Cohn and Tickle article.

So partial success. Now I have a better idea of the article’s content. All the other links from PubMed dealing with snakes and THEIR legs, as opposed to snakes and the legs they bite, have costs to access, up to $39.

Except for this nifty one from the Journal of Experimental Biology – “Becoming airborne without legs: the kinematics of take-off in a flying snake, Chrysopelea paradisi” (The picture above is of a flying snake.) Open access and more recently published. Not exactly on topic but it comes with movies! These were just not possible to see without online access. And the movies are really cool and help explain what the author of the paper was describing. You can actually see the difference between a J-loop takeoff and other modes. Plus, flying snakes sound like something from a B-movie.

Back to the topic. I went to Google and searched “Cohn Tickle snake”. The top response is from a USA Today article about why snakes do not have legs. In the article there are links to Martin J. Cohn and Cheryll Tickle. Clicking the Cohn link takes me to his page at the University of Florida. Not a lot here but there is a link to his personal site.

Now we get the Cohn lab page. I could just email him and ask for a copy of the paper (a slightly updated approach to the old method of sending reprint requests by snail mail). But there is a link to Publications.

And here we find the PDF to the paper I was looking for. A quick runthrough reveals that it is a paper I will find interesting (I love Hox stuff). But I would not have paid over $30 for it.

I certainly believe that downloading a paper from an open archive presented by the author of a paper is an ethical way to obtain the paper (It is just the online version of the reprint request, remember). So, it took me less than 10 minutes to find a copy of the article online. (And it turns out that if I had looked at my Google results just a little more, I would have found a direct link to the publications page, saving myself some time.)

I think that, except for the most highly paid of us, 10 minutes time would be less than $10. This seems about right. A paper for $5 I would buy immediately while much over $10 and I will go searching. I may not succeed but I can usually find an email link and request a copy from the author.

Online archives by the authors are becoming more common and are a basic aspect of many Open Access initiatives. Paying a small premium for access to a current article is a reasonable price, especially if it is convenient. But any business plan that wants to charge a huge premium for decade old work needs serious rethinking.

So, for a few minutes of my time I got the article for free and also got to see some nice movies of snakes flying. Not a bad way to travel in an online world.

Technorati Tags: , ,

Web 2.0 and the Enterprise

scuba by jayhem
How 300,000 IBM employees use Bluepedia wiki:
[Via Grow Your Wiki]

IBM gets wikis. In a 300,000+ person enterprise, a wiki enables emergent collaboration and expertise:

BluePedia is an encyclopedia of general knowledge about IBM, co-authored by IBMers for IBMers, which enables the collection of expertise and know-how of more than 300,000 IBMers around the world into a simple, searchable resource that is easily expanded, shared and used. The single, global co-authoring platform enables the development and implementation of a common worldwide vocabulary and easy recognition and identification of subject matter experts.

300,000 is a lot. Not many companies are going to have that many for a wiki. But from their press release, there is a lot more IBM is doing with Web 2.0 technologies. I am sure we will hear more in the next year.

I did like this from the release:

Web 2.0 technologies create open, collaborative spaces that eliminate the traditional hurdles created by time and distance that businesses worldwide have traditionally faced. The marriage of videos, blogs, and custom publishing enable working professionals to exchange ideas and perspectives using rich, multi-dimensional platforms that foster a two-way dialogue within an enterprise.

As a result, employees can leverage the technology available at their fingertips, regardless of time and place, to drive innovative ideas throughout their enterprises. By linking with several other development sites, guests experienced how IBM technologies drive efficiency, innovation, across the enterprise and tap into high-value skills from the company’s top talent, around the world, to solve the specific needs of its clients.

Companies whose basic products depend on the continuing creativity and innovation of its employees will have tremendous increases in productivity with these tools. The key will be that these tools have to be as flexible and open as possible, allowing new uses to be created by the user, not by the vendor.

The world will move too fast to wait for the vendor to provide the latest tools. IBM will fail here if they lock users into something bloated like Lotus . Lotus was useful for certain directed tasks but was unwieldy when required to adapt to changing or novel environments. It required a superior development staff to keep up. Web 2.0 tools will only succeed when the actual development is minimal and when the users can accomplish what they need themselves.

Technorati Tags:

Publishable science

Open science:
[Via Michael Nielsen]

The invention of the scientific journal in the 17th and 18th centuries helped create an institution that incentivizes scientists to share their knowledge with the entire world. But scientific journals were a child of the paper-and-ink media of their time. Scientific papers represent only a tiny fraction of the useful knowledge that scientists have to share with the world:

Enabled by a new media form, the internet, the last few years have seen a modest expansion in the range of knowledge that can be published and recognized by the scientific community:

The most obvious examples of this expansion are things like video and data.
However, there are many other types of useful knowledge that scientists have, and could potentially share with the world. Examples include questions, ideas, leads, folklore knowledge, notebooks, opinions of other work, workflows, simple explanations of basic concepts, and so on.
Each of these types of knowledge can be the basis for new online tools that further expand the range of what can be published by scientists:

It’s fun to think about what tools would best serve the needs associated with each type of knowledge. This is already starting to happen with tools and ideas like open notebook science, the science exchange, SciRate, and the Open Wetware wiki.

This is a very good point to make. Publishable information has increased tremendously. We are no longer limited by what the printing press is capable of displaying. We are no longer limited by the number of pages that can be printed a month.

This opens up the possibility of also making available not only the things that went right but those that went wrong. Preventing others from following a dead end would be useful.

Underlying this apparent problem is an opportunity to develop tools to assist scientists in finding relevant information, and to ensure that what they publish – their questions, ideas, and so on – is seen by those people who will most benefit. Ideally, the result will be not only a great expansion in the range of what is published, but also a great improvement in the quality of information that we encounter.

The reason new tools will be developed is that this approach will allow researchers to attack very complex problems in a much more efficient manner than those limiting themselves to the printing press. Success will breed success.

There are, of course, major cultural barriers to acceptance of these new tools. At present, there are few incentives to make use of new ideas like open notebook science. Why blog your ideas online, when someone else could be working on a paper on the same subject? This isn’t speculation, it’s already happening, and sometimes the blog posts are better – but try telling that to a tenure review committee.

Similar comments were made with regarding Open Source. What incentive would there be for creating software for free? It may well be that Open Science is not rewarded in the same fashion as science on paper. I think it is more likely that academia will change to provide proper rewards.

Certainly there are other places to pursue research than a university. In particular, I think there will be an even larger growth in non-profit research institutions over the next generation. They do not usually have the same arcane tenure problems universities do, and often rewarding people more like a corporation does than academia, that is for what they accomplish the meets the institution’s goals rather than where they published.

The successful institutions will find and use the tools that solve problems. They will also find ways to reward those that successfully use them

At the moment, many of these institutions are found in biotech and human health but as more money and focus moves towards using innovative tools to promulgate science, there will be ones for every discipline. And, as the brain drain from academia to these institutions increases, universities will either have to adapt or they will wither.

More flexibility, More collaborative environments. Less overhead. I believe that these research foundations will be the leaders in promulgating open science. It is to their advantage to do so.

Technorati Tags: , ,

Credit where credit is due

oil drop by Shereen M
Who needs coauthors?:
[Via Survival Blog for Scientists]

Young people, in tenure track positions, feel they to have to collect as many authorships as possible. Questions like “Will I be a coauthor?” and demands as “I have to be a coauthor” are part of daily conversations in science institutes.
But not only junior scientists are eager to boost their cv’s with authored papers.
[More]

Biology papers usually have large numbers of authors. It is rare to see a major paper in Nature or Science with two authors. Often modern papers are the results of collaborative research between multiple institutions. It makes it easier to get your name on a lot of papers but also makes proper assignation of credit difficult.

Credit for papers can be incredibly important and manipulation of the credit is not unheard of. Harvey Fletcher was a graduate student for Robert Millikan around 1910. Fletcher developed and designed the oil-drop experiments that measured the charge on an electron as well as investigations on Brownian motion that led to a better determination of Avogadro’s number.

Now, Fletcher could use a published paper in lieu of his Ph.D. thesis but only if he was sole author.

Millikin proposed that Fletcher be the sole author on the Brownian motion work and Millikan would be sole author on the electron charge work, even though Fletcher’s work was critical in both. Millikan knew which one would be the more important paper. As a graduate student, Fletcher really had no choice but to acquiesce to Millikan’s proposal.

Millikan published as sole author of the paper on the charge of the electron. Fletcher wrote on Avogadro’s constant. Millikan won the Nobel Prize in 1923. Although, Fletcher became the first physics student to graduate from The University of Chicago summa cum laude, he spent most of the next 38 years outside of academia, working at Bell Laboratories.

Although he did not win the Nobel Prize, he had a tremendous impact on many of the technologies that were developed in the 20th Century. At Bell Labs, he not only became ‘the father of stereophonic sound’ but was the director of the labs that developed the transistor.

What this shows is that while a true genius can not be stopped by who published what, in the scientific world, particularly in academia, the assignment of credit has huge ramifications. Almost anyone who takes physics knows about Millikan and the oil-drop experiment. Who knows about Fletcher?

These days, often the person who did the research is first author and the person who directed the research or whose lab supported the research is last. Everyone else involved in smaller amounts is in between.

But this can change. Often with 20 authors, no one ever gets to the last one when the article is referenced. The bibliography will just be ‘Smith, et al.’ So sometimes, the director of the lab will be placed as first author instead of last so everyone sees their name in the references.

So how does proper credit actually get assigned? In large measure, figuring out who designed the critical experiment, who simply provided reagents and who had critical intellectual input are all hidden from general view. This permits political pressures, such as what Millikan used on Fletcher, to determine placement, rather than actual worth.

Huge battles have been waged over where one’s name gets placed in a paper. Since this is what the world will see, it is worth it for many people to spend all their political capital to get a choice placement on a paper. A lot of scientific blood may have been spilt in order to get on a paper published in Nature.

Sometimes those in the know have an idea of proper credit but tenure committees, grant committees and other vetting bodies can have a difficult time telling just what contribution a scientist made on a paper with 40 authors.

There have been some attempts at better clarifying this, with authors making statements about who did what. Perhaps as we move away from the current model of publishing to one more digital in nature, there will be approaches to simplify this process.

In particular, there will have to be a way to assign credit for things other than just the number of publications. Scoring the impact people had on those publications, what work they actually performed and where they can be placed in the process that lead to novel scientific discoveries will become more likely, if the social media aspects of Science 2.0 comes to be appreciated.

Because every one of those aspects can be time-stamped and made accessible by using things like wikis and weblogs in ways that email will never accomplish. Openness and transparency, important aspects of successful Web 2.0 tools, will also make it possible to more accurately track the progress of creativity and innovation. Surely rewards will follow.

Will Science 2.0 make it less likely that political pressures can be used to claim credit that is not deserved? Being human, the pressures may never disappear. But Science 2.0 should make it a little more difficult to claim credit after the fact. Fletcher kept the secret of Millikan’s proposal until after he died. In those days, it was easier to control the flow of information, to hide political manipulations of the research.

Now, not as much.

Technorati Tags: , , ,

How podcasts work

podcast

Video: Podcasting in Plain English | Common Craft – Explanations In Plain English:
[Via Common Craft]

These videos are always worth watching and do a wonderful job explaining how many Web 2.0 tools work. The videos can be downloaded and embedded into intranet pages for employees, allowing them to better understand the technologies.

The fact that these videos use such a low tech approach to teaching about high tech tools make them very original and eye-catching.

Technorati Tags: ,

Fighting malaria with Web 2.0

mosquito by aussiegall
Social networking site aims to help fight malaria:
[Via News at Nature – Most Recent]

New website gives smaller African projects a bigger profile.
[More]

An interesting approach – using social networking tools to help increase awareness of anti-malaria projects and help fundraising effort. It is a novel way to use some of these tools but I wonder if these would be more successful as a stand alone project or under the wings of larger social media entities?

Technorati Tags: , ,

Knowledge hoarding

diffusion by Bitterjug
Is knowledge hoarding all about your pay cheque?:
[Via Library clips]

The other day I posted on, Participation is the currency of the knowledge economy.
The word “participation” can be interchanged for “social captial”, “conversation”, “contribution”, knowledge sharing”, but I chose “participation”, because “conversation” cannot happen without “participation.” And “participation” sounds more involved, sustained, or perpetual than “contribution” or “knowledge sharing.”

Anyway in that post I mentioned that the way companies currently operate is driven by each worker building their “intellectual captial” to get ahead, and to differentiate themselves. The more “intellectual capital” you have the more you are worth something or unique to the company. This kind of means workers compete with each other, or at least try to have unique power that will make them an asset to the firm. In this environment “knowledge sharing” would be the worst thing you could do, as you would be giving away your “edge”, giving away what makes you a unique asset to the company.

Of course we all know the “wisdom of crowds”, and an open and transparent participation model leads to ideas and conversation, which leads to discovery and collaboration. The act of sharing and finding saves others from re-inventing the wheel, saving money and project cycle-time.
A company that runs on a social captial model runs on the notion that “two minds are better than one”, so why not have a culture where these minds have open dialogue. In the end this opportunity for access to knowledge to help you with your work and to find new work brings the company closer to innnovation, and more honest client relationships.

No matter how simple the tools, and no matter even if people understand the benefits of “knowledge sharing” it just won’t happen if the company culture is about “intellectual captial” rather than “social capital.”
[More]

Organizations that depend on the creativity and innovations of their employees will not be as successful if they utilize knowledge hoarding when compared to those that have learning, collaborative communities.

The diffusion of innovations has been well studied. It is an outgrowth of human social networks. The rate at which information traverses the network will determine how rapidly a new idea gets accepted and used.

If certain people hoard information, they prevent this flow. In hierarchical companies, this hoarding can be useful to the hoarder, since they can position themselves as the node through which the information must flow. Knowledge is power.

In the highly networked world found in many companies today, however, this is more difficult. Preventing information flow along other routes becomes harder. The hoarder loses all power if someone else spreads the knowledge.

Just as the Internet routes around damage when a node goes down, so do well-connected human social networks route around knowledge hoarders, diminishing their power.

Companies that lessen the power of hoarders will have more rapid and successful diffusion of new ideas that can have huge impacts on the bottom line. Organizations that fail to deal with hoarders will not be as adaptive or as responsive to innovation. And, if the hoarder takes their information elsewhere, the organization is left with much less than it had before.

Technorati Tags: , ,

Cutting edge Open Science

lecture hall by yusunkwon
Best. Freshmen. Evar.:
[Via Unqualified Offerings]

By Thoreau

I decided to give my freshmen a taste of real physics. I offered extra credit to anybody who could give me a useful critique of my grant proposal. Amazingly enough, two of my students actually rose to the occasion. Although they couldn’t really dissect the science, they could tell that I wasn’t really explaining why this would be significant for the field, and they told me what I’d need to say to convince them of the significance. (I guess some people just can’t appreciate the inherent AWESOMENESS of simulating a new technique for optical nanolithography and identifying the necessary molecular parameters.) They earned themselves some extra credit points for the upcoming midterm. Prior to this these students flew under my radar, but if this grant gets funded, they’ll be the first ones that I consider for research assistantships.

I don’t know many researchers who would do this but Thoreau accomplished something very useful. Not only were several deficiencies in the grant identified but the students may have lined up some nice work for themselves. A nice win-win situation.

I think the extra-credit idea is a nice approach. Anyone who can make it through a government grant (which can range well over 60 or so pages) should get some credit just for making it through. The students were able to identify holes even without understanding the exact protocols.

I wonder if this could be applied further down the system – during the grant review process. Not have students critique but find a way to open up the review process to a wider group of people?

I know from comments reviewers have given my grants that sometimes they really did not read what was written, since the text directly contradicted their comments. I have had comments from two reviewers that directly contradicted each other.

Now, these days, very few grants are awarded the first time they are submitted. So being able to answer comments is important. But what if the comments themselves are useless? Perhaps using a more Long Tail approach would help.

Obviously there are barriers to overcome (e.g. proprietary information) but I wonder?

Technorati Tags: , ,

Spanning the Chasm

chasm by soylentgreen23
Is There Still A Chasm?:
[Via SmoothSpan Blog]

An interesting post by Leigh has popped up on Techmeme. She wonders, as I have, whether the fundamental notion of Moore’s Chasm has changed. Leigh’s question is whether the generation that grew up on Technology still even thinks of it as early adoption, or if the behaviour has become so widespread that there really is no Chasm any longer.

It’s an interesting question, but I believe there will always be a Chasm of some sort. My question is whether the Early Adopter crowd is now so large, and the Internet so effective at reaching them, that perhaps it is possible to build a business without the painful dislocation that is Crossing the Chasm. Perhaps there are enough on the Early Adopter side to make a tidy business after all.

[I’m working on a more extended form of this comment I left at SmoothSpan. I hope to have it posted soon.]

If I remember correctly, the original adoption curve came from observations of the acceptance rates of new varieties of hybrid corn. So the curve itself describes a social phenomenon, not a technical one. I’m not convinced that there are really more early adopters than before. But, the rapid rate of change presented by new technologies may alter things. The rate of diffusion of technological change through the different groups may not be able to keep up with the furious rate of the change itself.

My feeling is that the problem is not what percentage are in the early adopter segment or not. There are really four chasms between each of the five groups. Up to now, the most noticeable was between the early adopter set and the early majority. Crossing this chasm would result in a narrow majority that had adopted the new technology.

But new technologies move so rapidly that each of the four chasms may now be much wider, not just the early adopter-early majority. The early adopters can move so far ahead of those a little slower that the gap looks huge. (I’m going through this with Twitter. I almost do not want to start because those who have been using it for even a short time seem so much more advanced and are doing so many ‘magical’ things that I fear I will never catch up. And I am an early adopter of technology).

So, my feeling is that there is a greater need for those who can span the chasm and help increase the rate of diffusion. It is just more like holding a tiger’s tail, since the rate of change seems to be accelerating also.

Technorati Tags: ,