The Winner’s Curse

trophies by Snap®
Current Biomedical Publication System: A Distorted View of the Reality of Scientific Data?:
[Via Scholarship 2.0: An Idea Whose Time Has Come]

Why Current Publication Practices May Distort Science

Young NS, Ioannidis JPA, Al-Ubaydli O

PLoS Medicine Vol. 5, No. 10, e201 / October 7 2008

[doi:10.1371/journal.pmed.0050201]

Summary

The current system of publication in biomedical research provides a distorted view of the reality of scientific data that are generated in the laboratory and clinic. This system can be studied by applying principles from the field of economics. The “winner’s curse,” a more general statement of publication bias, suggests that the small proportion of results chosen for publication are unrepresentative of scientists’ repeated samplings of the real world.

The self-correcting mechanism in science is retarded by the extreme imbalance between the abundance of supply (the output of basic science laboratories and clinical investigations) and the increasingly limited venues for publication (journals with sufficiently high impact). This system would be expected intrinsically to lead to the misallocation of resources. The scarcity of available outlets is artificial, based on the costs of printing in an electronic age and a belief that selectivity is equivalent to quality.

Science is subject to great uncertainty: we cannot be confident now which efforts will ultimately yield worthwhile achievements. However, the current system abdicates to a small number of intermediates an authoritative prescience to anticipate a highly unpredictable future. In considering society’s expectations and our own goals as scientists, we believe that there is a moral imperative to reconsider how scientific data are judged and disseminated.

Full Text Available At:


[http://medicine.plosjournals.org/perlserv/?request=get-document&doi=10.1371/journal.pmed.0050201]
[More]

There are also several other article links at Scholarship 2.0 that are important to read. In particular, there is a discussion of the Winner’s Curse This is an observation that the winner of an auction in which all the bidders have similar information will usually overbid. The idea of the Winner’s Curse was first observed by oil companies bidding on offshore leases.

These authors make the point that publication in scientific journals may also suffer from a bias that resembles the Winner’s Curse. The Winner in an auction presents a price beyond the mean in order to succeed. The authors argue that in a similar way, papers often present data beyond the mean in order to get published.

It is an intriguing speculation and one that might deserve further examination. The data that do get published may be misleading and this may be a reason why early published clinical trials are often not replicated later.

And they make the point that the huge amount of data being generated has not seen a corresponding increase in the places to publish the data or conclusions based on the data. This introduces greater likelihood that what is actually published does not represent the ‘real value’ of the data.

I expect this work to produce some interesting discussions. I would be surprised if it is entirely true but it does propose some changes that might be worth implementing.

Technorati Tags: ,

Context wanted

 1097 942345473 C28E764925 by jared
Wanted: Too much news:
[Via Buzzworthy]

Information overload is something people generally try to avoid. But when there’s actual earth-shaking news like the current financial crisis, many people actively seek it out. The New York Times looks at the latest iteration of this phenomenon:
[More]

The key change in today’s world from one even 5 years ago is that finding information is easy today. It used to be that trained professionals, with years of experience were needed to track down important information. They needed to examine lots of journals, catalog the information in arcane ways and generally learn where things were. Now almost anyone can do it.

It used to require huge amounts of money to create a new song or a new video. Not anymore. Technology changed that. Same with finding facts. The problem is that there are too many facts, too much data. Converting this to information and data require context.

Context is what is now desperately needed. Information used to be hard to retrieve and the context was usually supplied during the retrieval process. Now retrieval is easy but context is hard. Context can only come from social interactions provided by other humans. This is the power of human social networks and social media.

They provide context, if properly used.

Technorati Tags: ,

Very, very fast

speeding by NathanFromDeVryEET
Your personal health: Free means education:
[Via business|bytes|genes|molecules]

One more post today. Russ Altman talks about how the cost of genotyping is asymptoting to free
(eerie parallels to Chris Anderson’s Free). The post comes on the heels of the launch of Complete Genomics and that they will enable a $5000 genome in 2010.
But that’s not the part that I want to talk about. It’s another part of the post

We must work to educate every day people about how to interpret their genome and how to use the information beneficially. Trying to protect them in a paternalistic manner is going to fail, I fear–because it will be too easy to get the information cheaply. So solutions that rely on government regulations or mandatory insertion of a physician in the process of ordering genetic sequencing (don’t get me wrong, I love physicians–I am one) are not going to work.

Given past writings on this subject, no surprise, I agree wholeheartedly. We need to educate, and educate fast.

There are many things that will be changed when sequencing a genome becomes almost as fast in reality as it is on CSI. Without proper education about this, people will be open to really bad science.

The best approach may not be the government or physicians, who have no stake in really explaining things. Perhaps there will be trusted genome advocates whose job is to carefully explain to someone what their genome really means, in a way that truly educates them. And provide quarterly updates of new information for them.

Technorati Tags: ,

NIce point

Jay Cross on Twitter:
[Via Gurteen Knowledge-Log]
By David Gurteen

Do you still not understand all the fuss over Twitter. Took me a while also but the penny has recently dropped for Jay Cross. See what he has to say.

Twitter provides an instant, real-time connection to the people you want to be connected to.

Credit:: Jay Cross

It is often hard to see how a new innovation can be used, how to frame it into one’s workflow. Twitter may not be there for everybody but ideas like this are helpful.

Technorati Tags: ,

Sharing can be hard and easy

sharing by Mr. Kris
Connected – Why is it so hard to get smart people to share?:
[Via Knowledge Jolt with Jack]

I came across Why is it so hard to get smart people to share? from Gia Lyons via a mention on the actKM mailing list. She covers some of the common downsides to attempting brain dumps from experts. Her notes reflect many of the conversations on this topic.

“There is a brigade charge underway to capture the wisdom (knowledge + experience) of the retiring corporate crowd. The urgency is perhaps driven by the fact that these “wisdom holders” will retire, then turn around and charge their former employers a hefty consulting fee for continuing their services. Not a bad gig if you can get it. But, those who have tried the knowledge management (KM) thing in the past will tell you that this harnessing, leveraging, capturing, harvesting – pick your favorite over-used word – is a hard row to hoe. And for the record, please do not try to harness or harvest my knowledge. I am not a horse, nor a corn crop.”

Back before knowledge management was a business term, expert systems work included the Knowledge Engineer role (and still does). This person was responsible for developing appropriate representations of the body of knowledge in the expert system. And quite often this included interviewing the experts to try to elucidate their rules and expertise: knowledge harvesting. While it works okay, there were always elements that either could not be discovered or could not be articulated by these means. As a result, expert systems never quite got to the point of perfection predicted by early proponents. And there was always some unsettling aspect of using expert systems alone that made people shy away.

Part of the problem is that the individual really sees no advantage to helping create such a system for the vague ‘group.’ People help other people but it takes a special kind of process to do a lot of work purely for the group without any recognition for performing the task.

Too many of these KM programs do not really take into account human needs and nature. The best way to move the tacit information of the expert into the explicit world of the community is to demonstrate to the expert why it will help save them time and give them more time to spend on what they want to do. This tacit-explicit transformation of information works best when it is an emergent property of a person’s workflow and not the primary reason.

The trap, I think, is in thinking that KM (or any other knowledge discipline) is only about writing things down. This trap is easy to fall into when the focus of the discussion is on the technology, rather than on people and process.

Back to basics then. There are experts within your business, and that expertise is all over the map from arcane technical topics to customer experts to company lore experts. They are employed because their expertise supports the business at some level.

Experts do a lot of things in the context of their work. They apply their expertise to solving business problems, whether in the lab or in the board room. They spend time honing their expertise: talking with people, attending conferences, reading, doing blue-sky experiments, etc. They also respond to questions and requests related to this expertise. (Experts do a lot of non-expert things in the company too — including learning from other experts.)

Most corporate experts hate repeating the same things over and over again. By putting an FAQ about their area of expertise up on a wiki, they can easily remove those sorts of interruptions and concentrate on problems that require their focus. Thus, there is a tacit-explictit transformation but it obviously helps the expert and will provide more time for them to use their expertise where it can really help.

And, as an added bonus, there are now metrics to demonstrate how important the expert is. Simply observing how often the FAQ is accessed and by whom will be valuable. It will be possible to compare the importance of different experts in the community and thus making it easier to reward them as well as making it easier for the experts to demonstrate their usefulness.

So, when it comes to the experts, what do we want them to do? All of these things – in the right balance at the right times. Do we really want them to spend time being interviewed by knowledge engineers or writing down what they know outside of any context? Why not facilitate their current work? I don’t think these projects should get in the way of their work by forcing them into artificial “harvesting” situations.

There are many directions to go from here. For example, maybe the mentees should be writing down what they learn, instead of asking the expert to take on the entire burden. If the knowledge transfer job is necessary, then it has to be in the context of work happening now or in recollection of how a particular project ran. At least then there is some context around which to hang expertise.

There is a balance of sorts between being the expert, becoming a better expert, and growing others’ expertise. Adding to the workload only upsets this balance – and upsets the very people we are asking to “share.”

People generally help other people, not a faceless organization. But people often like recognition for their help. A simple use of Web 2.0 tools helps accomplish both of these, while permitting the organization to capture the expertise of its people in a useful fashion. A real win-win.

Technorati Tags: ,

The harder stuff

hard by kevindooley
Getting Web 2.0 right: The hard stuff vs. the harder stuff:
[Via O’Reilly Radar]
Here begins a typical success story:

I had a powerful conversation recently in Europe with one of the top executives of a major industrial company. They have 100K+ employees in over 50 countries. When he joined five years ago their business was struggling and in need of major transformation; their stock was at two dollars a share, they had ethics issues and product quality problems – you name the malady, they were suffering from it…

Fast forward to 2008 and now they are one of the most extraordinary success stories in Europe – stock is over $28 a share, great profits, growing operations, well regarded in the business community etc. When you fly through a European airport they are everywhere.

I asked him how they were able to turn such a large, multinational ship around.

He told me most executives talk about “the hard stuff” vs. “the soft stuff”. Their focus for success in the organization is on the hard stuff – finance, technology, manufacturing, R&D, Sales – where the money is to be found, where costs savings are to be made. The soft stuff – leadership, culture, change and implementation – is there in rhetoric but not in reality (e.g., “people are our most important resource”). But the truth is that it is not the “hard stuff” vs. the “soft stuff”, but the hard stuff vs. the harder stuff. And it is this “harder stuff” that drives both revenues and profits by making or breaking a decision, leading a project to a successful conclusion – or not, and allowing for effective collaboration within a business unit or an organization – or not. He told me it was a consistent focus on the harder stuff that allowed them to turn their company around.

The harder stuff is that way because there are few ways to measure it by many current approaches. How can one demonstrate that collaboration was critical? How much collaboration was needed for success? Who needed to be involved that was not?

One of the magic aspects of Web 2.0 technologies is that they often provide just these sorts of data. When used well, they explicitly illustrate what was involved in a collaboration, who is important to make sure is involved and who is holding things up.

This is an apt description of the problems we face in bringing Web 2.0 into the enterprise. Web 2.0 is a game changer – it holds the potential to turbo-charge back office functions, foster collaboration and transform every business unit in the enterprise. Yet the resistance occurs when it comes down to implementing Web 2.0 because it represents a series of shifts that challenge traditional business culture and models of leadership. How often have I heard the knee-jerk reaction, “we can’t let our customers talk to each other” or “we don’t share our data” or “we are going to upgrade to a new platform – we are on a three year plan to get it done” (I keep a list of these reactions so please help me add to it). If developing a web 2.0 strategy is the hard stuff – moving that strategy forward is the harder stuff – and the bigger the company I work with – the harder the harder stuff is.

They need to understand that companies using these technologies will be ahead in 2 major ways: their employees will be more productive and innovative; and, they will have better metrics to enhance the process and make it work even better.

Technorati Tags: ,

Comments are a conversation

conversation by cliff1066

Dan Schwabel’s 5 Free Tools For Reputation Management introduced me to a new listening tool, backtype. It solves the problem of monitoring blog
comments where people specifically mention you. People can make comments about you on other blogs and if you only track links from blog posts, you won’t see it. BackType lets you find, follow and share comments from across the
web. I gave it whirl and it turned up some interesting results.

You can also track other bloggers and see where they commented — I might do this only to study how the masters do it. An old trick is to observe people who do social media really well and learn from observation. It’s interesting to observe Chris Brogan’s commenting activity.

Update: Based on a comment to this post, I’m adding some context to comment trackers.

These services let you track conversations that are important to your organization and issue. They also allow content creators to aggregate their online activity and expertise from across the social Web into one centralized, portable profile.

Questions To Ask Before You Dive In:

What do you need track?
How will you respond to negative
comments?
Will you respond to all comments?
How to prioritize?
Which tool is right for you?

Why Commenting and Comment Tracking Is Important

Commenting is the life blood of blogging and key to building a community
They’re a way to get more minds into the story.
They’re a way to annotate someone’s thoughts such that the ideas can take on another dimension.
They’re a way to establish authority in your content niche

[More]

Blogs are useful by themselves to the individual blogger. But, when comments are added, it allows others to become part of the conversation and to leave connections into their own networks. Comments enhance the power of blogs.

However, keeping track of comments at series of blogs can be difficult. Comments are not usually included in the RSS newsfeeds for the sites. The tools described here help provide a solution. This now makes it easier to follow conversations that are important to you, even if they re not directly on a blog.

Enhancing conversations is what Web 2.0 is all about.

Technorati Tags: ,

A wakeup lawsuit

tracks by *Micky
Zotero facing a lawsuit:
[Via Bench Marks]

I’ve written about Zotero before, it’s an intriguing tool, essentially a Firefox plug-in for managing your reference list and other pieces of information. It’s a bit of a hybrid between online management tools like Connotea and things like Papers which you store on your own computer.

The bad news is that Thomson Reuters, the manufacturers of EndNote, are suing George Mason University and the Commonwealth of Virginia because a new version of Zotero lets you take your EndNote reference lists and convert them for use in Zotero. Yes, this is the same Thomson of Thomson ISI, secret gatekeepers of journal impact factors. They really seem to be going out of their way to lose what little goodwill they have left with the scientific community. It will be interesting to see if this reverse engineering for interoperability holds up in court as something that should be prevented.

This is sadly typical. I loved EndNote back in the 90s because it was a great Mac product. Much better for my needs than its competition, Reference Manager, which was much more of a Windows product. Niles Software really listened to what people wanted and added some very useful features, such as linkage of the library to a Word document. Then you could put the citation directly into Word.

I convinced others at my company to buy it. I had searches for a wide variety of topics. The purchase of Niles Software by ISI (now part of Thomson) started a period of fitful Mac updates and costly upgrades. I have since moved to other applications (most recently Sente) that did what I wanted for a more reasonable price.

This lawsuit seems like a losing gambit to me since any user can convert their library to Endnote XML that any other application can read. All it will do is drive users away from their software as the customers find new uses for the data.

Because the database really belongs to the user not to Thomson. But they try to obscure that by using a proprietary format. This hurts the enduser. Say I have an 10 year old EndNote library I forgot to convert. With say 8000 entries. And an old version of EndNote that no longer works in OS X. How am I supposed to move it to what I currently use without having to purchase EndNote simply for this one use? My favorite example of this horrendous process is the Mac cookbook program Mangia!

In the early 90s, this was the best program of its type, bar none. It permitted one to have a huge recipe library, that could be easily displayed, searched and also permitted an easy grocery list to be printed. Many of us love it but it does not work in OS X.

Mangia is no longer produced by anyone. The database created was proprietary and undocumented. The program had no export feature. Now it no longer even runs on any computer. So every user now has a database that they created that is unusable. There are workarounds to try and get to the data but they are not satisfactory. They also require the user to be able to run Mangia, which is really impossible for virtually everyone using a Mac today (I think my mother kept an old Mac around just so she could still use this one program. She has a huge library of recipes).

So all I have on my computer is a dead database of Mangia recipes that can never be used again. All that work over years to create a database and it is useless. This is why people need to be careful when they chose a database application.

Companies that respond to enduser innovation by suing, rather than innovating, are not ones that I see being very successful in the long run. There are other programs that can do the same thing. They are often created by companies that are more responsive and user friendly than larger companies. Suing users just drives people to more open formats.

More from Bench Marks:

More importantly, it’s yet again, a lesson in tying yourself to one locked-down proprietary format for your data and your work tools. If you’ve put a huge amount of time and effort into maintaining your EndNote list and a better tool comes along and becomes the standard, all that work may go to waste and you’ll have to start over again. A similar lesson was learned last week from anyone who purchased music downloads from WalMart. Richard Stallman recently gave a warning along the same lines about the much-hyped concept of “cloud computing”.
As you experiment with new online tools for your research, heed these lessons well. Demand tools that support open standards and open formats, tools where if you put in an effort (and most of these tools demand a lot of effort), you can get that work out again so you don’t have to repeat it for the next tool you try. Further discussion here and here.

This gets at the same topic. Who owns the data? There are some very important and useful aspects to having data in the cloud. It makes it very easy for people to access their data from everywhere. Small groups can have a slew of Web 2.0 applications up and running for their group with little cost for maintenance or upkeep. This has some very real benefits.

But it must be balanced against the possibility that you no longer control the data. Your work is on servers belonging to someone else. They can change ownership and all of a sudden the cloud is not so free. To me, cloud computing is great for things that need rapid prototyping, easy access and are, at heart, ephemeral.

There are many types of data. Some of it is short term. It used to only be found on yellow sheets of paper or perhaps the multiple drafts of a paper. These data fit quite well in the cloud. I have an email address in the cloud that I only use for online purchases. Anything going there is a result of those purchases and does not clog up my real email.

But it is foolhardy for any organization to put the guts of its data anywhere that it has does not have absolute control over. These are things that losing access to would have severe ramifications for the business.

So, echoing David, stay away from anything that ties you into a specific, closed format. It can come back to bite you big time.

Technorati Tags: ,