The Winner’s Curse

trophies by Snap®
Current Biomedical Publication System: A Distorted View of the Reality of Scientific Data?:
[Via Scholarship 2.0: An Idea Whose Time Has Come]

Why Current Publication Practices May Distort Science

Young NS, Ioannidis JPA, Al-Ubaydli O

PLoS Medicine Vol. 5, No. 10, e201 / October 7 2008

[doi:10.1371/journal.pmed.0050201]

Summary

The current system of publication in biomedical research provides a distorted view of the reality of scientific data that are generated in the laboratory and clinic. This system can be studied by applying principles from the field of economics. The “winner’s curse,” a more general statement of publication bias, suggests that the small proportion of results chosen for publication are unrepresentative of scientists’ repeated samplings of the real world.

The self-correcting mechanism in science is retarded by the extreme imbalance between the abundance of supply (the output of basic science laboratories and clinical investigations) and the increasingly limited venues for publication (journals with sufficiently high impact). This system would be expected intrinsically to lead to the misallocation of resources. The scarcity of available outlets is artificial, based on the costs of printing in an electronic age and a belief that selectivity is equivalent to quality.

Science is subject to great uncertainty: we cannot be confident now which efforts will ultimately yield worthwhile achievements. However, the current system abdicates to a small number of intermediates an authoritative prescience to anticipate a highly unpredictable future. In considering society’s expectations and our own goals as scientists, we believe that there is a moral imperative to reconsider how scientific data are judged and disseminated.

Full Text Available At:


[http://medicine.plosjournals.org/perlserv/?request=get-document&doi=10.1371/journal.pmed.0050201]
[More]

There are also several other article links at Scholarship 2.0 that are important to read. In particular, there is a discussion of the Winner’s Curse This is an observation that the winner of an auction in which all the bidders have similar information will usually overbid. The idea of the Winner’s Curse was first observed by oil companies bidding on offshore leases.

These authors make the point that publication in scientific journals may also suffer from a bias that resembles the Winner’s Curse. The Winner in an auction presents a price beyond the mean in order to succeed. The authors argue that in a similar way, papers often present data beyond the mean in order to get published.

It is an intriguing speculation and one that might deserve further examination. The data that do get published may be misleading and this may be a reason why early published clinical trials are often not replicated later.

And they make the point that the huge amount of data being generated has not seen a corresponding increase in the places to publish the data or conclusions based on the data. This introduces greater likelihood that what is actually published does not represent the ‘real value’ of the data.

I expect this work to produce some interesting discussions. I would be surprised if it is entirely true but it does propose some changes that might be worth implementing.

Technorati Tags: ,

Context wanted

 1097 942345473 C28E764925 by jared
Wanted: Too much news:
[Via Buzzworthy]

Information overload is something people generally try to avoid. But when there’s actual earth-shaking news like the current financial crisis, many people actively seek it out. The New York Times looks at the latest iteration of this phenomenon:
[More]

The key change in today’s world from one even 5 years ago is that finding information is easy today. It used to be that trained professionals, with years of experience were needed to track down important information. They needed to examine lots of journals, catalog the information in arcane ways and generally learn where things were. Now almost anyone can do it.

It used to require huge amounts of money to create a new song or a new video. Not anymore. Technology changed that. Same with finding facts. The problem is that there are too many facts, too much data. Converting this to information and data require context.

Context is what is now desperately needed. Information used to be hard to retrieve and the context was usually supplied during the retrieval process. Now retrieval is easy but context is hard. Context can only come from social interactions provided by other humans. This is the power of human social networks and social media.

They provide context, if properly used.

Technorati Tags: ,

Very, very fast

speeding by NathanFromDeVryEET
Your personal health: Free means education:
[Via business|bytes|genes|molecules]

One more post today. Russ Altman talks about how the cost of genotyping is asymptoting to free
(eerie parallels to Chris Anderson’s Free). The post comes on the heels of the launch of Complete Genomics and that they will enable a $5000 genome in 2010.
But that’s not the part that I want to talk about. It’s another part of the post

We must work to educate every day people about how to interpret their genome and how to use the information beneficially. Trying to protect them in a paternalistic manner is going to fail, I fear–because it will be too easy to get the information cheaply. So solutions that rely on government regulations or mandatory insertion of a physician in the process of ordering genetic sequencing (don’t get me wrong, I love physicians–I am one) are not going to work.

Given past writings on this subject, no surprise, I agree wholeheartedly. We need to educate, and educate fast.

There are many things that will be changed when sequencing a genome becomes almost as fast in reality as it is on CSI. Without proper education about this, people will be open to really bad science.

The best approach may not be the government or physicians, who have no stake in really explaining things. Perhaps there will be trusted genome advocates whose job is to carefully explain to someone what their genome really means, in a way that truly educates them. And provide quarterly updates of new information for them.

Technorati Tags: ,

NIce point

Jay Cross on Twitter:
[Via Gurteen Knowledge-Log]
By David Gurteen

Do you still not understand all the fuss over Twitter. Took me a while also but the penny has recently dropped for Jay Cross. See what he has to say.

Twitter provides an instant, real-time connection to the people you want to be connected to.

Credit:: Jay Cross

It is often hard to see how a new innovation can be used, how to frame it into one’s workflow. Twitter may not be there for everybody but ideas like this are helpful.

Technorati Tags: ,

Sharing can be hard and easy

sharing by Mr. Kris
Connected – Why is it so hard to get smart people to share?:
[Via Knowledge Jolt with Jack]

I came across Why is it so hard to get smart people to share? from Gia Lyons via a mention on the actKM mailing list. She covers some of the common downsides to attempting brain dumps from experts. Her notes reflect many of the conversations on this topic.

“There is a brigade charge underway to capture the wisdom (knowledge + experience) of the retiring corporate crowd. The urgency is perhaps driven by the fact that these “wisdom holders” will retire, then turn around and charge their former employers a hefty consulting fee for continuing their services. Not a bad gig if you can get it. But, those who have tried the knowledge management (KM) thing in the past will tell you that this harnessing, leveraging, capturing, harvesting – pick your favorite over-used word – is a hard row to hoe. And for the record, please do not try to harness or harvest my knowledge. I am not a horse, nor a corn crop.”

Back before knowledge management was a business term, expert systems work included the Knowledge Engineer role (and still does). This person was responsible for developing appropriate representations of the body of knowledge in the expert system. And quite often this included interviewing the experts to try to elucidate their rules and expertise: knowledge harvesting. While it works okay, there were always elements that either could not be discovered or could not be articulated by these means. As a result, expert systems never quite got to the point of perfection predicted by early proponents. And there was always some unsettling aspect of using expert systems alone that made people shy away.

Part of the problem is that the individual really sees no advantage to helping create such a system for the vague ‘group.’ People help other people but it takes a special kind of process to do a lot of work purely for the group without any recognition for performing the task.

Too many of these KM programs do not really take into account human needs and nature. The best way to move the tacit information of the expert into the explicit world of the community is to demonstrate to the expert why it will help save them time and give them more time to spend on what they want to do. This tacit-explicit transformation of information works best when it is an emergent property of a person’s workflow and not the primary reason.

The trap, I think, is in thinking that KM (or any other knowledge discipline) is only about writing things down. This trap is easy to fall into when the focus of the discussion is on the technology, rather than on people and process.

Back to basics then. There are experts within your business, and that expertise is all over the map from arcane technical topics to customer experts to company lore experts. They are employed because their expertise supports the business at some level.

Experts do a lot of things in the context of their work. They apply their expertise to solving business problems, whether in the lab or in the board room. They spend time honing their expertise: talking with people, attending conferences, reading, doing blue-sky experiments, etc. They also respond to questions and requests related to this expertise. (Experts do a lot of non-expert things in the company too — including learning from other experts.)

Most corporate experts hate repeating the same things over and over again. By putting an FAQ about their area of expertise up on a wiki, they can easily remove those sorts of interruptions and concentrate on problems that require their focus. Thus, there is a tacit-explictit transformation but it obviously helps the expert and will provide more time for them to use their expertise where it can really help.

And, as an added bonus, there are now metrics to demonstrate how important the expert is. Simply observing how often the FAQ is accessed and by whom will be valuable. It will be possible to compare the importance of different experts in the community and thus making it easier to reward them as well as making it easier for the experts to demonstrate their usefulness.

So, when it comes to the experts, what do we want them to do? All of these things – in the right balance at the right times. Do we really want them to spend time being interviewed by knowledge engineers or writing down what they know outside of any context? Why not facilitate their current work? I don’t think these projects should get in the way of their work by forcing them into artificial “harvesting” situations.

There are many directions to go from here. For example, maybe the mentees should be writing down what they learn, instead of asking the expert to take on the entire burden. If the knowledge transfer job is necessary, then it has to be in the context of work happening now or in recollection of how a particular project ran. At least then there is some context around which to hang expertise.

There is a balance of sorts between being the expert, becoming a better expert, and growing others’ expertise. Adding to the workload only upsets this balance – and upsets the very people we are asking to “share.”

People generally help other people, not a faceless organization. But people often like recognition for their help. A simple use of Web 2.0 tools helps accomplish both of these, while permitting the organization to capture the expertise of its people in a useful fashion. A real win-win.

Technorati Tags: ,

The harder stuff

hard by kevindooley
Getting Web 2.0 right: The hard stuff vs. the harder stuff:
[Via O’Reilly Radar]
Here begins a typical success story:

I had a powerful conversation recently in Europe with one of the top executives of a major industrial company. They have 100K+ employees in over 50 countries. When he joined five years ago their business was struggling and in need of major transformation; their stock was at two dollars a share, they had ethics issues and product quality problems – you name the malady, they were suffering from it…

Fast forward to 2008 and now they are one of the most extraordinary success stories in Europe – stock is over $28 a share, great profits, growing operations, well regarded in the business community etc. When you fly through a European airport they are everywhere.

I asked him how they were able to turn such a large, multinational ship around.

He told me most executives talk about “the hard stuff” vs. “the soft stuff”. Their focus for success in the organization is on the hard stuff – finance, technology, manufacturing, R&D, Sales – where the money is to be found, where costs savings are to be made. The soft stuff – leadership, culture, change and implementation – is there in rhetoric but not in reality (e.g., “people are our most important resource”). But the truth is that it is not the “hard stuff” vs. the “soft stuff”, but the hard stuff vs. the harder stuff. And it is this “harder stuff” that drives both revenues and profits by making or breaking a decision, leading a project to a successful conclusion – or not, and allowing for effective collaboration within a business unit or an organization – or not. He told me it was a consistent focus on the harder stuff that allowed them to turn their company around.

The harder stuff is that way because there are few ways to measure it by many current approaches. How can one demonstrate that collaboration was critical? How much collaboration was needed for success? Who needed to be involved that was not?

One of the magic aspects of Web 2.0 technologies is that they often provide just these sorts of data. When used well, they explicitly illustrate what was involved in a collaboration, who is important to make sure is involved and who is holding things up.

This is an apt description of the problems we face in bringing Web 2.0 into the enterprise. Web 2.0 is a game changer – it holds the potential to turbo-charge back office functions, foster collaboration and transform every business unit in the enterprise. Yet the resistance occurs when it comes down to implementing Web 2.0 because it represents a series of shifts that challenge traditional business culture and models of leadership. How often have I heard the knee-jerk reaction, “we can’t let our customers talk to each other” or “we don’t share our data” or “we are going to upgrade to a new platform – we are on a three year plan to get it done” (I keep a list of these reactions so please help me add to it). If developing a web 2.0 strategy is the hard stuff – moving that strategy forward is the harder stuff – and the bigger the company I work with – the harder the harder stuff is.

They need to understand that companies using these technologies will be ahead in 2 major ways: their employees will be more productive and innovative; and, they will have better metrics to enhance the process and make it work even better.

Technorati Tags: ,

Comments are a conversation

conversation by cliff1066

Dan Schwabel’s 5 Free Tools For Reputation Management introduced me to a new listening tool, backtype. It solves the problem of monitoring blog
comments where people specifically mention you. People can make comments about you on other blogs and if you only track links from blog posts, you won’t see it. BackType lets you find, follow and share comments from across the
web. I gave it whirl and it turned up some interesting results.

You can also track other bloggers and see where they commented — I might do this only to study how the masters do it. An old trick is to observe people who do social media really well and learn from observation. It’s interesting to observe Chris Brogan’s commenting activity.

Update: Based on a comment to this post, I’m adding some context to comment trackers.

These services let you track conversations that are important to your organization and issue. They also allow content creators to aggregate their online activity and expertise from across the social Web into one centralized, portable profile.

Questions To Ask Before You Dive In:

What do you need track?
How will you respond to negative
comments?
Will you respond to all comments?
How to prioritize?
Which tool is right for you?

Why Commenting and Comment Tracking Is Important

Commenting is the life blood of blogging and key to building a community
They’re a way to get more minds into the story.
They’re a way to annotate someone’s thoughts such that the ideas can take on another dimension.
They’re a way to establish authority in your content niche

[More]

Blogs are useful by themselves to the individual blogger. But, when comments are added, it allows others to become part of the conversation and to leave connections into their own networks. Comments enhance the power of blogs.

However, keeping track of comments at series of blogs can be difficult. Comments are not usually included in the RSS newsfeeds for the sites. The tools described here help provide a solution. This now makes it easier to follow conversations that are important to you, even if they re not directly on a blog.

Enhancing conversations is what Web 2.0 is all about.

Technorati Tags: ,

A wakeup lawsuit

tracks by *Micky
Zotero facing a lawsuit:
[Via Bench Marks]

I’ve written about Zotero before, it’s an intriguing tool, essentially a Firefox plug-in for managing your reference list and other pieces of information. It’s a bit of a hybrid between online management tools like Connotea and things like Papers which you store on your own computer.

The bad news is that Thomson Reuters, the manufacturers of EndNote, are suing George Mason University and the Commonwealth of Virginia because a new version of Zotero lets you take your EndNote reference lists and convert them for use in Zotero. Yes, this is the same Thomson of Thomson ISI, secret gatekeepers of journal impact factors. They really seem to be going out of their way to lose what little goodwill they have left with the scientific community. It will be interesting to see if this reverse engineering for interoperability holds up in court as something that should be prevented.

This is sadly typical. I loved EndNote back in the 90s because it was a great Mac product. Much better for my needs than its competition, Reference Manager, which was much more of a Windows product. Niles Software really listened to what people wanted and added some very useful features, such as linkage of the library to a Word document. Then you could put the citation directly into Word.

I convinced others at my company to buy it. I had searches for a wide variety of topics. The purchase of Niles Software by ISI (now part of Thomson) started a period of fitful Mac updates and costly upgrades. I have since moved to other applications (most recently Sente) that did what I wanted for a more reasonable price.

This lawsuit seems like a losing gambit to me since any user can convert their library to Endnote XML that any other application can read. All it will do is drive users away from their software as the customers find new uses for the data.

Because the database really belongs to the user not to Thomson. But they try to obscure that by using a proprietary format. This hurts the enduser. Say I have an 10 year old EndNote library I forgot to convert. With say 8000 entries. And an old version of EndNote that no longer works in OS X. How am I supposed to move it to what I currently use without having to purchase EndNote simply for this one use? My favorite example of this horrendous process is the Mac cookbook program Mangia!

In the early 90s, this was the best program of its type, bar none. It permitted one to have a huge recipe library, that could be easily displayed, searched and also permitted an easy grocery list to be printed. Many of us love it but it does not work in OS X.

Mangia is no longer produced by anyone. The database created was proprietary and undocumented. The program had no export feature. Now it no longer even runs on any computer. So every user now has a database that they created that is unusable. There are workarounds to try and get to the data but they are not satisfactory. They also require the user to be able to run Mangia, which is really impossible for virtually everyone using a Mac today (I think my mother kept an old Mac around just so she could still use this one program. She has a huge library of recipes).

So all I have on my computer is a dead database of Mangia recipes that can never be used again. All that work over years to create a database and it is useless. This is why people need to be careful when they chose a database application.

Companies that respond to enduser innovation by suing, rather than innovating, are not ones that I see being very successful in the long run. There are other programs that can do the same thing. They are often created by companies that are more responsive and user friendly than larger companies. Suing users just drives people to more open formats.

More from Bench Marks:

More importantly, it’s yet again, a lesson in tying yourself to one locked-down proprietary format for your data and your work tools. If you’ve put a huge amount of time and effort into maintaining your EndNote list and a better tool comes along and becomes the standard, all that work may go to waste and you’ll have to start over again. A similar lesson was learned last week from anyone who purchased music downloads from WalMart. Richard Stallman recently gave a warning along the same lines about the much-hyped concept of “cloud computing”.
As you experiment with new online tools for your research, heed these lessons well. Demand tools that support open standards and open formats, tools where if you put in an effort (and most of these tools demand a lot of effort), you can get that work out again so you don’t have to repeat it for the next tool you try. Further discussion here and here.

This gets at the same topic. Who owns the data? There are some very important and useful aspects to having data in the cloud. It makes it very easy for people to access their data from everywhere. Small groups can have a slew of Web 2.0 applications up and running for their group with little cost for maintenance or upkeep. This has some very real benefits.

But it must be balanced against the possibility that you no longer control the data. Your work is on servers belonging to someone else. They can change ownership and all of a sudden the cloud is not so free. To me, cloud computing is great for things that need rapid prototyping, easy access and are, at heart, ephemeral.

There are many types of data. Some of it is short term. It used to only be found on yellow sheets of paper or perhaps the multiple drafts of a paper. These data fit quite well in the cloud. I have an email address in the cloud that I only use for online purchases. Anything going there is a result of those purchases and does not clog up my real email.

But it is foolhardy for any organization to put the guts of its data anywhere that it has does not have absolute control over. These are things that losing access to would have severe ramifications for the business.

So, echoing David, stay away from anything that ties you into a specific, closed format. It can come back to bite you big time.

Technorati Tags: ,

A generational war

 1288 1215052987 96482B2F3B by kentbye
Social Media vs. Knowledge Management: A Generational War:
[Via Enterprise 2.0 Blog]

You’d think Knowledge Management (KM), that venerable IT-based social engineering discipline which came up with evocative phrases like “community of practice,” “expertise locater,” and “knowledge capture,” would be in the vanguard of the 2.0 revolution. You’d be wrong. Inside organizations and at industry fora today, every other conversation around social media (SM) and Enterprise 2.0 seems to turn into a thinly-veiled skirmish within an industry-wide KM-SM shadow war. I suppose I must be a little dense, because it took not one, not two, but three separate incidents before I realized there was a war on. Here’s what’s going on: KM and SM look very similar on the surface, but are actually radically different at multiple levels, both cultural and technical, and are locked in an undeclared cultural war for the soul of Enterprise 2.0. And the most hilarious part is that most of the combatants don’t even realize they are in a war. They think they are loosely-aligned and working towards the same ends, with some minor differences of emphasis. So let me tell you about this war and how it is shaping up. Hint: I have credible neutral “war correspondent” status because I was born in 1974.

[More]

A very clear post that describes the conflict between Boomer and Millennial thinking when it comes to dealing with large amounts of data. Knowledge management (Boomer) is a top-down put the data in the proper bin sort of approach. There are names for each bin and everything needs to fit in the correct one.

Social media (Millennial) uses human social networks in a bottom-up approach that allows the data to determine where it should go. Any bin that it should go into is an emergent property of the network created by the community.

Read the whole post for a nice dissection of what is happening in this War. Just remember that Age is not as important as attitude. There are Boomers who get social media and Millennials who do not.

I think it is that one personality wants things to be black and white (the data is in a database on THIS computer) white the other deals great with shades of gray (the data is in the cloud and not really anyplace).

I did my post-doc in a chemistry lab, the only biologist. I saw something very valuable. Chemistry is very process-driven. The purpose of a process is to reproduce success. If a process, say a particular chemical synthesis, did not work, as in the yield was 10% instead of 90%, it was not the fault of the process. The reagents were bad or the investigator was incompetent. But the process was still valid.

So chemistry selected for people who were very process-driven, wanted things very tightly controlled and well defined.

Biology has a very different regard for process. The same process (say the cloning of a gene) can be done on two different days and get different results (10 colonies of cells one day; 500 the next). Biology is really too complex to be able to control everything. A lot of things can go wrong and it can be really easy to fool oneself with results.

So biology, particularly at the cutting edge, selects for people who can filter out extraneous bits of data, can be comfortable with conditional results and with the general anarchy that can occur. Every molecular biologist has experienced the dreaded ‘everything stops working, so I have to remake every buffer, order new reagents and spend a month trying to figure out what happened, knowing that things will start working again for no real reason.’

Chemists in my post-doc lab hated biology because of the large variance in results, compared to chemistry. Biologists are often happy to be within an order of magnitude of expected results

One way of thinking has to know whether Schrodinger’s cat is dead or alive, while the other is comfortable with knowing it is simultaneously dead and alive.

Biology needs the Millenial approach because it is creating data at too fast a pace to put it all into bins. Social networks can help tremendously with the filters needed to find knowledge in the huge amount of data.

Technorati Tags: ,

More information

pills by blmurch
Magical Thinking:
[Via FasterCures]
Margaret Anderson, COO, FasterCures

I appreciated the message of Carol Diamond and Clay Shirky’s recent piece in the August 2008 Health Affairs titled “Health Information Technology: A Few Years of Magical Thinking?” In it they say that “proponents of health IT must resist “magical thinking,” such as the notion that isolated work on technology will transform our broken system.” It’s interesting to think about systems change at the front end, and how easy it is to get stars in our eyes about how things like health IT or personalized medicine will transform the world as we know it, and how all of our problems will then magically go away.

The article discusses how it might be easier to implement IT in health if the whole system is redone, rather than bolting on IT. IT will not fix the problems without key changes in how medicine is practiced.

A press release discusses some of their points.

Diamond and Shirky propose an alternative route to using health IT to help transform the U.S. health system. “This alternative approach would focus on a minimal set of standards at first,” they say, and would make utility for the user and improved health outcomes, rather than vendor agreement, the key criteria.

Diamond and Shirky’s alternative approach “would mean working simultaneously on removing other obstacles while concentrating on those standards necessary for sharing the information, however formatted in the short term, to flow between willing and authorized participants. Finally, it would require clear policy statements that will guide the design of technology.”

Sounds like a bottom up approach with the end user driving the technology, rather than health vendors. More from Margaret Anderson:

Cell phones, email, and the Internet have certainly transformed things in ways we couldn’t have imagined, but they’ve introduced problems we couldn’t have imagined. Technologies such as FAX machines have been leapfrogged over. Problems such as the overabundance of information, and the speed of information flow are here to stay it seems. In the case of health IT, FasterCures sees it as a vital bridge to the future of more rapid information collection, characterization, and analysis which could speed our time to cures.

But there needs to be careful attention to the fact that too much information, particularly in the health field, can make it much harder to make accurate decision. eventually we will get the complexity of the system under control but in the meantime, there will be some problems. Faster Cures is examining them.

We are working on a white paper for the U.S. Department of Health and Human Services about educating and building awareness among consumers about personalized healthcare. This is another area where we must resist “magical thinking” and get down to brass tacks. Too often, the discussion about personalized medicine has been at a 30,000 foot level. For this paper, we’ve talked to many patient advocacy and disease research groups and everyone holds their breath about the potential power that these technologies may hold for their disease areas. They all want more targeted therapies with fewer side effects, which is ultimately the promise of personalized medicine. But they also recognize its complexities. It needs to take into account the world of co-morbidities we all live in; even if baby boomers are out running marathons and eating their greens and blueberries, the reality is that many of us are living with many conditions and diseases, not just one. It will probably raise costs before it can lower them. It’s unlikely many diseases will yield to the relatively easy HER2-Herceptin gene-to-drug relationship. Patients are likely to get much more information about their genetic makeup than they can act on in the near-term.

Health care is still too complex in most cases. The real magical thinking comes in the form of so many fraudulent ‘cures’ that have plagued mankind for thousands of years. Perhaps as we really get IT involved in health, we can begin to gain a fuller understanding of what causes disease and how to attempt a cure.

Technorati Tags: , ,