Category Archives: Science

Successful failure

visions by Jordi Armengol (Xip)

Clarity of vision:
[Via business|bytes|genes|molecules]

Be stubborn on vision and flexible on details
– Jeff Bezos

Those words, which I heard recently, have stuck in my head (or rather in Evernote, as I typed them on my iPhone furiously as I heard them).

Over the years, I have seen too many in the life science industry, even pharma in recent years, lurch around, almost trying to figure out what they need to be doing as companies as they go along. That’s why so many fail. Let’s say you are running a small biotech or bioinformatics shop. You need to be sure what your vision is, identify the actionable milestones that you need to achieve and then figure out what you need to do to hit each milestone. It’s not just for a company. If you’re a product manager, think about your product line, and so on. The times I have been successful were times where I had a clear vision about where I wanted to be, and then figured out a path (stress on “a path”) to get there. When you’re too reactive, it just doesn’t work.

Having worked at a couple of Biotechs I know some of the pitfalls. Vision is great and actionable milestones are a must but in biology both are usually based on extremely limited knowledge of very complex information.

Enbrel is a great example. Originally developed to fight septic shock, it passed every milestone but one. It failed in clinical trials, to have an ameliorative effect. But, rather than toss it away, Immunex was able to rework it into a premier rheumatoid arthritis drug. Flexible on the details.

Part of the real problem is that many businesses feel that the details always have to be right, that the company will only succeed if it always succeeds on the details. While this might be true, it is also impossible, especially in something as complex as biology.

Effective companies take the approach Immunex did. We wanted to kill projects as quickly as possible, or at least put them on the back burner. Three times a year, all the projects were reviewed by scientific management with possible participation by all the members of Discovery Research, whether they had a Ph.D. or not. Based on the manpower available and the limited resources we had projects were usually given a priority from say 1 to 3.

Ones were hot and every one wanted to work on them. Twos could go either way and were also wroth working on. Threes were back burner. The key was to have limited resources along with a lot of possible projects. There was always something important to work on if your project moved to 3. But, with judicious use of time and resources, a back burner project could be resurrected.

This is because it was still possible to work on a back burner project. One just had to be able to justify the time. Or propose a bake-off to finally demonstrate which approach would be best. Actually, many important projects came out of some of the skunkworks projects. But a lot of projects died a quick merciful death by this high level of vetting.

Flexible means working for a successful failure. That can be the best win of all.

Technorati Tags: ,

Discussing Web 2.0

boat by notsogoodphotography
Are scientists missing the boat?;.:
[Via Bench Marks]

….or has that boat already sailed?

I’ve read many a blog posting or magazine article declaring that scientists are behind the curve, and we biologists have been slow to pick up the new online tools that are available. I’ve repeatedly asked for examples of other professions that are ahead of the curve that we can use as models (are there social networks of bakers sharing recipes and discussing ovens?), but haven’t seen much offered in response. I tend to think that it’s not a question of scientists being slow, it’s that the tools being offered aren’t very appealing. Note how quickly scientists moved from paper journals to online versions, which only took as long as it did because of the slow progress on the part of journal publishers getting their articles up on the web. The advantages of online journals were obvious, and in comparison, the advantages of joining “Myspace for scientists” are less evident.

Are social networks )”Meet collaborators! Discuss papers!”) ever going to see heavy use from the biology community? Or are we starting to see that they’ve run their course in general, and scientists were prescient in not wasting their time?
[More]


There are too many advantages that arise from using many of these Web 2.0 tools (i.e. the ability to leverage human social networks in order to examine large datasets). However, the race will not be to have 5000 friends, as often seen out in the wild.

In a closed environment, such as a corporation, there are some very good uses for wikis, blogs, etc. They can not only help workflow tremendously but also can allow new metrics to be used in order to track just who contributed what to a project.

Moving tacit information from insides someone’s head outside into an explicit database will have important consequences for many organizations.

I don’t think the next generation will shun these tools. They will just have a better idea of how to interact with them more usefully, with a focus that can really help their workflow.

Technorati Tags: ,

More discussion

communication by dalbera
Is Science Being Distorted?:
[Via The Scholarly Kitchen]

A recent PLoS Medicine article claims that information economics distort science. But maybe it’s an obsession with journals distorting the views of the authors.
[More]

As I said earlier, I thought there would be some interesting discussions. I guess one way of looking at it is that all the ‘good’ data gets published in the prestigious journals and there is nowhere for the ‘bad’ data to be published until after a clinical trial fails. Whether this is a real bias problem is difficult to assess.

I think any problem is due more to the complexity of human health studies than any bias but we have to keep moving forward as best as we can.

Technorati Tags:

A new approach to publishing

type by Marcin Wichary
An experiment in open access publishing:
[Via Bench Marks]

The new edition of Essentials of Glycobiology, ” the largest, most authoritative volume available on the structure, synthesis, and biology of glycans (sugar chains), molecules that coat cell surfaces and proteins and play important roles in many normal and disease processes” came out yesterday. What’s particularly interesting about this edition is that it is simultaneously being released online in a freely accessible version, which will hopefully allow the textbook to reach a wider audience.

The theory often espoused is that online release of books leads to higher sales of the print edition, and for us, this is a good test case. Quoting from the press release, John Inglis, Executive Director and Publisher of CSHL Press notes that,

“We will be tracking its usage and how readers of the site respond to the availability of a print version, for both research and teaching purposes.”

“This is an innovative development in the distribution of an established textbook that we hope will benefit readers, authors and editors, and the publisher,” says Ajit Varki, M.D., the book’s executive editor and a leader of the Consortium of Glycobiology Editors, which initiated the project. Varki is Professor at the University of California, San Diego. The Consortium also includes Professors Richard Cummings, Emory University; Jeffrey Esko, UC San Diego; Hudson Freeze, Burnham Institute for Medical Research; Pamela Stanley, Albert Einstein College of Medicine, New York; Carolyn Bertozzi, UC Berkeley; Gerald Hart, Johns Hopkins University School of Medicine; and Marilynn Etzler, UC Davis.

The online edition of Essentials of Glycobiology can be found here, and the print version can be ordered here.

This is a very interesting experiment. I knw that there are books I want to have to be able to access important data when I am not online, usually when I am writing. Being online can be distracting then.

But sometimes when I am online, I want a quick fact. Then finding them in an authoritative source is really important. I personally think that this sort of dual use could be very productive. It has been successful for some fiction works.

I too will be looking to see how well this works.

Technorati Tags: , ,

The Winner’s Curse

trophies by Snap®
Current Biomedical Publication System: A Distorted View of the Reality of Scientific Data?:
[Via Scholarship 2.0: An Idea Whose Time Has Come]

Why Current Publication Practices May Distort Science

Young NS, Ioannidis JPA, Al-Ubaydli O

PLoS Medicine Vol. 5, No. 10, e201 / October 7 2008

[doi:10.1371/journal.pmed.0050201]

Summary

The current system of publication in biomedical research provides a distorted view of the reality of scientific data that are generated in the laboratory and clinic. This system can be studied by applying principles from the field of economics. The “winner’s curse,” a more general statement of publication bias, suggests that the small proportion of results chosen for publication are unrepresentative of scientists’ repeated samplings of the real world.

The self-correcting mechanism in science is retarded by the extreme imbalance between the abundance of supply (the output of basic science laboratories and clinical investigations) and the increasingly limited venues for publication (journals with sufficiently high impact). This system would be expected intrinsically to lead to the misallocation of resources. The scarcity of available outlets is artificial, based on the costs of printing in an electronic age and a belief that selectivity is equivalent to quality.

Science is subject to great uncertainty: we cannot be confident now which efforts will ultimately yield worthwhile achievements. However, the current system abdicates to a small number of intermediates an authoritative prescience to anticipate a highly unpredictable future. In considering society’s expectations and our own goals as scientists, we believe that there is a moral imperative to reconsider how scientific data are judged and disseminated.

Full Text Available At:


[http://medicine.plosjournals.org/perlserv/?request=get-document&doi=10.1371/journal.pmed.0050201]
[More]

There are also several other article links at Scholarship 2.0 that are important to read. In particular, there is a discussion of the Winner’s Curse This is an observation that the winner of an auction in which all the bidders have similar information will usually overbid. The idea of the Winner’s Curse was first observed by oil companies bidding on offshore leases.

These authors make the point that publication in scientific journals may also suffer from a bias that resembles the Winner’s Curse. The Winner in an auction presents a price beyond the mean in order to succeed. The authors argue that in a similar way, papers often present data beyond the mean in order to get published.

It is an intriguing speculation and one that might deserve further examination. The data that do get published may be misleading and this may be a reason why early published clinical trials are often not replicated later.

And they make the point that the huge amount of data being generated has not seen a corresponding increase in the places to publish the data or conclusions based on the data. This introduces greater likelihood that what is actually published does not represent the ‘real value’ of the data.

I expect this work to produce some interesting discussions. I would be surprised if it is entirely true but it does propose some changes that might be worth implementing.

Technorati Tags: ,

Very, very fast

speeding by NathanFromDeVryEET
Your personal health: Free means education:
[Via business|bytes|genes|molecules]

One more post today. Russ Altman talks about how the cost of genotyping is asymptoting to free
(eerie parallels to Chris Anderson’s Free). The post comes on the heels of the launch of Complete Genomics and that they will enable a $5000 genome in 2010.
But that’s not the part that I want to talk about. It’s another part of the post

We must work to educate every day people about how to interpret their genome and how to use the information beneficially. Trying to protect them in a paternalistic manner is going to fail, I fear–because it will be too easy to get the information cheaply. So solutions that rely on government regulations or mandatory insertion of a physician in the process of ordering genetic sequencing (don’t get me wrong, I love physicians–I am one) are not going to work.

Given past writings on this subject, no surprise, I agree wholeheartedly. We need to educate, and educate fast.

There are many things that will be changed when sequencing a genome becomes almost as fast in reality as it is on CSI. Without proper education about this, people will be open to really bad science.

The best approach may not be the government or physicians, who have no stake in really explaining things. Perhaps there will be trusted genome advocates whose job is to carefully explain to someone what their genome really means, in a way that truly educates them. And provide quarterly updates of new information for them.

Technorati Tags: ,

A generational war

 1288 1215052987 96482B2F3B by kentbye
Social Media vs. Knowledge Management: A Generational War:
[Via Enterprise 2.0 Blog]

You’d think Knowledge Management (KM), that venerable IT-based social engineering discipline which came up with evocative phrases like “community of practice,” “expertise locater,” and “knowledge capture,” would be in the vanguard of the 2.0 revolution. You’d be wrong. Inside organizations and at industry fora today, every other conversation around social media (SM) and Enterprise 2.0 seems to turn into a thinly-veiled skirmish within an industry-wide KM-SM shadow war. I suppose I must be a little dense, because it took not one, not two, but three separate incidents before I realized there was a war on. Here’s what’s going on: KM and SM look very similar on the surface, but are actually radically different at multiple levels, both cultural and technical, and are locked in an undeclared cultural war for the soul of Enterprise 2.0. And the most hilarious part is that most of the combatants don’t even realize they are in a war. They think they are loosely-aligned and working towards the same ends, with some minor differences of emphasis. So let me tell you about this war and how it is shaping up. Hint: I have credible neutral “war correspondent” status because I was born in 1974.

[More]

A very clear post that describes the conflict between Boomer and Millennial thinking when it comes to dealing with large amounts of data. Knowledge management (Boomer) is a top-down put the data in the proper bin sort of approach. There are names for each bin and everything needs to fit in the correct one.

Social media (Millennial) uses human social networks in a bottom-up approach that allows the data to determine where it should go. Any bin that it should go into is an emergent property of the network created by the community.

Read the whole post for a nice dissection of what is happening in this War. Just remember that Age is not as important as attitude. There are Boomers who get social media and Millennials who do not.

I think it is that one personality wants things to be black and white (the data is in a database on THIS computer) white the other deals great with shades of gray (the data is in the cloud and not really anyplace).

I did my post-doc in a chemistry lab, the only biologist. I saw something very valuable. Chemistry is very process-driven. The purpose of a process is to reproduce success. If a process, say a particular chemical synthesis, did not work, as in the yield was 10% instead of 90%, it was not the fault of the process. The reagents were bad or the investigator was incompetent. But the process was still valid.

So chemistry selected for people who were very process-driven, wanted things very tightly controlled and well defined.

Biology has a very different regard for process. The same process (say the cloning of a gene) can be done on two different days and get different results (10 colonies of cells one day; 500 the next). Biology is really too complex to be able to control everything. A lot of things can go wrong and it can be really easy to fool oneself with results.

So biology, particularly at the cutting edge, selects for people who can filter out extraneous bits of data, can be comfortable with conditional results and with the general anarchy that can occur. Every molecular biologist has experienced the dreaded ‘everything stops working, so I have to remake every buffer, order new reagents and spend a month trying to figure out what happened, knowing that things will start working again for no real reason.’

Chemists in my post-doc lab hated biology because of the large variance in results, compared to chemistry. Biologists are often happy to be within an order of magnitude of expected results

One way of thinking has to know whether Schrodinger’s cat is dead or alive, while the other is comfortable with knowing it is simultaneously dead and alive.

Biology needs the Millenial approach because it is creating data at too fast a pace to put it all into bins. Social networks can help tremendously with the filters needed to find knowledge in the huge amount of data.

Technorati Tags: ,

More information

pills by blmurch
Magical Thinking:
[Via FasterCures]
Margaret Anderson, COO, FasterCures

I appreciated the message of Carol Diamond and Clay Shirky’s recent piece in the August 2008 Health Affairs titled “Health Information Technology: A Few Years of Magical Thinking?” In it they say that “proponents of health IT must resist “magical thinking,” such as the notion that isolated work on technology will transform our broken system.” It’s interesting to think about systems change at the front end, and how easy it is to get stars in our eyes about how things like health IT or personalized medicine will transform the world as we know it, and how all of our problems will then magically go away.

The article discusses how it might be easier to implement IT in health if the whole system is redone, rather than bolting on IT. IT will not fix the problems without key changes in how medicine is practiced.

A press release discusses some of their points.

Diamond and Shirky propose an alternative route to using health IT to help transform the U.S. health system. “This alternative approach would focus on a minimal set of standards at first,” they say, and would make utility for the user and improved health outcomes, rather than vendor agreement, the key criteria.

Diamond and Shirky’s alternative approach “would mean working simultaneously on removing other obstacles while concentrating on those standards necessary for sharing the information, however formatted in the short term, to flow between willing and authorized participants. Finally, it would require clear policy statements that will guide the design of technology.”

Sounds like a bottom up approach with the end user driving the technology, rather than health vendors. More from Margaret Anderson:

Cell phones, email, and the Internet have certainly transformed things in ways we couldn’t have imagined, but they’ve introduced problems we couldn’t have imagined. Technologies such as FAX machines have been leapfrogged over. Problems such as the overabundance of information, and the speed of information flow are here to stay it seems. In the case of health IT, FasterCures sees it as a vital bridge to the future of more rapid information collection, characterization, and analysis which could speed our time to cures.

But there needs to be careful attention to the fact that too much information, particularly in the health field, can make it much harder to make accurate decision. eventually we will get the complexity of the system under control but in the meantime, there will be some problems. Faster Cures is examining them.

We are working on a white paper for the U.S. Department of Health and Human Services about educating and building awareness among consumers about personalized healthcare. This is another area where we must resist “magical thinking” and get down to brass tacks. Too often, the discussion about personalized medicine has been at a 30,000 foot level. For this paper, we’ve talked to many patient advocacy and disease research groups and everyone holds their breath about the potential power that these technologies may hold for their disease areas. They all want more targeted therapies with fewer side effects, which is ultimately the promise of personalized medicine. But they also recognize its complexities. It needs to take into account the world of co-morbidities we all live in; even if baby boomers are out running marathons and eating their greens and blueberries, the reality is that many of us are living with many conditions and diseases, not just one. It will probably raise costs before it can lower them. It’s unlikely many diseases will yield to the relatively easy HER2-Herceptin gene-to-drug relationship. Patients are likely to get much more information about their genetic makeup than they can act on in the near-term.

Health care is still too complex in most cases. The real magical thinking comes in the form of so many fraudulent ‘cures’ that have plagued mankind for thousands of years. Perhaps as we really get IT involved in health, we can begin to gain a fuller understanding of what causes disease and how to attempt a cure.

Technorati Tags: , ,

Marketing for research

atomium by txd
Attention, science and money:
[Via business|bytes|genes|molecules]

Interesting observation by Kevin Kelly. He says

Where ever attention flows, money will follow

To some extent, that’s somewhat obvious. Peter Drucker, whom I admire a lot, said the following

Marketing and innovation produce results; all the rest are costs

Part of the problem with many corporations that commercialize science and technology is that they only focus on the marketing and not the innovation. I remember being told by a higher up that marketing made money – For every dollar we spend on Marketing, we get $3 back. But he told me that research cost money, money that was never directly recouped.

There are good metrics for marketing, not so much for innovation. Yet without the latter the former has nothing to do.

Attention can be driven by many mechanisms, marketing being the most effective one. The key is gaining sufficient mindshare, which is often accompanies by a flow of capital. In science, the money follows topics of research that have mindshare. Similarly people fund companies in areas that generate mindshare for whatever reason.

The question I often ask myself, both from my time as a marketer and as someone interested in science communication, is how can we bring more mindshare to some of our efforts and science in general. What does money flow mean? Is it just research funding? Is it investment in such concepts as “bursty work”? Take something else Kelly writes

New things that don’t work or serve no purpose are quickly weeded out of the system. But the fact that something does work or is helpful is no longer sufficient for success.

Part of the problem is that many researchers feel the data should speak for itself. They fail to realize that gaining mindshare or convincing people requires social interactions. It is a very rare thing that requires no further work in order to sell itself.

We all realize that nothing in science is this way. That is, when we deal with each other, we realize that further experimentation is required to convince us of a new innovation. Few things just emerge from Zeus’ head. we know the process to market to our peers – publications, conferences and seminars.

But the idea of doing something similar to get innovations out to non-scientists is not on an researcher’s radar screen. We don’t have enough time for that. Perhaps just a recognition that there is a process people go through to adopt an innovation and the attempt to facilitate some of those steps would go a long way.

I have written about the lack of marketing in science (stealing shamelessly from Larry Page). It’s critical that we do a better job of highlighting the power of our activities and learn some marketing tricks along the way. No I am not talking about the in your face stuff that gives marketing a bad name, but about the kinds of activities that maintain that attention, and get people to notice. The good news, many of us already do that, perhaps without even realizing it. It’s still niche awareness, but I have a feeling that we are close to actually crossing the hump and bringing some of our activities into the mainstream.

KK link via Michael Nielsen

Marketing is really just convincing people to make a change in their life, to adopt an innovation. It may have a bad odor in science (because ads make people want things that they do not really need) but marketing is really what everyone does who truly wants to compete for mindshare.

We just need to do it in a way that supports research while helping others through the process of adopting innovations.

Technorati Tags: ,

Browsing for researchers

I use a RSS reader and read feeds because it is part of my writing process. Lately, my RSS reading habits have changed. I haven’t given up on it completely, but my process has changed. My feeds are organized into folders and the folders ordered by priority. Like a farmer tending his crops, I’d scan through each folder, each feed, bookmarking and annotating what caught my eye, and looking for patterns and connections. This scan, capture, analyze patterns, and write a blog post is a part of my routine.

It still is, but I now use other methods for scanning. It’s more like hanging out in a village square or a pub — conversations, news, and resources come to me. I’m finding new links and posts either through twitter, comments on my blog post, or through people who have linked to me.

So, it’s like I have a left brain, orderly, linear way to scan and a right brain, wildly creative way to scan.

RSS and newsreaders present an incredible set of tools to filter through a lot of information very rapidly. It is like you are directly hooked into to a diverse group of communities in real time. You can see how different items spread through a linked community and drive communication.

And the orderly vs crazy approaches to connecting help one’s own creativity and innovation by interacting with our tacit information, producing the opportunity to alert other communities.

I like how Chris Brogan describes his reading goals.

1. Reading what friends write.
2. Reading about the “new marketing” industry and the tech industry (fishbowl).
3. Reading what people recommend.
4. Reading off the wall stuff that inspires new thoughts (outside the bowl).

This sounds very much like an early adopter, who has connection outside to other media outlets, but uses trusted insiders to decide what things to use.

Michele Martin wrote a post summarizing a paper titled How Knowledge Workers Use the Web and pulls out some the classifications referenced in the paper. My RSS reading is mostly information gathering or browsing.

Finding–Looking for something specific, such as an answer to a specific question.
Information gathering–Less specific than finding, this is research that’s focused on a particular goal that’s broader-based than simply getting a specific piece of information.
Browsing–Visiting personal or professional sites with no specific goal in mind other than to “stay up-to-date” or be entertained.
Transacting–Using the web to execute a transaction, such as banking or shopping.
Communicating–Participating in chat rooms or forums (remember–this was done in 2002, prior to Facebook and the explosive growth of blogs, etc.)
Housekeeping–Using the web to check or maintain the accuracy and functionality of web-based resources, such as looking for dead links, cleaning up outdated information, etc.

One of the major aspects of scientific research and innovation comes from browsing, from reading about something not directly related to a specific problem but which may provide valuable insight for the problem. This used to be relatively easy by doing things like sitting in the library once a week going through the table of contents of all the journals that came in that week, carefully writing down the bibliographic information on note cards, so they could be examined later at leisure.

Serendipity could raise its head. But the Internet made searching so much easier. So too many scientists spend their time on the first step, finding. This is, of course, very important but you will really only find what you are looking for. Serendipity is reduced.

A personal example. Many years ago, I was working on inducing protein production in E. coli from specific gene segments. We typically did this by shifting the temperature, which resulted in the inactivation of a repressor and the expression of the gene.

However, for large scale production (think 1000s of liters) this was not a tenable solution. It was really impossible to raise the temperature of the vessel quick enough to make it a viable solution.

I happened to be reading the Table of Contents of the Journal of Bacteriology and saw a paper which discussed some of the biological effects on the bacteria when the pH of the media was shifted to a more acidic condition. I recognized some of the bacterial proteins involved as being similar to the repressor we used.

So I went out and did some experiments and determined that by dropping the pH, large amounts of the specific protein could be produced. Dropping some acid in a large vessel and stirring quickly can rapidly expose all the cells to the same conditions and induce protein production.

But it could also be done under some different conditions, resulting in up to 15 times more recombinant protein being produced.

So, for me, the really important aspect of RSS/newsreaders is bringing browsing back. Every journal has newsfeeds now. I can typically go through several thousand titles in an hour, bookmark the ones I want to examine later and even post the links to a blog, where I can add comments.

My blog becomes my online note card file for interesting articles.

Technorati Tags: ,