Archive for the ‘Science’ Category
It happens ever so rarely: an otherwise trifling experience early – something seen, heard or read – makes the entire day.
“Science is a lot like sex. Sometimes something useful comes of it, but that’s not the reason we’re doing it.”
…..is awarded for discovery of gene-knockout technologies in mice. Recipients are Drs Mario R. Capecchi (University of Utah), Martin J. Evans (Cardiff University ) and Oliver Smithies (UNC – my former stomping ground!).
This is a discovery that would have won the award sooner or later. Genetically modified mice are now an indispensable part of research into gene function and regulation (more details on the methods and the research here). What is interesting that gene regulation techniques have won the prize on two consecutive years and the gene knockout, an older discovery, won after RNA interference.
An article by Ajit Balakrishnan (CEO of Rediff) in Business Standard has kicked up bit of a storm in the desi blog-circles. The original article talks about difficulties Mr Balakishnan encountered while attempting to start an industry-academia collaboration with a professor of computer science at the Indian Institute of Technology, Mumbai (IITB), Soumen Chakrabarti.
Landing the Soumen catch turned out to be the easy part. Getting to engage IIT Bombay in a commercial relationship was to be a near-impossible task. The process for such an engagement is unchartered territory for Indian academic institutions. We settled on a compromise: we hired two of his star graduate students (or more accurately he persuaded them to join us instead of doing what all their classmates did—emigrate to America). Since then, we have been happily working together; whenever we run into a really tough computer science problem, we could get to Soumen through his students.
Response to the article has been typical. At the libertarian/free market-supporting Indian Economy Blog, Karthik held this up as an example of the failure of academic bureaucracy, lack of incentives and such. In a swift rebuttal, defenders of the anti-free market bastion, Krish and Abi, strongly supported IITB’s policies on collaboration and even laid the blame squarely on Balakrishnan.
IMHO, there is much sweeping generalizations and unsubstantiated statements on both sides. So herein, I try to play my usual role of finding the middle ground.
Firstly, a brief aside on academia-market relationship in the capitalist mecca of US. Even there, from my experience, the academia is usually suspicious of industry (though they usually don’t mind the money that comes from collaborating with it, or the free dinners!). There is something in the mentality of academicians that looks down heavily on business – PhD’s who have moved to the industry are said – in a jocular manner, of course – to have gone to the ‘dark side’ (perhaps the huge disparity in salaries versus educational levels has something to with it – but for now we shall avoid those murky waters). Recently, however, most US universities have setup an office or a whole department devoted to technology commercialization and intellectual property management of their academic research output. Some universities have even started to offer post-graduate courses on it (interestingly, Australia seems to have been way ahead of the game with the university I am with now having started such a unit in the early 80s). The goal of such units is to discover or even create commercialization opportunities for the research in the university laboratories. Universities have been quick to discover this as a good source of revenue at a time of funding crunches from the government. But it is also an important function for these units to ensure that the industry does not take undue advantage of a researcher’s hard work and perhaps prevent recurrence of the University of Rochester’s Cox-2 patent loss to Pfizer.
Coming back to the Indian context, while the regular university does not have much of a research output to speak of, the IITs, especially in disciplines such as Computer Science, Electrical/Electronics etc, have for some years conducted successful research collaborations with the industry. This has happened both at the individual level (ie a Professor acting as a consultant to some industry) or at an institutional levels (an industry providing support for research programs) and also both with private and public sectors (e.g see this for IIT-Delhi). Also, a quick look at the IITB website will show that there are two umbrella organizations, SINE and E-cell that serve as business incubators to foster entrepreneurship (perhaps our diplomatic ex-IITB-ian friend can further enlighten us on the practice of start-ups branching off from research at IITs). Therefore, Ajit Balakrishnan’s or Karthik’s assertion of “process for such an engagement is unchartered territory for Indian academic institutions” sounds rather extreme.
However, the devil is in the details – so even as IITB has a system in place for consultancy (e.g see these informal guidelines by Soumen Chakraborti himself), without first hand knowledge it is difficult to presume how efficient such a system is. One could speculate that for Mr Balakrishnan, going through the IIT system would have taken a few months time, which is eternity in the world of business. In that case, it was certainly a ‘near-impossible’ task for him to get his project kick-started (especially given that it relates to the cutting edge field of web search engines). It is possible too that IIT’s current system is more geared for collaborations with big companies, not for small start-ups. Given the quickly changing technology and economic landscapes, IITs (and other universities too) need to more nimble in responding to such needs. Therefore, there is certainly scope for improvement in bringing academia and industry together through offices of technology commercialization on the lines of US or Australia described above.
Finally, Krish’s childishly written ad-hominies against ‘free market fundamentalists’ merits a more detailed fisking. While I do not have the time or patience to rant in details, let me say that technology eventually needs to come out of the laboratory into the marketplace. There are academics who pursue knowledge for the sake of knowledge (and certainly we need such people and government support for their research), anybody who makes a blank (not to mention extremely naive) statement like “academicians are against the patent system” is living in a fool’s paradise. Most human beings are slaves to incentives, and I cannot believe that providing an additional incentive will harm scientific or technological progress.
(Thanks to Rohit for original article H/T)
Not the one that you eat, you glutton ..today celebrates the mathematical symbol Pi. Only in the American system of writing dates though.
[Cross-posted from here. ]
In an interview with New York Times, Harvard Business School professor Clayton Christensen offers some radical proposals to improve the US health care system (link via).
Prof Christensen believes that the current system is woefully inefficient in terms of affordability and accessibility. His basic premise is that with advances in medical diagnoses and cures, treatment of certain diseases should be more widely available rather than being restricted to a handful of trained professionals (ie doctors) and institutions (hospitals).
The whole interview is worth reading, but here are some parts that I found intriguing.
Q. The nation’s medical system is regularly offering increasingly advanced procedures and treatments. Isn’t that a good thing?
A. If you look at the progress that today’s hospitals and the medical profession have made, they continue to push the leading edge of what’s very difficult to do. But that’s a very different dimension of performance improvement than the one that makes more people better off, and that is making it affordable and accessible. In other industries, whenever affordability and accessibility have come, it has not come from making mainframe computers better but rather from commoditizing mainframes so that average people with average money can have access to high-quality computing, meaning personal computers. It came from disruptive technology rather than improvements on the existing system. Michael Dell could assemble one of these things in his dorm room.
Q. What’s the relevance to health care?
A. In health care, rather than replicating the expensive expertise of Mount Sinai Medical Center or Mass General Hospital or replicating the expensive expertise of doctors, we have to commoditize their expertise. That comes through the precise ability to diagnose the diseases that people have. Our ability to diagnose the diseases is moving ahead at a breathtaking pace, but regulation and reimbursement are trapping the delivery of rules-based medicine in high-cost business models.
Q. Are you saying doctors rather than the pharmaceutical industry are the root cause of what’s gone wrong?
A. The pharmaceutical industry has been focused on therapy, not diagnosis. The medical profession has simply accepted that many of these diseases are well-diagnosed, when in fact they aren’t. As a consequence, we haven’t moved the health care profession into a world where nurses can provide diagnosis and care. Regulation is keeping the treatment in expensive hospitals when in fact much lower cost-delivery models are available.
Q. Wouldn’t your solution require a dramatically different regulatory environment?
A. It differs state by state. In Massachusetts, nurses cannot write prescriptions. But in Minnesota, nurse practitioners can. So there has emerged in Minnesota a clinic called the MinuteClinic. These clinics operate in Target stores and CVS drugstores. They are staffed only by nurse practitioners. There’s a big sign on the door that says, “We treat these 16 rules-based disorders.” They include strep throat, pink eye, urinary tract infection, earaches and sinus infections.
These are things for which very unambiguous, “go, no-go” tests exist. You’re in and out in 15 minutes or it’s free, and it’s a $39 flat fee. These things are just booming because high-quality health care at that level is defined by convenience and accessibility. That’s a commoditization of the expertise. To have those same disorders treated in Massachusetts, you’ve got to go to a regular doctor, go through a long wait in their office, you go in and see the doctor for two minutes. He says, “You have an earache,” which you knew already, and then they charge you $150.
Having gone through the experience of being offered an appointment three weeks in future for a current cold symptom, and then having to wait hours in various rooms to see a doctor for ten minutes, the idea of such small clinics are particularly welcome (there is actually one being built alongside our neighborhood Eckerd).
My main question is whether medical science is really at that stage when a majority of diagnosis can be confidently prescribed by such ‘go, no-go’ metrics ?
The professor also talks about the problem of having non-integrated players in the health care system:
The current health care system is divided into buckets. You have the insurers, the employers who put up the money, the providers such as doctors and nurses, and the hospitals. Because they exist as independent companies, they can each improve themselves, but they can’t re-architect the system in the way that it needs to be changed.
There are two health care systems in the West, Intermountain Health Care in Utah and Kaiser Permanente in California, that are in fact integrated across each of those pieces of the system. They are far ahead of the rest of the world in bringing rules-based diagnosis and therapy in cost-effective business models to their patients.
Of course, all these changes cannot happen without a strong will from the government to overcome the regulatory framework of the current system.
The government will be the hardest because a lot of the regulations that require that care be given by people with particular expertise and in expensive hospitals were put in place during a prior era when the science was not really as well-defined. The regulations just haven’t kept up with the science.
The blog itself is a really cool attempt by several really smart people (yes it does make you wonder why they let me in) to convey the beauty of science and technology in layman’s term. Pliss to check it out regularly. We are also hoping to bring some perspectives to day-to-day scientific news.
….goes to Roger Kornberg of Stanford University for "for his studies of the molecular basis of eukaryotic transcription".
(Note: A slightly expanded version of this post is now up at Desicritics.org)
Those who are familiar with the field of Molecular Biology might hesitate (as I did early this morning) for a second and think – ‘hasn’t this guy already won a Nobel ‘? But that would be the father, Arthur Kornberg (also at Stanford), who won the Physiology and Medicine award in 1959 for research in a very connected filed, discovering the how the basic information molecules of life, DNA and RNA1 are synthesized and assembled. The Nobel Prize press release mentions
Forty-seven years ago, the then twelve-year-old Roger Kornberg came to Stockholm to see his father, Arthur Kornberg, receive the Nobel Prize in Physiology or Medicine (1959)
Talk about pressure, huh ! Other then the Braggs back in 1915, I can’t remember of any other case where father and son both have received Nobel Prizes.
The mention of William and Lawrence Bragg is appropriate in this context since their research, almost a century ago, on the technique of elucidating molecular structures by bombarding crystals with X-ray lights has enabled Roger Kornberg to investigate the mechanism of how genetic information is decoded (of course, the field of X-ray crystallography has become incredibly sophisticated since 1915!). As most people know, long strands of molecules called DNA store the genetic information of life. But the actual physical work-horses in the cell are proteins. The flow of information from DNA to proteins is through intermediates called messenger RNAs (the Central Dogma of Life). Kornberg’s contribution, in very simple terms, was to obtain very detailed ‘snapshots’ of the molecular processes involved in the creation of the RNA molecules from DNA. This has shed light on how this mechanism, called ‘transcription’, is regulated in cells.
It is interesting to note that the relevant scientific papers based on which the Academy awarded this prize were published in 2001. This is a relatively quick turnover for the Nobel committee, which usually waits many years for the research to be validated and its impact to be appropriately judged before rewarding it. For example, although Watson and Crick described the double helix structure of DNA in 1953 – they got the Nobel only in 1962. Additonally, it is rare (and even rarer in modern times) for a single person to be awarded the full prize. Both are indicative of the importance of this work.
Also interesting to note that both the Physiology/Medicine and the Chemistry Awards for this year are related to the regulation of genetic information in cells.
UPDATE: I feel very stupid now for having said this: "I can’t remember of any other case where father and son both have received Nobel Prizes.". Should have done a better research. Two other cases, that of Neils Bohr/Aage Bohr and JJ Thompson/George Thompson should have jumped to my mind – since I read about them back in my high-school days. Especially the Thompson’s – since father got the Nobel for discovering electrons while son got it for showing the wave nature of electrons ! And additionally, as pointed out by Ruchira in the comments, the Curies made it pretty much a family tradition ! Full list, along with other fun facts can be found here. Bad form from a trivia geek like me to forget these facts.
1: DNA = deoxyribose nulceic acid; RNA = Ribose nucleic acid
….was announced today and the winners are, quite appropriately in my humble opinion, Andrew Z Fire and Craig C. Mello for their discovery of the mechanism of "RNA interference – gene silencing by double-stranded RNA".
It is always very exciting when a scientific technique that you are familiar with and is routinely used in your line of work obtains the ultimate recognition. In the short (in scientific terms) eight years since the discovery of its mechanism, RNA interference has turned out to be a great boon for basic biomedical research. The ability to silence a particular gene within a cell helps immensely in understanding the effects of the gene on particular biochemical pathways of the cell. There is also the potential for clinical application in future.
Perhaps most of you know about this, but still worth posting: Anousheh Ansari, the first woman ‘space tourist’ (also the first Iranian and a host of other ‘firsts’) visiting the International Space Station, is blogging about her experiences
It is a nice a cozy feeling. As you may know, the station makes an complete orbit every 90 minutes, so when I talk about night don’t think of it as night on Earth when it is dark outside. The sun rises and sets during each orbit and you can watch 32 beautiful sunrises and sunsets over the course of the day.
That’s ubercool !
Blogging from space – now that just sets the standards for blogging pretty high. Technically it isn’t live-blogging since she doesn’t have internet access – as she mentioned to Google’s Larry Page over a phone call - so she is probably e-mailing her posts. Still it is quite impressive and provides interesting glimpses into life in the space station.
Wired magazine has an article on how the traditional ‘peer-review’ process for publishing scholarly scientific article is undergoing a gradual change, both through pressure from as well as to maximize the benefits of a digitally connected age.
“Peer review was brilliant when distribution was a problem and you had to be selective about what you could publish,” says Chris Surridge, managing editor of the online interdisciplinary journal PLoS ONE. But the Web has remapped the universe of scientific publishing – and as a result, peer review may finally get fixed.
The proof: In June, Nature began experimenting with a new method online. Authors submitting papers can choose a two-track process. While the work goes through the usual peer review drill, a preprint version gets posted on the Web. Anyone – even you – can comment, as long as you attach your name, affiliation, and email address. As of July, 25 articles had undergone this process, and the journal plans to issue a report late this year on how the test went. (Full disclosure: Wired editor in chief Chris Anderson participated in the project.) “The whole point of peer review is to help the editors select papers that are going to move science forward,” says Linda Miller, US executive editor of Nature and the Nature research journals (Nature Biotechnology, Nature Genetics, et cetera). “If there’s a better way, then why not? How could I say no?”
In other quarters, traditional peer review has already been abandoned. Physicists and mathematicians today mainly communicate via a Web site called arXiv. (The X is supposed to be the Greek letter chi; it’s pronounced “archive.” If you were a physicist, you’d find that hilarious.) Since 1991, arXiv has been allowing researchers to post prepublication papers for their colleagues to read. The online journal Biology Direct publishes any article for which the author can find three members of its editorial board to write reviews. (The journal also posts the reviews – author names attached.) And when PLoS ONE launches later this year, the papers on its site will have been evaluated only for technical merit – do the work right and acceptance is guaranteed. “Data becomes useful only if it’s shared,” Surridge says. “At the moment, our mechanisms for sharing information are the traditional journals, and if they’re hard to get into, data is completely lost.”
All these are certainly exciting new ways of sharing your data. At the same time, posting your data online in fact is really not very different from presenting your work at conferences in forms of seminars or posters, with the added advantage of being cost-effective from time and money point of view. Additionally, you are getting a much wider audience than you would at a conference and you may get to pick the brains of scientists from remote corners of the world. This could also potentially lead to better chances of discovering collaborators and fewer incidences of different groups repeating the same work, thereby optimizing research output.
Unfortunately, as the article points out, old habits are difficult to change. The major problem with this system will be the question of how to evaluate the value of a publication. An academic scientist is pretty much judged by his/her publication record – both quality and quantity. As any graduate student or post-doc knows, a first author Nature, Science or Cell paper is pretty much a sure-shot ticket to tenure-track position. The reason for that is the exclusiveness of the journals. An open review may be viewed as a dilution of the value of the final product.
But seriously: Who cares? An up-and-coming researcher can get more attention from the right experts by publishing something earthshaking on arXiv than by pushing it through the usual channels. Crazy ideas will get batted around in moderated forums, which is pretty much what the Internet is for. Eventually, printed journal articles will be quaint artifacts. Scientific papers will be living documents with data published on Web pages – commented on, linked to, and mirrored by labs doing the same work 6,000 miles away. Every research effort will have thousands of reviewers working in real time. Today’s undergrads have never thought about the world any differently – they’ve never functioned without IM and Wikipedia and arXiv, and they’re going to demand different kinds of review for different kinds of papers. It’s in their nature.
Eventually, I think the article is a bit too optimistic about the future of peer-reiew. But then ten years ago, I never really thought that going to the library to get a copy of a journal paper would become an antiquated practice (you get almost all papers in pdf format from journal websites nowadays) .
Lazy bum that I am – I love this ‘link to funnier/more interesting stuff with short comments’ system.
Anyway, got this in the e-mail from a colleague. Anybody who has struggled to get regular high-school/college course-related laboratory physics (or any other science) experiments to work using antiquated, partially functioning instruments will definitely sympathise with this. It starts off with:
Abstract: The exponential dependence of resistivity on temperature in germanium is found to be a great big lie. My careful theoretical modeling and painstaking experimentation reveal 1) that my equipment is crap, as are all the available texts on the subject and 2) that this whole exercise was a complete waste of my time.
….and gets funnier. This graph particularly, is priceless:
Reminds me of a much simpler experiment in high-school physics lab where we were trying to ‘prove’ Ohm’s Law using a particularly temperamental voltmeter. No matter what the actual current/voltage was, we could make the instrument show whatever value we wanted by tapping various parts of it.