I’m a clinician, but I’m actually also a translational scientist. It’s not uncommon for those of us in medicine involved in some combination of basic and clinical research to argue about exactly what that means. The idea is translational science is supposed to be the process of “translating” basic science discoveries into the laboratory into medicine, be it in the form of drugs, treatments, surgical procedures, laboratory tests, diagnostic tests, or anything else that physicians use to diagnose and treat human disease. Trying to straddle the two worlds, to turn discoveries in basic science into usable medicine, is more difficult than it sounds. Many are the examples of promising discoveries that appeared as though they should have led to useful medical treatments or tests, but, for whatever reason, didn’t work when attempted in humans.
Of course, if there’s one thing that the NIH and other funding agencies have been emphasizing, it’s been “translational research,” or, as I like to call it, translation über alles. Here’s the problem. If you don’t basic science discoveries to translate, then translational science becomes problematic, virtually impossible even. Translational research depends upon a pipeline of basic science discoveries to form the basis for translational scientists to use a the starting point for developing new treatments and tests. Indeed, like many others who appreciate this, I’ve been concerned that in recent years, particularly with tight budgets, the NIH has been overemphasizing translational research at the expense of basic research.
So it was with interest and disappointment that I read Matt Ridley’s latest op-ed in the Wall Street Journal entitled The Myth of Basic Science. In it, he tries to argue that it is innovation and technology that drive scientific discovery, not scientific discovery that drives technological breakthroughs. It’s a profoundly misguided argument that boils down to two central ideas. I’m quoting him directly because on Twitter he seems to be claiming that his critics are attacking straw men (they aren’t):
- “Most technological breakthroughs come from technologists tinkering, not from researchers chasing hypotheses. Heretical as it may sound, ‘basic science’ isn’t nearly as productive of new inventions as we tend to think.”
- “Governments cannot dictate either discovery or invention; they can only make sure that they don’t hinder it.” (Or, as he quotes others elsewhere, government funding of research is not particularly productive.)
He starts out with what struck me as one of the stranger straw men I’ve ever seen, namely that because there has been so much parallel technological discovery, technology is fast on the way to “developing the kind of autonomy that hitherto characterized biological entities.” (One can’t help but wonder if he’s been watching too many Terminator movies.) Here’s a taste:
Suppose Thomas Edison had died of an electric shock before thinking up the light bulb. Would history have been radically different? Of course not. No fewer than 23 people deserve the credit for inventing some version of the incandescent bulb before Edison, according to a history of the invention written by Robert Friedel, Paul Israel and Bernard Finn.
The same is true of other inventions. Elisha Gray and Alexander Graham Bell filed for a patent on the telephone on the very same day. By the time Google came along in 1996, there were already scores of search engines. As Kevin Kelly documents in his book “What Technology Wants,” we know of six different inventors of the thermometer, three of the hypodermic needle, four of vaccination, five of the electric telegraph, four of photography, five of the steamboat, six of the electric railroad. The history of inventions, writes the historian Alfred Kroeber, is “one endless chain of parallel instances.”
All of which is true. However, the relevance of this observation to basic science being a “myth” is tenuous at best. So where’s the straw man? It comes later in the article:
Politicians believe that innovation can be turned on and off like a tap: You start with pure scientific insights, which then get translated into applied science, which in turn become useful technology. So what you must do, as a patriotic legislator, is to ensure that there is a ready supply of money to scientists on the top floor of their ivory towers, and lo and behold, technology will come clanking out of the pipe at the bottom of the tower.
But, wait, you say. Isn’t that what I just said, that there must be a continual flow of new scientific discoveries to be translated into therapies (or technologies)? Yes, and no. First of all, no one in who knows anything about science believes that “innovation can be turned on and off like a tap” or that you can just throw money at basic scientists and expect technology to come “clanking out of the pipe at the bottom of” the ivory tower. The process is way more complicated than that. Basic science is hit or miss; you can’t predict what discoveries will or will not be translatable into something useful. In medicine, for instance, it’s virtually impossible to predict whether the discovery of, say, a given enzyme involved in cancer progression will be a useful drug target. Moreover, anyone who knows anything about basic science being translated into useful products knows that both kinds of science are important. You need the basic science as the grist for translational science; there must be a balanced approach. In the case of medicine (and because I’m medical researcher I naturally concentrate mostly on medical research), complaints about the NIH are not that it’s funding translational research but that its emphasis has become unbalanced.
Indeed, unwittingly, Ridley’s examples actually support this view. For the sake of argument, let’s not get into the weeds of whether technological advances are becoming akin to a self-sustaining, evolving system in which human beings a “just along for the ride,” as Ridley puts it, because for what I’m about to say it really doesn’t matter if that’s true or not. (Personally, I think Ridley’s view is exaggerated.) Think about why these various inventions were invented in parallel by so many people, and I bet you’ll see where I’m going with this.
What if the reason for parallel inventions was that the necessary prerequisite discoveries in basic science and engineering had been made, thus making those inventions finally possible? By the early 1800s, the basic physics for photography, for instance, had been around for centuries, dating all the way back to the pinhole camera and the camera obscura. Optics had been worked out for microscopes and telescopes. All that was required was a means of recording images, and that took chemistry, and a number of scientists and inventors were working on that, leading to the Daguerreotype and William Fox Talbot’s silver images on paper. Given that at the time a number of people were working on the problem of photography, it is not surprising that more than one discovered the chemistry that was needed to make photographs a reality.
Elsewhere, Ridley argues:
When you examine the history of innovation, you find, again and again, that scientific breakthroughs are the effect, not the cause, of technological change. It is no accident that astronomy blossomed in the wake of the age of exploration. The steam engine owed almost nothing to the science of thermodynamics, but the science of thermodynamics owed almost everything to the steam engine. The discovery of the structure of DNA depended heavily on X-ray crystallography of biological molecules, a technique developed in the wool industry to try to improve textiles.
This is a profound misunderstanding of how basic science is translated into useful products. For instance, it is true that there were steam engines before the laws of thermodynamics were worked out, and it’s true that the steam engine had a huge influence in the formalization of the laws of thermodynamics. Actually, one has to ask which steam engine Ridley means, given that rudimentary steam engines date back to the first century AD and there were several varieties of steam engines developed in the 17th century. I’m guessing that what he means is the Newcomen engine, developed in 1712. Or perhaps he means James Watt’s steam engine, patented in 1781, which was the precursor to the steam engines that powered ships and industry in the 19th century and beyond. Whichever steam engine he means, Ridley’s description glosses over thermodynamic research done before the steam engine, such as Boyle’s Law, which led to Denis Papin building a steam digester, which was a closed vessel with a tightly fitting lid that built up a high pressure of steam. (In fact, Papin worked closely with Boyle from 1676-1679 to develop the steam digester.) Papin later added a steam release valve that kept his machine from exploding. Watching the valve move up and down, he came up with the idea of a piston and cylinder engine but didn’t follow through with his design. That was left to the engineer Thomas Savery and Thomas Newcomen and then, decades later, James Watt. In the 19th century, the steam engine was an excellent tool that helped scientists formalize the laws of thermodynamics. Basically, discoveries in thermodynamics, such as Boyle’s Law, facilitated designing the steam digester and steam engine and later improving the steam engine. In turn, engineering improvements in the steam engine contributed to the understanding of thermodynamics during the 19th century.
Why did I go through all this? It’s because, even if, as Ridley states, there was a linear view of progress, of translation if you will, from basic science discoveries to products, be they medicines or the steam engine, that view is long gone. It is now understood that basic science drives the development of products and those products drive basic science. So, yes, elucidating the double helical structure of DNA was not possible until the development of X-ray crystallography. So what if X-ray crystallography was originally developed for the wool industry. If I go back another step, X-ray crystallography itself depends on the understanding of so much basic physics, that it couldn’t exist until after (1) X-rays were discovered and (2) diffraction patterns and X-ray scattering were understood. These all depended on discoveries taking place over a roughly 25 year period from 1895, when X-rays were discovered, to 1920, by which time the technique of X-ray crystallography had been validated on several crystals. Without the basic science of X-rays, diffraction, scattering, and crystallography, the structure of DNA wouldn’t have been elucidated more than three decades after the 1920s. All of this leaves aside all the basic science discoveries in genetics and biochemistry that led scientists to know that DNA is the basis of heredity and to know a fair amount about its chemical structure before X-Ray crystallography nailed it down. Even taking this view ignores all the science from genetics and biochemistry from the preceding decades that had identified DNA as the basis of heredity, determined its chemical constituents, and gone a long way towards teasing out hints of how DNA might encode information, all information without which the X-ray crystallographic structure would have meant little
As I said, you never know what basic science will discover, which basic science discoveries will lead to useful products, or ultimately what sorts of uses they will be put to.
Basically, Ridley is attacking a straw man version of basic science, and nothing in his article rebuts the “myth” of basic science, as the title calls it. Through it all, he seems not to understand the difference between R&D, which is “translational research,” research intended to result in a product or the improvement of a product, and research, which is, well, research with the intent of discovering new scientific knowledge. In computer companies, R&D might lead to computer chips. In a car company, R&D might lead to a more efficient engine that can be produced more cheaply. In basic science research, the goal is not nearly as defined, and scientists don’t necessarily know what they will find or where their investigation will take them. In any case, there is nothing contradictory about a bunch of inventors or engineers tinkering and producing inventions like the electric light together, because basic science and technology have to progress enough to produce the prerequisite understanding before such inventions become possible. When that happens, when the conditions are ripe for inventions like the telephone to be invented by several people, it’s because the basic science groundwork has been laid. It might have been laid decades ago and practical applications incrementally developed so that a specific invention becomes possible, or, as in the case with Papin working with Boyle to invent a steam engine, the basic science groundwork and practical application might progress rapidly hand-in-hand.
As I read Ridley’s op-ed, I kept asking myself: What, exactly is he getting at? Why did he choose this example. So what if technological progress happens simultaneously in many places by many people? So what if technology is like an biological species, evolving in response to whatever selective pressures there might be?
Ridley’s purpose becomes clear when he starts citing Terence Kealey, a biochemist turned economist. I had heard of Matt Ridley before, although not recently. He wrote one of my favorite science books, Genome: The Autobiography of a Species in 23 Chapters. However, besides that, I was not very familiar with him. Kealey, on the other hand, I had never heard of. So I Googled him, and I quickly learned that he is an adjunct scholar at the Cato Institute and is known for arguing that government money distorts the scientific enterprise. I also learned that he’s an anthropogenic global climate change denialist and even chaired the Global Warming Policy Foundation, which has been described as the “UK’s most prominent source of climate-change denial” and whose “review” of temperature records has been seriously criticized as incompetent and ideologically-driven.
OK, so Kealey is a climate change denialist, which casts his critical thinking skills with respect to science in great doubt, but maybe he knows economics:
For more than a half century, it has been an article of faith that science would not get funded if government did not do it, and economic growth would not happen if science did not get funded by the taxpayer. It was the economist Robert Solow who demonstrated in 1957 that innovation in technology was the source of most economic growth—at least in societies that were not expanding their territory or growing their populations. It was his colleagues Richard Nelson and Kenneth Arrow who explained in 1959 and 1962, respectively, that government funding of science was necessary, because it is cheaper to copy others than to do original research.
“The problem with the papers of Nelson and Arrow,” writes Mr. Kealey, “was that they were theoretical, and one or two troublesome souls, on peering out of their economists’ aeries, noted that in the real world, there did seem to be some privately funded research happening.” He argues that there is still no empirical demonstration of the need for public funding of research and that the historical record suggests the opposite.
This argument is strange. For one thing, no one that I’m aware of claims that “science would not get funded if the government didn’t do it and economic growth would not happen if science did not get funded by the taxpayer.” The question is what kinds of science that would and wouldn’t be funded by private sources. Overwhelmingly, the kinds of science funded by nongovernmental sources tend to be R&D (e.g., pharmaceutical or technology companies doing research and inventors that can be directly translated into a product) or philanthropy-funded research, research funded by private charitable organizations (e.g. The Susan G. Komen Foundation, The March of Dimes, or other philanthropic organizations that fund research on a specific topic). So, yes, research would be funded without the government. It would tend to be much more “translational” and/or targeted at specific problems.
There’s also this dubious assertion by Kealey, cited approvingly by Ridley:
After all, in the late 19th and early 20th centuries, the U.S. and Britain made huge contributions to science with negligible public funding, while Germany and France, with hefty public funding, achieved no greater results either in science or in economics. After World War II, the U.S. and Britain began to fund science heavily from the public purse. With the success of war science and of Soviet state funding that led to Sputnik, it seemed obvious that state funding must make a difference.
Huh? In the late 19th and early 20th century Germany ruled physics, producing scientists like Max Planck, Albert Einstein, Werner Heisenberg, and many others, who made revolutionary discoveries that laid the groundwork for modern quantum physics. Germany was a powerhouse in science back then (and still is, only nowhere near as dominant). For example, the early 20th century, Germany won 14 out of first 31 Nobel Prizes in Chemistry. Just look at the Nobel Prizes in sciences! Until 1965, Germany won a larger percentage of science Nobel Prizes than any other country. I also note that the US didn’t catch up with France on that score until the 1940s. Obviously, Nobel Prizes are not in and of themselves an measure of how good a country is at science, but they do suggest where the most innovative research has been occurring a one to a few decades earlier, given that the science that wins Nobel Prizes is usually at least a decade old, to give time to see its significance. One can’t help but note that there is a correlation between the dominance of the US in Nobel Prizes and the start of government funding of science. Does correlation mean causation in this case? Not necessarily, given all the other factors that could impact this measure, but this observation is still a piece of data that at least calls Kealey’s assertion into question.
Basically, Ridley postulates the “myth” of basic science as a means of arguing that current patent policy is too stringent and protects monopoly (which is an arguable point) and that government funding “crowds out” private funding and prevents discoveries from being made:
To most people, the argument for public funding of science rests on a list of the discoveries made with public funds, from the Internet (defense science in the U.S.) to the Higgs boson (particle physics at CERN in Switzerland). But that is highly misleading. Given that government has funded science munificently from its huge tax take, it would be odd if it had not found out something. This tells us nothing about what would have been discovered by alternative funding arrangements.
And we can never know what discoveries were not made because government funding crowded out philanthropic and commercial funding, which might have had different priorities. In such an alternative world, it is highly unlikely that the great questions about life, the universe and the mind would have been neglected in favor of, say, how to clone rich people’s pets.
At this point, I gave up on Ridley. First, he’s downplaying the number of discoveries made with government funding, such as NIH and NSF funding. Also, the Internet is rather a big deal to dismiss so breezily as a “highly misleading” example, given how it has so thoroughly transformed our world over the last 25 years or so—mostly by private companies taking advantage of and building on the government-supported infrastructure and protocols. As for the last “what if” assertion, I did facepalm on that one, given that we actually do have a sort of “living experiment” going on right now regarding what happens when government funding dries up. The NIH budget has been more or less static for over a decade and thus has declined significantly in real dollars. As a result, private sources have stepped in. Have their priorities been better for the country? Not really:
Yet that personal setting of priorities is precisely what troubles some in the science establishment. Many of the patrons, they say, are ignoring basic research — the kind that investigates the riddles of nature and has produced centuries of breakthroughs, even whole industries — for a jumble of popular, feel-good fields like environmental studies and space exploration.
Historically, disease research has been particularly prone to unequal attention along racial and economic lines. A look at major initiatives suggests that the philanthropists’ war on disease risks widening that gap, as a number of the campaigns, driven by personal adversity, target illnesses that predominantly afflict white people — like cystic fibrosis, melanoma and ovarian cancer.
A Nature editorial describes the problem well:
We applaud and fully support the injection of more private money into science, whether clinical or basic. Nevertheless, it is important for each funding body to take into account the kinds of research being heavily supported by the others, to avoid putting all our eggs into a few baskets and shortchanging areas that may yet have crucial contributions to make.
Ridley, given his seeming free market proclivities, might prefer market-based philanthropy to fund science over government funding (and that is his right), but he is sadly deluded if he thinks that private sources don’t “distort” scientific priorities every bit as much as he accuses government funding of doing. At least governments try to look at what will benefit the nation (or large parts of the nation); private philanthropists might or might not do that. Many simply respond to personal interests, personal tragedies, and, sometimes, crackpot ideas. Now it’s true that the government is by no means immune to crackpot ideas (witness the NCCIH, formerly known as NCCAM), but funding such ideas has to work within already established rules for peer review. The same is not true of private funding, where the philanthropist or foundation can basically make up any rules he or it likes.
In the end, I would argue that science should be funded by both government and private sources. What the optimal balance is will depend on the country, its priorities, and its economic resources. Contrary to what Ridley and Kealey argue, it doesn’t have to be a zero sum game.
Finally, let’s revisit Ridley’s picture of technological progress as developing to become like a biological organism, complete with evolution in response to selective pressures. Now let’s carry that analogy farther than Ridley did. Just because evolution by natural selection still occurs in animals and plants doesn’t mean that selective breeding (i.e., guiding that evolution with human intent) doesn’t remain useful and effective in specific cases, such as breeding crops, dogs, horses, pigs, and other animals. Similarly, even if scientific and technological progress is evolving like species of organisms, that doesn’t mean that guiding certain the evolution of specific “species” of science by directing funding to human-decided priorities through government funding isn’t useful and effective. In the end, all Ridley is arguing is that he prefers the science priorities of foundations, companies, and wealthy donors to priorities decided by government. To him one (private funding) is desirable and “natural,” while the other is “distorting.” Yet there is no fundamental difference in how much the whims of a few private donors and various industries “distort” science compared to government, other than that there are more checks and quality control on how government doles out research funds. In the end, Ridley just likes how private sources distort research priorities but doesn’t like how government distorts them.
77 replies on “The “myth” of basic science?”
Many of the patrons, they say, are ignoring basic research — the kind that investigates the riddles of nature and has produced centuries of breakthroughs, even whole industries — for a jumble of popular, feel-good fields like environmental studies and space exploration.
And just exactly what is “feel good” about environmental studies?
Let’s Just Say (TM from another list) that I’m old enough to have seen the Cuyahoga River on fire.
What kinds of science would not be funded by the private sector? Mostly the really big, really important projects like the moon landings, the LHC, and ITER, just to name three. Are/were they worth doing? For the most part, yes, although there are some doubts about ITER, but if it works, it could save the world.
Guys like this are promoting an anti-government ideology that is just not appropriate in a modern society, and they will cherry pick just about anything to make a point.
This would have been enough for me to infer that it was free-market nonsense. Opinion pieces in the WSJ have been thus for as long as I have been aware of them. I see that Ridley’s piece lived down to my expectations.
An ironic combination there. Tim Berners-Lee, the inventor of HTTP, was employed at CERN at the time (1991), and he invented HTTP to let teams of scientists working on different kinds of computer hardware/operating systems in different countries to communicate with one another. HTTP was one of several internet protocols that existed in the early 1990s: anybody remember gopher or archie? Those others mostly fell by the wayside, because HTTP was the protocol that got developed into the commercial internet we know today.
I should also add that this fellow shows a fundamental misunderstanding of the psychological basis and the rationality of economics and financial markets, which, because they are human undertakings, have a distinct emotional and fundamentally irrational tone, which is why they must be regulated in order to be trusted.
Ridley himself is notorious for climate obfuscationism:
So because we have developed better tools to do some basic science, basic science does not led to innovation?
His previous example was making more sense (steam engine and thermodynamics – although I would like more insight into this).
Since DNA is not a physical by-product of X-ray technology, and it’s discovery led to a hoist of biomedical and scientific applications, I fail to see how it has not been an innovation sprouting from basic science.
Bit of the egg-and-chicken question, except the egg is from a lizard and the chicken has come out of the shed to eat it.
Since he is speaking of astronomy, I would point out that the demand for giant telescopes and powerful computers in a pure “basic science” did lead to improvements in both fields of optical parts and electronic hardware.
One can also point at the general relativity theory and GPS/cell phones: you need the first to get artificial satellites on a geosynchronous orbit, so you can get the nifty satellite-talking gadgets everybody is enjoying today.
Wait, it’s a success, either way you look at it – commie-funded Russian program put the Sputnik in space, first dog in space, first man in space; and taxpayer-funded NASA eventually managed to put men on the Moon, repeatedly;
But it’s not an argument in favor of state-funded science?
Wait, is he saying that a more pure commercially-oriented approach would have gone into more altruistic research and would have avoided the pitfall of short-term-profit applications?
tl;dr: what’s the color of the sky on his planet?
“and its discovery led”
Vade retro, apostrophe.
The Cato Institute!
Perhaps he prefers the sort of market led R&D that Boiron does. Or claim tax relief for at least.
From an economic point of view, it is true that the results of basic science translate in a knowledge available in any country, whereas inventions can be protected, so if you want financial returns, it is better to invest on technology. However, technology is highly dependent on basic science, because it is dependent on knowledge.
If there is a « myth » of basic science, it is in the way today’s basic science is articulated with technology at the institutional level. Invention does not need NEW basic science, but knowledge of all the information on a question and the will to solve a problem.
Building your reputation in basic science today can be done by using up-to-date technology to make spectacular experiments, often confirming what is already known, and could be presented in a way that may convince journal editors that your results will be translated in an invention very soon. Most of the basic science that is echoed by the media does not represent a major advance in our knowledge. Important inventions may not require a basic science breakthrough, but basic science which does not improve our knowledge and is not used by inventors is just a technological bluff.
That does seem to be Ridley’s point, and it’s easily falsified. Thanks to huge advances in integrated circuit technology (a technology that would not exist if we had not developed quantum theory), there is now a discipline called computational biology, which did not exist twenty years ago. When I was a student, Rutherford’s derisive “stamp collecting” epithet could be reasonably applied to most of biology. Today, these computers are helping us to make fundamental quantitative discoveries in biology–exactly the sort of discoveries that Orac needs in order to translate them into medical practice.
Obviously we do not need basic research into vulnerabilities of pathogenic microorganisms, because pharmaceutical companies are already cranking out dozens of novel, effective antibiotics.
It is perhaps worth pointing out that basic science can be approximately divided into theoretical and experimental. These days the latter tends to be quite a bit more expensive than the former. Private foundations can suffice for theoretical science while big experiments often require the backing of society, as expressed through their governments.
Yup. The biggest problem with biomedical research, for example, is that it’s become so expensive to carry out.
Begs the question, of course, of what “we tend to think.”
And who he is including in “we.”
IMHO, that’s not much of a myth. Gods seducing beautiful maidens while in the form of a barnyard animal – that’s the mark of a quality myth.
And we can never know what discoveries were not made because government funding crowded out philanthropic and commercial funding, which might have had different priorities.
Given the dismal funding success for applications to NIH and NSF I’m pretty sure that other funding sources are in constant demand.
Didn’t Ridley prove the virtues of the free market over statism by inheriting a bank and driving it into the ground, requiring the British Government to take it over and bail out its debtors with state money?
He seems to have kept a low profile since then in the hope that people would forget.
Why has it become so expensive to carry out? Why did a paper in Nature contain 3 simple figures 40 years ago, and now you need many compound figures to publish in any journal. Was research less scientific at that time? IMO, basic science has become a competition where search of the truth is dispensable.
I mean Orac #14
I suppose we’ll never know what discoveries could have been made had the millions spent to promote climate denial and on tobacco industry shenanigans and attempts by other business shills and lobby groups to mislead the public had instead been spent on (legitimate) scientific research.
Judging from the WSJ comments, the article will make sense, and seems to be targeted, to individuals who accept libertarian fantasy world economics.
For anyone else, it’s an exercise in WTF am I reading.
Transparency. It was taken on trust when the scientists who published papers with the three figures that they had also done all the background work they stated they had done, and reviewers didn’t make them show all those extra figures. Also, it was mainly Science and Nature that emphasized such brevity. Other journals were less stringent.
There were also very strict page limitations because of cost, and in those days figures had to be made by hand. It was very tedious, and I know what I speak of. Indeed, I remember making figures for my first published first-author paper in the early 1990s by cutting out strips from the autoradiographs and matting them myself with letters for panels and captions, then taking photos of them and waiting for them to be developed. (No digital photography back then!) Each figure took me hours to do, and for some of them I had to redo them when they didn’t photograph well.
In these days of electronic publishing, with Photoshop, graphic programs, PowerPoint, etc., it’s easy to make and publish lots of figures. Also, more and more journals want to see the “data not shown” figures, forcing authors to publish them in electronic Supplemental Data sections. I remember reviewing a paper for Science with only three or four figures in the paper itself that had 13 or 14 multipanel figures in the Supplemental Data section. It was so ridiculous that I actually complained to the editor and asked if including that many figures was really necessary. (I didn’t think ti was.)
Not entirely. There were some pretty harshly critical comments there too, even more critical than my post.
BTW, should I cross post this post to my not-so-super-secret other blog this weekend? 🙂
Or beautiful blue-skinned G-d-boys manifesting themselves in a thousand places at once so as to make love to a thousand milkmaids. (Seriously, Hindus have some of the best myths. Buddhist ones are all boring and didactic.)
He wrote one of my favorite science books, Genome: The Autobiography of a Species in 23 Chapters.
I remember “Genome” containing some useful information, interspersed with pop evo-psych and a lot of weird political non-sequiturs — Ridley never misses a chance to cheer-lead for Thatcherism.
Chapter 20 is subtitled “Politics” and is a lengthy apologia for the inaction and denialism from the Thatcher and Major governments about prion diseases in human food resulting from free-market forces in agriculture (i.e. the BSE outbreak).
We learn that the ministers and industry-approved experts who stonewalled on the possibility of contaminated food were acting responsibly (even if they proved to be completely wrong), with the corollary that the scientists warning about prions were irresponsible alarmists (never mind that they were right).
I didn’t think much of Ridley after reading that.
Ridley’s wife is a neuroscientist who conducts a lot of government-funded pure research into human colour vision. AWKWARD.
“Ridley never misses a chance to cheer-lead for Thatcherism.”
I strongly support military expeditions to islands populated by sheep.
I would’ve replaced Edison with Tesla, Faraday, or even Newton.
re those myths ( or a g0d masquerading as a swan or shower of coins)
why is it that the really best ones involve a manifestation in order to screw human females?
I guess we matter.
It’s all about sex with gods, but not so much with the goddesses.
It’s all about sex with gods, but not so much with the goddesses.
If you want “supernatural females stole my body essence for their insemination program” mythology, you have to go to the alien-abduction literature.
Orac # 22
Not only transparency. If you compare the content of a paper, there was much less techniques in each paper. You could publish a paper with only a sequence at the time when sequencing was a new technique, even when the results were not interesting. It is for me obvious that the technology criteria has been important for decades and we are now at a stage where it is driving basic science.
I hate when that happens. Stupid government hogs all the scientist and labs, and doesn’t leave any for commercial enterprise and philanthropist, so they have to make do with a Fisher-Price microscope and a 1962 chemistry set. /sarcasm
I remember back in the late 70’s, PBS had a series called Connections https://en.wikipedia.org/wiki/Connections_(TV_series) It was based on a book by the same name, and I have a copy of it around here somewhere. (As I remember, my copy has a price tag, but not a UPC code. It’s from the dark ages.) A quick search of the interwebs show that the shows are available online.
It’s about how scientific discoveries and technologies would build on each other. The same story told by our host about steam engines is included in one of series of events in one chapter.
As best as I remember, while I found a few of the “connections” a little tenuous, overall it was well done and fairly interesting. (Connections 2 and Connections 3 not so much.)
Stan from Texas
Has this guy somehow failed to catch on to the notion that the sole purpose of modern business is to make money for the shareholders? Any results of commercial research will be held closely secret for as long as possible in order to maximize the revenue for the company. There is no other objective. Benefit to anyone or anything beyond the shareholders or owners is purely coincidental.
I read Matt Ridley’s op-ed with some interest and disbelief. He does not to understand the difference between basic science and technical innovation. Just the selection of Bell or Edison shows this . They were both good/great engineers or tinkers but neither made any contribution to basic science. This is not science, this is technology. His quoting Adam Smith shows the same lack of understanding. Again, he is talking about engineering not science.
He seems to waffle on without understanding this forever. Certainly, he failed to point out any instance where private enterprise resulted in important breakthroughs in basis science. I don’t remember Einstein working for Krupp or Ford when he published his papers.
I really wonder what he thought he was illustrating with his discussion of Newton and Leibniz. Ignoring the fact that the Calculus is not strictly speaking science, why are we surprised that two researchers discover/develop something almost simultaneously? They were working in the same scientific community and had access to essentially all the same information. Oh, and just as a passing point, Newton was a Cambridge Don and later a civil servant; Leibniz was a life-long civil servant. Not a hint of private enterprise.
Mind, a climate denier http://www.desmogblog.com/matt-ridley like an anti-vaxer is unlikely to let facts impede him, even if he understood the issues.
“Crowded out”. Because there are so few scientists and researchers in the world that their ability to spend money becomes the limiting factor. So every dollar provided from public funds is a dollar taken away from the capacity for philanthropies and businesses to fund research. WTF?
The only way this makes sense is if Ridley is trying to provide a rationale for activities that soak up public money, thereby preventing the government imposing its smothering blanket of research funds, and allowing a thousand flowers of creativity to bloom.
Activities like, say, needing a £27 billion loan (much of it a write-off) through incompetent Ridley-related business decisions.
Here’s the other thing about basic research: you never know where it will take you.
A personal example: I once studied the ecology of blue-bellied lizards (fence lizards) in the California desert. Fun, if very hot, work, but who cares? Well, it turns out that those silly lizards might have something to do with the low prevalence of Lyme disease on the West coast. Once it has an impact on human health, people tend to care a lot more.
But if you’d never studied the lizards int he first place, for their own sake, you wouldn’t have anyplace to start.
The reason doing biological research is so expensive is all the “gatekeepers” who take a cut of the ever shrinking meager funding.
You can’t know the “value” of basic research until after you have done it and figured out where it fits into the grand scheme of things.
The “gatekeepers” certainly don’t know the value of basic research. They can’t even pretend to know if the science is outside of their field. That is why they try to use non-science based metrics to evaluate “proposals”, such as the citation index of papers the PI has written.
Liberals love “general science”/“general research” because those things don’t require accountability or productivity.
Dan Akroyd’s character sums it up well in 11 seconds:
They just want to play in the sandbox paid for with other peoples’ money. And don’t ask them questions. Because, Hey…
@ See Nuovo
Precisely, the problem of basic science is that administrations have introduced surrogate indicators for its evaluation, like productivity measured by impact factors, level of technology, number of PhD supervised, etc… These politics, in return, are never evaluated, although they lead to the collapse of basic science, because they create a Ponzi scheme and they distort the aim of research.
Your glomming on to this well-known fυckwit doesn’t speak well to your ability to think, Daniel Corcos.
Narad, if I remember well, in many occasions you have answered to SN. And actually the reason I am answering to SN and to you is because I know my message will be read by the others, who are able to judge the ability to think of any of us.
The auditorium lights dim. The chatter quiets down, slowly.
“What was this show again?” whisper one spectator to another.
“It purports to be a piece of controversial street theatre, ‘a spectacular spectacle and a one-man underground expose of misguided politics, the alienation of modern living and the tragedy of humanity losing its soil.’ ”
“Sounds a bit pretentious…” he whispers. “Wait, soil?”
“Says so in this photocopied pamphlet the guy was handing out in the foyer.”
“That thing looks like crap. The spectacle could have done with a graphic department.”
“Uh? It says here there’s a monkey involved, but somebody’s crossed over it with a pen…”
“What? Oh no!” the spectator gaps in shock, making several people turn their heads to watch. “Let me see!” He snatches the pamphlet and starts at it with mounting disgust.
The curtain stays to open, jerkily, as if pulled inexpertly by hand.
“Oh fuck no… Not this again…” the man mutters to himself.
“What? What is it?” whispers the other.
The curtains come to an abrupt halt, as the pulling mechanism gets stuck, the stage only half revealed.
“It’s him. ” the man groans.
The curtains twitch as someone jerks the rope repeatedly.
See Noevo walks into the stage. Stage lights are lit, but they illuminate the curtain more than they do the line figure standing there. It moves about until it finds a spotlight. See Noevo blinks in the sudden glare. It is wearing the same clothes as in its last performance, but the outfit is dirtier and even more ragged than before. ucked under See’s belt is what looks like a crude sock puppet vaguely resembling a monkey, made out of a brown wool sock and mustard coloured felt.
“Welcome, ladies and gentlemen to a night to remember, to a spectacle so grand you will tell about it to your grand children, to your children’s children, and to their children for generations! You will be mystified, stupefied, awed into tiers and saved by spontaneous conversion brought upon by me, from the eternal lake of fire! This!” it gestures grandly about, “This will be my finest show yet!”
” Let’s start with some YouTube videos…”
The campaigns to raise funding may be driven by stories of personal adversity told mostly by white women, I’m guessing because those are the women with the greatest access to money and power. The campaigns are driven by personal stories because Americans are innumerate., and crave titillation. Statements like “the 5 year survival rate for ovarian cancer is 45.6%” mean nothing to people who don’t understand fractions, let alone percentages. The story, “I cut off my boobs so I wouldn’t die like my mother did when she was 41” nets a few million more dollars for research.
Yes, the incidence of ovarian cancer is slightly higher in white women than in other ethnic groups. The mortality rates have dropped very slightly overall* and dropped more for white women. (The squeaky wheel gets the grease?) The mortality for black women is about the same as it was in 1999, and the rate has increased for asian/american and hispanic women.*
* partly due to separating out the stats for primary peritoneal cancer cancers. * http://www.cdc.gov/cancer/ovarian/statistics/race.htm
Good Herr Doktor #18, 25 and 36
You highlight some of the difficulties in taking anything which Matt Ridley says seriously: a cheerleader for free market capitalism who, when the bank he was in charge of screwed up royally, did not accept any responsibility but screamed for a public money bail out…As has been pointed out, his “science” books are a cover for advancing his political ideology and he’s not even good at that, displaying a distinct lack of knowledge of the area he did degree in (zoology, and, FFS, Dawkins was probably one of his lecturers/tutors)…
And then we look at how he has any sort of position of prominence. He has not got there by any effort, skill nor ability: he was born into it. He is a member of my local minor aristocracy (the family pile is a few miles from here); his uncle was a member of Thatcher’s cabinet; he is very well connected, what with also being an Old Etonian; he has never had to work, everything has fallen into his lap…
This bloke is a charlatan and without his hereditary privileges and family connections would not be taken remotely seriously.
The way in which the author tortures his examples to follow his narrative is especially evident for me (being Portuguese) in the way he dismisses the contribution of astronomy and cartography to the Portuguese explorers in the 13th-15th centuries. “Recent scholarship” (which he conveniently fails to cite) say it was actually “trial and error by the sailors” – while I’m sure this is partially true for actual *sailing*, I have a hard time believing that all the accurate charts that allowed Portuguese sailors to not get lost and starve to death came about as a fluke from an enterprising sailor who enjoyed drawing. The truth is in this case as in many others, innovation from trial and error only became useful when systematized and coupled with actual expertise. E.g. a Crown monopoly assured strict standardization of ship parts, ammunition, etc, while a state-run navigation school supplied the ships not only with capable navigators and officers, but also quartermasters who could do math (what a concept!).
To play devil’s advocate, there could be a negative impact on people’s generosity versus charities if they have the feeling that the government is already providing enough via tax money.
Us French/European tend to be giving less money to charities than the Americans (thanks to our nanny states doing a slightly better job at providing basic healthcare and education coverage – the idea of giving money to the local clinic or school is completely preposterous to most French people).
OTOH, it’s about local charities, like the above-mentioned local clinic. The donor can picture emotionally the benefits of his donation every times he drives by the clinic. Private founds for scientific research are a different kettle of fish. You need to invest in a bit of media coverage to raise awareness that such foundation exists and there is a need. So the perception of the level of public funding matters less.
We French have a few of these philanthropic foundations. They tend to focus on specific illnesses (cancer, myopathy). The League against Cancer drains quite a bit of money*, but it’s not money one could use for research beside cancer-related topics. A bit of a limitation on which fields may spawn innovation. I mean, AFAIK, there is no philanthropic foundation for research in electronics or computers or chemistry.
* Let’s throw into the fray that French donations are 75% tax-deductible (after a certain threshold, it drops to 66%); meaning that more than two-third of the money NGOs get is actually “diverted” tax money.
Oh, and some of us French scientists are quite happy that Bill & Melinda Gates are willing to fund us, on top of our government subsidies.
(And to castigate the ignorance of the above troll, BMGF asks quite a lot of questions about what we did with all this money; actually, even government agencies tend to be very demanding on reports and financial summaries; if we were bankers, we would have far less paperwork to do on where the money went)
tl;dr: I finally agree that there is nothing to stop rich individuals or private companies from funding whatever research they like, public funding or not.
the way he dismisses the contribution of astronomy and cartography to the Portuguese explorers in the 13th-15th centuries. “Recent scholarship” (which he conveniently fails to cite) say it was actually “trial and error by the sailors”
“Recent scholarship” translates into “rhetoric spouted by someone else at the same rightwing thinktank”. Ridley is regurgitating — practically verbatim — claims made a few years ago by Terence Kealey.
The funny thing is, Kealey also said this:
So, it’s OK for governments to fund science to “democratize” it, but, dammit, it’s really unnecessary because the magic of the free market would take care of it.
“So having the NIH funded by the government [and likewise the FDA and EPA] could produce a countervailing pressure against the tobacco companies”.
This would be worrisome because tobacco and cancer are good for GDP.
Ridley rather admires Edward Hooper and his oral polio vaccine begat HIV hypothesis.
I had to read that several times to make sure I’d read it correctly. That argument is literally nonsensical. How does government funding “crowd out” philanthropic and commercial funding? How does government funding stop philanthropists and private companies from also funding research?
I wish to modify the above. His argument is not merely nonsensical: it is illogical, irrational, and not even wrong.
Orac quoting Kealey:
And how does Kealey think the people who have enough money to create private foundations get their money? It’s still big corporations, just one step removed. And of course these private foundations will have an agenda of some kind–Kealey works for one. Sometimes that agenda will align with the public interest, but that’s not the way to bet.
I’d take his warnings about Bush or Blair using government-funded research to support corporations more seriously if Kealey didn’t act as if that was a good thing.
A nice rebuttal to Ridley’s central argument, using the iPhone and Steve Jobs as an example:
I believe that in the UK, the Institute of Physics (IoP) contracted a consulting firm to try and track the development of some basic research into some kind of technology. Apparently, this firm then came back to the IoP and said that they simply weren’t able to do it. It was far too complicated to try and build some kind of simplistic chain going from “basic science” to “economically valuable technology”.
2 things, very random…
1. I think that picture is of Beth McNally, a scientist I actually know.
2. Does Orac make comments on the A/V Club Walking Dead reviews? Does the man sleep?
…and Then There’s Physics
I suppose that the Institute had proposed to pay them as a function of the benefit from “economically valuable technology”.
If they were payed by the hour, they would not have declined the proposal.
Ridley with his intellectually dishonest technique which is to start with the conclusions then work his way back making up causal links where none exist, like some bent policeman or lousy lawyer framing some unsuspecting individual, certainly attracts the click!
Even if we stick to translational / R&D work, many would argue that the state does much better than the ‘market knows best’ [ex-financiers] like Ridley would try to argue. Take
… and on the question of our warming planet (let’s assume for a moment that Ridley agrees there is a problem to solve) … will it be private enterprise which alone can fix this? Not according to Bill Gates …
“Gates admitted that the private sector is too selfish and inefficient to do the work on its own, and that mitigating climate change would be impossible without the help of government research and development”
Ridley has only a few ideas (1. technology will sort out all the world problems; 2. the world’s resources are functionally infinite; 3. the market knows best; 4. don’t trust self-serving scientists unless they agree with 1-3) and he recycles them ad nauseum, as click bait for his next book no doubt.
Anyone for a ‘Ignore Matt Ridley’s nonsense’ week sometime soon?
In a perfect world there would be oodles of money for basic science, both from government and private.
In this world, one has to wonder whether the west getting all its products from China is long-term viable (for the west). If it’s not, we might have to ask ourselves whether the public spending burden is not a handicap for manufacturing.
For the Thatcher-bashers and dirigistes: the most spectacular government sponsored science/industry projects in the UK in the 20th C. were Concorde and the AGR. Just google on those two – there’s people who think they were the biggest disasters to strike the British economy, from which it has never recovered.
Those advocating more dirigisme never seem to get called out on where they think it should end. The logical end points have been tried, in the USSR and DDR, for example.
The most important part of our host’s essay seems to be lost on some contributors:- “there must be a balanced approach “.
Peter G – Concorde etc. were engineering programmes, with no pretensions to basic science. And while we are on ‘bashing’, Crossrail will exceed the cost of Concorde development even allowing for inflation, and is an example of a very well run UK engineering project. Let’s not rehearse why Concorde failed commercially, but let’s not ‘bash’ UK engineering.
Private partners like Arup, Atkins, etc are involved in Crossrail of course, but none would have had the commercial or political wherewithal to initiate such a project in the UK..
And don’t get me started on Thatcher. I have a friend – a retired entrepreneur who ran a semiconductor business – who spits blood whenever her name is mentioned, because of her Government’s confused approach to that industry in the UK.
A balanced approach is indeed needed, and the US have traditionally been the masters at navigating the public – private R&D interface (something Thatcherism failed to foster in the UK, because of ‘market knows best – sell off all our assets’ dogma).
If it ain’t broke?
Yes, my comment #21 should have read “the article will make sense, and seems to be targeted, onlyto individuals who accept libertarian fantasy world economics”, which judging from the positive comments seems to be the case.
I assumed the “harsh” comments were coming from people with similar feelings to my own “WTF I am reading” reaction to Ridley’s absurd opinion.
If someone had given me the opt-ed and told me it was a submission to a high school essay contest run by Cato or some similar “libertarian” propaganda machine, I would have believed them.
Truth is that Ridley has no argument that would withstand even the mildest level of scrutiny.
Now if he’d gone with something along the lines of “The myth of libertarian economics”, he could easily have built a case.
Especially given Edison’s commercially motivated disinformation campaign against Alternating Current.
“In the end, Ridley just likes how private sources distort research priorities but doesn’t like how government distorts them.”
Part of this is seen in how he views climate science conducted by government organizations. Far as I know he hasn’t gone full conspiracy mode yet, but does think government science is distorted and puts the emphasis on the wrong parts. As others have linked to, Ridley has a long history of distortion in a few areas as his ideology blinds him completely.
E.g. he claimed no-one could see the bank failure coming so he wasn’t at fault. Later investigation revealed emails and letters and meetings that warned him, often directly, if he continued with his current policies the bank would fail, and here, exactly, is how and why, and here’s what you need to do to keep that from happening.
[…] blog post rebuts @mattwridley’s article on science funding (discussed last week) [link] […]
Orac, you a person who uses the phrase “climate change denialist” and as such your critical thinking faculties are called into question.
However I think you are right on this point. Edison and 22 other people would not have been in a position to invent the light bulb if Faraday and Maxwell had not developed the basic science of electricity.
The dependency between pure and applied science works both ways. Ridley is always interesting to read, but here he overstates his case.
Anyone who thinks a person who uses the term “climate change denialist” (which in my case was clearly short for anthropogenic global climate change denialist) lacks critical thinking skills is in reality the person who himself lacks critical thinking skills. 🙂
See “straw man”, “ad hominem”, “argument from authority” for starters.
Have you read anything from the GWPF, or just articles about it in the Independent? If you are going to read one thing, read this: http://www.thegwpf.org/patrick-moore-should-we-celebrate-carbon-dioxide
I also recommend spending some time at http://judithcurry.com/ (who linked your article).
And when you finish reading Judith Curry on climate change you can go to mercola.com for current medical information.
I like Ridley, but the best thing about his WSJ article is that it prompted your response. Thanks.
Anyone who suggests going to the GWPF or Judith Curry’s blog to learn about climate science, really shouldn’t other people’s critical thinking faculties are lacking. The GWPF promotes nonsense, and Judith Curry seems to have trouble understanding percentages.
Matt Ridley’s article reads like a parable. I would title it, “This Is What Happens When Smart People Become Contaminated with Ideology.”
Having read the WSJ article, Ridley’s persistence about technology evolving like biology, has a whiff of The Singularity to it. Particularly where he gets into comparisons between the Internet and biological brains. Speaking of quackery, perhaps he wants to upload his soul to a machine?
Ridley’s bottom line is this:
All the money government spends on science, should instead be given to his private-sector cronies to spend on science. That way his cronies can get rich in the process.
In other words, his article is nothing more than propaganda for a grab.
The way it runs well isvia common sense. Government funding concentrates on projects that tend to be large, ones that benefit from a pooling of resources beyond the means of many private candidates. Examples, man on moon, many in defence, some satellites, military projects.
You should not underestimate the contributions of the private sector. A great deal of the discovery of mineral resources and their commercialisation was done by companies that started out small. The successful ones tended to grow faster than comparative growth rates under governments.
We in the private sector devised a one liner decades ago. How do you create a small company? Answer, you show a large one to a bureaucrat.
Seriously, there are not many observers now who have seen both public and private sector funding compete. In this age of say demonstrated largesse for conformists, like the US government wasting many billions on climate change, where does one find a younger investigator who would not like it?
<blockquote.In this age of say demonstrated largesse for conformists, like the US government wasting many billions on climate change…
The fact that you reject the truth of climate change tells me everything I need to know.
[…] • Warren Buffett’s Way to Invest for Retirement: 90/10 Allocation (CIO) • Bubbles: No One Has Any Idea What’s Going On (Motley Fool) • Bill Gates thinks it’s time to fix capitalism (MoJo) see e.g., VW sees 800,000 more cars affected by ‘inconsistencies’ (Reuters) • Everything you think you know about happiness is wrong (Quartz) see also America’s Coming Cognitive Decline (Bloomberg View) • The “myth” of basic science? (Science Blogs) […]
[…] The “myth” of basic science? (scienceblogs) […]