Friday, December 9, 2011

Jeremy Shane: The Best of Times, and the Worst of Times, for Health and Healthcare

Editor's Note: Jeremy Shane is the President and COO of the Health Central Network, based in the Washington DC area.    Jeremy is a leader in both the business side of healthcare and also on the activist side.  So this guest piece reflects not only Jeremy's deep understanding of the industry, and the challenges and opportunities that it faces, but also his passion for Serious Medicine.   As he observes, only a Serious Medicine approach to Alzheimer's, to cite one urgent example, will save the US from economic ruination as future entitlement bills come due.  

Yet ultimately Jeremy sees more good news than bad news on the horizon; so count him as a the rarest of contrarians--a well-informed optimist.

The Best of Times, and the Worst of Times, for Health and Healthcare

A single page--page B4 of Thursday’s Wall Street Journal provides a wonderful synopsis of the state of the US health industry.  The page includes four articles: “Pfizer Cancer Drug Advances,”    “Justices Weigh Patents on Medical Tests,”  “Novartis Retools Brain Work,” and “Astra Cuts Sales Force in US 24%.”

Collectively, these articles paint a hopeful picture for health, but a very difficult outlook for health companies.  Great science will continue to excite and frustrate, scientific breakthroughs will test long-held regulatory approaches and law, and traditional models of drug development are in big trouble.

First, on the critical issue of curing serious disease, is the Pfizer article.  Wednesday an FDA advisory committee gave a unanimous nod to a new Pfizer drug, axitnib, for kidney cancer.   The FDA likely will give its final approval for the drug soon.  Clinical trials showed the drug prolonged survival for kidney cancer patients whose disease stopped responding to initial treatments.  There are two other treatments on the market for advanced kidney cancers.  Pfizer’s drug seems to work as well as the others, and better in some cases.  That’s the good news.  The challenge now is to figure out which types of kidney cancer where each advanced drug is certain to work better.  The only thing worse than a drug that doesn’t work, is a drug that that sometimes works and sometimes doesn’t, but in either case for reasons doctors don’t understand.  It means some patients will have false hope and lose time that could be spent trying something else that might-–or ideally would-–work.  So at the same conference where Pfizer announced results for axitnib, a new study from the PREDICT Consortium, a group of European researchers, was announced to see if different combinations of kidney cancer drugs could be associated definitively with specific tumor types.  That’s progress.

Yesterday’s articles also highlight how science and technology are eroding long-held regulatory approaches.  The FDA will approve Pfizer’s drug as a “second-line” treatment, meaning most patients will get it only after widely-accepted chemotherapy, surgery, or radiation therapies have been tried and have failed.  FDA rules do not prohibit doctors from using second-line drugs as part of an initial treatment plan, but because the drug wasn’t tested with “treatment naïve” patients before FDA approval, the choice to use it earlier is riskier for doctors and patients.  Ironically, the same research approaches making highly targeted drugs possible also suggest that tumors evolve once they are bombarded with chemotherapy, making them harder to kill.  Drugs used as second-line therapies might work better in combination with other drugs if used at the outset.  Or, they might not.  But right now, because of FDA rules governing clinical trial, drug manufacturers rarely test drugs on just diagnosed patients if a reasonably effective chemotherapy regime exists (reasonably effective meaning the chemo will shrink the tumor or slow its growth but is unlikely to kill it for good.)  Patients lose as a result.  Oncologists play the odds, more comfortable with tried-and-true though imperfect approaches, unsure in the absence of trial data whether using a second-line drug at the outset will improve survival.  And the specter of a malpractice suit if they try (and fail) cannot be far from the oncologist’s mind.  In time, research will win out, and it is possible that for many kinds of fast-growing or complex tumors, going nuclear as early as possible on multiple biochemical fronts is a surer path to total victory.

Yesterday’s story about the Supreme Court deliberations highlights how scientific and technological advances also are undermining long-held legal standards.  Patent law is intended to encourage new discovery, protecting breakthrough ideas while allowing others to benefit from publication of new approaches, enabling others to build on new discoveries (for a fee) in their own products.  Patent law has become murky in addressing new techniques to improve diagnosis, adjust drug doses, or determine individuals’ risk of disease based on their genetics.  Yesterday’s Supreme Court case, Mayo v. Prometheus, concerns a test that helps doctors pick the best dose for a Crohn’s Disease drug.  Tests like the one in Prometheus are at the heart of targeted medicine.  By definition, for a targeted therapy to be picked, doctors and patients must know what they’re aiming at, requiring sophisticated tests.  The question facing the Justices in this case and one expected next year, is: What makes a test patentable?  If a test merely detects substances that humans naturally produce, is it really creating something new?  What if the substance identified by a test is a rare mutation--it is a natural substance, but absent innovative techniques it might not be identified?  The answers to these questions could have a large impact on the pace at which new cures are discovered.  If patent law is too loose,  and too many patents are issued, patent-holders could extract monopoly prices and stifle innovation.  But allowing too few patents eliminates the incentive to risk money and time to discover new things, since there is no certainty that investment can be recouped.  Whatever the Supreme Court decides this year and next, it seems likely that health innovators will spend more time in court (like software and wireless phone companies) fighting over patents, diverting precious time and money from breakthrough science into law firms’ coffers.

Patent law may be about to become messier but the skirmish obscures the larger import of cases like Prometheus.  Whatever the law, science and technology are changing the way doctors and patients make health decisions.  Bit by bit, test by test, products like those offered by Prometheus are demystifying diagnostic and treatment choices.  As technology takes the margin of error out of previously subjective medical judgments, it will take the premium out of the cost doctors charge to make the diagnostic or treatment choice.  This is not to say all medical science can or should be reduced to a set of algorithms, though companies such as IBM, with its Jeopardy-winning Watson software, may try.  Doctors will continue to add a lot of value in their surgical skill and ability to manage difficult cases.  The body will remain a complex place to do business for a long time yet.  But the way medical professionals work is changing forever and for better.  The more that can be determined about a person’s medical condition objectively and precisely, the more consistent will be choices about treatment, and the more equitably can treatments become accessible to patients whether they are in a vaunted Memorial Sloan Kettering or a health clinic in rural Kansas.  

So, patent law disputes will over time become beside the point, even as they remain costly distractions.  We can hardly expect nine Justices, however learned, to solve a problem that-- regardless of their words--will be overrun by the discovery of scientific fact.  The high priests of the legal profession represent an industry marching ever-deeper into nuance and obfuscation as science marches in the opposite direction, towards clarity and consistency in making treatment decisions.

This brings us to the articles about Novartis and AstraZeneca and the breakdown of the Big Pharma business model.  Over the last twenty years, big pharma funded R&D on many disease areas with fat profit margins from a few multi-billion dollar blockbusters.  Once the FDA approved a drug, pharma companies sent legions of sales reps to knock on doctors’ doors, persuading them to treat patients with the newest compound.  This business model worked well for a while.  New medications came to market could be used by millions or tens of millions of people.  New treatments in many conditions like high cholesterol worked much better than older drugs, or even postponed costly surgeries.  Drug companies learned to create demand in conditions like erectile dysfunction where no real treatments existed, appealing to aging Baby Boomers’ anxieties about youthfulness and self-image.  Over time “lifestyle” drugs attracted elite scorn, but it is worth remembering that for a company like Pfizer, male angst over performance in bed funded research in niche categories like kidney cancer.  (Lilly, maker of the other leading ED drug, Cialis, also plowed profits into drugs for hard-to-treat conditions as well as more broader-scale areas like depression and chronic pain.)

Regardless, the days of blockbuster sugar daddies is over.  Big drugs, such as Pfizer’s Lipitor, are going off patent, crushing profit margins.  And “push” marketing, driven by salespeople pounding the pavement and knocking on doors makes less sense.  Today’s targeted drugs are used by fewer doctors, specializing in less-common conditions.  Even in oncology--a fairly broad specialty--there are only about 30,000 medical professionals in the largest cancer society, compared to the total universe of about 600,000 US doctors who prescribe drugs.  So companies like Astra Zeneca, who have not found many new targeted drugs, are in a bind.  Unable to support speculative research in multiple areas or justify large sales forces, big pharma is laying off marketers and researchers en masse.  Before Astra Zeneca, rounds of layoffs have happened at Merck, Novartis, Pfizer, GSK and others.  

There is another looming threat to American health and welfare that could produce a few blockbusters, Alzheimer’s Disease.  But AD has proven to be a vexing disease to understand.  Researchers are facing hurdles to developing “druggable” compounds, medications that work in predictable ways and can be delivered in consistent doses to deep recesses of the brain.  AD is likely to follow what we are learning about cancer and autoimmune disease.  That is AD will probably come to be described as Alzheimer Diseases, plural, with different varieties triggered by a range of genetic and environmental interactions, requiring a spectrum of combination therapies.  

Companies like Novartis which have tried and failed to develop Alzheimer’s killers now see that brain conditions defy resolution until we take a holistic approach to understanding brain disorders--degenerative ones like Alzheimer’s, developmental ones like autism, and behavioral ones like schizophrenia.  They are taking a step back to study brain anatomy, chemistry, genetics, and immunology, hoping that collaboration on core brain processes will mean many steps forward on multiple fronts.  Novartis, and a other pharma companies who remain interested in brain disorders, realize that even the largest “Alzheimer’s product team” focused solely on Alzheimer’s cannot hope to crack the “case.”  Brain research will be bigger than any company, any patent law, any FDA rule, bigger too than Obamacare or Ryancare since no insurance scheme from left or right can hope to cover the daunting cost of caring for people with multi-decade degenerative, developmental, and traumatic brain conditions.  Only great science and willing research participants--cures, in other words--can.

That, in a nutshell, is the state of the health industry today.  It is tempting to say it’s Dickensian, the best of times and the worst of times, but that gives too much credit to the down side.  Vibrant societies rely on creative destruction, objective truths, enabling new ways for innovators to profit, and citizenry to prosper.  Great science, or cures as most people think of it, requires perseverance and openness to new ways of thinking.  Rent-seekers and beneficiaries of now outdated regulatory schemes will be compelled by public pressure and scientific truth to step aside or join the fight.  If they get seriously ill, you can bet they will. 


Editor's post-script: A very important point is tucked away at the end of the paragraph about the Prometheus case: Now matter how the Supreme Court rules on that case, to the extent that cutting-edge science can be replicated, in the form of both treatments and tests, the problem of doctor-skill differential--which tends to disadvantage rural locales, and other under-served areas--is diminished.  That is, if the necessary intellectual capital is all captured in in the test or the treatment, then the skill of the doctor or healthcare provider is de-emphasized.   Doctors and other healthcare providers will always be vital, but if the test or treatment can be standardized and replicated on a mass scale, then everyone, everywhere, can get the benefit of the best that medicine has to offer.    And that's a good thing. 

Tuesday, December 6, 2011

All Visions Must Pass: Donald Berwick Departs the Centers for Medicare & Medicaid Services

Dr. Donald Berwick, now departed from his 17-month recess appointment as Administrator of the Centers for Medicare and Medicaid (CMS)--he was unable to win confirmation by the US Senate--took some parting shots at "waste" in the system in a New York Times interview. Berwick, who oversaw the spending of more than $800 billion in outlays, declared that as much as 30 percent of health spending is wasted. OK, we're all against waste, but who defines it? The federal government? Has Uncle Sam earned the credibility to make judgements about waste?

His valedictory interview notwithstanding, Berwick is probably best known, to admirers and detractors alike, as America's leading fan of Britain's National Health Service (NHS). As he said to an NHS audience in July 2008, "I am romantic about the NHS; I love it." And he went on to flatter the NHS in rapturous terms: "You are unified, movingly and most nobly, by your nation's promise to make good on an idea: the idea that health care is a human right.  The NHS is a bridge--a towering bridge--between the rhetoric of justice and the fact of justice."

The NHS does, indeed, represent a kind of pinnacle. It is the culmination of a belief system that originated in 19th century Germany. In the early part of that century, G.W.F. Hegel lyricized about the wondrous justice-giving powers of the "universal" bureaucratic state, not so different from the government of his Prussian homeland. Later in that century, another Prussian, Bismarck, reified and solidified Hegel's idealism into the practical reality of a bureaucratic welfare state; neo-Hegelians finally had achieved their utopian vision. For them, the welfare state became a secularized godhead, boasting the power to transubstantiate mere tax money into glorious and ennobling political structures. In the US, neo-Hegelians called themselves Progressives; the English word "Progressive" was inspired by the German Deutsche Fortschrittpartei, the German Progress Party, founded in 1861.

Of course, American progressives, yearning to enact their vision of modernization and uplift, were not inspired only by Bismarck, they were inspired also by the humming factories that improved productivity and generated prosperity. So Henry Ford, having mastered the assembly line, became a hero, as did Frederick Winslow Taylor, the pioneering "efficiency expert." Across the political spectrum, right to left, from the US to the USSR, "Fordism" and "Taylorism" were admired for their industrializing powers. For their part, American progressives reasoned that if factories were efficient thanks to Ford and Taylor, they could be made even more efficient without the "waste" of capitalist competition. And they further reasoned that people like themselves could make the whole nation more efficient. As John Dewey wrote in his 1935 book, Liberalism and Social Action, "Organized social planning . . . is now the sole method of social action."

So it made sense that healthcare, too, should be planned, modernized, and socialized. The Beveridge Report, produced by the British government in 1942 as the blueprint for the NHS, asserted that national healthcare should be seen as part of a "comprehensive policy of social progress."

Later in the 20th century, Dr. Berwick was swept up by the same progressive idea: planners would improve social welfare and, at the same time, eliminate waste. Berwick founded the Cambridge, Massachusetts-based Institute for Healthcare Improvement (IHI), which self-described itself as follows:

IHI was founded in the late 1980s by Don Berwick and a group of visionary individuals committed to redesigning health care into a system no longer plagued by errors, waste, delay, and unsustainable social and economic costs.

Berwick has been open in his admiration of such contemporary efficiency experts as W. Edwards Deming and of companies that have streamlined "just in time" techniques, such as Toyota. And the progressive healthcare dream has stayed steady now for a century; Berwick's declaration that 30 percent of healthcare spending is "wasted," for example, is perfectly congruent with Barack Obama's 2008 promise to eliminate waste and so cut a family's healthcare spending by $2500, or one-third. The promise still stands, of course, even if we are still waiting for the facts to catch up. Indeed, the lag time might lead some to conclude that perhaps the government is not the efficiency machine that Berwick and Obama might wish it to be. The progressive dream of enlightened management, it seems, will never die.

Some things have changed, to be sure, in the 100 years since Teddy Roosevelt ran for president on the Progressive ticket, promising, among other platform planks, national health insurance. For one thing, progressives have figured out how to profit from their progressivism; Berwick's IHI paid him more than $2.3 million in 2008. Indeed, such a fat paycheck is perfectly in keeping with the spirit of an age in which policy experts become rich as well as powerful. White House healthcare czarina Nancy-Ann DeParle, to name another, received $5.8 million as a consultant to health insurance companies in the three years prior to her entry into the Obama administration.

Yet at the same time, there can be no doubt that Berwick has full faith in the transcendence of what he has been doing. As he told the Times, "We are a nation headed for justice, for fairness and justice in access to care." Indeed, he continued, putting the cause of providing universal health insurance in grandiose terms: "There is a moon shot here." By "moon shot," Berwick meant a project that can grip the popular imagination the way it has gripped the imagination of so many Democratic Party intellectuals. Yet if most Americans don't see health insurance in such lustrous terms, well, that is indeed a problem for the latest generation of Hegelians.

In fact, Berwick lamented that the public hadn't yet grasped the greatness of the vision: "Somehow we have not put together that story in a way that's compelling."

One problem, perhaps, lies in Berwick's zeal for health rationing, seen as a necessary component of health justice. Yet zeal is not shared by most Americans. So when Berwick declared in 2009, "The decision is not whether or not we will ration care--the decision is whether we will ration with our eyes open," those words were widely used against him. But Berwick has, in fact gone even further, past the political point of no return:

Any healthcare funding plan that is just, equitable, civilized and humane must, must redistribute wealth from the richer among us to the poorer and the less fortunate. Excellent healthcare is by definition redistributional.

Alas, poor Berwick--we knew him too well. Whereas he saw government-run healthcare as taking us toward the light, others saw darkness. The NHS, for example, looks bad to most Americans with access to Google. Even the briefest search yields up headlines such as this, from the December 1 edition of The Telegraph, the UK newspaper: "Loved ones not always told their relative is on controversial 'death pathway'/ NHS doctors are failing to inform up to half of families that their loved ones have been put on a scheme to help end their lives, the Royal College of Physicians has found." It seems that tens of thousands of NHS patients are being put on what the Telegraph called the "death pathway," as a play on what the NHS calls--using a euphemism for euthanasia--the "care pathway." According to NHS rules caregivers [sic] are allowed to withhold care from terminal patients after receiving consent from the family. Yet a government audit found that some 2500 families, in the city of Liverpool alone, were not so consulted.

It's from reports such as this that some in the US get the idea that eliminating "waste" is code-talk for eliminating patients. So maybe "death panels" in the US aren't such a stretch, after all. That's why most American advocates of national health insurance tend to shy away from any comparison to the NHS. But not, as we have seen, Dr. Berwick, who has always been forthright in his NHS-philia. Most likely that's why he wasn't confirmed by the Senate; when his recess appointment to CMS expired, he had to go back to Cambridge.

No doubt Berwick will soon find a way to pass his vision on to a new generation, but after a century of progressive activism on healthcare, perhaps he--and all of us--might think about a new course of action. It seems that US healthcare advocates have hit the point of diminishing returns; after all, we have had mostly universal healthcare coverage for decades now. Medicaid and Medicare were created in the 60s, and EMTALA gave everyone the right to emergency-room treatment, without regard for ability to pay. Yet such piecemeal approaches did not meet Berwickian efficiency standards, it would seem, and so the Obama administration charged ahead with the Affordable Care Act, signed into law in 2010. And later that same year, the Democrats, of course, were clobbered in the midterm, and now "Obamacare" is still deeply under water politically.

So maybe the Hegelian-Bismarckian-Berwickian vision has played itself out. Maybe it's in the nature of the health-and-welfare state these days that it a) can't keep up with fast-fluxing market forces, b) can't keep up with the even faster changes in social networking and computerized transparency and publicity, c) can't withstand the onslaught of special interests, who honeycomb any kind of legislation with baroque and unpopular mandates and set-asides, d) can't come to grips with the reality that lawyers and judges end up running any public system these days, and e) can't comprehend the fact that the real problem with a disease such as Alzheimer's is not the financing of the disease but, rather, the disease itself. And therefore the whole idea of securing additional healthcare finance for the population is less important than the idea of securing the medicine to keep people healthy in the first place. The best way to eliminate "waste" in the healthcare system is to eliminate the need for so many people to be processed through the system.

We shouldn't be surprised if Berwick writes a book about his experiences in Washington. Here's a suggested title, although he probably won't use it: All Things Must Pass.

Monday, November 28, 2011

DARPA to the rescue! If civilians have lost interest in developing cures, let the military do it.


Wired magazine reports that the Defense Advance Research Projects Agency (DARPA) is working on a new approach to antibiotics--or post-antibiotics.   As Wired's Katie Drummond writes,  DARPA is seeking proposals that could completely replace traditional antibiotics with a whole new kind of bacteria killer:

Darpa wants researchers to use nanoparticles--tiny, autonomous drug delivery systems that can carry molecules of medication anywhere in the body, and get them right into a targeted cell. Darpa would like to see nanoparticles loaded with "small interfering RNA (siRNA)" -- a class of molecules that can target and shut down specific genes. If siRNA could be reprogrammed "on-the-fly" and applied to different pathogens, then the nanoparticles could be loaded up with the right siRNA molecules and sent directly to cells responsible for the infection.

Drummond allows that it might seem hard to believe that DARPA could pull off something like this, but in fact, the theory has already been proven.  Last year, she notes, researchers were able to engineering siRNA and put it into nanoparticles that were injected into four primates infected with the Ebola virus, thereby arresting the killer disease.

But wait--there's more.   Not only does DARPA seek to bring about this whole new approach, skipping past familiar modes and mechanisms, it is also seeking ways to time-compress the timeline of new cures, from years down to mere days.

So it's a daunting, if enticing, prospect.   DARPA does, indeed, have a big vision.  At a time when most healthcare "experts" talk only of finance and bean-counting and rationing--that is, on the demand-side of medicine--the DARPA wants to jump in on the supply-side of medicine; that is, the creation of actual cures; it's the Pentagon, not the Department of Health and Human Services, that wants to decisively intervene in the course of disease and save lives.  Audacious?  Sure.  Impractical?  Maybe.  Popular?  Absolutely, if it works.   But as Drummond concludes:

If anybody can design a new paradigm for medicine, and a new way to mass-produce it, our money's on the military. After all, we've got them to thank for figuring out how to manufacture the medication that got us into this mess in the first place: penicillin.

Indeed, the military, British and American, was the impetus for the serious development of antibiotics.  The medicinal qualities of the blue-green mould penicillium notatum had been observed as far back as the Middle Ages, but those positive properties were not recorded in a scientific treatise until 1875.  Yet serious scientific inquiry did not begin for another five decades after that.  Alexander Fleming had been a British military doctor during World War One, working in the mud and filth of the trenches, observing firsthand the lethality of infected wounds.   For the next decade, Fleming kept seeking a remedy for infection--until that fortuitous moment in 1928, when he noticed that bread mould was inhibiting bacteria growing in a petri dish.  He called it penicillin.

Yet in the following years, progress remained slow, as Fleming and others at St. Mary's Hospital in London struggled for a decade to purify and extract the antibiotic agent and turn it into a usable drug.  It was not until World War Two that urgent military necessity led to increased funding for Fleming--and to the rapid acceleration of penicillin research and development, mostly in the US.  This heroic story was ably told by Lauren Belfer in her 2010 novel, A Fierce Radiance.  

Vannevar Bush, the director of the Office of Scientific Research and Development--DARPA's predecessor agency--ordered that penicillin research be a national priority second only to the atom-bomb project.   And it worked.  By 1944, penicillin was being produced in the millions of doses by Pfizer, working on a government contract.  As a result of this public-private partnership--this medical-industrial complex, if one prefers--more gains were made in the battle against deadly infection than in all the previous years of human history.

Fleming and two fellow researchers were awarded the Nobel Prize for Medicine in 1945.   As Bush   observed in that same year:

The death rate for all diseases in the Army, including the overseas forces, has been reduced from 14.1 per thousand in the last war to 0.6 per thousand in this war.  Such ravaging diseases as yellow fever, dysentery, typhus, tetanus, pneumonia, and meningitis have been all but conquered by penicillin and the sulfa drugs, the insecticide DDT, better vaccines, and improved hygenic measures. Malaria has been controlled. There has been dramatic progress in surgery.

So while we don't yet know if DARPA's new plan for siRNA will truly work, history tells us that if the military really puts its mind to work on a challenge, that challenge can often be overcome.  Why?   Because the military has a strong claim on national resources--and not just tax revenue.  In the past, to achieve an urgent objective, the military has black-boxed its budgets, dragooned brain power, and bulldozed any and all obstacles.

To cite one germane non-medical example, Gen. Leslie Groves, leader of the Manhattan Project, did not pause over Environmental Impact Statements when he occupied Oak Ridge, Tennessee, and set up a nuclear bomb factory that brought in 75,000 people, and he certainly did not hold public hearings in advance of the 1945 atomic tests at the Trinity site in New Mexico.    Such military mobilization of resources is a hard and Hobbesian process, but it has one virtue: It works.   If the goal is important--and theoretically, at least, the wartime military doesn't have any goals that are not important--then the Manhattan Project sums up the way the process can work to shorten the war, reduce casualties, and guarantee victory.

Similar tales could be told about the wartime (including the Cold War) invention/acceleration of such 20th century inventions as radar, synthetic rubber, aviation, electronics, nuclear power, the internet, and GPS.   As an aside, the fact that each these inventions contributed not only to military victory but also to civilian wealth is yet another bonus of constructive public-private partnerships, and a reminder that the US military has been one of the principal drivers of the American economy all through our history.   And so, too, in the case of DARPA's siRNA project; if it works, we will all owe those defense nerds yet another huge debt.

By contrast, the results for innovation and the economy in the absence of military mobilization can be painfully slow--even deadly slow.  In a free and pluralistic society, every economic activity is eventually surrounded by claimants and rent-seekers of various kinds; these claimants and rent-seekers can be variously described as remoras, barnacles, or lampreys.   That is, they can be mildly symbiotic, a slight burden--or they can be lethally parasitic.

The dismal economic consequences of runaway pluralism were ably described by the economist Mancur Olson in his 1982 book, The Rise and Decline of Nations; Olson went so far as to suggest it was more economically beneficial to lose a war than to suffer the endlessly cumulated sedimentations of special-interest encrustation.  The non-catastrophic solutions to such "demosclerosis"--to recall Jonathan Rauch's encapsulation of the Olson argument--are relatively straightforward; csh solutions include deregulation and an overall opening up of clotted economic arteries.  But as we have seen in our time, it's easier to prescribe those solutions than it is to implement those solutions.

Typically, what's needed is at least some kind of crisis--some wake-up call; a default, if not a  defeat.  Civilian leaders can sometimes make the most of a sense of urgency and crisis--but military leaders always can.

As we have seen in recent decades, bad news on the medical front has not been in any way galvanic--the situation gets worse and worse. Indeed, the worsening seems to be part of a deliberate policy of looting the medical industry to achieve other governmental goals.  

No wonder, then, that we have been losing the war against infection for some time now, and nobody in the US government, other than DARPA, seems to have noticed.  Yes, it might seem to be a strange world when all the agencies and committees that have the word "health" in their title have been allowing the problem to worse, to the point where the number of new antibiotics has fallen by more than 80 percent over the last quarter century, even amidst louder warnings about the rise of deadly "superbugs."   Yet as the historical record shows, even well-meaning civilians have not been able to overcome the cumulative blockages of the trial lawyers, the FDA, and the overall brain-drain and capital-drain out of the pharma sector.

Enter the Pentagon and DARPA, coming from a different world, pursuing different goals.   By no means is the military always a paragon of efficiency, but mission-focused command-and-control does have its bottom-line virtues.   For the most part, the military is able to fend off civilian predations and Olsonian sclerosis, because generals and admirals can invoke national security--and, at a more gut level, the well-being of our fighting forces--in order to push its projects through.

"Compared to war," General George S. Patton said during World War Two, "all other forms of human endeavor shrink to insignificance."   War is, indeed, catalytic; it does unleash vast amounts of public exertion and public forbearance.   But war, of course, is also tragic, even if, as in World War Two, the larger benefits of improved medicine save lives during and after the war.

In a better world, advocates for Serious Medicine, such as a new kind of instantaneous bacteria-killer, would be able to act just as decisively in the fight against microbes as generals can in the fight against men.   That is, we would enjoy the benefits of saving lives without predicating the effort on taking lives.   Until then, however, we can conclude that those generals and admirals care more about the well-being of their men and women than our elected politicians care about the well-being of us civilians.

So yes, someday, we should have a MARPA, for Medical Advanced Research Projects Agency, as a more mission-focused version of the NIH.   We should mimic the military's sense of purpose on the civilian side, without firing a shot.

But until that happens, we should be thankful that we have a DARPA.  

Saturday, November 19, 2011

The FDA’s Rejection of Avastin: Not Part of the Solution, Part of the Problem

The Food and Drug Administration’s decision to restrict the use Avastin for breast cancer has attracted some cautious supporters in unexpected places.  MPT’s own Paul Howard, for example--not generally regarded as a fan of the contemporary FDA--writes,“This is one case where I think the FDA did the right thing.”  

Well, here’s another perspective: This is a case where the FDA did the wrong thing.  It’s wrong for patients, wrong for the country, and wrong even for the long-term cause of saving money.   We need to do more against cancer, not less.  And paradoxical as it may seem, if we do more, we will not only save more lives, but we will ultimately spend less money.  Indeed, medical history tells us that only when we do more--that is, increase innovation and productivity--do we end up spending less.  That’s the lesson of polio from the 50s, of AIDS in the 80s and 90s, and of heart disease over the last half-century.  And it could be the lesson of breast cancer, too--but only if we take the same dynamic pro-science, pro-innovation approach.

Today, the FDA, echoing the thinking of the larger federal government, seems content to fight mere skirmishes in the war on cancer.   Yet absent any sort of strategy for victory, the casualty toll will continue to mount.   Last summer, at an FDA hearing in Washington, one woman, Priscilla Howard, declared, “Despite the potential side effects from Avastin, metastatic breast cancer has only one--death.” She added that Avastin had controlled her cancer for 32 months: “I want every available weapon in my arsenal as I fight this devastating disease.”  But now, thanks to the FDA’s action against Avastin, that arsenal has been depleted.  Indeed, it’s a safe bet that the future arsenal will be depleted even more; Uncle Sam has just sent a clear signal to researchers and developers: Don’t assume that the government is interested in financing future progress against cancer.  If you develop a new drug, the burden is all on you.  In addition, you will confront both implicit and explicit price controls.  

In fact, the FDA’s Avastin decision should be seen in the context of overall public policy in the last few decades, which can be summed up in three points:

First, the dominant healthcare policy elites, influenced by the environmental movement, have adopted a generally skeptical view of technological advancement in medicine.  Since the 60s, technology has been seen by many as a source of alienation, pollution, and even, in a metaphorical sense, mutilation.   In 1984, Dick Lamm--who had led the fight against the proposed Denver Olympics before going on to serve two terms as Colorado’s Democratic governor--struck an elitist chord when he applied the same limits-to-growth ethos to healthcare.  Older Americans should pass from the scene sooner, rather than later, he said, for the sake of future generations: “We’ve got a duty to die and get out of the way with all of our machines and artificial hearts and everything else like that and let the other society, our kids, build a reasonable life.”   With the conspicuous exception of the fight against AIDS--which was treated as an all-out war, thanks to the intervention of figures from the popular culture, as opposed to the policy culture--this go-slow approach has dominated the chattering classes.   Indeed, the Kaiser Family Foundation has noted this gulf between the elites and the masses; the elites want less healthcare as a matter of national policy, and the public, by contrast, wants more.

Second, policy makers see the need to control healthcare costs as a way of making national health insurance more acceptable and affordable.  To put it bluntly, if people die, that’s cheaper for the system, at least in the short run.   Such sentiments are rarely articulated in public, of course, but the public is nevertheless suspicious of what the elites are up to.   And so, for example, when a panel within the Obama Department of Health and Human Services put forth new and more restrictive guidelines calling for fewer mammograms, the public rose up and the new rules were withdrawn, although not before the “death panel” meme was born.   Interestingly, the same panel put forth similarly restrictive guidelines on prostate cancer screening, and those new rules have not been withdrawn--perhaps a  reminder that prostate-cancer-minded men are not as organized and energized as breast-cancer-minded women.  Meanwhile, the cost-controlling effect of the Independent Payment Advisory Board, part of the Affordable Care Act of 2010, remains to be seen.  But here’s a prediction: IPAB will be much more effective at controlling abstract costs, defined as future speculative research, than it will be at controlling tangible costs, defined as money flowing directly to patients and caregivers.  In other words, IPAB will impose “savings” in exactly the sort of research that could ultimately save lives.   In the past, the federal government has been good at making long-term investments, e.g., the railroads, aviation, and the Internet.  But in the current political environment, the healthcare imperative is for immediate savings--in time for the next fiscal year, or the next election.

Third, we now see the additional pressure of the “deficit hawks,” culminating in the so-called Super Committee, which has raised the static-analysis view of deficit-reduction to the pinnacle of national thinking.   Official Washington will be happy if there’s a deal in the next few days or weeks--any deal.  It’s not hard, of course, to find skeptics who believe that the spending restrictions will not be meaningful, but it would appear that the Establishment has settled on the idea that an agreement of some kind is desperately needed--if only, some might say, to save the same Establishment from losing face.  Yet if and when those possible spending caps are broken, it’s more likely that immediate costs--say, increasing payments to doctors or hospitals--will be accommodated, as opposed to longer-term research.  So once again, cancer researchers and developers are on notice; the real money will be in treating cancer, not in beating cancer.  And the same will hold true for other diseases, such as Alzheimer’s.   The care may ultimately cost more than the cure, but the feds are interested in paying only for the care.  And as always, we get what we pay for.

Back to Avastin: If the drug is used less, that’s a savings to the government, in the short run.    Yet as the population ages, diseases such as cancer--as well as other illnesses, such as Alzheimer’s--seem destined to become more prevalent, and the nation will have to bear the  expense.   So while the price-controlling approach to cancer research is likely to “work” in terms of restricting cancer drugs, it is ultimately doomed to fail as a means of controlling costs.   Caring for increasing numbers of sick people for long periods of time is costly--and those people, by the way, are voters.

So how can prices for healthcare be lowered?  The answer is the same for medicine as for everything else--improved productivity, getting more for less.  That’s been the secret of the Scientific Revolution over the last four centuries, and also for the Industrial Revolution over the last three centuries.  As Adam Smith explained in The Wealth of Nations, developing a more efficient way to make something as simple as a pin could increase overall output by a factor of 240--that’s 24000 percent.   Such gains have been routine over these hundreds of years, accounting for the material abundance that we enjoy today.   So it’s perverse that all the aforementioned policy elites are following a different policy path when it comes to medicine.   Instead of saying, “Push ahead, so that we can have more for less,” the elites have taken an anti-Smithian stand; they have taken a neo-Malthusian stand, arguing for rationing and scarcity.   And such neo-Malthusianism is the ultimate animating philosophy behind the FDA’s decision against Avastin.  If everybody “knows” that we need to cut back and make do with less, here is the FDA’s opportunity to be on “the right side of history.”

So the challenge for the rest of us is to rediscover Smith, and to reject Malthus yet again.   We must apply Smithian wisdom to the systematized research, and mass production, of medicine.  That is, apply the time-tested scientific and industrial principles of growth, and insist that they be applied to medicine.   And if we do that, Avastin would be seen in a new light.  The drug may or may not prove to be a great cancer treatment, but surely, at minimum, the use of the drug will save some lives, as well as help teach us about what works against cancer.  Edison didn’t get the the lightbulb right the first time he tried, nor did Einstein develop the theory of relativity in the first draft.  The process of discovery can be lengthy--and expensive.  But as we have seen, the cost of non-discovery is even greater, and ultimately more expensive.  

This further point--that we learn by doing, as millions of actors set in motion a Hayekian process of discovery that no bureaucrat could plan for, or account for--is worth pausing over, because it speaks to what saves lives in medicine.  

A powerful illustration of discovery in action comes from Harvard economist David Cutler, who describes the process by which heart disease has become vastly more survivable and vastly less expensive on a per-patient basis.   Cutler recalls that in 1955, President Dwight D. Eisenhower suffered a heart attack.  His doctors prescribed . . . bed rest.  That was the best they could do, even for the commander-in-chief, the leader of the free world.   The remedy was certainly low-cost, although for the leader of the free world, no expense would have been spared.   In fact, Cutler comments, the treatment Ike received was counter-productive: “We know today that bed rest is ineffective.  It does not prevent further heart damage, and it can lead to other complications, such as blood clots in the veins and lungs.”  In other words, the treatment for the heart attack was making the president’s condition worse.  Early failure is a familiar enough phenomenon in any scientific inquiry, and medicine is no exception.  So the challenge, therefore, is to keep pushing forward, figuring it out as one goes along.   Such problem-solving is the basic method of all science and all engineering.

By the 1970s, Cutler continues, open heart surgery had become common.  Such procedures were an improvement, albeit with huge drawbacks; any patient who spends time in a hospital runs the risk, for example, of nosocomial infections--that is, infections acquired in the hospital.  Such infections are estimated to occur in five percent of all acute-care hospital stays, causing perhaps 70,000 deaths a year.  But even as progress was being made for such surgeries, the development of alternatives continued.  In the 1960s, stents emerged, and in the following decade, the first angioplasties were attempted.  Drugs emerged, too, such as statins.   Meanwhile, science became more aware of dietary and lifestyle issues as they affect heart disease, giving people new tools to help their own health and longevity.  In addition, that old medicine-chest standby, aspirin, was now seen in a new light.   So we can see that for many, the advance of science has led to some surprisingly simple and elegant solutions, based not on faith or superstition, but on a century of accumulated scientific wisdom.  When a basic problem is solved, it stays solved, at minimal cost;  for example, for as long as people want to use the wheel, the wheel will work, and for as long as people wish to avoid rickets, Vitamin D will work.   And at the same time, we have developed the sort of heavy scientific machinery, including the pacemaker, that is keeping, for example, Dick Cheney alive.  The cumulative wisdom of simple solutions, together with complex solutions, has worked: As Cutler observes, heart disease is three-fourths more survivable than it was in Eisenhower’s time.   And that’s been a huge boost to our society and economy; unfortunately, the federal bean-counters have chosen not to notice, and so the positive-feedback impact of cures has never been factored into national budgeting.

So that means, unfortunately, that progressive scientific health solutions--as opposed to redistributive bureaucratic health semi-solutions--have never been taken seriously by the budget “experts.”  And so, absent that policy support, we haven’t made as much progress on some other diseases.   If the healthcare policy elites could forget their training and bring themselves to see medical progress as a fiscal winner, of course they would demand the sorts of changes in the legal and regulatory environment that would foster more and better medicine.  But at the rate we are going, they won’t change, and so the inhibitory environment won’t change.  

So the Avastin decision is a sign of the times, a part of the problem--and certainly not part of the solution.

This piece was cross-posted at the Manhattan Institute's Medical Progress Today. 

Monday, October 24, 2011

Helping the paralyzed to walk--can Uncle Sam compute the value of that?

A company called Ekso Bionics has developed a wearable, battery-powered exoskeleton that enables the paralyzed to walk; that's one Amanda Boxtel, above, walking for the first time in a long time.   A very heartening story--and it's not just a story, it's real.

No doubt such exoskeletons are expensive, although, of course, the price will fall if and when they are mass-produced and new competitors enter the market. 

Of course, if the US government were to buy such bionic devices for, say, America's wounded warriors coming back with crippling injuries from Iraq and Afghanistan, that expenditure would simply be measured as a cost by the Congressional Budget Office.    Given the tenor of the times, it's a safe bet that the bean-counters who rule Washington these days will say that we can't afford it.   Would the CBO even attempt to score the possibility that more wounded warriors would be able to go back to work, and be more productive?  And of course, the way things work in DC these days, if CBO doesn't score it, then it has not happened. 

Moreover, if we had built this exoskeleton industry here at home, then the cost of buying the equipment would be balanced by the jobs and profits that we would be generating here on the home front.


Sunday, October 9, 2011

More on the Serious Medicine Crash

Data from the California Healthcare Institute and the Boston Consulting Group providing yet another metric of the decay of healthcare investment in the US.   Note steep falloff in last three years. These numbers not adjusted for inflation.

H/T: Manhattan Institute's Medical Progress Today. 

Serious Medicine Crash update






According to the Medical Innovation & Competitiveness Coalition, a unit of the National Venture Capital Association, medical investment is dramatically falling off:

The survey found that U.S. venture capital firms have been decreasing their investment in biopharmaceutical and medical device companies over the past three years and expect to further curtail such investment in the future. Overall 39 percent of respondent firms have decreased their investments in   life sciences companies over the last three years and the same percentage expect to further decrease these investments over the next three years, some by greater than 30 percent. This is roughly twice the number of firms that have increased and/or expect to increase investment.

While 40 and 42 percent of firms expect to decrease investment in biopharmaceutical and medical device companies respectively, 42 and 54 percent expect to increase their investment in non-FDA regulated healthcare services and healthcare information technology companies respectively.

In another alarming sign, survey respondents expect to see significant investment decreases in companies fighting serious and highly prevalent conditions including cardiovascular disease, diabetes, obesity, cancer, and neurological diseases.

“More than 100 million Americans suffer from diseases for which there are still no cures, or even meaningful therapeutic options. To conquer disease and relieve suffering, we must have a medical innovation pipeline that is as strong and robust as possible,” said Margaret Anderson, executive director, FasterCures. “Bringing critical therapies to market requires venture capital investment to spur a thriving life sciences industry as well as having a regulatory system that’s appropriately resourced and equipped to ensure innovation is translated to better health.” 

H/T: Manhattan Institute's Medical Progress Today

Sunday, September 25, 2011

"The Super Committee Faces the Challenge of Knowing the Unknowable"

From the Manhattan Institute's Medical Progress Today blog

Serious Medicine Strategy is now part of the the Manhattan Institute's Medical Progress Today Blog

Here's one of several pieces I have written for the new Manhattan Institute Medical Progress Today blog.  

Sunday, September 11, 2011

The return of fatalism: One time-tested way to save money on healthcare--embrace illness and death

Is disease a blessed event hastening our path to heaven?   Should we look beyond the pain and suffering and focus only on the end goal?  It's easy to mock this don't-worry-be-happy-just-die admonition from a 1799 religious pamphlet:

Let us learn a lesson from the seafaring man, then, and regard the bright side of even our afflictions. Instead of considering sicknesses and diseases to be only so many painful visitations, let us try to regard them, also, as so many different roads to the golden gates of heaven.


Few admit to thinking that way today, but it does seem as if, in our dismissal of science, we are strangely returning to that sort of fatalism.   Every civilization puts certain things in the ascendancy as opposed to other things: One civilization builds pyramids, another buils cathedrals, another builds grands boulevards, another builds mcmansions.  In other words, each civilization makes a series of choices: What's important? monumentalism? clericalism?  royalism?  militarism?  economic dynamism?   Another choice is science and scientific advance--either a culture celebrates, and fosters, scientific advance, or it doesn't.   And if a culture doesn't celebrate technological progress, then, in a dynamic world, it is likely to not only fail its own people, but it is also likely to be overcome by rivals. That was the story of China relative to the West from 1500 to 1945 or so, and it could be the story, in reverse, of the 21st century and beyond.    On this 9-11 anniversary, we rightly pay tribute to those who were lost ten years ago today, but surely just as important is making sure that it doesn't happen again.  And such prevention requires active measures.  Passivity and fatalism are no answer--at least not a good answer.

And yet it sure seems that for the most part, we are on an anti-science course in America today.  Yes, we have plenty of schools and institutes with "science" on their nameplate, but oftentimes, they seem at least as interested in politics, and politically correctness, as science.   And of course, while the larger culture is happy enough to get a next-generation smartphone, the larger culture also seems happy enough to assume that these wondrous mini-machines will be developed and produced by foreigners.  And of course, the climate of regulation and litigation sends an unmistakable message: anything made in the USA can be targeted by bureaucrats and trial lawyers, as politicians either cheer them on or watch passively.   

Nowhere is this adversarial culture more evident than in the area of medical R&D. As noted here at SMS, there's been a crash in Serious Medicine, which will obvious and deleterious effects on all of us, and yet the political system has been clueless.

So today we spend money financing disease and its ravages, and yet we seem uninterested in intervening to stop the disease.  Out of the $2.6 trillion that the US spends on healthcare, only around $113 billion goes for medical R&D, and that category of R&D covers everything from cancer to botox.  In other words, as a percentage of our total spend, very little is directed toward urgent national problems, such as Alzheimer's.   It wasn't always thus--for a while, we focused on polio, and we beat it.  For a while, we focused on AIDS, and we beat it back, at least in the US.    But now, we seem content just to deal with the ravages of disease.  And yet ironically, this approach isn't cheap at all, because the epidemics we confront are not the quick death of the black plague, but rather the slow disability and death that come from chronic illnesses such as Alzheimer's Disease.    So we get the worst of both worlds: no cures and we spend a fortune. 

Except maybe for a lingering few Christian Scientists and maybe some Greens, we don't do this out of a religious or quasi-religious feeling, but merely out of inertia and sloppy thinking, backed up, of course, by some quiet players who gain money and power out of the status quo.  After all, plenty of current constituencies benefit from a definition of healthcare that focuses on retroactive finance, as opposed to proactive science--think nursing homes, financiers, and non-science-minded "experts" in "public policy." 

The French critic George Bernanos argued,  “The worst, the most corrupting of lies are problems poorly stated.”  And so we see today, the problem of healthcare has been defined away from cures and defined instead as long term care.  And so the issue becomes insurance of various kinds, and not science of any kind.   So we might as well embrace the fatalism of that 18th century pamphlet. 

Hat tip: Marc Abrahams at Improbable.com 

Thursday, September 8, 2011

Frederic Bastiat, Call Your Office: What the Committee for a Responsible Federal Budget Would Like to See --and What Can Be Seen: Two Different Things

Under the headline, “What We Hope to See From the Super Committee,” the Committee For a Responsible Federal Budget (CRFB) weighed in yesterday with its recommendations to the Joint Select Committee on Deficit Reduction (Super Committee), which holds its first meeting on Capitol Hill on today.   The CRFB, of course, is perhaps the pre-eminent “deficit hawk” organization in Washington DC, and so its recommendations carry great weight among wonks, pundits, and, inevitably, politicians. 


But what, exactly, is it recommending?  What would CRFB have the Super Committee do?  The September 7 document outlines five sets of recommendations, mostly aimed at reinforcing the determination and credibility of the Super Committee and, by extension, Congress.   But here we will examine just one recommendation: the admonition to “Go Long,” as in, long term.  As the CRFB document puts it, the Super Committee must take entitlement spending head-on, addressing “the long-term drivers”:

Any serious fiscal plan must address the long-term drivers of our growing debt. The Super Committee must enact serious reforms to Social Security, Medicare, Medicaid, and other federal health spending.


In other words, it’s all the major entitlements that must go under the budget knife.   Okay, fair enough: Entitlements account for almost three-fifths of federal spending, and so it makes sense to look into those budget categories for savings.   The CRFB document pushes hard in this direction, advocating an overhaul of spending--including federal health spending--well beyond the familiar ten-year time horizon for federal budgeting.  Indeed, the CRFB looks ahead a full four decades, all the way to 2050: 

Based on our projections, federal health and retirement spending is slated to grow substantially, from below 10 percent of GDP today to 12 percent by 2021, 15 percent by 2035, and 17 percent by 2050.  This is due both to population aging (largely because of the retirement of the baby boom population) and to rapid health care cost growth.


Okay, so major increase in costs is foreseen.  But let’s ask ourselves: How do we really know what healthcare and retirement spending is going to be in 2050?  What do we really know about the middle of the 21st century--that is, what things will be available, and how much they will cost?   To be sure, part of the cost-increase is relatively foreseeable, because of the aging of America; the over-65 population is projected to grow almost three times faster than the overall population, and the percentage of the elderly will increase from about 12 percent of the population today to about 20 percent in 2050.   And old people, to be sure, generally cost more to treat than young people.   Moreover, for the most part, population increases and demographic shifts are relatively easy to project--although some forecasters, such as the notorious Malthusian Paul Ehrlich, author of The Population Bomb, have still managed to be grossly wrong. 

But how much do we know about the future costs of healthcare?  Answer: not much more than we know about what life will be like.   A moment’s reflection tells us that the year 2050 falls into the category of what the great free-market economist Frederic Bastiat called the “unseen,” as opposed to the “seen.”   That is, some things just aren’t knowable in advance.  And to draw upon the wisdom of another free-market economist, Friedrich Hayek, it’s a “fatal conceit”  to think that anyone can plan that far ahead.  Quick questions: What will a computer look like in 40 years?  What will the Internet look like in 40 years?  If we don’t know the answer to those questions, we don’t know what healthcare is going to cost.  To gain perspective, we might think 40 years back to telephones and early computers: How have they changed since the early 70s?  Answer: Thanks to the cost-crashing/ productivity exponentializing power of Moore’s Law, they have changed in every imaginable way, and in ways that nobody back then could have imagined.   So to the extent that computers and the Internet are now thoroughly woven into the fabric of everything we do, it’s a safe bet that computers and the Net--or whatever they will have evolved into four decades hence--will have similarly transformed and retransformed medicine in the decades to come.  And perhaps changes in healthcare will change the length of our productive worklife in some dramatic way as well.   So in other words, we have no idea what federal healthcare and retirement spending is going to be in 40 years. 

Thus a lesson straight from Bastiat and Hayek: Don’t try to predict things that can’t be predicted.  The future--to borrow the distinction made by RAND national security expert Gregory Treverton--is not a puzzle, it is a mystery.  Puzzles, of course, can ultimately be solved by piecing information together--and yet mysteries are, well, mysterious.  In this world, anyone who says he can solve a mystery by examining the pieces of a puzzle is, at best, wrong, and, at worst, dangerous. 

If we continue with our current healthcare strategy--which can be defined as decreasing amounts of new technology, plus increasing amounts of labor and financial inputs--then we will, indeed, get a costly future such as CRFB projects.    Today, for example, nearly six million Americans suffer from Alzheimer’s Disease (AD), and that number is expected to quadruple in the next four decades.  Once again, projections about the future might be suspect, but those concerning the health consequences of an aging population are easy enough to foresee, especially in the absence of any dramatic scientific intervention.  And as of today, we have no proven effective treatment for AD.  Zero.   As a result, AD care is not only labor-intensive--nursing home care for increasing armies of incapacitated dementia victims--but it also meets the definition of “futile care.”   And yet the idea of attacking the true cause of rising AD costs--as opposed to lamentations about those costs in the future--is unmentioned by CRFB.  Indeed, the idea of seeking a cure for AD is essentially ignored by the entire category of Beltway “budget experts.”  The economists and lawyers and talking-point-writers who dominate the DC debate seem oblivious to scientific transformation as an alternative budget strategy.  Three critical words vital to the self-esteem of any bureaucracy are “not invented here.”   That is, if we didn’t think of it, it can’t be worth considering.  

So we might ask: How, under the current medical-technological regime, are we going to save money on AD?   Will we simply reduce the nursing care for dementia victims?   As always, the affluent will be able to buy their way out of personal neglect--even if they have failed to buy their way out of the disease, thus demonstrating an ultimate grim equality of result.   But what about those who depend on Medicare and Medicaid?  What will happen to them?  What are the horror stories to come?   Moreover, how will those people vote?  The brave talk of inside-the-Beltway lobbying groups and legislative bodies doesn’t hold up well against popular passion expressed in the streets and at the ballot box.   That was the story of the federal government’s short-lived catastrophic health insurance program in the 80s,  of Clintoncare in the 90s, and of Obamacare in the last three years.   In other words, Members of Congress who vote for the sort of cuts that CRFB is advocating are likely to be rewarded by opinion-leaders inside the Beltway--and punished by voters outside the Beltway.   That is, elected officials can become un-elected officials and then, as a consolation prize, get a good seat at the Gridiron Dinner. 

Yet there is another path, completely ignored by CRFB, and that is the path of medical progress.  The words “medicine,” “research” and “cures” do not appear at all the in the CRFB document.   And yet it is only through medical progress that genuine medical transformation can occur.   Profound transformation is achieved by visionaries and scientists, not by financiers and bean-counters. 

It’s worked that way in the past.  Let’s take polio as an example.  Back in the early 1950s, economists calculated that the polio epidemic, then raging, would cost the US economy $100 billion a year by the year 2000 (more like $1 trillion in today’s dollars).  Yet instead of accepting the basic premise of that projection--that the polio epidemic would continue forever--we changed the basic premise.  That is, we developed the polio vaccine.   And so, instead, our expenses for polio are essentially zero.  And what’s the smart way to think about that kind of budgeting?   Going back to that projection from the 1950s, deficit hawks might have said that $100 billion a year is too much money.  So should they have said $90 billion?  Or, even hawkier, $50 billion?  The fact is, if polio were still engulfing us today, such reductions would be politically disastrous; voters would show their fury at the ballot box.  

Meanwhile, Alzheimer’s today is costing the US economy $172 billion a year, according to the Alzheimer’s Association.    And the cumulative cost is headed up to $20 trillion by 2050.   Deficit hawks might say those dollar figures should be reduced by 10, 20, maybe even 50 percent or more; that’s what deficit hawks do.      But there’s still that nagging issue of “how?”  As we have seen, AD care is expensive; indeed, labor-intensive healthcare runs into the iron logic of Baumol’s Law--if it’s labor-intensive, it will be expensive.   In other words, it’s one thing to declare that eldercare should be X-percent cheaper in the future, it’s another thing to achieve those savings.    As the news from the UK reminds us, the obvious solution is rationing--rationing that costs lives.   And from rationing, it’s not hard to get to even more draconian cost-saving solutions.   As always, it’s easier to envision these solutions inside a marble palace of planning, as opposed to at an actual patient’s  bedside.

Interestingly, CRFB seems to have grappled with this issue--the issue that its proffered solution is unpopular with the American people; the polls, after all, show that by 3:1 or 4:1 margins, people don’t want to see cuts in Medicare.   So perhaps in anticipation of a likely political backlash, CRFB is telling the Super Committee that it’s okay to “backload” the cuts--that is, have them come beyond the ten-year time horizon.   Super Committee members are told, in other words, that it’s okay to make negligible cuts in the near term, as long as big cuts are made in the long term:  

To reassure markets and put our budget on a sustainable path over the long-term, the Super Committee must therefore address the growth of the nation's largest entitlement programs, and give priority to those reforms with the potential to slow long-term growth paths (even if they do not have significant scoreable savings this decade). Reforms to Social Security, Medicare, and Medicaid are central to improving the long-term imbalances. [underlining in original]


Okay, so in other words, the CRFB is telling the Super Committee that it’s all right to make small cuts in the near term, over the next decade, as long as the big cuts come later.   But we might ask: Isn’t that the essence of kicking the can down the road--that is, telling Super Committee members, and Congress as a whole, that the big cuts need come only many elections hence, in 2021 or thereafter?   And what economic problem does that solve?  Aren’t we in a crisis now? 

What could be the reason for this budget-fudging?   One possibility is that CRFB accepts the Keynesian argument that immediate cuts in spending--and thus in consumer demand--might damage the economy.  Another possibility is that entitlement cuts in the near term are just too painful politically; that is, if such cuts are on the menu where the voters can see them, the whole project collapses as the politicians flee.   

Yet we might ask: If the work of CRFB is not about cuts anytime soon, then what is it about, exactly?   A third possible explanation is that the CRFB simply wants a deal.  That is to say, CRFB wants something that it can call a success.  Or, to be even more cynical about it, with apologies to the late Sen. George Aiken, it wants something that can be dubbed a “victory,” no matter how transparently risible that “victory” might be.   That is, something that proves that the American Establishment can still do something--anything.   Establishmentarians always like to emphasize their credibility, their seeming competency.  And the CRFB is the epitome of an operationally conservative establishment--not ideologically right-wing, just eager to tamp things back down to something that looks like normalcy.  

So back to CRFB’s  recommendations to the Super Committee.   And if those numbers are built on bad assumptions?  Well, that’s a problem to addressed at some later time.  The important thing, it seems, is that we have a deal in our time.  

The problem for the rest of us, though, is that the problems we face are not normal--and neither are the solutions.  Science, and scientific progress, are inherently disruptive, and if the political Establishment can’t handle that truth, well then, we need a better Establishment.  Indeed, for reasons that C. P. Snow outlined a half-century ago, contemporary political and cultural can barely handle science and what it brings.  And so the establishment just pretends that science doesn’t exist.  But as Galileo said under his breath at the Inquisition, eppur si muove.   That is, just because the reigning orthodoxy says that something isn’t happening, that doesn’t mean it isn’t happening.   As they say, if we don’t deal in reality, other people will.   Reality is a stubborn thing.  Indeed, the only way that the Establishment can possibly makes it straight-line projections  work is by squashing science--and we know how that ends.   

Then and now, politics, and the desire for orthodoxy, has trumped looming reality.   The obvious reality is that if we don’t do something radical about AD and other costly diseases, none of these budget deals are going to be work out as the American people might hope.  Either we will vastly more than the “deficit hawks” wish, or we will move toward rationing--or worse.

Today we see huge edifices of political thought being based on projections about the budget deficit in 2021, 2050, or even later.   There’s nothing wrong with predicting the future--so long as there’s an adequate amount of humility in the predictions.  But the bold predictions of CRFB as to what will happen in the middle of this century are based on a kind of know-nothing hubris.  The hubris that pretends to knowledge about what is unseen and mysterious.   And that is, indeed, a fatal conceit.  

Governor Rick Perry takes on cancer at the Reagan Library presidential debate

From the Politico/NBC Republican presidential debate last night: 

PERRY: But here's the facts of that issue. There was an opt-out in that piece of -- it wasn't legislation. It was an executive order.

I hate cancer. We passed a $3 billion cancer initiative that same legislative session of which we're trying to find over the next 10 years cures to cancers. Cervical cancer is caused by HPV. We wanted to bring that to the attention of these thousands of -- of -- of -- tens of thousands of young people in our state. We allowed for an opt-out.

I don't know what's more strong for parental rights than having that opt-out. There's a long list of diseases that cost our state and cost our country. It was on that list.

Now, did we handle it right? Should we have talked to the legislature first before we did it? Probably so. But at the end of the day, I will always err on the side of saving lives.

 
Rick Perry was responding to questions about his controversial 2007 executive order to vaccinate girls in Texas with Gardasil, as a way of warding off the human papilloma virus that can cause cervical cancer.   Obviously anything to do with sex--especially teen- and pre-teen sex--gets into touchy issues of family and family values and parental rights, but equally obviously, vaccinations as a whole are a good idea.   Think polio vaccine, for example, or, in an earlier era, the smallpox vaccine.   Perry has since expressed regret for the way that the Gardasil issue was handled, but the larger story of how Texas is fighting cancer is quite interesting--and quite compelling.    


And fighting cancer in Texas--or anywhere--isn't just a good thing for compassionate health reasons, it's also a way to job-creator.  The Texas Medical Center in Houston, for example, employs 61,000 people.  Statewide, and nationwide, medicine is a much bigger industry--one of the most dynamic industries we have. 

Tuesday, September 6, 2011

What Happens When “The doctor will see you now” Is Replaced by “Show me the money”?

A fascinating article appeared in Monday’s New York Times, headlined, “Adjusting, More M.D.’s Add M.B.A.”  That is, medical doctors are now getting master’s degrees in business administration.  As the Times explains:

As recently as the late 1990s, there were only five or six joint M.D./M.B.A degree programs at the nation’s universities, said Dr. Maria Y. Chandler, a pediatrician with an M.B.A. who is an associate clinical professor in the medical and business schools at the University of California, Irvine. “Now there are 65,” she said.


Mark V. Pauly, a longtime leader of the health care management program at the Wharton School at the University of Pennsylvania, said, “A light bulb went off and they realize that health care is a business.”


And so doctors are learning business as well.   As Penn’s Mark Pauly says in the Times article, “healthcare is a business.”   That point is worth pausing over: doctors as a business.  Some cynics will say that this is nothing new, that doctors have always been in it for themselves, but as the Times story makes clear, this is indeed a new businesslike trend in the offing.  So how to regard this trend?  What does it mean for doctors--and for the rest of us?  

On the one hand, it’s hard to blame doctors for shifting toward a greater focus on businesslike profit-maximization.  Why?  Because while doctors are admired by the general public, they are not respected by powerful blocs in society.   A 2010 Gallup poll shows that health professionals (nurses, pharmacists, and doctors, but hereafter, for simplicity’s sake, we’ll just “doctors”) finish at or near the top in rankings by the public of most admired professions.  Yet at the same time, doctors are very much disrespected by big financial players in society, including lawyers, insurance companies, and governments.   And so doctors must compare the specific negative power of those antagonistic blocs against the general esteem of the public--and today, the power blocs have an edge over the public.  That is, the real muscle in our society belongs to lawyers bent on suing doctors, also suing medical drug- and device companies.   And other segments of society, including insurance companies and governments at all levels, are seemingly guided by a sole objective: to control and cut costs, no matter what the other consequences might be.   In such an environment, perhaps it makes sense for doctors to muscle-up financially in self-defense.  Okay, so that’s why doctors are getting MBAs.

But on the other hand, what about the rest of us and our health?   What about the public interest? Yes, it’s a free country, but are we as patients better off if doctors take time away from medicine to focus on business?   That is, if doctors become so focused on making money that they take two years to get a formal MBA?   And more importantly: If doctors become doctor-businesspersons, is that really good for our national health?   Might we better off, instead, if we could think of ways to incentivize doctors to put more energy into medicine and the healing arts?  

One thing is sure: If the doctor-business melding trend continues, big things in our society will change.  First of all, it means an inevitable downshift in the public esteem of doctors.   One ultimate source of medical prestige is the feeling that doctors are motivated by at least a modicum of altruism.  It’s at least a little bit of altruism, people think, that inspires doctors to put themselves through the rigors of strenuous education in order to help others live better and longer.  The public realizes, however vaguely, that the Hippocratic Oath, composed some 2500 years ago, is still the guiding medical-ethical document for the profession.  The modern American version, for example, includes these idealistic lines:

I will prevent disease whenever I can, for prevention is preferable to cure.


I will remember that I remain a member of society, with special obligations to all my fellow human beings, those sound of mind and body as well as the infirm.


If I do not violate this oath, may I enjoy life and art, respected while I live and remembered with affection thereafter. May I always act so as to preserve the finest traditions of my calling and may I long experience the joy of healing those who seek my help.


Obviously not all doctors have lived up to these high standards, but just as obviously, most doctors have--that’s why doctors are so respected.  Moreover, the reality that the medical profession, plus medicine, has done great good is undeniable; over the last two centuries in the US, for instance, life expectancy has more than doubled, while the infant mortality rate has plummeted by some 98 percent.

Yet what might happen if doctors just became another species of businessmen?  What happens if the Hippocratic Oath is supplanted by profit-and-loss statements?

One early clue comes from those same Gallup rankings, which showed that the honesty and ethics of doctors are rated “very high/high” by 66 percent of the population, while business executives are so rated by just 15 percent.   So if doctors turn into just another category of businesspeople, it’s logical to assume that their prestige will drift down toward the general level of businesspeople.  

More urgently for most of us, what’s the health impact of purely businesslike doctors?   What happens when you go to such a doctor?   Will he or she want what’s best for you, the patient?  Or what’s best for his/her bottom line?   There’s a big difference.  As noted, the Hippocratic Oath stipulates, “I will prevent disease whenever I can, for prevention is preferable to cure.”   But of course, from a doctor’s point of view, cure, or attempted cure, is a better financial bet than prevention.  From a purely financial utilitarian point of view, separated from any ethical framework, it makes perfect sense not to tell a fair-skinned patient about the danger of too much sun--and then treat the patient, a few years or decades later, for skin cancer. 

Once again, despite the Hippocratic Oath and all the other medical canons, we have seen plenty of profiteering doctors who eagerly order unnecessary and duplicative tests and even superfluous operations.   Still, so far at least, we can say that these are rotten apples in the professional barrel.  Yet what will happen if and when medicine becomes fully “incorporated” and the new goal for doctors becomes hitting revenue targets?   If this were to happen, the most obvious consequence is that the prestige of the medical profession would plummet.

Some doctors, to be sure, might say, “Fine.  Keep your prestige.  We’ll take the money.”  Yet meanwhile, for every doctor who reaps the monetary benefit of an MBA, there are another hundred doctors who have been proletarianized--that is, turned into an overworked and underpaid wage-slaves--by Medicare or the insurance companies.  That is, proletarianized doctors are put to work on the medical equivalent of an assembly line, told what to do by a bureaucracy, told what to prescribe and how fast to do it.  Yet either way--whether doctors learn how to make more money as corporate operators or just accept becoming salaried employees of a public or private bureaucracy--the uniqueness of their profession will be lost.  And that will be a huge loss to the commonweal.   

Once again, those words: “healthcare is a business.”  As a matter of national policy, is this really what we want?   Do we want to eliminate the elevating aura of professionalism and move to total bottom-line-ism? 

We might learn the fate that doctors could face from the precedent of lawyers.  Once upon a time, lawyers were seen as a self-regulating guild, in which private appetites were subordinated to what we would now would call “the public interest.”  In England, mother country of American law, new barristers would be “called to the bar” and then spend their careers as members of an “inn of court,” which were a combination of insider’s club, workplace, and classroom for continuing legal education.   The guiding idea was that senior barristers would oversee the proper development of the legal profession, providing discipline and sanction as needed.  And such supervision was indeed needed, because after all, lawyers have extraordinary power--most obviously, the power to make or break individuals and firms with lawsuits.  And so the masters of the bar prohibited, for example, champerty--over-zealous or manipulative lawyering.  Were these long-ago lawyers motivated by idealism?  Not necessarily; they might have made the practical calculation that if they didn’t police their own profession, others would do it for them.

In the US, bar associations attempted to fulfill the same policing and oversight function.  And so, for example, the American Bar Association forbade lawyers from advertising, beyond the use of business cards. 

Those rules were overturned in the 1970s on free-speech grounds, in keeping with a larger sense that the US economy needed to be deregulated.  Critics of the old system were right to criticize it as cliquish and self-protecting, but they were wrong to think that the new wide-open system would be an improvement.  Today, the legal profession has been entrepreneurialized; one consequence has been the rise of John Edwards-type legal buccaneers--lawyers becoming millionaires and even billionaires.   Are we really better off being a nation of all lawsuits, all the time?   Americans curious as to the cumulative harmful impact of this litigation might take a look at Walter Olson’s website, Overlawyered.com.

Another ill consequence to our healthcome to our health in the form of those ubiquitous television spots, trolling for clients, featuring phone numbers such as 1-800-BAD DRUG.   In other words, legal predators are now free actively to seek plaintiffs to sue medical providers for any and everything.   So now we see what happens when a profession is fully and totally commercialized, even as it maintains its coercive power; now, more than ever, the power to sue is the power to destroy.   In today’s legal environment,  ethical canons and traditions of restraint have melted away in the white heat of publicity- and profit-seeking litigation.   The cost to doctors is bad enough, in terms of malpractice insurance, but the cost to patients has been infinitely greater, in terms of damage done to medical research and development.   As noted at SeriousMedicineStrategy.org in the past, the real story of the past two decades has been the precipitous decline in the number of new drugs and medical devices coming to market, as well as the wipeout of the medical venture-capital sector.   

So over the last few decades, lawyers have been de-regulated and de-professionalized, and the result is that the profession has been enriched, even as its ethical prestige has been degraded.   So now we could go further and ask: What would happen if other professions were similarly de-professionalized and profit-maximized?  What would happen to the police?  Or to the military?  Or to the courts?   To be sure, there are libertarian theorists who think such privatization would be a great idea, but mercifully, not too many Americans agree with them.  

For most of us, it makes sense to see our society as a series of sectors, each with its own sectoral  rules, under the overall umbrella of the Constitution.  Public officials are supposed to operate according to one set of rules.  So are the clergy.  And families, too, have some unique rights.  This is the essence of pluralism, and it is also the essence of common sense.   Edmund Burke called these different groups “little platoons,” while Peter Berger and Richard John Neuhaus called them “mediating structures.”  It’s impossible to imagine society functioning without these legal and traditional privileges.   

Yet at the same time, untrammeled market forces, unmodified by the morality or ethics of non-market entities, are a threat to each of these little platoons and mediating structures.   And that’s why we should worry about what is happening to the medical profession--what doctors are doing to themselves, and what they would, in turn, do to us.  

In recent decades, we can note, advocates of pure market forces have gained ascendancy in business thinking.  One oft-heard argument is that the greatest goal of a corporation should be to maximize “shareholder value,” and therefore all other corporate goals should be subordinated to that prime objective.   And so, for example, the interests of corporate stakeholders, as opposed to shareholders, are given short shrift as a matter of policy and ideology.  Such a view may have its place for a company making widgets, but if the same value-system migrates into a medical office, trouble will ensue.   

It’s not hard, for example, to foretell that corporatized doctors, schooled in the new verities of corporate methods, will see the Hippocratic Oath as less and less of a hindrance to their pursuit of profit thorough any possible avenue.    And so just as “innovative” financing schemes became the bane of the financial markets in recent years, so new-style doctors could find “innovative” ways to extract money  not only out of patients, but also out of society.   Organ sales from willing donors--or unwilling donors?   Any number of Coma-like scenarios?   So then we will get the worst of both worlds: greedy doctors who do a bad job for patients, while costing the individual and the country even more more money. 

Some will say, of course, that there’s no alternative, no turning back.  The forces of modernity--from global competition to the Internet to the rising health consciousness of the citizenry--are shaking up the medical profession.  And so we must go forward, we are told, into the further  transformation of the medical profession.   

It is indeed true that we can’t go home again to the idea of a country doctor.    But we don’t have to give up on the traditional mores of medicine.  We can keep the best of our sacred medical tradition and yet also reap the best of what science has to offer.  How?  We will explore that in the next installment.