Tuesday, March 29, 2011

"Evan Bayh Admits Health Care Bill Does Not Stop Rising Health Costs"


Mediaite's Alex Alvarez catches this damning admission from former Indiana Senator Evan Bayh; Obamacare doesn't do anything to control healthcare costs: 

The real issue that was not addressed, Laura, that you’ve raised now, and I think appropriately, is the cost, the cost to both the government and to your listeners. We need to take steps now to get the costs of health care under control. That was not dealt with really in an aggressive way in this legislation. I think it now needs to be.

Interesting that Bayh voted for the bill that passed last year.  And yet now he says, present and future tense, that "we need to take steps now." 

So when will start controlling healthcare costs?  Answer: When we realize that the intellectual model that we have been following--rationing not only care, but also, in effect, rationing medicine and medical research--is a non-starter.  Yes, Obamacare promised to do just that, but, well, that's politics.

The real-world reality is that restricting payments for the treatment of ill people is not politically acceptable.   By contrast, restricting the flow of capital to medical research is, unfortunately, possible, because the public doesn't quite know what's going on.  But such a restriction makes healthcare even more expensive, because it's cheaper to vaccinate than to treat, and it's cheaper to cure than to care.

Unfortunately few Americans understand the nature of the drug and device "pipeline."  As the economist Bastiat said, there's the "seen" and the "unseen" in political economics, and longterm medical research is mostly unseen.

The opposite model, of course, is Serious Medicine, which argues that what when people are healthier, they are less expensive to care for.   And can work longer and more productively.  


Sunday, March 27, 2011

The US Government neglects a once-great industrial sector--pharma. Where exactly, are the jobs we need going to come from? To say nothing of the cures?

The Export-Import Bank of the United States is holding a conference in Washington DC this week, and what's notable about the agenda is the absence of any representation of the pharma/medical device sectors on the list of speakers.   One would surmise, therefore, that the medical sector just isn't that important to export-minded economic planners.   Instead the speakers are heavy with diplomats, reps from tech companies, from construction companies--and, of course, well-connected DC-based lobbyists.

Indeed, it's easy to see why the pharma sector was overlooked.  In 2009, according to the International Trade Administration's Office of Health and Consumer Goods, pharmaceutical exports from the US were about $46 billion, and imports were $82.5 billion; thus the US has a negative pharma trade balance of $35.5 billion.   Not so long ago, the US pharma sector was a huge net exporter, but decades of political attacks on the pharma industry, combined with restrictive legislation, have taken their toll.  This decline in US pharma competitiveness is measurable and quantifiable, as Michael Milken noted recently.

We should worry about the decline of any US industry, of course, but surely we should worry in particular about the decline of the medical industry, for the simple reason that medicine is a key variable as to whether or not we keep our health  Admittedly, the rest of the world is making medicine, but overseas firms are making mostly generics--copies of extant medicines.  In other words, the rest of the world is mostly not innovating, it is instead free-riding, and so that's why we are suffering a Serious Medicine Crash.   In the long run, we will have to find a way to recapitalize the medical sector, and we probably will, because demand for medicine is always strong.  That renascent pharma sector just might not be in the US, that's all.

In the words of Serious Medicine strategist Jeremy Shane:

One of the fastest ways to reduce costs for American consumers and increase use of American technology would be to export the fruits of our drug research and diagnostics.  The larger the population to which new drugs or genetic or diagnostic tests are distributed, the faster we will learn which drugs work best for which groups of people, and the greater the opportunity for US-based innovators to realize value from great science and US based manufacturing of new therapies.

But none of these economic opportunities seem to be of interest to the Obama administration.  Having enacted a new healthcare finance scheme at enormous fiscal and political cost, the Obamans don't seem interested in revisiting any facet of the healthcare issue.  Indeed, to the extent that the the drying up of the drug pipeline means that government agencies spend less on healthcare--no matter what the longterm consequence to public health--the Obamans might even be happy to see the medical sector wither away.   That's speculative, of course, but we know from the Ex-Im speakers' roster that nobody in Washington is doing anything to change the faltering status quo.


Improving Alzheimer's Treatment: The Key to Solving the Retirement/Entitlement Crisis


A sobering article in Newsmax about what Americans expect to see in their own retirement.  Under the headline "Obama's Fiscal Policies Doom Retirement for Millions," writer Chris Gonsalves cites data from the Employee Benefits Research Institute, showing how Americans fear that they won’t have enough money for their retirement and, as an understandable consequence, are planning to work longer.  
It’s perfectly understandable that Americans are pessimistic about the traditional retire-at-65 scenario, and are resigned to working longer.   But from a public policy point of view, all of us could help if we could create a political constituency in favor of a cure--or even a significant improvement in treatment--for diseases that afflict the elderly.    Arthritis is one such malady, as are COPD and diabetes.  But perhaps the most fearful of all is Alzheimer’s Disease (AD).   

It’s AD, among other illnesses, that could hobble aging Americans, preventing them from working the jobs they will likely need to be working in the decades ahead.   

According to the EBRI report, a rising percentage of Americans are fearful that the retirement income they need won’t be there for them: 

27 percent of American workers are not confident they'll have enough money to retire and live well. That's up 5 percent from last year, and marks the highest level of unease ever measured in the 21 years of the survey by EBRI, a Washington-based nonprofit research firm focused on health, savings, and retirement issues.  Only 13 percent of the 1,258 adults surveyed in January say they are very confident about their retirement outlook, tied for the all-time low set in 2009.

And so as a result, people are planning on working longer: 

Fully 36 percent of Americans expect to be working after age 65. That's up from 11 percent in 1991 and 25 percent in 2006. Nearly three quarters of workers, 74 percent, say they expect to have to work after they retire to make ends meet. Currently fewer than 23 percent of retirees report working for pay.

The EBRI report focuses on the need for seniors, and future seniors, to save more money now.  But we might further note that these numbers underscore the reality that, as a practical political matter, nobody in Washington is going to be slashing away at entitlement spending, no matter what the elite deficit commissions might wish.    If Americans are fearful of their retirement security, and the retirement security of loved ones, they’re not going to agree to cuts in such spending--the polls show that overwhelmingly.

And yet if we step back, we can see that the entitlement crisis is really a retirement crisis.  That is, people need money on which to live.  If they can depend on the government, that's not so bad, so long as the government has the money.   But fi they can't depend on the government, at least not as soon as they thought, well, they had better be healthy enough to work.

To be sure, if America were to confront a genuine Greece-style crisis, everything will be on the table, and everything will be subject to cuts.   But for now, so long as Washington can find money to pay for Afghanistan, Libya, NPR, and wasteful education programs, few senior citizens, nor their advocacy groups, are going to agree to any kind of cuts.   So if the policy elites wish to continue their anti-deficit campaign, it would be behoove them to heed the wisdom of, for example, Sandra Day O’Connor and Maria Shriver, both of whom wrote last fall that we are spending hundreds of billions, headed toward trillions, on AD, while spending virtually nothing on a cure.   The real cost savings for Medicare, they added, will come from successfully treating AD.   And of course, as noted here at SMS in the past, if we could forestall AD, we could then talk about raising the retirement age for entitlement benefits.   And that would eliminate much of the deficit/debt overhang.  

Oh and by the way, if we could develop an AD treatment, we would have developed a new export industry, because an enriching and aging world wants the same medicines that we want.  

But alas, few in the political/policy world seems to be talking about a cure strategy.  They would rather, instead, argue about financialist ideology.  Such debates can be important, but the problem staring us in the face is medical, not financial.   And so the gloom deepens.   

Friday, March 18, 2011

"The Pro-Diabetes Board" -- a reminder of the ultimate cost of shortsighted penny-wise, pound-foolish approaches to health spending

A powerful editorial in The Wall Street Journal this morning, "The Pro-Diabetes Board," detailing efforts by Washington State to use comparative effectiveness review (CER) methods to crimp down on Medicaid and other state spending by de-funding glucose self-monitoring tools, such as finger sticks.   As the Journal edit notes, self-monitoring is long established as the best way to deal with diabetes, but now, in this new Obamacare era, rationers--motivated in part by saving money and in part also, we can say, by an ideological desire to shrink the healthcare sector, just as the President says we must.  And in their zeal to "bend the curve," they are ignoring established medical best-practice.  

As the Journal notes, this "Scarcitarian" approach to diabetes is likely to travel from Washington State to Washington DC: 

Which brings us from Washington state to Washington, D.C. The Health Technology Assessment program's director, Leah Hole-Curry, was appointed last year as a governor of the comparative effectiveness board established by ObamaCare. The national board is known as the Patient-Centered Outcomes Research Institute, yet at an early meeting in November, Ms. Hole-Curry and the other 14 governors debated whether or not patients were the institute's "primary constituents."

Now this agenda is on autopilot. The institute is built on self-executing funding—that is, not subject to annual appropriations like other federal programs—and dedicated taxes on insurers. At the very least Americans deserve some honesty about who these people are and what they favor.

Former Arkansas governor Mike Huckabee has been attacked for questioning CER in his recent book, but here we see the way the rationing process works out in practice, just as Huckabee said.   We might pause to note that the idea of CER is a perfectly valid, because more information is always better than less information.

CER is valid, that is, with two caveats: First, it has to make room for personalized medicine (and personalized medicine will not flourish till the trial lawyers are pushed out of the way, as discussed here many times) and second, CER has to be managed by experts that people trust.   The "death panel" allegation struck a nerve because there's just too much evidence, here and around the world, that nationalized health systems end up pushing toward euthanasia.  And so the public simply will not have trust if they get the feeling that those running the healthcare system do not have their best interests at heart.  

We might further note, by the way, the ultimate futility of this Scarcitarian approach to cost-saving.  Washington State is not going to save money, long term, with this CER-ish method, and neither is Washington DC.  Why?  Because people are still sick, whether or not they are being properly treated.  Indeed, if they aren't treated, they will get sicker faster.  It is shortsighted, penny-wise and pound foolish, because diabetes needs to be treated properly from the beginning, and good treatment includes good monitoring.

The cynical "best" that can be said about such an approach is that it saves money for the incumbent regime.   If Washington State, or Washington DC spend less money on healthcare this year or next, that's good news for whoever's in charge.  That is, he or she can say, "Hey, I saved money on my watch!"

Never mind, of course, that ass diabetes worsens, it gets more expensive--nerve damage and intense pain, limbs have to be amputated, and, of course, kidney failure resulting in dialysis, costing an average of $77,000 a  year per patient.  Of course, from Washington State's point of view, there might be a kind of sneaky fiscal logic, because the dialysis program is a federal program.  So if patients get sick enough, they become a federal burden.   That's a federal entitlement, in place since 1972, and no sign of it going away, even if we wanted it to.  

And thus a key Serious Medicine Point: We have had universal coverage, long before the enactment of Obamacare.   It's just that the way we provide that coverage, at present, is fundamentally incompetent, because we mostly focus on healthcare finance, as opposed to actual health itself.  Sick people are more expensive than healthy people.  So by managing national health badly, we spend more.  It's almost as if our healthcare system were designed by the nursing home industry.

Gary Puckrein, president of the National Minority Quality Forum, makes the further point that four in ten diabetics are admitted to a hospital every year--that's expensive, and inevitably costs localities and states money, as well as the feds.    Indeed, overall, diabetes costs the US some $174 billion a year, according to the American Diabetes Association.

So if we really want to save money on diabetes, we would make the hardheaded calculation that it's time to do something preemptive about diabetes--like cure it.  But that's not a thought that the Obama administration, locked into to its retroactive fiscal model, seems to be contemplating.  

Friday, March 11, 2011

An American Center for Cures in Los Angeles

Intriguing article in the Los Angeles Business Journal by Dr. Robert Kotler, arguing that Los Angeles should be the home of the American Center for Cures (ACC).  The idea is to establish one place to be a hub for not just medical research, but medicine development.   As in, consciously turning research into cures for people, the way that the Defense Advanced Research Projects Agency funds specific ideas and inventions for the US military.

The ACC is the brainchild of Chicago businessman Lou Weisbach and Miami urologist Dr. Rick Boxer.

Weisbach had been arguing to locate the ACC in Chicago, but of course, the ACC could flourish anywhere there's adequate capital and sufficient legal and regulatory protection.  What's most important is to get the ACC going somewhere.  

Tuesday, March 8, 2011

Alzheimer's Disease: "I say the cure will come as quickly as the American people want it to come.”


Writing for the Penn Gazette, the alumni publication of the University of Pennsylvania, Samuel Hughes takes a close look at the Ivy League school's efforts against Alzeheimer's Disease

One of the Penn experts quoted is Dr. John Trojanowski, the William Paul Measey-Truman G. Schnabel Jr. Professor of Geriatric Medicine and Gerontology, who notes that the annual cost of Alzheimer’s care in the US now at about $172 billion. Globally, the cost is about $604 billion, and by 2050, that number could rise as high as $3 trillion, Trojanowski says. A five-year delay could cut that number to around $1.5 trillion. “Half of $3 trillion is certainly a lot of money,” he adds. “But it’s far less than $3 trillion.”

The whole article is fascinating and full of detail, but in these paragraphs, Hughes lays out both the enormous cost and the enormous potential.  We could have a cure, if we really wanted one, says Trojanowski: 

Thirty years ago the late, great medical essayist Lewis Thomas called Alzheimer’s “the disease of the century.” While AIDS may have justifiably stolen the spotlight in the 20th century, the demographics and staggering costs associated with Alzheimer’s make it well-positioned to reclaim the title in the 21st.

“When Alzheimer described Alzheimer’s disease in 1906, life expectancy was 48, and the top 10 or 20 causes of death were infectious diseases,” points out Trojanowski. “A hundred years later, people are living to an average age of 78 in developed countries. And now Alzheimer’s, which was ignored, has become an epidemic. Alzheimer’s has replaced diabetes as the sixth leading cause of death in developed countries. 

“The current [global] cost of Alzheimer’s disease is $604 billion,” he adds. “If those costs were the economic output of a country, then the cost of Alzheimer’s care would mean that Alzheimer’s is between Turkey and Indonesia as the 17th-largest economy in the world. If it were a company, it would be the largest company in the world, larger than Walmart and Exxon Mobil. It’s affecting China, Southeast Asia, Australia, Indonesia. So it is a global problem. A global epidemic—with horrendous costs.

“We really owe it to ourselves and future generations to create a world without Alzheimer’s disease,” he adds. “And I think we can. Twenty years ago I wouldn’t have said that. We didn’t know enough. When asked at support groups by families that had an Alzheimer’s patient, I would almost tearfully have to say, ‘I have no idea.’ As a physician, to admit that there was nothing that you could do—and that you had no idea when something could be done—was emotionally difficult. And now it’s changed so dramatically that I say the cure will come as quickly as the American people want it to come.”

Words worth repeating: "The cure will come as quickly as the American people want it to come." 

One of the arguments of of this blog is that Serious Medicine does, in fact, need a strategy.  And strategy, of course, means aligning means and ends.  That is, can we mobilize what we have to achieve our goals?   It's not easy, of course, but the greater the stakes, the greater the reward for success, and the greater the cost for failure.  And right now, we are losing the war on Alzheimer's.  But, as Dr. Trojanowski says, we don't have to lose.  We could win this fight against AD if we wanted to. 

Monday, March 7, 2011

“The operating environment for pharma is worsening rapidly.” If the pharma sector is under-capitalized, then we will be under-medicalized.


“The operating environment for pharma is worsening rapidly." That's a quote from a Morgan Stanley research document, cited in The New York Times story this morning, headlined, "Patent Woes Threaten Drug Firms."

This is what a Serious Medicine Crash looks like.

The reality is that the pharma sector has suffered a severe de-capitalization--and it's likely to get worse.  As the Times piece notes, the stock prices for Pfizer and Merck have fallen some 60 percent in the last decade, even as the S&P index has risen 19 percent.   And that's why Big Pharma is laying off thousands--including researchers.

This shrinkage is due in part to a drying-up pipeline of the new-drug pipeline--a drying-up exacerbated, of course, by the trial lawyers and the FDA--and also to other factors, such as the rise of generics.  Some 75 percent of prescriptions drugs consumed in the US are generic; in addition, governments are  imposing ever-tougher price controls, around the world and now, increasingly, in the US.   As the Times observes:

The drug industry has long said that Americans fueled the research engine, spending much more per capita on prescriptions than in any other nation, and paying the highest prices for prescribed medicines.

In other words, just as the US dollar has been the reserve currency for the world, and the US military has been the "reserve defense" for the world, so too the US drug market has been the "reserve pharmacy" for the world.  That is, demand here in the US provided the economic surplus to the pharma companies to make the drugs needed.

But that "reserve pharmacy" role seems to be ending.   Here's a look at this chart in the Times, based on the same data depicted in past postings at SMS.  As we can see, expenditures up, drug approvals down.


So we can ask: What's the new plan for replenishing the drug pipeline?  Not just for us, here in the US, but for the world?    Some people, of course, despise the pharma industry, and would like to see it crippled or crushed.  And so we might ask: What's their alternative plan for new medicines?   Where, in their reckoning, should new drugs come from?   Of course, most people don't really have an opinion on the pharma industry, either way.  Still we might ask those folks: What are you going to do when you confront a Serious Illness, and you discover that the medical cupboard is bare?   As Thomas Hobbes said, "Hell is truth seen too late."

Ultimately, American leaders--and world leaders--are  going to have to figure out a new way to inject capital into the pharma sector.  Not for the sake of next-gen lifestyle drugs such as Viagra or botox or Latisse, but for the sorts of drugs needed to combat premature death and disability.   Alzheimer's, for example, is a worldwide problem; if hundreds of millions of people suffer premature dementia, that's going to be a lot more expensive to treat than it would be to preempt.  

The world may or may not be able to agree on a lot of things, but surely Serious Medicine is one area in which every human being has a stake. 

Thursday, March 3, 2011

If war is too important to be left to generals, then medicine is too important to be left to politicians. The US Marines and Afghanistan; DARPA and Serious Medicine: Are we fully mobilizing our national potential for Serious MILITARY Medicine?


Are we utilizing technology to the fullest extent possible to protect our service personnel in combat zones?  And to minimize fatalities and help the wounded to completely recover?  And by that I don't mean "fully recover as much possible within the constraints of current medical technology." Instead, we mean, "fully recover given the maximum mobilization of medical potential."  There's a huge difference between the present-day actual and the future-possible.   

And, as we shall see, the same idea--apply as much technology as possible--is as applicable to Serious Medicine as it is to the US military.  Indeed, counterintuitive as might seem to some, there's a robust common link between military technology and medicine.  In the end, what matters in both "miltech" and medicine is the application of science and technology.   And while war and medicine have much different purposes, in their technological dimension they share a common ancestor: The Industrial Revolution.    That is, the power to produce not just one, but millions.    Just as we can and should mass produce the tools we need for national security, so, too, should we mass produce the tools we need for medical security.  If we do mass produce them, they will be not only more abundant, but also better and cheaper.  
Unfortunately, we have fallen out of the habit of thinking that way, and that’s bad for both the military and for medicine. 

We can start by focusing on medicine.   Specifically, we illustrate the difference between the medical actual and the medical possible by considering a news article that appeared in the front page of The Washington Post this morning.  In a well-reported and also deeply moving story, reporter Greg Jaffe chronicles Marine Lieutenant General John F. Kelly, who lost his son, Marine Second Lieutenant Robert Kelly, in Afghanistan last November.   That's Gen. Kelly pictured above, awarding a Purple Heart to a comrade of his late son, Lance Corporal Sebastian Gallegos.  

After paying due respect to 2Lt. Kelly and LCpl. Gallegos--and all the other heroes who have sacrificed so much--we might next consider whether we are doing all we can to keep safe our armed services and their personnel as we put them in harm’s way.  

Here we should note that the US military has already made enormous progress in military medicine.   Back in January, The New York Times reported that the survival rate for American service personnel taken to hospitals in Afghanistan has improved dramatically, just in the past five years.  In 2005, according to the Times, 19.8 percent of those evacuated to military hospitals died.   By 2010, that death rate was down to 7.9 percent.  

To put it another way, that's a 60 percent decline in fatality in just five years.  It would be easy to go back further, to earlier wars, to the point where the death rate was half or more--and of course, it wasn't that long ago, that military hospitals were little more than amputation mills, or didn't exist at all.

Meanwhile, there's the additional issue of restorative and regenerative medicine.  Physical therapy is much improved; war has often been a catalyst for such advances, which then spin off. It’s worth remembering that plastic surgery emerged during World War One, as doctors started trying to give wounded troops at least the beginnings of a renewed post-war life.   And World War Two saw the mass production of penicillin, opening the door to a cornucopia of antibiotics.  

It’s heartening to see some f what the most forward-looking unit with the Defense Department is thinking up in terms of medicine;  Wired magazine recently cited some what the Defense Advanced Research Projects Agency has been working on.    It's quite a list, according to Wired's Madhumita Venkataramanan, including programs for battlefield diagnostics, improved tissue engineering and regeneration, and even improved eyes for soldiers.

We need all of this--and much more.  Whatever our opinion as to the wisdom of the Afghan war, we will always, of course, support the troops.  They are, in the most literal sense, our brothers and sisters, and they are doing what we, through our elected government, have asked them to do.  But our support should consist of more than admiration and compassion--and maybe chipping in to a USO fundraiser.  As a nation, our collective support should be not only the best available wisdom about true national security for America, but also the best available wisdom about military technology and military medicine.  If we do that, as we have seen, not only will we be doing right by our men and women, but we will also find the benefits feeding back to our own society.   

And yet at the same time, it would be nice to think that we are prosecuting our wars in the most cost-effective way possible.   Let’s take a look at this passage from Jaffe's article, describing the paternal concern of the elder Kelly, the general, in regard to his son, the lieutenant: 

Kelly knew that Robert went out on almost every patrol with his men through mine-filled fields. One of the Marines at Bethesda told him that Robert was "living on luck." [emphasis added] 

Now we might just step back and ask ourselves: Isn't it, er, interesting that the US military is still fighting in Afghanistan by sending Americans on patrol through fields of landmines?    Yes, boots on the ground are a good idea, but boots on landmines are a bad idea.   I understand the idea of counter-insurgency, or COIN, but surely we understand that we can't win hearts and minds by marching our people around so they get blown up.  Aside from the obvious humanitarian considerations, the Afghan hearts and minds we are trying to win are not impressed with us and our ways if we can't outsmart the handmade bombs that the Taliban are planting.  Meanwhile, US casualties drain away support for a war on our homefront.
   
As Clausewitz, patron saint of all military intellectuals, reminds us, wars occur within a political context.  That is, public opinion in both Afghanistan and America will be just as decisive as any military battle.   So COIN, without the proper strategic thinking--about not only military buildup and preparation, but also the politics of the countries involved--is unlikely to prove effective over the long run.   

Yes, of course, some might say, planning is nice.  But sometimes nations must react, and quickly.  

Perhaps we had no choice, therefore, in the War on Terror.   After all, as former Defense Secretary Donald Rumsfeld once put it, “You go to war with the Army you have. They're not the Army you might want or wish to have at a later time.”  So sure, we had to go to Afghanistan with what  we had.  But wait: Rumsfeld made that "you go to war" comment on December 8, 2004That is, he was making excuses for a lack of preparation more than three years after US forces first arrived in Afghanistan.  In other words, by the time Rumsfeld said those words, he had had plenty of time to think about what equipment our warriors needed and to ask for whatever resources he felt were necessary.  But perhaps, as I wrote at the time, Rumsfeld was too busy writing CYA memos to think cogently about the battlefield situation. 

So there should be a better way.  But what better way?  The answer is as old as the industrial revolution: Replace "labor"--that is, the lives and limbs of our young people--with capital, and the technology that capital + ingenuity can create.   

We've done it before.  Within four years of Pearl Harbor, the US government had totally transformed its warfighting.  A war that began with infantry and battleships ended--successfully, of course--with dozens of new aircraft carriers, effective radar, B-29s and, of course, A-bombs.  Yes, there were some fiascoes of inadequate technology during the course of that march to victory, but our ultimate victory speaks for itself.   

Indeed, we won WW2 suffering a tenth of the KIAs that the enemies that we defeated suffered fighting against us.  In a nutshell, that's the way to win a war.  As Gen. George Patton said, you don't win a war by dying for your country, you win a war by making the other s.o.b. die for his country.   Admittedly, we were spent close to 40 percent of our GDP on the military, but it was worth it.   America’s fighting men and women were worth it.   And if the Global War on Terror is worth it, then we should be willing to spend what it takes to fight it effectively. 

Instead, we tried to fight in Afghanistan and Iraq  on the cheap, at least financially.   That is, we scrimped on the sort of military R&D that could have saved lives--that’s why our men and women are still working their way through minefields. 

Perhaps politicians figured that it was too risky, politically, to ask for the resources needed to rapidly develop effective countermeasures to the insurgents, or perhaps the pols were too busy with other priorities.  

But we do know this: The result, on the ground, has been an inadequate hodgepodge of technological improvisations. As one Marine combat veteran of Iraq in 2004-5 recalled of the on-the-ground miltech that he saw in Al-Anbar province: 

Nothing more frustrating than sitting in the opened air rear of a Humvee without up-armor while watching anti-IED [improvised explosive devices, aka land mines] engineers roll along in a "Star Wars"-looking machine. The USMC has never had the funding to do much more than the basics and when we have been given $'s we have usually botched it.

In other words, more than a year after the US invasion of Iraq, Leathernecks were still riding around in the “opened air”--that is, our troops were fully exposed to enemy fire--Humvees.  Disgraceful.   And yet as we know, through grit and determination, the Marines made a magnificent accounting of themselves in Iraq, anyway. 

After nine years, we should be doing better than marching young men and young women through mine-laden fields in Afghanistan.  Our fighting forces shouldn’t have to be “living on luck.”  If we had really wanted to, this past decade.  we could have figured out something completely different--something as radically innovative as that which emerges from Silicon Valley every few months.  I have no doubt that  DARPA has plenty of ideas for Iraq and Afghanistan--after all, DARPA's predecessor agency created the Internet.   Indeed, we recently learned that DARPA is experimenting with a “cheetah-bot” that would bound across the landscape like a fast cat.

If DARPA can do that in the future, it could have done it in the past, with enough resources.  Maybe, for example, DARPA could have made a "daddy long legs"-ish vehicle for COIN, so that human feet never touch the ground on patrol.  Or maybe some sort of hovercraft.  Or perhaps something altogether different.  At minimum, the Pentagon, properly resourced, should have been able to figure out how to disable every mechanical device in a given area--that is, landmine clearance on a vast scale.    

Speculative thinking?  Sure.  But that's how we win wars.  And as the French statesman Georges Clemenceau declared, "War is too important to be left to generals." 

Indeed, history shows us that generals are instinctively suspicious,  even mistrustful, of technology.   That's why, for example, Billy Mitchell--one of the most brilliant American officers ever, the visionary of American airpower--ended up being courtmartialed.  (Fortunately, President Franklin D. Roosevelt posthumously vindicated Mitchell.)

This is not the place to delve too far into military psychology, but suffice it to say that the record shows, over and over, that top brass have rejected new ideas more often than not.  The airplane was viewed with suspicion, as we have seen.  And so to another military innovation, the  tank--it took an outsider, Winston Churchill, to push armor on the British Army, which seemed perfectly satisfied with trench warfare. The trench warfare, of course, in which, as Churchill said, men fight machine guns with their chests.  

History will eventually show what DOD knew and didn’t know, and what it asked for and didn’t ask for.  But if generals weren’t getting what they thought they needed, they should have protested--or resigned.  A prominent resignation or two would have gotten America’s attention.

And all the same arguments apply to military medicine--and to Serious Medicine.  If war is too important to be left to generals, then medicine is too important to be left to politicians.  

We need everything DARPA is doing now, and 10 times more.   Our troops are worth it.  So we need someone in the military, or in the civilian leadership at DOD, or in the government somewhere to say that we will set as a goal repairing spinal cords, so that those wounded in war will regain their motor power.  And if we achieve that breakthrough, it will be a breakthrough for all of us.   And so that's where the medical discussion ought to lead--toward all of us getting the benefit of DARPA-like leadership, whether we are in the military or not.

One last point: I think that a similar deep problem runs through both our medical thinking and our military thinking.   And that is, a certain de-materialization of our approach to the world.  Instead of medicines and drugs, and nuts and bolts, we seem to think in terms of bailouts and finance and ideology.  We--or at least our leaders--think not in tangible realities, but intangibles of rhetoric and spin.   And yet it is real thing--real machines--fight wars.  And then it is other machines, and the products of machines, that save lives.  


UPDATE:  Another Washington Post reporter, David Brown, chronicles the rise of double-amputee and groin injuries among US personnel in Afghanistan.    As Brown notes, the number of amputations and genitourinary (GU) wounds is up sharply:


In 2009, 75 soldiers underwent amputation and 21 lost more than one limb. In 2010, 171 soldiers had amputations and 65 lost more than one limb. GU injuries increased from 52 to 142 over the same period.

Of the 142 soldiers with genitourinary wounds who arrived at Landstuhl last year, 40 percent — 58 men in all — suffered injury to the testicles. Body armor, which has greatly reduced fatalities, usually includes a triangular flap that protects the groin from projectiles coming from the front. It doesn't protect the area between the legs from direct upward blast. Various laboratories are reportedly working on forms of shielding that would provide such protection. Medical staff at Landstuhl also noticed a rise in severe genital injuries last fall.