The New York Times'Natasha Singer takes a look at health-oriented social networking sites, including CureTogether.com and PatientsLikeMe.com.
Singer injects some quizzical notes, concerning privacy issues, and also asking to what degree drug companies are fronting some groups, for whatever purpose, but the ultimate logic of a social network for health is as powerful as a social network for anything else. As PatientsLikeMe puts it:
“When patients share real-world data, collaboration on a global scale becomes possible. New treatments become possible.”
Newsweek's Sharon Begley and Mary Carmichael have collaborated on a must-read cover story in Newsweek, entitled: "Desperately Seeking Cures: Medical Progress Isn't Making Progress Rapidly Enough--Here's Why--And How To Push Things Forward."
The Newsweekers begin by noting the fall-off in the drug pipeline:
From 1996 to 1999, the U.S. food and Drug Administration approved 157 new drugs. In the comparable period a decade later—that is, from 2006 to 2009—the agency approved 74. Not among them were any cures, or even meaningfully effective treatments, for Alzheimer’s disease, lung or pancreatic cancer, Parkinson’s disease, Huntington’s disease, or a host of other afflictions that destroy lives.
What's happened? The co-authors explain:
“Basic research is healthy in America,” says John Adler, a Stanford University professor who invented the CyberKnife, a robotic device that treats cancer with precise, high doses of radiation. “But patients aren’t benefiting. Our understanding of diseases is greater than ever. But academics think, ‘We had three papers in Science or Nature, so that must have been [NIH] money well spent.’?”
More and more policymakers and patients are therefore asking, where are the cures? The answer is that potential cures, or at least treatments, are stuck in the chasm between a scientific discovery and the doctor’s office: what’s been called the valley of death.
The barriers to exploiting fundamental discoveries begin with science labs themselves. In academia and the NIH, the system of honors, grants, and tenure rewards basic discoveries (a gene for Parkinson’s! a molecule that halts metastasis!), not the grunt work that turns such breakthroughs into drugs. “Colleagues tell me they’re very successful getting NIH grants because their experiments are elegant and likely to yield fundamental discoveries, even if they have no prospect of producing something that helps human diseases,” says cancer biologist Raymond Hohl of the University of Iowa.
In other words, Begley and Carmichael argue, scientists have come to see working for the NIH as an end in itself. An ominous development, if true.
Begley and Carmichael offer some suggestions for exiting this rut, including a greater emphasis on cooperation among "turf-jealous academics," and even direct grants to biotech companies, as championed by Sen. Arlen Specter (D-Pa.)--those biotech companies can be presumed to be more interested in actually getting drugs, and cures, to market.
Great news: Scientists are edging closer to a vaccine for breast cancer, reports the BBC:
Vincent Tuohy, from the Cleveland Clinic Lerner Research Institute, said: "We believe that this vaccine will someday be used to prevent breast cancer in adult women in the same way that vaccines have prevented many childhood diseases. "If it works in humans the way it works in mice, this will be monumental. We could eliminate breast cancer."
Sounds great, but how will the rationers in the federal government regard this possible news? How will the Congressional Budget Office score it? What if it takes another 10 years of research and trials to get the breast-cancer vaccine to market? Will that decade's worth of effort be counted as a cost--a cost to be pruned--or will it be seen as an investment, to be nurtured along?
The CBO is a hard-working bunch, and mean no harm, but their mandate is to look at everything through their single-variable bean-counting prism. And that means that something as radical as a costly cure might not fit into scarcitarian vision, or their computer model.
The New York Times'reports on the promise of telemedicine, the long dreamed-of idea of long-distance medicine, in which doctors and other healthcare providers are instantaneously connected to patients who might not otherwise be able to get to a doctor, because they are in remote locations, are incarcerated, or are otherwise not mobile. As the Times' Milt Freudenheim notes, a fifth of all Americans live in places where primary-care physicians are "scarce," to say nothing of specialists.
The telemedicine idea speaks for itself, described ably by Freudenheim, but the Times reporter raises one interesting point that deserves much further exploration: We know how much we spend on telemedicine, but how much do we save? Freudenheim reports that interactive diagnosis--of the sort pictured above, showing a Houston physician, Dr. Jerry Jones, working from home--is a $500-million-a-sector, a part of the $3.9 billion in annual expenditures for telemedical devices such as smart phones and other kinds of sensors.
So let's see: That's $3.9 billion expenditure, out of the $2.4 trillion that the US spends annually on healthcare. But President Obama, and just about everyone else, says that we are spending too much on healthcare. So should telemedicine share in the austerity that we are supposed to be imposing on the healthcare sector? The austerity--some call it rationing--that Uncle Sam is supposed to impose on this dynamic sector of the economy--is that really a good idea?
Here at Serious Medicine Strategy, we are skeptical of the a priori assumption that the US spends too much on healthcare. But even if we do spend too much on certain aspects of healthcare, a moment's reflection tells us that telemedicine is a winner, not a loser, for the US economy. And so we should have more telemedicine, not less. Why? Three reasons:
First, with telemedicine, a new niche within the healthcare sector is being pioneered, in which people get better healthcare, and are thus made healthier. (Now, if we can just keep the trial lawyers from destroying it, through liability lawsuits targeted not just at doctors, but also at the telemedicine pipelines, such as the phone companies and the cable companies. As legal expert James Wootton has warned, creative legal buccaneering will strive to find some theory, no matter how dubious, that holds deep-pocketed pipelines somehow responsible when something goes wrong.)
Second, along the way, a new industry is being created--lots of high-tech jobs, building out and servicing the telemedicine industry, as it evolves in ways that we can't foresee, not only here in the US, but also, potentially, around the world. (See iPad.) How much is that new industry worth to us? It should be worth a lot. And as a matter of dynamic-accounting fairness, the gains from this nascent industry should be deducted from the overall cost of our healthcare, so that we come up with "net" number (healthcare costs minus telemedicine industry gains) for the overall sector.
Third, the savings of telemedicine are already accruing to various institutions, and those savings should be factored in, too, as part of our healthcare bill. It's not fair, and it's not smart, to measure the costs without measuring the benefits--which, in some cases, will way outweigh the costs. For example, the Times reports that moving a oil-rig worker off an offshore oil platform in some faraway place to a doctor can easily be a $10,000 roundtrip. So the telemedicine-induced savings for the crew of a single rig might total $500,000 a year. An even bigger issue is moving prisoners around as they receive medical treatment; obviously it is extremely expensive to move prisoners from prison to a doctor and back. The Times cites a further estimate that the annual savings of telemedicine to the state of California alone could total $1.2 billion. Which is to say, measured across the whole of the economy, the savings from telemedicine could be enormous. Shouldn't those savings be subtracted from the net of our national healthcare bill? Of course they should.
The point here is that telemedicine is a paradigm-shifting approach not only to better medical care, but also to more jobs and economic growth, as well as more savings to the country. If only Uncle Sam knew how to account for all that good news.
Rep. Debbie Wasserman-Schultz was the prime mover behind the EARLY Act--that's an acronym for Breast Cancer Education and Awareness Requires Learning Young.
As Wasserman-Schultz, herself a breast cancern survivor at a young age, observes:
In 2008, the American Cancer Society estimated that there would be 182,460 new cases of breast cancer in women. Of these cases, more than 10,000 – 11,000 of these women would be under 40 years of age. Although the incidence of breast cancer in young women is much lower than that of older women, young women's breast cancers are generally more aggressive, are diagnosed at a later stage, and result in lower survival rates. In fact, breast cancer is the leading cause of cancer deaths in young women under the age of 40.
So there's plenty of reason to think that this is a laudable bill with an important purpose.
But as Lt. Columbo on the old TV show might say, "There's just one thing." And that is, can the federal government actually accomplish all the goals that it sets out for itself? It's nice that lawmakers come up with laws with good-sounding titles, and noble purposes, but just isn't so obvious that the government knows how to get these things done. As the BP oil spill reminds us, there can be a wide gap between what the government says that it is doing and what it gets done.
That was the lesson of a piece by Sarah Kliff in Politico, headlined, "Running late on the EARLY Act," reports:
Although it’s racing to roll out consumer-friendly aspects of the health care law before November’s midterm elections, the Obama administration has just missed the deadlines to set up task forces on breast cancer and health care in Alaska.
The health care law required Health and Human Services to establish the breast cancer task force by last weekend and the Alaska task force by the first week of May. But sources familiar with the situation said the department isn’t even close to having the two panels ready.
So the government missed a deadline. Will there be any consequences for the bureaucrats who missed that deadline? And will there be any consequences for the politicians who concoct grand legislation, without seeming to worry so much as to whether or not the legislation actually works as promised?
We desperately need accountability and feedback in the government. Accountability for failure, and feedback, as in a learning process, so that we can figure out what is and is not working. We know, now, about federal competence in the BP oil spill. And thanks to Politico, we are learning, now, about federal competence in the area of women's health.
Here at SMS, we have been critical of Washington for failing to fully grasp all the opportunities that computers, social networks, and even games have put before us. How much creativity and ingenuity is waiting to be harnessed?
Today, we can only get glimpses of what is possible. One such glimpse comes from SAP Software, which has created Idea Place, to encourage info-sharing on software ideas and applications. Some of them concern medicine.
In this video, we see two SAP employees, Tobias Queck and Vandana Deep, talking through a SAP product called Paper UI. It's a digital pen, which records handwriting information as it is being written. So the care-provider writes the information on a regular piece of paper, and the information, stored in the pen, can be docked and sync-ed into the overall network.
No doubt there will soon enough be a wireless version of this digital pen, so that the information goes into the network in real time, but no need to get ahead of ourselves. In the meantime, we might pause to consider the fact that we don't yet know which "paradigm" of electronic health records (EHR) will prevail. In recent years, we have presumed that some sort of tablet--most notably, perhaps, an iPhone or an iPad--will be the dominant platform.
But one can make a case that it should be the pen/stylus, because, let's face it, plain old dead-tree paper has advantages. It never runs out of battery life, it's easy to move around, it survives spills and drops, and so on. And yet of course, we want all the data to be captured, and backed up--that's what the pen is for. The grand synthesis of EHR is yet to be seen, although, of course, there may never be a grand synthesis.
Interestingly enough, SAP-ers Queck and Deep were pitching the product at a trade show, DKOM 2010, held in San Francisco earlier this year, and they were in some sort of race with the clock the clock. As in, they had, it appears, six-and-a-half minutes to make their pitch, before a bell went off. One could not gain a sense, from the video, as to what the carrots or sticks might be, and it all seemed good-natured in any case. Which is to say, game-like elements have penetrated--suffused is probably a more accurate verb--not only the computer culture, but also the computer-medicine culture. I was also struck by the use of the word "imagineering," which I think was coined by Walt Disney, to describe the engineering used at Disneyland and the other Disney theme parks.
Nothing wrong with that. On Friday, I published a long piece for Steve Clemons' blog, The Washington Note, arguing that vast reserves of creativity and ingenuity were not being tapped for the cause of healthcare, and the biggest of those untapped reserves was the gamer culture.
As usual, The Economist'stake on Craig Venter's announcement of synthetic life--more precisely, the creation of a bacterium with a synthetic genome--is thoughtful and thought-provoking. The piece was evocatively entitled, "And man made life: Artificial life, the stuff of dreams and nightmares, has arrived."
There will be much back-and-forthing as to the impact that Venter--working with Hamilton Smith--will have, including some contextualization, evenminimalization, of the Venter/Smith achievement, but The Economist gets to the bottom line:
It will be a while, yet, before lifeforms are routinely designed on a laptop. But this will come. The past decade, since the completion of the Human Genome Project, has seen two related developments that make it almost inevitable. One is an extraordinary rise in the speed, and fall in the cost, of analysing the DNA sequences that encode the natural “software” of life. What once took years and cost millions now takes days and costs thousands. Databases are filling up with the genomes of everything from the tiniest virus to the tallest tree.
Indeed, this might be a case where the who-what-wheres of journalism, important as they are, prove to be inadequate compared to the speculative challenge of understanding the road ahead.
Serious Medicine Strategist Jim Woodhill puts himself in the camp of those who believe that what Venter has wrought will, indeed, be big. And to illustrate the potential scope of the Venter-ized future, Woodhill reminds us an unjustly neglected sci-fi writer,Cordwainer Smith, who imagined for us a universe in which Serious Medicine is really serious, where Serious Medicine is the most valuable commodity in the cosmos. As Woodhill recalls it:
Smith wrote of a future world where a drug called "stroon" was distilled from the vomit of gigantic mutant sick sheep. Stroon, also known (for some unknown reason) as the "Santaclara Drug" granted its taker health without aging while he kept taking it. The value of stroon was so high that when one 18-year-old Old North Australian sheep rancher was able, with the aid of a family heirloom war computer, to corner the stroon futures market, he became wealthy enough to buy everything.
OK, some might say, that's just sci-fi, published half a century ago. But Woodhill connects Smith's work to the present debate, specifically, the way that we finance healthcare--and let that healthcare financing get the better of our overall Serious Medicine Strategizing. Let's remember, folks, the central issue of healthcare is cures, not cost. The politicians have taken their eye off that ball, because we have let them. And so the beancounters, who have gained ascendancy over the politicos, in the wonkrooms of Washington, have trumped the medical visionaries, such as Venter. That's why we could have a healthcare debate, over the last year-and-a-half, that paid almost no attention to the prospect of actually curing diseases. We will finance illness--and expand the financing of illness--but we seem to have relatively little interest in actually curing disease and restoring people to health. That's not what the average American thinks, but the average American wasn't really consulted in the debate.
As Woodhill explains:
One of my (in)sanity checks on the healthcare reform debate is to note that the discovery of a Santaclara Drug that cost, say, 20% of U.S. GDP (initially) to synthesize in great enough amounts for our entire population would be a disaster within the current accepted public policy model. I mean, 17% of GDP would go to 20%! (And more than 20% because stroon would not eliminate trauma care, maternity wards, etc., just heart disease, cancer, major mental illness...)
Woodhill is exactly right. Venter opens up the prospect of genuine medical/scientific breakthroughs, the kind that would paradigm-shift medicine, and paradigm-shift the economy, too. Just as Cordwainer Smith, in his own way, imagined a half-century ago.
Today, we might imagine what America would look like if we were systematically applying Venter's technology--in a legal and ethical way, of course--to the health challenges at hand. What if we were to help Venter, and all the others doing this work, help us? What if we launch a national medical-economic strategy to do the sort of research that would lead to the sort of wonder-drugs that would make America a medical and economic magnet for the world? Such an achievement is admittedly speculative, but if we could do it, the achievement would dwarf some minor tweak--or even a big shift--in the way that we finance healthcare.
No wonder, then, that Washington isn't interested in turning the reins of healthcare policy over to science--because whatever science creates, by definition, wouldn't be invented in DC. It might be financed in DC--and maybe it should be financed in DC--but it would be created in a lab out beyond the Beltway somewhere. And that act of medical-scientific creation would, in effect, disempower the political class.
Writing in The Washington Postthis morning, Jennifer Brokaw, an emergency room physician in San Francisco, opined that one solution to the ER crisis was for doctors and policymakers to make better use of more information:
Every municipality or county should have a detailed understanding of where needs are and a plan to address them. For example, by knowing the exact numbers of diabetics, smokers, and people with kidney disease and high blood pressure in our communities, we can predict how many CT scanners or, say, vascular surgeons are needed. Gaining this knowledge will require cooperation and information-sharing among all health-care entities: insurers, hospital systems (both public and private) and managed-care systems.
OK, that’s a familiar prescription: more information. And yet such a prescription is hard to fill, because of HIPAA-ish privacy concerns, not to mention the general my-data-is-my-castle orneriness of the American population. As we have seen this year, it’s a challenge for the government to conduct a census, to say nothing of enumerating all the diseases and conditions that people have, and then sharing that information among healthcare providers.
So another approach is needed, as Dr. Brokaw is quick to point out. Social networks, most notably Facebook, seem to have become quite skilled at eliciting information from people. (Perhaps too skilled, in the waggish opinion of humorist Andy Borowitz, who jokes today that the Chinese government should forget about secret police and just “friend” the entire population of China, as a better way to keep tabs on people.)
In the meantime, the need, and the opportunity, looms large. Here’s more from Dr. Brokaw:
We need to employ the new ideas and technologies that have emerged from the Internet and social networking revolution to link patients with informed advice about their conditions. In a fully realized online system, simple medical questions can be answered by health advisers in real time. Patients can be directed to walk-in clinics, urgent-care centers or even emergency rooms when necessary.
It seems to me, here at SMS, that the key is to make such information-sharing successful is to make it voluntary. Like Facebook, unlike the government, as we understand government today. And how to do that? How to get people to cough up information about their coughs, and other health issues? The answer is to make the process of information-disclosure cool, or otherwise gratifying, or remunerative, to the individual. And that requires a rethinking of the basic “business model” of healthcare, including the business model of govnernment.
I wrote this piece in January for Foxnews, noting some differences between the iPad and Obamacare: I noted that people were standing in line to buy the new tablet computer, while most people were fleeing from the thought of Obamacare. As I put it:
What the American people want is a health care system that works as well as Mac--it’s nice that it looks nice, but the real value is that it works. It gets the job done. And it’s easy to use--from the start, when you take it out of the box.
Is that too much ask? That things work? That Uncle Sam do as good a job as Steve Jobs? If it is too much to ask--if Uncle Sam can’t be bothered to make his systems work--then don’t expect people to trust the government.
The point is not that a corporation is better than the government, as BP demonstrates. The point is that Steve Jobs is better than the government. As I also wrote in that same piece, four months ago:
We can only imagine what the world would be like today if Obama had sat down with Jobs a year ago--and, in the spirit of fair & balancee, with other top CEOs and tech visionaries, too--and said, “How can we make our health care system cool? How can we make it work better? How can we make it cheaper? How can we make it do all the things that you do in your business--drive costs down, drive performance up?” The answers that would have come back from those techsters would have been hard for a liberal Democrat to swallow--and that’s why the meeting never happened. But Silicon Valley types would have pointed the way toward a popular plan that would have been popular--as popular as an iPhone app -- 3 billion of which have been downloaded so far. Now that’s popular enthusiasm, the kind that the Democrats sorely need right now.
Instead, of course, Obama chose to sit down with Harry Reid and Nancy Pelosi. And more recently, Democrats have been taking advice from wordsmiths, such as Emory University’s Drew Westen. Their plan, it seems, is to talk their way out of their political problems.
Some people, of course, are creeped out by Jobs, or Apple. But the company has built one of the most trusted brands in America; most people--and this is a democracy--would agree that Jobs and Apple have been more successful at solving problems than the government. So why not find a way to arbitrage that talent and effectiveness for other problems? Isn’t at least worth a try?
Others see the value of seeking out the best and the brightest--the true best and brightest--to solve urgent national problems. One such is comedian-commentator Bill Maher, who had this to say recently about Steve Jobs, as reported by The Huffington Post:
"America needs to focus on getting Jobs -- Steve Jobs. Because something tells me that Apple would have come up with a better idea for stopping an oil spill in the Gulf of Mexico than putting a giant box on top of it," he said.
He explained why he thought Jobs and his Apple team could do a better job running America, joking that we might have to change the country's name (Hello, "iMerica."):
“In 2001, Apple reinvented the record player. In 2007, the phone. This year, the computer. I say, for 2011, we let them take a crack at America. Our infrastructure, our business model, our institutions. Get rid of the stuff that's not working, replace it with something that does. For example, goodbye US Senate -- Hello Genius Bar! So good luck, Steve -- you'll need it!
Maher is right. And Dr. Brokaw is right. Knowledge is power in healthcare, just as knowledge is power in every other field. The challenge is using that knowledge-power in the best way possible. Our health depends on it.
This morning I watched a fascinating documentary on NHK TV, reporting on Japanese efforts to advance stem cell research. And yet interestingly, much of the research is being done--and many of the benefits are being seen--here in the US.
The picture above, screen-grabbed from the NHK documentary, shows piglets genetically altered with fluorescent protein; many more such pics can be seen here. Such bio-bravura might seem like mere showmanship, and no doubt there's some of that, but the science of xenotransplantation is, indeed, advancing rapidly.
And that's not the half of it. The NHK documentary reported on a Cincinnati, Ohio man, Lee Spievack, who lost the tip of his finger in a hobby-shop accident five years ago. Yet his finger has reportedly been regenerated, complete with fingernail, by dropping "pixie dust" on the stub of his finger. The pixie dust, of course, is just a nickname for serious material, extracellular matrix, or ECM, derived from pig bladders, which is believed to help stoke the regrowth of tissue. It must be added that some say that the incident was a hoax, because Spievack's wound wasn't that severe.
Yet the Pentagon is extremely interested, as a way of restoring America's wounded warriors to full, or at least better, health. The NHK documentary featured on-air interviews with doctors, and wounded soldiers, expressing optimism that the ECM was helping at least a little, regrowing war-blasted muscle and even the beginnings of a blown-off finger.
So is "pixie dust"/ECM just the latest version of cold fusion? Or is it the beginning of something big? Suffice it to say, the scientific debate, and the inquiry, will continue.
Perhaps what's most interesting is how little coverage such debates seem to have lost their grip on the American imagination. Obviously other issues are important, too, but it's hard to think of any other issue more important than biotech. Not only because it concerns all of our lives, but because biotech also represents nascent industries, waiting to be born--the question is where. Japan? The US? Singapore? China? Right now, it's hard to know.
Whichever country takes the lead in biotech will likely take the lead, period, in the century to come. Although, of course, if the history of computer technology is any guide, the benefits of biotech will likely spread far and wide around the world. Japan excelled in consumer electronics, going back half a century, but it was the whole world that benefited. And the same is true, more recently, for US computer and Internet technology.
Stem cell has been extremely controversial in the US. Back in the summer of 2001, before 9-11, it seemed as though stem cells were the most vexatious issue in America. President George W. Bush reached a reasonable enough compromise, although, of course, he was not the only decisionmaker--stem cell is a worldwide phenomenon. In any case, by now, nearly a decade later, the science of stem cell has moved way beyond embryonic stem cells, to stem cells derived from all manner of adult organs, and so while there's much reason to be queasy about some of what is happening, or might be happening, the "cat is out of the bag," worldwide.
Science does what it does, a sort of permanent revolution, a point made brilliantly decades ago by Harvard's I. Bernard Cohen in his book Revolution in Science.
More recently, a different facet of the same inevitable reality was illuminated by by Steven Levy, in a fascinating Wired magazine assessment of geek/hacker culture, 25 years after he published a pathbreaking book on the subject. Then and now, Levy celebrates the techno-enthusiasm of geek/hacker culture, the sense of doing something just because you can.
Catching up with Bill Gates, among others, now grown relatively old, Levy noted changes in Gates' thinking. If he were 13 again, Gates said, he would go into biology, not computers:
Just ask Bill Gates. If he were a teenager today, he says, he’d be hacking biology. “Creating artificial life with DNA synthesis. That’s sort of the equivalent of machine-language programming,” says Gates, whose work for the Bill & Melinda Gates Foundation has led him to develop his own expertise in disease and immunology. “If you want to change the world in some big way, that’s where you should start — biological molecules.” Which is why the hacker spirit will endure, he says, even in an era when computers are so ubiquitous and easy to control. “There are more opportunities now,” he says. “But they’re different opportunities. They need the same type of crazy fanaticism of youthful genius and naivetè that drove the PC industry — and can have the same impact on the human condition.”
In other words, hackers will be the heroes of the next revolution, too.
Yes, there are huge moral issues at play here--in biology far more than electronics--and those issues can't and shouldn't be dismissed, because nobody wants sci-fi dystopia. But we do want cures for disease and disability.
And so the reality is that we are charging ahead, here in the US, in Japan, and around the world.