Home General Biotechnology Is the practice of medicine impeding medical innovation?

Is the practice of medicine impeding medical innovation?

At a recent event on the comparative advantages of small molecule vs. biologic drugs, several themes emerged which led me to re-examine the question of whether the practice of medicine is capable of keeping pace with medical innovation.

As mentioned in a previous post, the majority of patients receiving the personalized medicine drug Herceptin had not been previously administered Herceptin’s diagnostic test. This is extremely important, as the diagnostic test can identify those patients most likely to benefit from the drug and exclude those who are likely to see no benefits (and will likely only get sicker as they rotate through an ineffective drug). For all the talk of the promise of personalized medicine, it appears that the gatekeepers — physicians and payers — are unaware of how to effectively prescribe  personalized drugs.

This is not a new phenomenon. Antibiotic drugs have seen their effectiveness drop due to overprescription, which led to the emergence and rapid spread of antibiotic-resistant bacteria.

So, just as mis-prescription of antibiotics took the shine off many lucrative drugs, mis-prescription of personalized medicines stands to likewise diminish their value. What is particularly surprising is that this is not a new trend. For all the advances in drug development over the decades since over-prescription of antibiotics was recognized as a problem, physicians and payers are still hard pressed to prescribe drugs effectively.

When I asked the panel of potential solutions to the problem, I was given a list of emerging technologies such as bioinformatics, e-health, centralized databases, etc. that could solve the problem; adding technology to a problem isn’t necessarily going to solve it, it may just make it a more expensive problem! In reality the solution already exists. All healthcare payers need to do is require prior authorization before prescribing Herceptin, which would require physician consultation, and require a positive result in the diagnostic test prior to reimbursement for the drug.

As with most simple solutions, I’m sure that this one has already been conceived. So why is it not being used? Could it be that healthcare payers are afraid they might end up having to use a litany of diagnostic tests on all cancer patients, thereby offsetting any potential costs savings? I look forward to your thoughts in the comments section below.

4 COMMENTS

  1. I feel like patients are a big part of this problem. Patients no doubt demand those antibiotics even if their doctors tell them they are probably useless. “As long as they are getting paid for by the insurance company, I’ll take ‘em.”

    Doctors feel the pressure to prescribe what their patients demand, or else the patients will just go somewhere else!

  2. I don’t believe that patients will have a problem with knowing that their doctor is screening their cancer for its ability to react to a treatment regimen.

    I’m dumbfounded that clinicians wouldn’t institute a screening policy in order to provide proper treatment; I was under the impression that medicine was “evidence based”. New technologies won’t help if they’re not applied carefully.

    In this example, there needs to be a common practice established. The drug companies have as much of a vested interest as physicians – it’s better for them if the treatment they’re selling actually works!

  3. The problem is that some of those patients which are HER2 negative still seem to respond to HER2 targeted therapy. This therefore renders a diagnostic test obsolete as irrespective if a patient is HER+ or – they will still be prescribed herceptin as there is a chance they will respond. A chance is all that a cancer patient needs.

    Are there any other examples of doctors ignoring diagnostic tests of a similar variety? maybe warfarin??

Leave a Reply