The Associated Press
An EMS Perspective By Mike McEvoy This certainly has broad implications for EMS both with regard to the consequences we might encounter in patients as well as the drugs we ourselves use – that is the big issue in my mind. EMS is particularly poor about vetting the drugs they use partly because they lack the pharmacological resources and expertise available to hospitals but also because they are more susceptible to marketing influences (read: manufacturers). If you think back to 2000, when the American Heart Association changed the ACLS guidelines to recommend amiodarone, there was a clear insinuation that it was supposed to be a wonder drug for cardiac arrest. The huge marketing ploy to hospitals and EMS to sell this very expensive drug didn’t really reveal it made little difference to survival – it only made a difference short-term. People may have been alive when they were admitted to the hospital but nobody got out. But because of the marketing campaign, amiodarone was able to get into a lot of ambulances. Agencies and departments that had the resources to look more closely at the claims decided they couldn’t see the benefits and didn’t use it. New York City for instance had a system that was big enough – and with enough scientific resources – to look more closely at the evidence and claims. Amiodarone is one example of how EMS can get sucked right into the same vortex discussed in this article. Like consumers, EMS is basically in the dark on the inner workings of what goes on at the FDA and the manufacturers’ own studies of drugs. Basically, EMS is just like a consumer: it lacks the wherewithal to carefully evaluate a drug in any greater detail than how it is marketed to them. Mike McEvoy, PhD, REMT-P, RN, CCRN, is the EMS coordinator for Saratoga County and the EMS director on the Board of the New York State Association of Fire Chiefs. Read his columns, ‘Drug Whys’ and ‘Firemedically.’ |
WASHINGTON — Did you know that Lunesta will help you fall asleep just 15 minutes faster? Or that a higher dose of the osteoporosis drug Zometa could damage a cancer patient’s kidneys and raise their risk of death?
Chances are you didn’t, and neither did your doctor. Much of what the Food and Drug Administration knows about a drug’s safety and effectiveness is not included on the label, say two drug safety experts who are calling on the agency to make that information more accessible.
In this week’s issue of the New England Journal of Medicine, researchers from Dartmouth College argue that drug labels don’t reflect the nuanced decisions the FDA makes when deciding to approve a drug. The editorial from Drs. Lisa Schwartz and Steven Woloshin recommends easy-to-read fact boxes to help patients weigh the benefits and risks of medications.
If drug labels sometimes exaggerate benefits and play down drug risks, the authors say there’s a very good reason: they are written by drugmakers.
While FDA must approve the final labeling, the actual language is drafted by the manufacturer, with input from FDA scientists.
The labeling is based on results from company studies, which generally compare results for patients taking the drug versus those taking placebo.
If FDA decides the drug’s ability to treat or prevent a disease outweighs its side effects, the agency is obligated to approve it. But Schwartz and Woloshin point out that benefits may be slim and potential harms may not be fully understood.
“The take home point is that just because a drug is approved doesn’t mean it works very well,” said Schwartz, in an interview with the Associated Press. “You really need to know more to see whether it’s worth the cost.”
Schwartz and Woloshin say FDA labeling frequently fails to provide a full picture of a drug’s effects.
In the case of Sepracor Inc.'s blockbuster sleeping pill Lunesta, it’s virtually impossible to tell how well the drug works based on the labeling, which only indicates that it worked better than placebo, or a dummy pill.
Only by wading through the FDA’s 403-page internal review of Lunesta do the details emerge: patients fell asleep 15 minutes faster and slept 37 minutes longer, on average.
“Lunesta patients still met criteria for insomnia and reported no clinically meaningful improvement in next-day alertness,” the authors state.
Despite that lackluster finding, the drug has grown into a $600 million-a-year drug for Sepracor, helped by the company’s advertisements featuring a green Lunesta moth.
FDA review documents can also hide critical safety information.
The authors point to the example of Novartis’ Zometa, which was approved in 2001 to prevent skeletal fractures in cancer patients with brittle bones. The drug was approved in both 4-mg and 8-mg doses, despite FDA findings of increased kidney damage and death with the higher dose.
FDA went back and added language about kidney toxicity in 2008, but the information about death rates is still missing from the label.
While FDA reviews are posted online, they are often hundreds of pages long and written in extremely dense medical language.
Woloshin and Schwartz recommend FDA provide reader-friendly summaries of its drug reviews, to supplement industry-drafted drug labeling.
Earlier this year, the FDA’s panel of communication experts recommended the agency adopt fact boxes for all announcements about drug risks and benefits. Woloshin and Schwartz said they have met with FDA leadership to discuss the proposal.
A spokeswoman for the FDA declined to comment Wednesday afternoon.
In a study comparing comprehension of drug benefits, patients showed significantly better understanding with fact boxes versus traditional drug advertising. Seventy percent of patients viewing the boxes correctly identified a superior heart burn medication, versus just 8 percent who viewed drug advertisements, according to the 2006 study by Schwartz and Woloshin.