A recent article from NYT covered the Avandia case-For Drug Makers, a Downside to Full Disclosure;
When GlaxoSmithKline settled a lawsuit three years ago with the State of New York over the antidepressant medication Paxil, the company agreed to take an unusual step: publicly disclosing the results of its clinical trials for Paxil and other drugs.
The company, which was criticized at the time for failing to publicize all pediatric trials of Paxil, not just the positive ones, made good on its promise. The first posting on a new Web site was about 65 studies involving its popular diabetes drug, Avandia.
This week, GlaxoSmithKline learned what that greater disclosure could mean.
A cardiologist at the Cleveland Clinic, Dr. Steven Nissen, stumbled onto the Glaxo Web site while researching Avandia last April. He and a colleague quickly analyzed the data, and on Monday, The New England Journal of Medicine released its finding that Avandia posed a heightened cardiac risk.
The Number’s Guy covered the statistics behind the study;
In a meta-analysis, researchers pool results from different studies — in this case, Cleveland Clinic cardiologist Steven Nissen and statistician Kathy Wolski analyzed 42 studies. Those studies were done by many different people, and as you might expect, there was wide variation between them. Sometimes Avandia was compared with a placebo and sometimes with alternate treatments. Adverse events — namely heart attacks shown to occur with higher frequency among Avandia users — may not have been identified consistently across the different trials. And if they weren’t, Dr. Nissen would have no way to know, because he was looking at study summaries and not patient-level data. The limitations of this “study of studies” filled a lengthy third paragraph in an accompanying New England Journal of Medicine editorial.
So why, then, use meta-analysis at all? Because for drug dangers that are rare enough, even studies of thousands of patients might not suffice to separate a real risk from random statistical variation. Combining tens of thousands of patients who underwent the treatment separately, under different protocols and supervision, may be the only way to clear thresholds for statistical significance.
Whether a result is significant is determined by the value of a statistical variable called p, which depends on the magnitude of the effect, the consistency of that effect and the number of observations. Researchers can’t control the first two factors, which, in a drug trial, ought to be governed by biochemistry. They can, however, add more observations by wrapping together multiple studies, which can lower the value of p, which is a good thing. A value below 0.05 — commonly chosen as a threshold — means there is less than a 5% probability that an observed effect arose purely by chance.
To see the power of that approach, consider one Avandia trial, known as Dream (Diabetes REduction Assessment with Ramipril and Rosiglitazone Medication), whose results were reported in the Lancet last year. This study alarmed Dr. Nissen, because, as he wrote in a letter to the Lancet, patients taking Avandia had a 37% greater risk of adverse heart outcomes compared with a placebo, which he found “very disturbing.” And for every cardiac problem studied — such as angina, stroke and heart failure — the pattern was the same: The rate was higher among people taking Avandia. But because the events were so rare (for example, just 15 heart attacks in the Avandia group, compared with nine in the control group), the overall findings weren’t statistically significant. Indeed, Avandia maker GlaxoSmithKline’s asserted in its response yesterday to the NEJM study that the drug’s users “showed no increase in cardiovascular risk when compared to placebo” in the Dream trial.
Dr. Nissen’s meta-analysis, which included the Dream study, found a similar elevation of risk for heart attacks — 43% higher among Avandia users — and, thanks to the addition of 41 other studies, managed to nudge p just below the 0.05 threshold, to 0.03.
I asked Dr. Nissen why he did a meta-analysis. He replied, “If you have a question you want to ask, and no single clinical trial is large enough to answer the question, then you have no answer at all. But if you can carefully combine the results of several trials, then you can answer the question you otherwise cannot. And that was exactly the situation we faced with Avandia.” About the technique of meta-analysis, he added, “It’s not as statistically powerful as a single large trial, and should never be a substitute. But in the absence of a single large trial, it can be quite helpful.”
Related;
The Avandia meta-analysis: critical appraisal versus hype
Bootstrapping
Avandia Heart Risks and a World Which Never Sleeps
FDA Issues Safety Alert on Avandia
Clinical Trials
Relatively Small Number of Deaths Have Big Impact in Pfizer Drug Trial
Filed under F (for forgotten);
If a drug firm funds three studies and only one shows that its product works, which finding ends up published in the Journal of the American Medical Association? And which studies go unpublished?
Eight years ago, Kabi Pharmacia, now called Pharmacia, decided the answer was the successful study. And last year, that same study, together with three later research trials that it gave birth to, were the sole sources for federal guidelines on the success rates of nicotine inhalers in helping smokers.
Researchers call it the "file drawer effect" — the quiet filing away of disappointing medical experiments. Perhaps one medical study in five enrolls thousands of patient volunteers, continues for years, then disappears, delayed indefinitely or never published.
Meta-Analysis and the Filedrawer Effect
Disclosing clinical trials;
This effort is an attempt to deal with the age old problem of publication bias, a problem supposedly identified by the ancient Greeks, as described in a letter to the editor of Lancet by Mark Pettigrew;Diagoras was the original atheist and free thinker. He mocked the Eleusinian mysteries, an autumnal fertility festival which involved psychogenic drug-taking, and was outlawed from Athens for hurling the wooden statue of a god into a fire and sarcastically urging it to perform a miracle to save itself. In the context of publication bias, his contribution is shown in a story of his visit to a votive temple on the Aegean island of Samothrace. Those who escaped from shipwrecks or were saved from drowning at sea would display portraits of themselves here in thanks to the great sea god Neptune. "Surely", Diagoras was challenged by a believer, "these portraits are proof that the gods really do intervene in human affairs?" Diagoras' reply cements his claim to be the "father of publication bias": "yea, but . . . where are they painted that are drowned?"
No comments:
Post a Comment