Nissen's entire approach is a case study of how to use statistical tools to manufacture biased results. In “Why Most Scientific Research Findings Are False†John Ioaniddis, a professor nstitute for Clinical Research and Health Policy Studies, Department of Medicine, Tufts-New England Medical Centera notes that a study relying on meta-analytic finding from inconclusive studies where pooling is used to “correct†the low power of single studies, is probably false and biased.
Research findings from underpowered, early-phase clinical trials would be true about one in four times, or even less frequently if bias is present. Epidemiological studies of an exploratory nature perform even worse, especially when underpowered, but even well-powered epidemiological studies may have only a one in five chance being true. Nissen’s research which combines small clinical trials to conduct a epidemiological study of an exploratory nature that deliberately excludes patients without heart attacks, that does not independent confirm if one took place and does not have access to patient level data, to find a risk he believes is there may or may not be true but it is certainly biased and likely to be false.
Indeed, stating that there is a increase of 40 percent risk in heart attacks relative in Avandia users compared to others raises two other troubling questions. First, it is generally true that, as Ioannidis claims that independent of molecular or genetic confirmation of a cause and effect, “too large and too highly significant effects may actually be more likely to be signs of large bias in most fields of modern research.â€
They should lead investigators to careful critical thinking about what might have gone wrong with their data, analyses, and results.
For instance, in the 1980s a Swedish epidemiological study found that people with hip and knee replacements had a 30 percent greater increase of kidney cance than those who had no surgery. That study did not cause the authors to run to Congress and the media with results at a politically sensitive time. Instead, it suggested further epidemiological analysis and observational analysis which lead to the conclusion that the ‘relationship’ between orthopedic implants and kidney cancer was “noise†as opposed toa signal of something going wrong.
The proper response to the Avandia exercise would be to conduct further research and to put the general risk of heart events in context, Nissen, the authors of the editorials supporting his claim and the NEJM have not done either. Instead Nissen ran to the media and Congress with a highly speculative report. The NEJM gave the article prominence and failed to run a cautionary editorial.
Research findings from underpowered, early-phase clinical trials would be true about one in four times, or even less frequently if bias is present. Epidemiological studies of an exploratory nature perform even worse, especially when underpowered, but even well-powered epidemiological studies may have only a one in five chance being true. Nissen’s research which combines small clinical trials to conduct a epidemiological study of an exploratory nature that deliberately excludes patients without heart attacks, that does not independent confirm if one took place and does not have access to patient level data, to find a risk he believes is there may or may not be true but it is certainly biased and likely to be false.
Indeed, stating that there is a increase of 40 percent risk in heart attacks relative in Avandia users compared to others raises two other troubling questions. First, it is generally true that, as Ioannidis claims that independent of molecular or genetic confirmation of a cause and effect, “too large and too highly significant effects may actually be more likely to be signs of large bias in most fields of modern research.â€
They should lead investigators to careful critical thinking about what might have gone wrong with their data, analyses, and results.
For instance, in the 1980s a Swedish epidemiological study found that people with hip and knee replacements had a 30 percent greater increase of kidney cance than those who had no surgery. That study did not cause the authors to run to Congress and the media with results at a politically sensitive time. Instead, it suggested further epidemiological analysis and observational analysis which lead to the conclusion that the ‘relationship’ between orthopedic implants and kidney cancer was “noise†as opposed toa signal of something going wrong.
The proper response to the Avandia exercise would be to conduct further research and to put the general risk of heart events in context, Nissen, the authors of the editorials supporting his claim and the NEJM have not done either. Instead Nissen ran to the media and Congress with a highly speculative report. The NEJM gave the article prominence and failed to run a cautionary editorial.