Tips on Distinguishing Good Metaanalyses From Poor Ones

Publication
Article
Oncology NEWS InternationalOncology NEWS International Vol 8 No 5
Volume 8
Issue 5

SAN FRANCISCO-Although there are many good metaanalyses, derived from combining the results of numerous solid clinical trials, there are also many “filled with garbage,” Deborah Grady, MD, said at the Seventh Symposium on Clinical Trials: Design, Methods and Controversies. It is incumbent on the physician to be able to distinguish the good from the bad, said Dr. Grady, associate professor of epidemiology, biostatistics and medicine, University of California, San Francisco (UCSF).

SAN FRANCISCO—Although there are many good metaanalyses, derived from combining the results of numerous solid clinical trials, there are also many “filled with garbage,” Deborah Grady, MD, said at the Seventh Symposium on Clinical Trials: Design, Methods and Controversies. It is incumbent on the physician to be able to distinguish the good from the bad, said Dr. Grady, associate professor of epidemiology, biostatistics and medicine, University of California, San Francisco (UCSF).

A solid metaanalysis of clinical trials should contain several ingredients: a clear research question, a comprehensive and unbiased identification of completed trials, a definition of inclusion and exclusion criteria, and a calculation of a summary estimate of effect and confidence interval, she said.

Dr. Grady used a study on the effect of garlic on total serum cholesterol, published in the Annals of Internal Medicine (October 1993), as an example of what she considers a metaanalysis gone wrong.

Searching the Internet

The metaanalysis found that garlic, in an amount of about one-half to one clove per day, decreased total serum cholesterol levels by about 9%.

Although the research question in this trial was clear, the researchers merely searched the Internet for studies, Dr. Grady said. Not only did they not pursue pertinent unpublished data, they had few clear criteria for the studies. They eliminated several studies that showed garlic had little effect on cholesterol, even though these studies were large. Most of the studies included were short-term trials, in which it was not clear if the participants had lifestyles that would have influenced the outcomes, such as a low-fat diet, she said.

This trial violated an important rule of good metaanalyses, she said. “The metaanalysis should not be performed if the quality of the individual trials is poor and the results are unreliable,” Dr. Grady said. In this case, the trials were evaluated quite casually by resident physicians who were members of the same journal review club as the authors of the trial, rather than by an independent panel. Even using this approach, the studies only received a quality rating of between 3 and 4 points out of 10. “It was a case of garbage in, garbage out,” she said.

In doing a metaanalysis, the researchers should calculate a weighted mean effect estimate—also known as the summary relative risk. This mathematical calculation, in effect, weights each study by its size. The researcher calculates the inverse of the variance of the effect estimate from each study; large studies tend to have a small variance while small studies have a large variance.

Metaanalyses should combine trials that are clinically homogeneous. If they differ in important ways, such as the intervention, outcome, controls, and blinding, then the metaanalysis is likely to be poor. In the garlic analysis, studies that used very different forms of garlic—tablets, spray-dried powder, liquid extract—were lumped together. “So the conclusion had very little power,” she said.

Publication Bias

Dr. Grady also concluded that there was real publication bias in this metaanalysis, since the published studies were unlikely to be representative of all studies on garlic and cholesterol.

If there is no publication bias, there should be no association between a study’s size and its findings. “A strong correlation between study outcome and sample size suggests publication bias,” she said.

The biggest drawback to doing a metaanalysis, Dr. Grady said, “is that it can produce a very official and reliable-looking summary estimate from garbage.” And indeed, garbage will be the result if the individual trials are of poor quality.

Recent Videos
An “avalanche of funding” has propelled the kidney cancer field forward, says Jason Muhitch, PhD.
4 experts are featured in this series.
4 experts are featured in this series.
3 experts in this video
3 experts in this video
Related Content