In our Hindsight series, researchers highlight a historical piece of research that was -and is- significant to them. In this post, Julia Koricheva discusses the impact of Gurevitch et al’s A meta-analysis of competition in field experiments. Julia is a professor at Royal Holloway and an Associate Editor for Functional Ecology
The paper which arguably had the biggest impact on my scientific career is “A meta-analysis of competition in field experiments” by Jessica Gurevitch et al. published in the American Naturalist in 1992. While it made an important contribution to the research on effects of competition, it also had a much wider impact on the field because it was one the first studies to introduce meta-analysis as a new methodological approach to research synthesis in ecology. By now, this paper has accumulated close to 600 citations, not just in ecology, but also in other biological fields, and, I would argue, has contributed significantly to changes in the way research is synthesized in ecology today.
Until the mid-1990s, literature reviews in ecology were usually commissioned to be written by the established experts in the field. These reviews did not normally have a methods section explaining how the papers were selected for the review (experts should know the literature!) and usually were narrative and qualitative, i.e. did not contain any statistical analyses. This made such reviews subjective, potentially biased and not well equipped to explain variation in results between the studies. The only quantitative approach used in reviews at that time was the so-called ‘vote-counting’, where the reviewer would tally the number of studies showing statistically significant results to the predicted direction, non-significant results and significant results opposite to the predicted directions. If one of the above categories got the most ‘votes’, then the reviewer declared that the majority of studies supported the hypothesis/refuted the hypothesis/the results were inconclusive. For instance, in one of the influential earlier reviews of field competition experiments by Connell, competition was judged as occurring if there was a statistically significant response to manipulations of population density. This approach is problematic for several reasons. First, the probability of reaching statistical significance depends both on the magnitude of the effect and the sample size. Therefore, the number of statistically significant outcomes might differ between two categories of studies either because there are differences in samples sizes between the categories or because the magnitude of the effect is different. Second, vote-counting provides no information on the strength of the effect, only on whether the effect occurs (i.e. prevalence of competition rather than its strength).
In their review of experiments which examined the effects of competition on biomass, Gurevitch et al. used a totally different approach to research synthesis called meta-analysis, which until then was almost unheard of in ecology and mostly used in medicine and social sciences (for an interesting perspective of how Jessica Gurevitch discovered meta-analysis approach and how it changed her career, see this blog).
Gurevitch et al. wanted to know not just whether competition occurred or not in a study, but how large the effect of completion actually was. In order to address the magnitude of the effect of competition on biomass they have calculated effect sizes based on difference in biomass in the control and treatment where the density of study organisms has been manipulated. They standardized this difference by dividing it by the pooled standard deviation of the control and treatment. Gurevitch et al. also wanted to know how much the effect of competition varied between the studies and what was causing this variation. They compared effect sizes between different trophic levels, different habitats, different levels of population density, etc., and were able to establish which of the above characteristics influenced the magnitude of the competition effect.
What I particularly like about Gurevitch et al‘s paper is how accessibly and clearly written it is, which is rare for a paper describing a statistical approach. Because this was one of the first applications of meta-analysis in ecology, Gurevitch et al. started with an overview of meta-analysis. They explained how this approach differs from other ways of research synthesis and described the advantages and the procedure of meta-analysis step-by-step. Then they proceeded to the ecological questions that they addressed in the review. There were also helpful appendices explaining the inclusion criteria for studies in this review and determination of standard deviations from individual experiments.
I also like the fact that in the discussion, in addition to focusing on interpretation of their results, the authors addressed broader issues about accessibility of published data for future research syntheses. For instance, they called for figures to “convey information in as straightforward a manner as possible” and commented that “dazzling three-dimensional designs and imaginative scales of measurement make it difficult or impossible to use the data presented”. The study by Gurevitch et al. paved the way for the future use of meta-analysis in ecology and set a high bar for the standard of such reviews. I am sure this paper and the clarity with which it was written were one of the contributing factors to the exponential growth in usage of meta-analysis in ecology since early 1990s.
On a personal level, the paper by Gurevitch et al. changed the trajectory of my scientific career. I discovered this paper several years after its publication, around 1996, partly because competition was not the topic of my research and hence it did not look immediately relevant. At the time I was writing a review on how plant resistance to herbivores changes when plants are stressed (e.g. by drought, pollution, shading etc). Although the specific topic of the paper by Gurevitch et al. (competition) was quite different from my own research (plant stress), there were many similarities in the challenges faced in the attempt to summarize the literature on the subject. Dozens (if not hundreds) of experiments have been performed on the topic and the results of these experiments varied. After reading the paper by Gurevitch et al., I realized that the meta-analytic approach would be very suitable for my review as well. With some trepidation, I emailed Jessica Gurevitch and asked her for advice on the review I was writing. I was so excited when she answered me (I guess she was equally excited that someone in ecology was interested in the new approach their paper proposed!).
To cut a long story short, with the help of Jessica, I was able to complete the review I was working on. I also became hooked on meta-analysis approach and went on to conduct many more research syntheses on various ecological topic in addition to continuing my field work. We also continued our collaboration with Jessica, which led to publication of the first Handbook of Meta-Analysis in Ecology and Evolution and a recent review in Nature describing how the field of research synthesis has changed since introduction of meta-analysis.
The introduction of meta-analysis to ecology has made the process of research synthesis more rigorous, less subjective and more quantitative, hence facilitating decision making in management and conservation. It has also opened the field of research synthesis to the early career researchers, and it is pleasing to see initiatives by the British Ecological Society’s journals such as the Journal of Animal Ecology‘s Sidnie Manton award, which encourages early career ecologists to write a synthesis or review paper that might either summarize their dissertation work, provide new insights into classic areas of animal ecology, or might shed light on emerging fields in animal ecology.