Cat and fox: agents of Australian extinctions

Australia’s drylands are famous for their assemblage of ultra-cool mammals. As one example, it is difficult for us non-Australians to imagine a more endearing creature than the rock-wallaby pictured below.

Black-foted Rock-wallaby

Black-footed rock wallaby. Credit: Peter McDonald.

Unfortunately, numerous species of Australia’s dryland mammals are going extinct. Many of these extinct species weigh between 35 and 5500 grams – a weight range that researchers have described as the critical weight range (CWR). Peter McDonald and his colleagues wanted to know what was causing these extinctions, and why were they most prevalent in the CWR. They considered two hypotheses. First, perhaps the land was becoming less productive, either from habitat destruction by humans, or as a result of changing climate. Reduced plant abundance could cause herbivorous mammals to go extinct. An alternative hypothesis is that perhaps newly introduced predators, notably feral cats and red foxes, were killing the native mammals so effectively, that they were disappearing from the Ausralian drylands.

Previous research indicated that extinction rates were lower in areas that had more species living in trees and around rocks, leading McDonald to think that maybe habitat was influencing extinctions in important ways. In particular, he realized that rugged mountainous areas might have fewer predacious cats and foxes, and secondly that these two predators tend to go for prey within the CWR. Putting these ideas together, perhaps mountainous areas are refuges for Australia’s dryland CWR species, protecting them from predator-driven extinction. If so, mammal species richness would be highest in rugged, protected areas, and lowest in more open areas. If, on the other hand, mammals are going extinct because overall productivity is declining, we would expect overall species richness to be greatest in the most productive areas.
McDonald and his colleagues tested these two competing hypotheses by censusing mammals in four different types of habitats in Tjoritja National Park within the MacDonnell Range of central Australia. These were (1) mountain areas dominated by a sparse assemblage of shrubs and clumps of spinifex grass, (2) spinifex grasslands (with a more abundant cover of spinifex than found in the mountains), (3) Acacia shrublands, and (4) alluvial woodlands, which were most productive with richest soils.



Mountain refuge habitat_PeterMcDonald

Mountains. Credit: Peter McDonald



Spinifex grasslands


Acacia shrubland


Alluvial woodland

The researchers set up a variety of different mammal traps in 90 different sites representing these four habitats to capture and identify small mammals, and they detected larger mammals by searching for fresh scat at each site. The researchers estimated productivity with the normalized difference vegetation index (NDVI), which uses satellite imagery to measure the green-ness, and hence productivity, of a site or region.

In support of the predation hypothesis, more mammal species were found in the most rugged terrain.


Number of mammal species per site in relation to ruggedness of terrain. The curve is the fitted value of the regression equation.  The shaded area represents the 95% confidence interval.

In contrast to the productivity hypotheses, fewer mammal species were found in the most productive sites


Number of mammal species per site in relation to productivity of terrain as measured by the NDVI. The curve is the fitted value of the regression equation.  The shaded area represents the 95% confidence interval.

While it’s useful to evaluate both hypotheses by measuring current species richness, the researchers also needed to know how many species were actually driven to extinction in the time since cats and foxes invaded. They reconstructed historic species richness for each habitat based on subfossil remains (remains of organisms that are only partially fossilized), from indigenous knowledge supplied by aboriginal Australians, and from historical accounts in the early literature.

They discovered that CWR extinctions were most prevalent in alluvial (12/12 species) and acacia (7/7 species) habitats. Spinifex habitas lost 5/6 CWR species, while mountainous habitats only lost 2/6 CWR species. Importantly, species outside of the CWR have survived relatively well in all habitats, further implicating cats and foxes as the agents of extinctions.


Current (extant) and historic (pre-invasion by cats and foxes) mammalian species richness in the four habitats. The dots are the mean weight, and the lines are the weight ranges for each species.  The shaded area represents the critical weight range (CWR)

More support for the the predation-habitat link comes from recent research that indicates that red foxes are absent from the mountain habitat, while feral cats are substantially less abundant. Even when present, cats are much less efficient hunters in the mountain habitat because the complex rock structure affords more refuges to prey items.

Feral cat with fat-tailed antechinus_NTG

Feral cat captured on camera with a fat-tailed Antechinus. Credit Tony Griffiths.

Across Australia, many CWR species have gone extinct in regions colonized by cats and foxes. McDonald and his colleagues provide solid evidence that these introduced predators are responsible for these extinctions. They urge researchers to explore other mountainous regions in Australia to see if they too are acting as refuges for CWR mammals.

note: the paper that describes this research is from the journal Conservation Biology. The reference is McDonald, P. J., Nano, C. E. M., Ward, S. J., Stewart, A., Pavey, C. R., Luck, G. W. and Dickman, C. R. (2017), Habitat as a mediator of mesopredator-driven mammal extinction. Conservation Biology, 31: 1183–1191. doi:10.1111/cobi.12905. doi:10.1111/cobi.12908. Thanks to the Society for Conservation Biology for allowing me to use figures from the paper. Copyright © 2017 by the Society for Conservation Biology. All rights reserved.

River restoration responses

The Lippe River in Germany has been subjected to many decades of channelization, deepening, floodplain drainage, straightening and consequent shortening, with one result being that the modern Lippe is 20% shorter than it was two centuries ago. Beginning in 1996, conservation managers began reversing this trend by widening the river, raising the level of the river bed, constructing small islands within the river and terminating floodplain drainage operations over a stretch of 3.3 km. As a result of these activities, a small portion of the river looks much like it did 200 years ago.


A section of the Lippe River before (left) and after (right) restoration.

Over a 21-year period, researchers from Arbeitsgemeinschaft Biologischer Umweltschutz have conducted systematic surveys of fish communities at the restored and unrestored sections of the river. Researchers sampled the fish community with electrofishing – inputting a direct electrical current into the river – which causes the fish to swim towards the boat where they are easily collected with nets, identified by species, and returned unharmed into the river. A data set over this length of time in association with a restoration project is very unusual; oftentimes (in part due to funding issues) only one survey is conducted to assess the fish community response to river restoration.

About eight years ago, while a postdoctoral researcher at Senckenberg Research Institute in Frankfurt, Germany, Stephan Stoll was asked to analyze some river restoration outcomes, and, as he describes, “became hooked to the topic.” To evaluate the response of the Lippe River fish community to restoration, a group of researchers headed by Stephanie Höckendorff, a Master’s student with Stoll, first asked a very simple question – how did fish abundance and species richness (the number of fish species) compare in the restored and unrestored regions of the river.

The graph below shows several striking trends. Abundance peaked about 2-3 years after restoration, declined sharply the next year, and recovered in subsequent years to about three times the abundance found in unrestored sections. Importantly, abundance varied extensively year-to-year. For example, if you had done only one survey in 2000, you would have erroneously concluded that restoration had no effect, which is why the researchers emphasize the importance of collecting data over a long stretch of time.


Abundance of fish in restored (Rest-gray curve) and unrestored (Cont-black curve) sections of the Lippe River.  The gray vertical bar indicates the start of the restoration project in 1997.

Species richness increased sharply, but did not reach its peak until nine years after restoration. Again, there was extensive year-to-year variation in species richness.


Fish species richness in restored (Rest-gray curve) and unrestored (Cont-black curve) sections of the Lippe River.  The gray vertical bar indicates the start of the restoration project in 1997.

Höckendorff and her colleagues were intrigued by this delay in species richness, and turned their attention to understanding what types of species benefited most from the restoration. Their analyses indicated that colonizing species, such as common minnows and three-spined sticklebacks, tended to have short life spans, early female maturity, several spawning events per year and a fusiform body shape – a body that is roughly cylindrical and tapers at both ends. Interestingly, some of the most successful colonizers took quite a long time to get well-established within the community.


Common minnows, Phoxinus phoxinus. Credit: Carlo Morelli (Etrusko25)


The three-spined stickleback, Gasterosteus aculeatus. Credit: Ron Offermans

The restored habitat was highly dynamic, experiencing periodic flooding and the formation of temporary shallow bays and shifting sandbanks. These types of habitats tend to select for minnows, sticklebacks and other opportunistic species that are attracted to periodic disturbances. These opportunistic species were quick to move in, and continued to increase in abundance over time. Importantly, several rare and endangered species also colonized the restored habitat. However, large, deep-bodied, slow maturing and long-lived species did not benefit (at least over the 17 years of the survey), as these types of species are generally favored in less dynamic habitats, which are more stable and uniform.

Overall, these findings demonstrate the benefits of river restoration to the fish communities they harbor. But some species are more likely to benefit than others, and the time-scale over which recolonization occurs is highly variable. Surveys must be repeated over a long time-scale to tell conservation managers whether their restoration efforts are successful, and how they might change their future river restoration efforts.

note: the paper that describes this research is from the journal Conservation Biology. The reference is Höckendorff, S., Tonkin, J. D., Haase, P., Bunzel-Drüke, M., Zimball, O., Scharf, M. and Stoll, S. (2017), Characterizing fish responses to a river restoration over 21 years based on species’ traits. Conservation Biology, 31: 1098–1108. doi:10.1111/cobi.12908. Thanks to the Society for Conservation Biology for allowing me to use figures from the paper. Copyright © 2017 by the Society for Conservation Biology. All rights reserved.

Sushi in Disguise

As an ecology researcher, I’ve always been attracted to systems where you might be able or inclined to eat your organism after you completed your experiment or observations. Alas, I spent my research career studying spiders, dragonflies and zebrafish, all of which are high on nutrients but low on succulence. Thus I read with considerable gastronomic anticipation an article by Demian Willette and his colleagues that studied nine different species of fish served up at local sushi restaurants in Los Angeles, California.


Mackerel, salmon and tuna (front to back) served at a Los Angeles sushi restaurant. Credit Demian Willette.

One of the co-authors, Sara Simmonds, had a great idea when she was a teaching assistant for the Introduction to Marine Science course at UCLA in 2012. Simmonds suggested that students in the class could investigate whether seafood served at sushi restaurants were always what they claimed to be, or might they sometimes travel under false identities. For example, is red snapper (which does not occur in California waters) really red snapper, or might merchants substitute one of 13 rockfish species in its stead? This project allowed students to investigate a real world marine-related topic, while also getting some experience using and applying molecular genetics tools.

Over the course of the four-year study, students ordered sushi from 26 different restaurants, confirmed the species identification with the wait-staff, and collected tissue samples from each order. They then subjected the samples to DNA barcoding, which amplifies and sequences an approximately 650 base pair segment of the mitochondrial COI gene. Once they determined the DNA sequence, students then compared it with known sequences using the Basic Local Assignment Search Tool database (National Center for Biotechnology Information).

Each year, between 40 and 52% of the fish were mislabeled. Though previous studies by other researchers had identified mislabeling, Willette and his colleagues were surprised that all 26 restaurants had at least one case of mislabeling, and that the mislabeling rate was so consistent from one year to the next.


Percentage of sushi mislabelled (left y-axis – bar symbol) and number of restaurants sampled (right y-axis – diamond symbol) by year.  Number in bar is sample size for that year.

Overall substitution rates varied dramatically from one species to another. All fish species, except bluefin tuna, were mislabeled at least once, and two species – red snapper and halibut – were always mislabeled. Red snapper was often replaced with red seabream, while halibut was usually replaced with flounder.


Percentage mislabeled (+ standard error) for each species in the study. Numbers above bars are number mislabeled (left) and total sample (right).  For example 6 out of 47 salmon were mislabeled.

Why should we care if we’re served the wrong species of fish, as long as it tastes good? As it turns out, there are several reasons. About 33% of halibut are substituted with olive flounder, which can harbor the parasite Kudoa septempunctata, which is known to cause severe food poisoning. In addition, some of the other halibut substitutes are actually overfished flounder species, so substituting these for halibut is depleting already at-risk fisheries. Similar problems, in which an at-risk species substitutes for the mislabeled species, were common in tuna and yellowtail as well.

The researchers recommend that seafood mislabeling must be attacked at all stages of the seafood supply chain. All seafood should be labeled to species, place of origin, and the type of fishing practice used. Inspectors must be trained to identify seafood – perhaps using portable, hand-held DNA sequencers. Retailers should be told when they sell mislabeled species, so they can insist that their suppliers deliver the correct goods. Finally social media can be used to inform the public of consistent mislabeling, so consumers can pressure retailers to make sure that a red snapper is what it claims to be.

note: the paper that describes this research is from the journal Conservation Biology. The reference is Willette, D. A., Simmonds, S. E., Cheng, S. H., Esteves, S., Kane, T. L., Nuetzel, H., Pilaud, N., Rachmawati, R. and Barber, P. H. (2017), Using DNA barcoding to track seafood mislabeling in Los Angeles restaurants. Conservation Biology, 31: 1076–1085. doi:10.1111/cobi.12888. Thanks to the Society for Conservation Biology for allowing me to use figures from the paper. Copyright © 2017 by the Society for Conservation Biology. All rights reserved.

Seagrass scourge: when nutrient enrichment reaches the tipping point

Sean Connell has watched as south Australia has lost vast expanses of kelp forest and seagrasses over the past years. One of the primary culprits associated with loss of seagrass meadows is excessive nutrients, particularly nitrogen, which enters the ecosystem with runoff, and causes an increase in algal epiphytes (epiphytes are small plants that grow on other plants). Epiphytes can negatively affect seagrass by blocking sunlight needed for photosynthesis, and indirectly, by increasing the rate of cellular respiration within the ecosystem, thus using up oxygen needed by seagrass for metabolic processes.


Two dolphins swim above a bed of seagrass off the south Australian coast.

Connell and his colleagues noticed that seagrass loss was often sudden; a large seagrass meadow would appear to be in good shape, and then it would abruptly disappear. They suggested that there might be a threshold effect in nutrient levels that seagrasses can tolerate; that these systems function well until a certain threshold in nutrient levels is crossed, above which there is an abrupt loss of seagrasses. They tested this hypothesis by subjecting plots of the seagrass Amphibolis antarctica to seven different concentrations of dissolved inorganic nitrogen (DIN) over a 10 month period, and monitored the abundance of epiphytes and seagrass over that timespan.

The meadows were about two km offshore from Lady Bay, Fleurieu Penninsula, Australia, in about 5 meters of water. Different amounts of nitrogen fertilizer were wrapped in nylon bags (for slow continuous release of DIN) and staked to the ocean floor. Amphibolis antarctica grows by producing new leaves at the top of each leaf cluster, but at the same time it drops old leaves. Leaf turnover, the researchers’ measure of growth, is simply new leaf production minus old leaf drop. The researchers tied on a small nylon cable at known locations on selected plants, noted how many leaves were above and below each tie at the beginning of the experiment, and recounted leaf number 10 months later. Finally, the researchers measured epiphyte growth by microscopically viewing a sample of seagrass leaves, and counting the number seagrass leaf cells that were covered by epiphytes.

Seagrass growth was relatively unaffected by all tested DIN levels.


Leaf production per day in relation to concentration of DIN.

However, leaf drop showed a strong threshold effect; leaf drop rates increased sharply between 0.13 – 0.15 mg/L of DIN.


Leaf drop per day in relation to concentration of DIN.

Putting these two graphs together, you can see (below) that leaf turnover switched from positive to negative at 0.13 – 0.15 mg/L of DIN. Negative leaf turnover translates to a sudden loss of seagrass at that threshold. At least in this system, at this location, 0.13 – 0.15 mg/L of DIN is the tipping point, beyond which the seagrass system suddenly goes into decline.


Leaf turnover per day (left y-axis and red data), and Epiphyte cover (% – right y-axis and green data), in relation to concentration of dissolved inorganic nitrogen.

The graph also shows that the tipping point coincides with an epiphyte cover of approximately 60%. It is possible that increased epiphyte cover may reduce seagrass photosynthetic rates (particularly in lower leaves), so that leaf turnover suddenly shifts into the negative zone, but the study was not designed to identify the underlying mechanism.

Seagrass meadows perform important ecosystem services, such as absorbing excess nutrients from the sediment, and providing habitat and food for a diverse group of grazers and indirectly, for their consumers. Thus seagrass conservation is vital. The danger here is that moderate levels of nutrients do not appear to have much effect on seagrass populations, but there is an abrupt shift to seagrass loss once the nutrient threshold is crossed. This makes the system very difficult to manage, because the loss occurs without warning. Australian ecologists have repeatedly failed to restore lost seagrass meadows, as simply reducing nutrient levels does not reverse the process. Thus anticipating seagrass loss before it happens is the most viable management solution for this critical ecosystem.

note: the paper that describes this research is from the journal Conservation Biology. The reference is Connell, S. D., Fernandes, M., Burnell, O. W., Doubleday, Z. A., Griffin, K. J., Irving, A. D., Leung, J. Y.S., Owen, S., Russell, B. D. and Falkenberg, L. J. (2017), Testing for thresholds of ecosystem collapse in seagrass meadows. Conservation Biology, 31: 1196–1201. doi:10.1111/cobi.12951. Thanks to the Society for Conservation Biology for allowing me to use figures from the paper. Copyright © 2017 by the Society for Conservation Biology. All rights reserved.

Crawling with caterpillars courtesy of climate change – and ants

In the book of Exodus, Yahweh inflicts upon the Egyptians ten plagues, several of which have biological bases. Plagues two three, four and five are frogs, lice, wild animals and diseased livestock. But it is the eighth plague that is relevant to today’s tale – the locust explosion. As it turns out, insect populations have periodically exploded throughout recorded history (and no doubt before), and for many years ecologists have been trying to understand why insect populations are so variable. Rick Karban has taught a field course at Bodega Marine Reserve, California since 1985, and, as he describes “In some years, the bushes are dripping with caterpillars and in others they are very difficult to find.  The wooly bears (Platyprepia virginalis) are so conspicuous and charismatic that I couldn’t help wondering what was responsible for their large swings in abundance (they are more than 1,000 times as abundant in big years than in lean ones).”


Wooly bear caterpillar density during annual surveys conducted in march of each year.

The early stage caterpillars are most common in wet marshy habitats, but as they develop, they move to dryer upland habitats where they pupate, metamorphose into moths and mate. Young caterpillars live in leaf litter, eating vegetation and decaying organic matter.


Late instar (close to pupation) wooly bear caterpillar feeding on bush lupine. Credit Rick Karban.

Karban and his colleagues recognized that insect populations are sensitive to climate, and wondered whether climate change may be playing a role in Platyprepia population explosions. But there’s much more to climate change than global warming; for example, many areas of the world expect much more variable precipitation patterns, with more big storms and more droughts. Karban and his colleagues wanted to know whether variable precipitation might affect wooly bear populations. So they examined rainfall records between 1983 and 2016, and found that numerous heavy rainfall events (over 5 cm) in the previous year were correlated with increases in caterpillar abundance.


Change in caterpillar abundance in relation to number of heavy rainfall events (over 5 cm) during the previous year. Note the y-axis is the natural logarithm of the change in abundance.

Karban and his students explored three hypotheses for why caterpillars increased following a year with numerous heavy rainfall events. First, perhaps more rain causes more plant growth and deeper litter, providing extra food for caterpillars. Second, heavy rains may reduce the number of predacious ground-nesting ants. Lastly, heavy rains may produce deeper denser litter providing refuge from predacious ants.

The researchers tested litter as food hypothesis by comparing caterpillar growth rates during the summer, which usually has very little rainfall. They weighed individual caterpillars, placed them into cages and supplied them with litter from either wet or dry sites. After 30 days, they reweighed the caterpillars and found that all of them had lost weight, and that there were no significant differences in weight change between wet and dry sites. Thus, at least during the summer, there was no evidence that wet sites had better food for caterpillars.

Karban and his colleagues turned their attention to ants.


If ants stayed away from wet sites, that would suggest that rainy years may benefit caterpillars by reducing the number of ants in their habitat. To measure ant abundance, the researchers set out bait stations supplied with a sugar-laced cotton ball and 1 cm3 of hot dog. They discovered many more ants, and in particular, many more Formica lasioides (a fearsome caterpillar-killer) ants were recruited to dry sites than wet sites. This suggested that years with numerous rainfall events might reduce ant abundance, at least in the wet areas preferred by young caterpillars.

The researchers tested the ant predation hypotheses by caging caterpillars in plastic deli containers that had either window screen bottoms that allowed ants to enter but prevented the caterpillars from leaving, or had spun polyester bottoms that prevented ant access. At each of 12 field sites, the caterpillars were caged with litter that matched the depth and wetness of litter found at that site. All caterpillars protected from ants survived, while 40% of the unprotected caterpillars from dry sites and 23% of the unprotected caterpillars from wet sites were killed by predators. So ants are clearly fearsome predators, but more so under dry conditions.


A Formica lasioides ant subdues an early install wooly bear caterpillar within the confines of a deli box. Credit Rick Karban.

But does litter wetness help protect against predacious ants? To investigate this question, the researchers placed caterpillars in deli containers that permitted ant access. At each site, two containers were placed side-by-side; one contained a caterpillar + litter from a wet site, while the other contained a caterpillar + litter from a dry site. Both containers were completely filled with litter and left in the field for 48 hours. The researchers discovered that caterpillars were 26% more likely to avoid predation if they were in a container stuffed with litter from a wet site. This suggests that litter from wet sites acts as a refuge for caterpillars against predators.


Caterpillar survival rate in relation to litter wetness.

Unfortunately, no long-term data on ant abundance are available, so we don’t know the relationship between ant and caterpillar abundance over time. But when ants were excluded, caterpillars survived well, and when ants were present, caterpillars survived best in wet sites with deep litter. It is not clear why caterpillars survive ant predation better in wet litter. One possibility is that caterpillars are more active than ants at cooler temperatures, and may be more likely to avoid them in wet and cool conditions. A second possibility is that dry litter is structurally less complex than wet litter, and ants may be more likely to move efficiently to capture caterpillars in dry terrain.

Given the predictions for more rainfall variability in coming years, Karban and his colleagues expect caterpillar abundance to fluctuate even more dramatically from year to year. In this system, and presumably other insect populations as well, multiple factors interact to determine whether there will be a population outbreak reminiscent of Pharaoh’s experience early in recorded history.

note: the paper that describes this research is from the journal Ecology. The reference is Karban, Richard, Grof-Tisza, Patrick, and Holyoak, Marcel (2017), Wet years have more caterpillars: interacting roles of plant litter and predation by ants. Ecology, 98: 2370–2378. doi:10.1002/ecy.1917. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2017 by the Ecological Society of America. All rights reserved.

Changing climate promotes prolific plants and satiated consumers

Plants in Sweden can have a difficult life, but climate change has provided a more benign environment for some of them, including the white swallow-wort, Vincetoxicum hirundinaria. This perennial herb grows in patches in sun-exposed rocky areas, in forests located below cliffs, and along the edges of wooded areas. The plant forms clumps that are heavily laden with flowers in June and July, and creates pod-like fruits in July and August


Vincetoxicum hirundinaria growing in rocky outcrop (top photo). Vincetoxicum pods releasing their wind-dispersed seeds (bottom photo).

Christer Solbreck has had a lifelong interest in insect populations, and he has been following the insects that eat Vincetoxicum’s seeds for the past 40 years. As he described to me, surprisingly few population ecologists actually measure the amount of food available to insects. I should add that very few people have the resilience to study the same population of insects for 40 years, either. And interestingly, though this paper discusses the effect of a changing climate on seed production and seed predation, it was not Solbreck’s intent to consider climate change as a variable when he began, as climate change was not a concern of most scientists in the 1970s.

But climate change has happened in southeastern Sweden (and elsewhere), and has affected ecosystems in many different ways. Ecologists can quantify climate change by describing its effect on the vegetation period, or growing season (days above 5°C), which has increased by about 20 days since the mid 1990s.


Length of growing season (vegetation period) in southern Sweden.

During the same time period the abundance of Vincetoxicum has increased sharply.


Vincetoxicum abundance, measured as area of the research site covered, during the study.

You will note that “Vincetoxicum” has the word “toxic” in its midst; the seeds are toxic to most consumers, and are important food sources for only two insect species. Euphranta connexa females lay eggs in developing fruits of the host plant, with the emerging larva boring through the seeds and killing most of them. Lygaeus equestris is an all-purpose seed predator; both larvae and adults suck on flowers, on developing seeds within the fruits, and on dry seeds they find on the ground up to a year later.


Euphranta connexa female lays eggs in an immature seed pod.


Lygaeus equestra larva feeds on a fallen seed.

Solbreck teamed up with biostatistician Jonas Knape to analyze his data. From the beginning of the study, Solbreck suspected that annual variation in weather – particularly rainfall – might influence Vincetoxicum seed production, and consequent population growth of the two insect species. They discovered something quite unexpected; the dynamics of seed production shifted dramatically in the second half of the study, alternating annually from very high to very low production over that period. This dynamic shift coincides with the extension of the growth season as a result of climate change.


Seed pod abundance by year.

The researchers argue that there is a non-linear negative feedback relationship of the previous year’s seed production on the current year’s seed production. Negative feedback occurs when an increase in one factor or event causes a subsequent decrease in that same factor or event. In this case, an increase in seed production uses up plant resources, leading to a decrease in seed production the following year. But the effect is non-linear, and does not come into play unless Vincetoxicum produces a huge number of seeds, as shown by the graph below,


Seed production in the current year in relation to seed production in the previous year. Note that both axes are logarithmic. The curve represents the expected seed pod density generated by the statistical model, with the shaded area representing the 95% credible intervals. Open circles are data for 1977-1996, while closed circles are data for 1997-2016.

The researchers also found that high rainfall in June and July increased seed production.

So how do these wild fluctuations in seed production affect insects and the plant itself? One important finding is that in high seed production years, the proportion of seeds attacked by insects plummets because the sheer number of seeds overwhelms the seed-eating abilities of the insect consumers. Ecologists describe this phenomenon as predator satiation.


Seed predation rates in relation to seed pod density.  Note that both axes are logarithmic. The curve represents the expected predation rate generated by the statistical model, with the shaded area representing the 95% credible intervals. Points are E. connexa predation rates while triangles are combined predation by both insect species.

As a result of predator satiation, there were, on average, seven times as many healthy (unattacked) seed pods in 1997-2016 than there were in 1977-1996. Presumably, this increased number of healthy seeds translates to an increase in new plants becoming established in the area. An important takehome message is that the entire dynamics of an ecosystem can change as a result of changes to the environment, in this case, climate change. More long-term studies are needed to evaluate how common these shifting dynamics are likely to become in the novel environmental conditions we humans are creating.

note: the paper that describes this research is from the journal Ecology. The reference is Solbreck, Christer and Knape, Jonas (2017), Seed production and predation in a changing climate: new roles for resource and seed predator feedback?. Ecology, 98: 2301–2311. doi:10.1002/ecy.1941. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2017 by the Ecological Society of America. All rights reserved.

Powdery parasites pursue pedunculate oak

Studying disease transmission is tricky for many reasons. Most humans frown on what might seem like the easiest experimental protocol – release a disease into the environment and watch to see how it spreads. For his doctoral dissertation in 2006, Ayco Tack settled on a different experimental protocol – bring the potential hosts to the disease. In this study, staged in Finland, the hosts were pedunculate oak trees, Quercus robur, and the disease was the powdery mildew parasite, Erysiphe alphitoides. Almost 10 years later, Adam Ekholm continued research on the same system, with Tack as his co-supervisor.

Ayco Tack

Trees on the move. Credit: Ayco Tack.

But before moving trees around, the researchers first needed to see how the disease moved around under field conditions.  Within a tree stand, powdery mildew success will depend on how many trees it occupies, how many trees it colonizes in the future, and how many trees it disappears from (extinction rate). The researchers measured these rates over a four year period (2003 – 2006) on 1868 oak trees situated on the island of Wattkast in southwest Finland. They also measured spatial connectivity of each tree to others in the stand. In this case connectivity is a measure of the distance between a tree and other trees, weighted by the size of the other trees. So a tree that has many large neighbors nearby has high connectivity, while a tree with a few distant and mostly small neighbors has low connectivity. Results varied from year-to-year, but in general, the researchers found higher infection rates, lower extinction rates, and some evidence of higher colonization rates in trees with high connectivity.

Mildew_Adam Ekholm

Oak leaf infected with powdery mildew parasite. Credit: Adam Ekholm.

The importance of connectivity indicated that the parasites simply could not disperse efficiently to distant trees. But perhaps the environment might play a role in colonization rates as well. For example, fungi like powdery mildew tend to thrive in shady and humid environments. Thus a tree out in the open might resist colonization by powdery mildew more effectively than would a tree deep in the forest. To test this hypothesis, Tack and his colleagues placed 70 trees varying distances (up to 300 meters) from an infected oak stand. On one side of the oak stand was an open field, while the other side was closed forest. Thus two variables, distance and environment, could be investigated simultaneously.

Ayco Tack inspecting a potted tree_Tomas Roslin

Ayco Tack inspects an oak tree placed in an open field. Credit: Tomas Roslin.

The researchers collected infection data twice; once in the middle of the growing season (July) and a second time at the end of the growing season (September). Not surprisingly, infection rates were higher by the end of the growing season. In general, infection rates, and infection intensity (mildew abundance) were higher in the forest than in the field, indicating a strong environment effect. In the July survey, trees further from the oak stand had lower infection intensity, but as infection rates increased over the course of the season, the effects of distance diminished, particularly in the forest.


Upper two graphs show the impact of habitat type on (a) proportion of trees infected and (b) mildew abundance. The lower two graphs are the influence of distance from parasite source on mildew abundance of trees set in (c) a forest habitat and (d) an open field. Mildew abundance was scored on an ordinal scale with 0 = none and 4 = very abundant.

Ten years later, Adam Ekholm, as part of his PhD dissertation that studies the effect of climate on the insect community on oak trees, added a third element to the mix – the influence of genes on disease resistance. He wondered whether certain genotypes were more resistant to powdery mildew infection. The researchers grafted twigs from 12 large “mother” trees, creating 12 groups of trees, with between 2 – 27 trees per group (depending on grafting success). Each tree in a given group was thus genetically identical to all other trees within that group.

Ayco Tack

Oak tree placed in the forest. Credit: Ayco Tack.

The researchers chose a site that contained a dense stand of infected oaks, but was surrounded by a grassy matrix that contained only an occasional tree. To study the impact of early season exposure, Ekholm and his colleagues divided the trees into two groups; 128 trees were placed in the matrix at varying distances from the infected stand, while 58 trees were placed directly in the midst of the stand for about 50 days, and then moved varying distances away. The researchers scored trees for infection at the end of the growing season (mid-September).


Trees that spent 50 days within the oak stand had much higher infection frequency and intensity than trees that were initially placed in the matrix. Some genotypes (for example genotype I in graphs C and D below) were much more resistant to infection than others (such as genotypes D and J). Finally trees further from the source of infection were less susceptible to become colonized over the course of the summer (data not shown).


Proportion of trees infected (A) and proportion of leaves infected (B) in response to early season exposure to stand of oaks infected with the powdery mildew parasite (oak stand) or no early season exposure (matrix). Proportion of trees infected (C) and proportion of leaves infected (D) in relation to tree genotype. Genotypes are labeled A – L; numbers in parenthesis are sample size for each group.

These findings illustrate how dispersal, host genotype and the environment influence the spread of a parasite under natural conditions. The parasite exists as a metapopulation – a group of local populations inhabiting networks of somewhat discrete habitat patches. Some populations go extinct while others successfully colonize each year, depending on distance from a source, tree genotype and environment. Ekholm and his colleagues encourage researchers to use similar experimental approaches in other host-parasite systems to evaluate how general these findings are, and to explore how multiple factors interact to shape the dynamics of disease transmission.

note: the paper that describes this research is from the journal Ecology. The reference is Ekholm, Adam; Roslin, Tomas; Pulkkinen, Pertti and Tack, Ayco. J. M. (2017). Dispersal, host genotype and environment shape the spatial dynamics of a parasite in the wild. Ecology. doi:10.1002/ecy.1949. The paper should come out in print very soon. Meanwhile you can also link to Dr. Tack’s website at Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2017 by the Ecological Society of America. All rights reserved.