Kelp consumption curtailed by señorita

Miranda Haggerty was diving through a kelp forest, and noticed that many kelp bore a large number of tiny limpets that were housed in small scars that they (or a fellow-limpet) had excavated on the kelp’s surface. This got her thinking about how these scars might affect the kelp, and equally relevant, whether there were any limpet predators that might lend the kelp a hand (or a mouth) by removing limpets.

Jerry Kirkhart

A limpet grazes on a kelp frond. Credit: Jerry Kirkhart

Feather boa kelp (Egregia menziesii) is a foundation species within the subtidal marine system off the California coast, providing food and habitat for many species that live on or among its fronds. The tiny seaweed limpet, Lottia insessa, specializes on feather boa kelp, grazing on its fronds and living within the scars. Many invertebrates and fish live within the kelp forest, but the most abundant fish is the señorita, Oxyjulis californica. Haggerty wondered whether the señorita might benefit the kelp (directly) by removing limpets, or (indirectly) by scaring limpets away – what ecologists call a trait-mediated indirect interaction.


The señorita – a fearsome predator of limpets.  Credit: Miranda Haggerty

The first order of business was to determine whether the limpets were actually harming the kelp.  Haggerty and her colleagues approached this in two ways.  First they chose 94 kelp plants from kelp forests off the California coast.  From each individual they chose one grazed and one ungrazed frond (each 3 m long). Grazed fronds averaged 5-10 scars and at least 2 limpets per meter of length.  Every three weeks they visited their kelp to score for broken fronds. In 29 of 30 cases, the grazed frond broke before the ungrazed frond (in the remaining cases the entire plant was missing, or both fronds broke and the researchers could not tell which had broken first).


Photo of feather boa kelp showing grazing scars, including one housing a limpet (left).  Diagram of feather boa kelp showing multiple fronds (right).

But the researchers were concerned that perhaps limpets chose to graze on weaker fronds, so the breakage was not caused by grazing scars, but by limpet choice.  To account for this concern, Haggerty and her colleagues chose 43 ungrazed kelp plants, placed three  limpets on one frond, and chose a second, equal-sized frond as an unmanipulated control. Once again, they visited their plants every three weeks, and discovered that grazed fronds broke first in all 20 pairs that the sequence of frond breakage could be determined.  Clearly, limpet grazing is bad news for feather boa kelp.

How does the señorita fit into this system? The researchers designed a laboratory experiment to address this question.  They used 10 large tanks (1700 L), and set up five different experimental treatments to compare direct effects of predation, and indirect effects of predator presence, on limpet grazing, and ultimately on kelp survival. To isolate the direct effects of predation from the indirect effects of predator cues on limpets, Haggerty and her colleagues placed four kelp fronds into fish exclosure cages, which were housed in the large tanks, and placed three limpets onto some of these fronds.  To mimic actual predation (CE treatment in Table below), they removed limpets by hand at a constant rate typical of señorita predation. For the NCE treatment (testing indirect effects of predator presence) they introduced señorita into the large tank so the limpets experienced the predator cues, but were not eaten. The different treatments are summarized in the table below. These experiments ran for one week and each treatment was replicated 10 times.

HaggertyTableFinalEach day the researchers monitored the number of limpets and grazing scars.  After one week, Haggerty and her colleagues counted the number of grazing scars, and measured the breaking strength of each frond by clamping the frond’s end to a table and pulling on the opposite end with a spring scale until it broke. They then recorded the amount of force needed to break the frond.


Clamped kelp frond whose breaking strength has been tested.  Notice that the frond broke at a grazing scar (right). Credit Miranda Haggerty.

Not surprisingly, the predator control (PC) kelp (limpets present without señorita) had the most scars and lost the greatest amount of tissue.  Kelp receiving the consumptive predator effect treatment (CE) had fewer scars and lost less tissue than PC.  But interestingly, kelp receiving NCE and TPE treatments had significantly fewer scars than the CE kelp, and were statistically indistinguishable from each other.  Thus, in the laboratory, the presence of señorita cues (NCE treatment) was more important than actual predation (CE treatment) in reducing kelp scarring and tissue consumption (top and middle graph below).  As a result, the NCE treated kelp were stronger (had greater breaking strength) than were the CE treated kelp (bottom graph below).


Mean (+ standard error) number of grazing scars (top), mass of tissue consumed (middle) and breaking strength (bottom) of kelp in response to five experimental treatments. CE = consumptive effect, NCE = non-consumptive effect, TPE = total predator effect, PC = predator control, LC = limpet control. Different letters above bars indicate significant differences between the means when comparing treatments.

Haggerty and her colleagues replicated this experiment, with a few experimental design modifications, in a field setting.  As with the laboratory experiment we’ve just discussed, the researchers found a very strong non-consumptive effect. The researchers suspect that these limpets respond to chemical cues emitted by their señorita predators. They could not respond to many types of sensory cues because they lack auditory organs, and the experimental design prevented fish from transmitting any shadows (visual cues) or vibrational cues. In addition previous studies have shown that some limpet species use chemoreception for predator avoidance, foraging and homing. However, the nature of this chemical cue is yet to be discovered for this predator-prey system.


Schooling señorita. Credit: Miranda Haggerty

Trophic cascades occur when the effects of one species on another species cascade down through the ecosystem. In this case, señorita predators directly and indirectly reduce limpet density, which increases the survival of kelp – a foundation species for this ecosystem. The researchers point out that this trophic cascade only occurs in the southern feather boa kelp range, because señorita are absent further north.  We don’t know if limpets have other predators in the northern range, but we do know that the kelp are structurally more robust further north, so they (and the ecosystem) may be relatively immune to limpet-induced destruction.

note: the paper that describes this research is from the journal Ecology. The reference is Haggerty, M. B., Anderson, T. W. and Long, J. D. (2018), Fish predators reduce kelp frond loss via a trait‐mediated trophic cascade. Ecology, 99: 1574-1583. doi:10.1002/ecy.2380. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

Seaweed defense – location, location, location.

If you’re ever feeling sorry for yourself, you should know that things could have been much worse; you could have been the brown seaweed, Silvetia compressa. So many problems!  Ocean waves come crashing over you, threatening to pull you off your life-sustaining substrate.  Ocean tides recede, exposing you to harsh sun and dangerously dry conditions. Perhaps worst of all, the fearsome predator Tegula funebralis eats away at your body, and you are powerless to defend yourself from its savage ravages.


Tegula snails chomp away on Silvetia seaweed in northern California. Credit: Emily Jones.

As it turns out, Silvetia is not so powerless after all.  After being partially grazed by Tegula, the seaweed can induce defenses that reduce its palatability.  From prior work, Emily Jones noticed that seaweed from northern California shorelines was much more sensitive to grazing than was seaweed from southern California shorelines.  It took fewer grazing snails to elicit palatability reduction in northern Silvetia than it did in southern Silvetia. She decided to focus her PhD work with Jeremy Long on documenting these geographic differences, and figuring out why they exist.


Emily Matthews (near) and Grace Ha (far) survey snails and seaweed in a northern California site. Credit: Emily Jones.

Environmental conditions vary along the California coast.  Northern seaweed populations experience cooler temperatures (air ~5-20 °C; water ~10-12 °C) and more nutrients (nitrate levels up to 40 umol/L) than do southern populations (air 5-37 °C; water ~14-20 °C; nitrate levels < 2 umol/L). In addition, Jones and Long surveyed Tegula abundance at three northern California and three southern California sites, counting every snail in 20 quadrats placed in the low, mid and high intertidal zone at each of the six sites (360 0.25 X 0.25m quadrats in total) .  They discovered that seaweed was much more likely to encounter Tegula along northern coastlines.


Percent of plots with Tegula snails in northern sites (Stornetta, Moat Creek and Sea Ranch – blue bars) and southern sites (Coast, Calumet and Cabrillo – orange bars). High, Mid and Low refer to location within the intertidal zone (high is closest to shore and regularly exposed at low tide).

Given these differences in snail abundance, we can now understand why Silvetia is more sensitive in its northern range to Tegula grazing.  But how strong are these differences in sensitivity? Jones and Long developed a simple paired-choice feeding preference assay to test for differences in palatability. At each location (north and south), the researchers gave test snails a choice between feeding on seaweed that had been previously grazed by either 1, 4, 7, 10 or 13 Tegula snails, or to feed on seaweed with no grazing history.  The test snails grazed for five days, and the researchers measured the amount of seaweed consumed for each group. They discovered that even a little bit of previous grazing (the 1-snail treatment) made northern test snails prefer non-grazed northern Silvetia, while only high levels of previous grazing (the 10 and 13-snail treatments) had similar effects on southern snails tested on southern Silvetia.


Amount of previously-grazed and non-grazed Silvetia eaten by Tegula in paired choice tests. (Top) Northern Selvetia, (Bottom) Southern Silvetia. Error bars are 1SE. * indicates significant differences in consumption rate.

These findings raised the question of whether the cooler and more nutrient-rich environmental conditions at the northern site were somehow causing this difference in consumption of previously-grazed seaweed.  The researchers designed a series of common garden experiments at the Bodega Marine Laboratory, in which seaweed from both locations were tested in the same environment.  Silvetia was exposed to grazing by two snails, or by no snails for 14 days. When test snails were given the choice of non-grazed or previously-grazed northern Silvetia, they much preferred eating non-grazed Silvetia. In contrast, they showed no preference when given a similar choice between non-grazed or previously-grazed southern Silvetia. This indicates that seaweed from the north are responding more to grazing by reducing palatability than are seaweed from the southern locations.


Amount of previously-grazed and non-grazed northern and southern Silvetia eaten by Tegula in paired choice tests.

In theory, there could be a tradeoff between induced defenses, such as reduction in palatability in response to grazing, and constitutive defenses, which an organism expresses all of the time.  Examples of constitutive defenses are thorns or spines in plants, and cryptic coloration or body shape in many insects.  Jones and Long found no evidence for such a tradeoff; in contrast southern Silvetia actually had lower levels of constitutive defenses, as both northern and southern Tegula strongly preferred eating southern Silvetia in paired choice tests.


Amount of northern and southern Silvetia eaten by northern and southern Tegula in paired choice tests.

These geographic differences in seaweed sensitivity to grazing are probably due to long-term differences in environmental history.  Southern Silvetia seaweeds live in stressful conditions (high temperatures and low nutrients), and the physiological cost of mounting an induced defense against low and moderate levels of grazing may be too high to be worthwhile. We also don’t know what the overall grazing rates are in the north versus the south, and importantly, how variable the grazing rates are in each location.  Highly variable grazing rates would select for a strong set of induced responses, which could be turned on and off as needed, allowing seaweed, or any plant, to defend itself against new or more hungry herbivores moving into their environment.

note: the paper that describes this research is from the journal Ecology. The reference is Jones, Emily and Long, Jeremy D. 2018. Geographic variation in the sensitivity of an herbivore-induced seaweed defense. Ecology. doi: 10.1002/ecy.2407. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

Indirect effects of the lionfish invasion

I’m old enough to remember when ecological studies of invasive species were uncommon.  Early on, there was a debate within the ecological community whether they should be called “invasive” (which conveyed to some people an aggressive image akin to a military invasion) or more dispassionately “exotic” or “introduced.” Lionfish (Pterois volitans), however, fit this more aggressive moniker. Native to the south Pacific and Indian Oceans, lionfish were first sighted in south Florida in 1985, and became established along the east Atlantic coast and Caribbean Islands by the early 2000s. They are active and voracious predators, consuming over 50 different species of prey in their newly-adopted habitat. Many population ecologists study the direct consumptive effects of invasive species such as lionfish.  In some cases they find that an invasive species may deplete its prey population to very low levels, and even drive it to extinction.


A lionfish swims in a reef. Credit: Tye Kindinger

But things are not always that simple. Tye Kindinger realized that lionfish (or any predator that feeds on more than one species) could influence prey populations in several different ways.  For the present study, Kindinger considered two different prey species – the fairy basslet (Gramma loreto) and the blackcap basslet (Gramma melacara). Both species feed primarily on zooplankton, with larger individuals monopolizing prime feeding locations at the front of reef ledges, while smaller individuals are forced to feed at the back of ledges where plankton are less abundant, and predators are more common.  Thus there is intense competition both within and between these two species for food and habitat. Kindinger reasoned that if lionfish depleted one of these competing species more than the other, they could be indirectly benefiting the second species by releasing it from competition.


Fairy basslet (top) and blackcap basslet (bottom). Credit Tye Kindinger.

For her PhD research, Kindinger set up an experiment in which she manipulated both lionfish abundance and the abundance of each basslet species.  She created high density and low density lionfish reefs by capturing most of the lionfish from one reef and transferring them to another (a total of three reefs of each density).  She manipulated basslet density on each reef by removing either fairy or blackcap basslets from an isolated reef ledge within a particular reef.  This experimental design allowed her to separate out the effects of predation by lionfish from the effects of competition between the two basslet species.  Most of her results pertained to juveniles, which were about 2 cm long and favored by the lionfish.


Alex Davis

Alex Davis captures and removes basslets beneath a ledge. Credit Tye Kindinger.

Kindinger measured basslet abundance in grams of basslet biomass per m2 of ledge area.  When lionfish were abundant, juvenile fairy basslet abundance decreased over the eight weeks of the experiment (dashed line) but did not change when lionfish were rare (solid line).  In contrast, juvenile blackcap basslet populations remained steady over the course of the study, whether lionfish were abundant or rare. Kindinger concluded that lionfish were eating more fairy basslets.


Abundance of juvenile fairy basslets (left) and blackcap basslets (right) as measured as change in overall biomass. Triangles represent high lionfish reefs and circles are low lionfish reefs.

Competition is intense between the two basslet species, and can affect feeding position and growth rate.  Kindinger’s manipulations of lionfish density and basslet density demonstrate that fairy basslet foraging and growth depend primarily on the abundance of their blackcap competitors. When competitor blackcap basslets are common (approach a biomass value of 1.0 on the x-axis on the two graphs below), fairy basslets tend to move towards the back of the ledge, and grow more slowly.  This occurs at both high and low lionfish densities.


Change in feeding position (top) and growth rate (bottom) of fairy basslets in relation to competitor (blackcap basslet) abundance (x-axis) and lionfish abundance (triangles = high, circles = low)

In contrast, blackcap basslets had an interactive response to fairy basslet and lionfish abundance. Let’s look first at low lionfish densities (circles in the graphs below).  You can see that blackcap basslets tend to move towards the back of the ledge (poor feeding position) at high competitor (fairy basslet) biomass, and also grow very slowly.  But when lionfish are common (triangles in the graphs below), blackcap basslets retain a favorable feeding position and grow quickly, even at high fairy basslet abundance.


Change in feeding position (top) and growth rate (bottom) of blackcap basslets in relation to competitor (fairy basslet) abundance (x-axis) and lionfish abundance (triangles = high, circles = low)

By preying primarily on fairy basslets, lionfish are changing the dynamics of competition between the two species. The diagram below nicely summarizes the process.  Larger fish of both species forage near the front of the ledge, while smaller fish forage further back.  But there is an even distribution of both species.  Focusing on juveniles, they are relatively evenly distributed in the rear portion of the ledge (Figure B).  When fairy basslets are removed experimentally, the juvenile blackcap basslets move to the front of the rear portion of the ledge, as they are released from competition with fairy basslets (Figure D).  Finally, when lionfish are abundant, fairy basslets are eaten more frequently, and juvenile blackcaps benefit from the lack of competition (Figure F)


Kindinger was very surprised with the results of this study because she knew the lionfish were generalist predators that eat both basslet species, so she expected lionfish to have similar effects on both prey species.  But they didn’t, and she does not know why.  Do lionfish prefer to eat fairy basslets due to increased conspicuousness or higher activity levels, or are blackcap basslets better at escaping lionfish predators? Whatever the mechanism, this study highlights that indirect effects of predation by invasive species can influence prey populations in unexpected ways.

note: the paper that describes this research is from the journal Ecology. The reference is Kindinger, T. L. (2018). Invasive predator tips the balance of symmetrical competition between native coral‐reef fishes. Ecology99(4), 792-800. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

Mangroves partner with rats in China

Many of us have seen firsthand the havoc that invasive plants can wreak on ecosystems.  We are accustomed to think of native plants as unable to defend themselves, much like a skinny little kid surrounded by a group of playground bullies. ‘Not so fast’ says Yihui Zhang.  As it turns out, many native plants can defend themselves against invasions, and they do so with the help of unlikely allies.

In southern China, mangrove marshes are being invaded by the salt marsh cordgrass, Spartina alterniflora, which is native to the eastern USA coastline. Cordgrass seeds can float into light gaps among the mangroves, and then germinate and choke out mangrove seedlings.  However, intact mangrove forests can resist cordgrass invasion.  Zhang and his colleagues wanted to know how they resist.

mangrove-Spartina ecotone

Cordgrass (pale green) meets mangrove (bright green) as viewed from space. Credit: Yihui Zhang.

Cordgrass was introduced into China in 1979 to reduce coastal erosion.  It proved up to the task, quickly transforming mudflats into dense cordgrass stands, and choking out much of the native plant community.  Dense mangrove forests grow near river channels that enter the ocean, and are considerably taller than their cordgrass competitors.  The last player in this interaction is a native rat, Rattus losea, which often nests on mangrove canopies above the high tide level. At the research site (Yunxiao), many rat nests were built on mangroves, using cordgrass leaves and stems as the building material.


Rat nest constructed from cordgrass shoots rests upon a mangrove tree.  Credit Yihui Zhang.

Zhang and his colleagues suspected that cordgrass invasion into the mangrove forest was prevented by both competition from mangroves and herbivory by rats on cordgrass.

Baby rat in the nest

Baby rats in their nest. Credit Yihui Zhang.


To test this hypothesis, they built cages to exclude rats from three different habitats: open mudflats (primarily pure stands of cordgrass), the forest edge, and the mangrove forest understory, (with almost no cordgrass). They set up control plots that also had cages, but that still allowed rats to enter.


Arrow points to resprouting cordgrass. Credit Yihui Zhang.

The researchers planted 6 cordgrass ramets (genetically identical pieces of live plant) in each plot and then monitored rodent grazing, resprouting of original shoots following grazing, and shoot survival over the next 70 days.

They discovered that the cages worked; no rats grazed inside the cages.  But in the control plots, grazing was highest in the forest understory and lowest in the mudflats (Top figure below).  Most important, both habitat type and exposure to grazing influenced cordgrass survival.  In the understory, rodent grazing was very important; only one ramet survived in the control plots, while 46.7% of ramets survived if rats were excluded.  In the other two habitats, grazing did not affect ramet survival, which was very high with or without grazing (Middle figure). Rodent grazing effectively eliminated resprouting of ramets in the understory, but not in the other two habitats (Bottom figure).


Impact of rat grazing on cordgrass in the field study in three different habitats.  Top figure is % of stems grazed, middle figure is transplant survival, and bottom figure is resprouting after grazing (there was no grazing in the rodent exclusion plots). Error bars are 1 standard error. Different letters above bars indicate significant differences between treatments.

The researchers suspected that low light levels in the understory were preventing cordgrass from resprouting after rat grazing. This was most easily tested in the greenhouse, where light conditions could be effectively controlled.  High light was 80% the intensity of outdoor sunlight, medium light was 33% (about what strikes the forest edge) and low light was 10% the intensity of outdoor sunlight (similar to mangrove understory light).  Rat grazing was simulated by cutting semi-circles on the stembase, pealing back the leaf sheath, and digging out the leaf tissue. Cordgrass ramets were planted in large pots, exposed to different light and grazing treatments, and monitored for survival, growth and resprouting following grazing.

Greenhouse setup

Cordgrass growing in greenhouse under different light treatments. Credit: Yihui Zhang.

Zhang and his colleagues found that simulated grazing sharply reduced cordgrass survival from 85% to 7% at low light intensity, but had no impact on survival at medium or high light intensities.  Cordgrass did not resprout after simulated grazing at low light intensity, in contrast to approximately 50% resprouting at medium and high light intensity.


Survival (top) and resprouting (bottom) of cordgrass following simulated grazing in the greenhouse experiment.

The researchers conclude that grazing by rats and shading by mangroves are two critical factors that make mangroves resistant to cordgrass invasion. Rats tend to build their nests near the mangrove forest edge, so it is not clear how far into the forest the rat effect extends. Rats do prefer to forage in the understory (rather than right along the edge), presumably because the understory helps to protect them from predators.  In essence, mangroves compete directly with cordgrass by shading them out, and also indirectly by attracting cordgrass-eating rats. Conservation biologists need to be aware of both direct and indirect effects when designing management programs for protecting endangered ecosystems such as mangrove forests.

note: the paper that describes this research is from the journal Ecology. The reference is Zhang, Y. , Meng, H. , Wang, Y. and He, Q. (2018), Herbivory enhances the resistance of mangrove forest to cordgrass invasion. Ecology. Accepted Author Manuscript. doi:10.1002/ecy.2233. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

Fungi attack plants – insects respond!

As she was preparing to do her dissertation research on the interactions between the Asian chestnut gall wasp, the chestnut blight disease and the European chestnut, Pilar Fernandez-Conradi read a lot of papers about fungal-insect-plant interactions.  She was impressed by the diversity of outcomes that resulted when plants were attacked by both insects and fungi, and wondered whether there were any generalities to glean from these research findings. She asked two basic questions. First, if a plant is infected by a fungus, is it more or less likely to be attacked by insects than is an uninfected plant?  Second, does an insect that attacks a fungal-infected plant perform better or worse than it would have on an uninfected plant?

D. Kuriphilus+Gnomo

Three-way interaction between the chestnut tree, the chestnut gall wasp, and the fungus Gnomopsis castanea. Female wasps induce the plant to create galls, which house developing larvae. Green globular galls (with a hint of rose-color) have not been infected by a fungus, while the very dark tissue is the the remains of a gall that was attacked by the fungus. Credit: Pilar Fernandez-Conradi.

Fernandez-Conradi and her colleagues thought they were more likely to discover a negative effect of fungal infection on the preference and performance of herbivorous insects.  Several studies had shown that nutrient quantity and quality of host plants is reduced by fungal infection, so it makes sense that insects would avoid infected plants.  But the researchers also knew that fungal infection can, in some cases, actually increase the sugar concentration of some plants, so insects might prefer those plants and also develop more rapidly on them. In addition, fungal infection can induce chemical defenses in plants that might make them less palatable to insects, or alternatively, fungal infection could weaken plant defenses making them more palatable to attacking insects.

To resolve this conundrum, Fernandez-Conradi and her colleagues did a meta-analysis, of the existing literature, identifying 1113 case studies based on 101 papers.  To be considered in the meta-analysis, all of the studies had to meet the following criteria: (1) report insect preference or performance on fungal-infected vs. uninfected plants, (2) report the Genus or species of the plant, fungus and insect, (3) report the mean response and a measure of variation (standard error, standard deviation or variance). The measure of variation allows researchers to calculate the effect size, which calculates the strength of the relationship that is being explored. The researchers found that, in general, insects avoid and perform worse on infected plants than they do on uninfected plants.


Mean effect size of insect preference and performance (combined) in response to fungal infection infection.  Error bars are 95% confidence intervals (CIs).  In this graph, and the next two graphs as well, a solid data point indicates a statistically significant effect.  You can also visually test for statistical significance by noting that the error bar does not cross the dashed vertical line that represents no effect (at the 0.0 value). The negative value indicates that insects respond negatively to fungal infection.

Fernandez-Conradi and her colleagues then broke down the data to explore several questions in more detail. For example, they wondered if the type of fungus mattered.  For their meta-analysis, they considered three types of fungi with different lifestyles: (1) biotrophic pathogens that develop on and extract nutrients from living plant tissues, (2) necrotrophic pathogens that secrete enzymes that kill plant cells, so they can develop and feed on the dead tissue, and (3) endophytes that live inside living plant tissue without causing visible disease symptoms.


Effect of fungus lifestyle on insect performance.  k = the number of studies.  Different letters to the right of CIs indicate significant differences among the variables (lifestyles).

The meta-analysis showed an important fungus-lifestyle effect (see the graph to your left).  Insect performance was strongly reduced in biotrophic pathogens and endophytes, but not in necrotrophic pathogens, where insect performance actually improved slightly (but not significantly). The researchers point out that biotrophic pathogens and endophytes both develop in living plant tissues, while necrotrophic pathogens release cell-wall degrading enzymes which can cause the plant to release sugars and other nutrients.  These nutrients obviously benefit the fungus, but can additionally benefit insects that feed on the plants.

To further explore this lifestyle effect, Fernandez-Conradi and her colleagues broke down insect response into performance and preference, focusing on chewing insects, for which there were the most data. Insects showed lower performance on and reduced preference (i.e. increased avoidance) for plants infected with biotrophic pathogens. They also performed equally poorly on endophyte-infected plants, but did not avoid endophyte-infected plants (see graph below). This was surprising since you would expect natural selection to favor insects that can choose the best plants to feed on. The problem for insects may be that endophytic infection is basically symptomless, so the insects may, in many cases, be unable to tell that the plant is infected, and likely to be less nutritionally rewarding.


Effects of fungal infection on preference and performance of chewing insects.  k = the number of studies.  Different letters to the right of CIs indicate significant differences among the variables. Variables that share one letter have similar effect sizes.

Many ecological studies deal with two interacting species: a predator and a prey, or a parasite and its host.  Fernandez-Conradi and her colleagues remind us that though two-species interactions are much easier to study, many important real-world interactions involve three or more species. Their meta-analysis highlights that plant infection by pathogenic and endophytic fungi reduces the performance and preference of insects that feed on these plants. But fungus lifestyle plays an important role, and may have different effects on performance and preference. Their meta-analysis also suggests other related avenues for research.  For example, how are plant-fungus-insect interactions modified by other species, such as viruses, bacteria and parasitoids (an animal that lives on or inside an insect, and feeds on its tissues)? Or, what are the underlying molecular (hormonal) mechanisms that determine the response of the plant to fungal infection, and to insect attack?  Finally, how does time influence both plant and insect response?  If a plant is recently infected by a fungus, does it have a different effect on insect performance and preference than does a plant that has suffered from chronic infection.  There are very few data on these (and other) questions, but they are more likely to get pursued now that some basic relationships have been uncovered.

note: the paper that describes this research is from the journal Ecology. The reference is Fernandez‐Conradi, P., Jactel, H., Robin, C., Tack, A.J. and Castagneyrol, B., 2018. Fungi reduce preference and performance of insect herbivores on challenged plants. Ecology, 99(2), pp.300-311. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

Field gentian – when it’s good to be eaten

We tend to think of plants as victims – after all any interested herbivore can simply walk, fly or crawl over to its favorite plant, and begin munching. But not so fast! In reality, plants have a variety of ways they can make life difficult for potential herbivores. Plants can escape herbivores by simply growing in places that are not easily accessible (such as in cracks, or high enough to be out of a herbivore’s reach) or by growing at a time of year when herbivores are away from the plant’s habitat. Plants also use mechanical defenses such as thorns or a diverse array of chemical defenses to thwart overzealous herbivores. A third approach – tolerance – can take many forms. For example, following attack by a herbivore some plants can increase photosynthetic rates or reduce the time until seed production . Tommy Lennartsson and his colleagues were interested in a particular form of tolerance that ecologists call overcompensation, in which damaged plants produce more seeds than undamaged plants.


Herbivores in action. Notice the difference in vegetation height inside and outside the pasture. Credit: Tommy Lennartsson.

Overcompensation is an evolutionary puzzle, because undisturbed plants produce fewer offspring than partially eaten plants. That outcome seems to fly in the face of the scientific principle that natural selection favors individuals with traits that promote reproductive success. Lennartsson and his colleagues investigated this evolutionary puzzle by comparing two subspecies of the herbaceous field gentian Gentianella campestris. The first subspecies, Gentianella campestris campestris (which we’ll just call campestris), has relatively unbranched shoot architecture when intact, growing to about 20 cm tall, but produces multiple fruiting branches when the dominant apical meristem is eaten. The second subspecies, Gentianella campestris islandica (which we’ll call islandica), is much shorter (about 5-10 cm tall), and always has a multi-branched architecture.


Two subspecies of field gentian – campestris (left) and islandica (right).

Environmental conditions and soils can vary dramatically, even on a small spatial scale. The field site was a gently-sloped grassland in Sweden that had coarser, dryer soil on the ridge, and finer, wetter and richer soil in the valley. This created a productivity gradient, with taller vegetation in the valley. The average  height of all the vegetation was 15 cm in the high-productivity valley, 10 cm on the medium-productivity slope and 5 cm on the low-productivity ridge.

The researchers used this natural variation to set up an experiment that would allow them to explore hypotheses about why an undisturbed campestris is less successful than one that is partially-eaten. One hypothesis (the overcompensation hypothesis) is that campestris restrains branching to conserve resources, so that when it is grazed it has plenty of resources in reserve to be used for regrowth and the production of prolific branches, flowers and seeds. Limited branching and limited seed production of ungrazed campestris are simply a cost of tolerance, while overcompensation after damage maximizes reproductive success. A second hypothesis (the competition hypothesis) is that restrained branching allows the plant to grow tall, so it can compete better in ungrazed pastures than can the much shorter islandica. These two hypotheses are not mutually exclusive.

To test these two hypotheses, the researchers set up 2 X 2 meter experimental plots in the valley (18 plots), slope (12 plots) and the ridge (6 plots). They planted 2000 seeds per subspecies in each plot, which ultimately yielded about 20 plants of each subspecies per plot. Of course there were many other neighboring plant species in these plots. In the high productivity plots (valley), the neighboring plants in six plots were clipped to a height of 12 cm, six plots to 8 cm and six plots to 4 cm. In the medium productivity plots (which naturally only grew to 10 cm), the researchers cut neighboring plants to 8 cm in 6 plots and 4 cm in six plots. Finally, in the low productivity plots, the researchers cut neighboring plants to 4 cm in all six plots. In mid July, half of the gentian plants in each plot were clipped to the same height as the surrounding vegetation, while the remainder were not clipped.


Experimental plots from the valley (left), slope (middle) and ridge (right).  Black squares represent plots where neighboring plants were clipped to 12 cm, grey squares to 8 cm, and clear squares to 4 cm. Squares with slashes through them (left)  represent plots that were used for a different purpose.

The beauty of this experimental design, is that by counting seeds, the researchers could assess the reproductive success of both subspecies under conditions of high competition (when surrounded by tall neighbors) and low competition (when surrounded by shorter neighbors). At the same time, clipping the two subspecies allowed the researchers to simulate grazing in these different competitive environments. Lennartsson and his colleagues found that unclipped islandica did better than unclipped campestris when surrounded by short or medium height neighbors, but that islandica success plummeted when the neighbors were very tall (see the left graph below). Campestris reproductive success also dropped when surrounded by tall competitors, but not as much as did islandica, so that campestris produced twice as many seeds than islandica in the high competition environment (also the left graph).

When plants were clipped to simulate grazing, campestris outperformed islandica in all three competitive environments. Campestris actually produced more seeds when it was clipped than when it was not clipped in the low and medium competition environments. Thus campestris overcompensated for grazing under conditions of low and moderate competition (see the right graph below).


Mean (+ standard error) seed production for unclipped (left graph) and clipped (right graph) field gentian subspecies in relation to surrounding vegetation height.  Sample sizes are in bars.

The researchers collected data on growth rates, development, survival probabilities and reproductive success for both species under conditions of being clipped or unclipped at different levels of competition. They then used these data to create a population growth model in relation to the percentage of grazing (damage risk) at different levels of productivity. In these graphs, a stochastic growth rate of 1.0 (on the y-axis) indicates that the population is stable, above 1.0 indicates it will increase and below 1.0 indicates a declining population.


Population growth rate of both subspecies in relation to damage risk at different levels of productivity.  These models predict that the population will increase at growth rates above the dotted line (growth rate = 1.0) and decline below the dotted line.

This model shows that in high productivity environments, campestris always does better than islandica (top graph). However, the model predicts that islandica will decline at any damage level (note in the top graph that all islandica damage values yield a growth rate below 1.0), while campestris will also decline except for very high damage risks. In medium and low productivity populations (middle and bottom graphs), islandica does better than campestris when damage risk is low, but the reverse is true at high damage risk.

So how do these results relate to the two hypotheses for why an undisturbed campestris is less successful than one that is partially-eaten. Campestris overcompensated for damage by producing more seeds and having positive population growth under most levels of productivity. In contrast, islandica undercompensated when damaged, but produced more seeds than campestris when ungrazed, except for in the high productivity environment. These differences in responses support the hypothesis that restrained branching is favored by natural selection in environments where damage from grazing is common (the overcompensation hypothesis). But, the superior performance by campestris in productive ungrazed environments supports the competition hypothesis.

Can we generalize these findings to other plants? Lennartsson and his colleagues point out that many short-lived grassland plants can’t grow tall enough to be effective competitors for light. These plants are thus restricted to environments where the surrounding plants are not very tall. Two factors commonly create conditions where there are short neighboring plants: grazing and unproductive (low nutrient) soils. When grazing is widespread, tolerance mechanisms such as overcompensation are favored by natural selection. When soils are unproductive, unrestrained branching is favored. Therefore, Gentianella campestris provides us with a natural experiment for testing hypotheses about how natural selection acts on plants to promote their reproductive success in a variable environment.

note: the paper that describes this research is from the journal Ecology. The reference is Lennartsson, T., Ramula, S. and Tuomi, J. (2018), Growing competitive or tolerant? Significance of apical dominance in the overcompensating herb Gentianella campestris. Ecology, 99: 259–269. doi:10.1002/ecy.2101. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.


Prey populations: the only thing to fear is fear itself

In reference to the Great Depression, Franklin Delano Roosevelt is famously quoted as stating during his 1933 inaugural speech “the only thing we have to fear is fear itself.” Roosevelt was no biologist, but his words could equally apply to a different type of depression – the decline of animal populations that can be caused by fear.


Roosevelt’s inauguration in 1933. Credit: Architect of the Capitol.

Ecologists have long known that predators can depress prey populations by killing substantial numbers of their prey. But only in the past two decades or so have they realized that predators can, simply by their presence, cause prey populations to go into decline. There are many different ways this can happen, but, in general, a predation threat sensed by a prey organism can interfere with its feeding behavior, causing it to grow more slowly, or to starve to death. As one example, elk populations declined after wolves were introduced to Yellowstone National Park. There are many factors associated with this decline, but one factor is fear of predators causes elk to spend more time scanning and less time foraging. Also, elk tend to stay away from wolf hotspots, which are often places with good elk forage.

Liana Zanette recognized that ecologists had not considered whether predator presence can cause bird or mammal parents to reduce the amount of provisioning they provide to dependent offspring, thereby reducing offspring growth and survival, and slowing down population growth. For many years, she and her colleagues have studied the Song Sparrow, Melospiza melodia, on several small Gulf Islands in British Columbia, Canada. In an early study, she showed that playbacks of predator calls reduced parental provisioning by 26%, resulting in a 40% reduction in the estimated number of nestlings that fledged (left the nest). But, as she points out, Song Sparrow parents provision their offspring for many days after fledging; she wondered whether continued perception of a predation threat during this later time period further decreased offspring survival and ultimately population growth.

Song sparrow

The Song Sparrow, Melospiza melodia. Credit: Free Software Foundation.

Zanette’s student, Blair Dudeck, did much of the fieldwork for this study. The researchers captured nestlings six days after hatching , weighed and banded them, and fit them with tiny radio collars. They then recaptured and weighed the nestlings within a few hours of fledging (at about 12 days post-hatching) to assess nestling growth rates.


Banded sparrow nestling with radio antenna trailing from below its wing. Credit: Marek C. Allen.

Three days after the birds fledged, Dudeck radio-tracked them, and surrounded them with three speakers approximately 8 meters from where they perched. For one hour, each youngster listened to recordings of calls made by predators such as ravens or hawks, followed, after a brief rest period, by one hour of calls made by non-predators such as geese or woodpeckers (or vice-versa). During the playbacks, Dudeck observed the birds to record how often the parents visited and fed their offspring, and whether offspring behavior changed in association with predator calls. This included recording all of the offspring begging calls.


Blair Dudeck simultaneously uses a tracking device to locate Song Sparrows and a recorder mounted to his head to record their begging calls. Credit: Marek C. Allen.

Fear had a major impact on parental behavior. Parents reduced food provisioning vists by 37% when predator calls were played in comparison to when non-predator calls were played. They also fed offspring fewer times per visit, which resulted in 44% fewer meals in association with predator calls.


Mean number of parental provisioning visits (in one hour) in relation to whether predator (red) or non-predator (blue) calls were played. Error bars are 1 SE.

Hearing predator calls had no effect on offspring behavior – they continued to beg for food at a high rate, and did not attempt to hide.

Some parents were much more scared than others – in fact, some parents were not scared at all. The researchers measured parental fearfulness by subtracting the number of provisioning visits by parents during predator calls from the number of visits during non-predator calls. A more positive number indicated a more fearful parent (a negative number represents a parent who fed more in the presence of predator calls). The researchers discovered that more fearful parents tended to have offspring that were in poorer condition at day 6 and at fledging.


Offspring weight on day 6 (open circles) and at fledging (solid circles) in relation to parental fearfulness.  Higher positive numbers on x-axis indicate increasingly fearful parents.

Importantly, more fearful parents tended to have offspring that died at an earlier age. Based on this finding, the researchers created a statistical model that compared survival of offspring that heard predator playbacks throughout late-development with survival of offspring that heard non-predator playbacks during the same time period. They estimated a 24% reduction in survival. Combined with their previous study on playbacks during early development, the researchers estimate that hearing predator playbacks throughout early and late development would reduce offspring survival by an amazing 53%.

This “fear itself” phenomenon can extend to other trophic levels in a food web. For example recent research by Zanette and a different group of researchers showed that playbacks of large carnivore vocalizations dramatically reduced foraging by raccoons on their major prey, red rock crabs. When these carnivore playbacks were continued for a month, red rock crab populations increased sharply. This increase in crab population size was followed by a decline of the crab’s major competitor – the staghorn sculpin, and the crab’s favorite food, a Littorina periwinkle. Thus “fear itself” can cascade through the food web, affecting multiple trophic levels in important ways that ecologists are now beginning to understand.

note: the paper that describes this research is from the journal Ecology. The reference is Dudeck, B. P., Clinchy, M., Allen, M. C. and Zanette, L. Y. (2018), Fear affects parental care, which predicts juvenile survival and exacerbates the total cost of fear on demography. Ecology, 99: 127–135. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.