Introduced quolls quell melomys

The cane toad has the distinction of being the world’s largest toad.  It was introduced to Australia in 1935 to control cane beetles that were eating sugarcane (see A toxic brew: toad vs. quoll).  Since their introduction, cane toads have expanded their range by over 2000 km from their release sites, which would be fine if all they did was eat cane beetles.  As it turns out, they are worthless at eating cane beetles, but are very good at being eaten by many predators, including the northern quoll (Dasyurus hallucatus).  This turns out poorly for the quolls, as cane toads are loaded with toxins, which quickly convert a cane-toad-fed quoll into a quoll corpse. Consequently, quoll populations are collapsing across much of Australia.

A northern quoll sports a radio collar. Credit: Chris Jolly

For his PhD work, Chris Jolly was hoping to explore whether he could use behavioral conditioning techniques to train quolls to avoid cane toads before he released them back into the environment.  Unfortunately, while in captivity, the quolls lost their fear of predators, and became easy prey for dingoes, which resulted in a failed reintroduction to Kakadu National Park. In an interesting twist, another graduate student was releasing quolls onto Indian Island for a different purpose, and Jolly decided to regroup and focus his attention on how the prey population responded to a novel predator. In 2017, 54 quolls were released on the northern part of the island, which set up a natural experiment in which quolls were present in the north, and absent in central and southern Indian Island

Ben Phillips and John Moreen release the first batch of northern quolls on Indian Island (Kabarl), Northern Territory, Australia. Credit: Chris Jolly

Working with several researchers, Jolly established four research sites in the north and three in the south.  At each one-hectare site, the researchers set up a 10 X 10 grid of live-traps which they baited with balls of peanut butter, rolled oats and honey (100 traps at each site). The target species was the grain-eating rodent, Melomys burtonia. Over the course of the study, 439 individual melomys were captured, weighed, sexed and implanted with a microchip for identification purposes. The researchers wanted to know how the presence of predaceous quolls influenced melomys abundance, and whether melomys adjusted behaviorally to quoll presence.  

Research sites on Indian Island. Quolls were introduced to the northern part of the island in 2017.

They discovered that the three southern sites (without quolls) maintained relatively steady numbers of melomys throughout the study.  In contrast, the four northern sites (with quolls) showed a sharp decrease in melomys abundance. Complicating the issue, a wildfire broke out in August 2017, affecting only the northern part of the island.  The researchers believe this fire did not affect the melomys in any significant way, as wildfires are common in the area, and several previous studies have shown no effect of wildfires on melomys abundance.

Melomys population estimates at three southern sites (left graph) and four northern sites (right graph). The dashed orange line denotes quoll introduction, while the dashed red line indicates the wildfire in 2017. Error bars are 95% confidence intervals.

Shyness can be an adaptive behavior if predators are in your environment.  Jolly and his colleagues wanted to know if there was any difference in the shyness (or conversely – the boldness) of melomys from the north (with quolls) and south (without quolls). They set up arenas that were baited with the aforementioned peanut butter balls, and placed a live-trap with a melomys at the door to the arena.  The researchers then opened up the trap door and recorded whether the melomys entered the arena within 10 minutes. 

Experimental setup testing melomys responses to open-field tests. Credit: Chris Jolly.

After 10 minutes, each melomys was rounded up and placed back in its trap, and a red plastic bowl was put into the arena.  The trap was then reopened and the researchers recorded whether the melomys interacted with the red bowl.

Looking at the left graph, you can see that in 2017, north island melomys were much shyer than melomys from the predator-free south island. But by 2019, this difference was mostly gone.  But when it comes to exploring a novel object (right graph), the northern melomys still retained some of their fear in comparison to southern melomys.

Figure 4

Left graph.Proportion (+ 95% confidence intervals) of melomys that emerged from live traps within 10 minutes in the open-field test. Right graph. Proportion of melomys that interacted with the novel object in the experiment that tested for neophobia (fear of novel objects).

Lastly, Jolly and his colleagues tested the effect of living with quolls on melomys foraging behavior.  At nightfall, the researchers placed one wheat seed at 81 locations in each site. Control (unmanipulated) seeds were set out at 40 locations while seeds that had been stored with quoll fur (and presumably smelled like quoll) were set out at 41 locations. At daybreak, the researchers counted the number of remaining seeds, so they could calculate seed removal. In the first session conducted shortly after quoll release, they found no evidence of discrimination based on predator scent in either melomys population. But over time, the northern melomys began to discriminate based on quoll scent, while southern quolls continued to forage at the same rate on control and quoll-scented seeds.

Figure 5B

Mean seed take bias (the number of scented seeds – the number of control seeds) taken by north and south island melomys. Error bars are 95% confidence intervals.

The researchers conclude that introduction of quolls as a novel predator influenced melomys in two distinct ways.  First, quolls preyed on them and reduced melomys abundance.  But equally important, quolls changed melomys behavior. Soon after quoll introduction, invaded melomys populations were substantially shyer than the non-invaded populations.  But this changed over the next two years, with a reduction in general shyness in the invaded populations, and an increase in predator-scent aversion. In effect, melomys were fine-tuning their behavioral response to quoll invasion.

Unfortunately, the researchers can’t evaluate whether these behavioral changes result from learning, or from natural selection.  Melomys has a short generation time, so natural selection could be strong, even over a short timespan.  Unfortunately, because of low survival from one year to the next, there were not enough melomys to test for whether individual behavior changed over time as a result of learning.  It is certainly plausible that natural selection and learning operate together to change melomys behavior following quoll introduction.  

note: the paper that describes this research is from the journal Ecology. The reference is Jolly, C. J., A. S. Smart, J. Moreen, J. K. Webb, G. R. Gillespie, and B. L. Phillips. 2021. Trophic cascade driven by behavioral fine-tuning as naıve prey rapidly adjust to a novel predator. Ecology 102(7): e03363. 10.1002/ecy.3363. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2021 by the Ecological Society of America. All rights reserved.

Stressed-out primates

The endangered black lion tamarin, (Leontopithecus chrysopygus), lives in mostly degraded and highly fragmented landscape in the state of Sao Paulo, Brazil.  Olivier Kaisin is a PhD student who wants to know whether declining environmental conditions are causing increased stress to the tamarins. Researchers often use glucocorticoid (GC) levels as a measure of physiological stress, as many animals, including primates, produce and release GCs in response to stress.  Many researchers have argued that prolonged elevation of GC levels has a negative impact on individual survival or reproduction, but it is not clear whether this is true for most primates. Given that 60% of primate species are currently threatened with extinction, it would be nice to know whether conservation biologists could use GC levels to identify populations that are at risk.

The black lion tamarin, Leontopithecus chrysopygus.

One of the unadvertised features of graduate programs is that students need to learn about their study system before doing research. In this spirit, before beginning his tamarin study, Kaisin (working with several other researchers) did a meta-analysis of all studies (published until 2020) that compared cortisol levels in primates from disturbed vs. undisturbed habitats to see if the type of disturbance influenced GC levels.  Disturbance types included hunting, tourism, habitat loss, ongoing logging, habitat degradation and other human activities.  Habitat loss was a reduction in forest fragment size to less than 500 hectares.  Habitat degradation resulted from logging in the past 20 years that led to changes in forest structure and diversity, but did not substantially reduce the size of the forest habitat.  Other human activities did not fit into the five disturbance types, and included activities such as mining, urbanization and access to rubbish.

The graph below shows the effects of the different disturbance types.  “Hedges g” is a test statistic used in meta-analyses to look for effects of different variables.  The midpoint of the bar (or the diamond in the case of the overall effect) is the mean value of Hedges g, while the endpoints of each bar (or diamond) indicate the 95% confidence interval.  If the entire interval does not overlap 0, then we can conclude that there is a statistically significant effect of that variable.  Based on this analysis, both hunting and habitat loss were associated with significant increases in glucocorticoid levels in primates, contributing to a significant overall increase in glucocorticoid levels in response to disturbance.

The influence of six types of disturbance on GC levels of 24 different primate species. * indicates statistically significant effects.

As Kaisin and his colleagues point out, six of the studies actually showed a significant decrease in GC levels in association with disturbance.  For example, howler monkeys had reduced GC levels in response to ongoing logging.  The researchers interpret this surprising GC decrease on the elimination of large predators from the logged forest, which substantially reduces howler monkey stress levels. As a second example, in Madagascar, an invasive tree species in the degraded site provided important fruits for red-bellied lemurs, leading to well-fed lemurs with reduced GC levels. Unfortunately, these confounding variables cannot be easily controlled, so researchers need to consider each study on a case-by-case basis. Some families of primates were more influenced by stress than others. In particular, hominids (great apes) and atelids (New World monkeys such as howler, spider and woolly monkeys) both showed significantly greater GC levels in association with stress.  Three families showed smaller increases while three other families of primates were basically unaffected.

The influences of disturbance on eight different primate families (as measured by Hedges g). CI (95%) is the 95 percent confidence interval. Weight is a measure of the contribution of each primate family to the overall effect. Families with more species/studies contribute more weight to the overall effect

The researchers emphasize that many more studies are needed in order to understand when we should expect stress to elevate GC levels in primates.  For example, only one of the studies looked at stress effects on Asian primates. Future studies in endocrinological primatology should relate how prolonged stress influences fitness – including survival, growth and development and reproductive success.  In turn, this would allow the conservation community to understand the relationship between stress and future population viability.

note: the paper that describes this research is from the journal Conservation Biology. The reference is Kaisin, O., Fuzessy, L., Poncin, P., Brotcorne, F. and Culot, L., 2021. A meta‐analysis of anthropogenic impacts on physiological stress in wild primates. Conservation Biology35(1), pp.101-114. Thanks to the Society for Conservation Biology for allowing me to use figures from the paper. Copyright © 2021 by the Society for Conservation Biology. All rights reserved.

Tasty truffles tempt mammalian dispersers

The first humans known to eat truffles were the Amorites (Old Testament victims of Joshua during his Canaan conquest) over 4000 years ago.  Many other animals eat truffles as well; in fact humans commonly use dogs (and sometimes pigs) to help them locate truffles, making good use of their highly developed sense of smell. Apparently pigs and perhaps dogs as well, need to be muzzled so that they don’t consume this delightful fungal delicacy following a successful search.

Ryan Stephens was studying the small mammal community in the White Mountains of New Hampshire as part of his doctoral dissertation, and was particularly interested in what these mammals were eating.  It turned out that fungi comprised about 15% of the diet of the woodland-jumping mouse, white-footed mouse, deer mouse and eastern chipmunk, and about 60% of the diet of the red-backed vole. So he and his colleagues began surveying truffles in the region, and discovered several new species in the process.  

New Hampshire’s White Mountains. Credit: Ryan Stephens

The Elaphomyces truffles we will discuss today are partners in ectomycorrhizal associations.  The fungal hyphae form a sheath around the roots of (primarily) Eastern hemlock trees providing soil nutrients to the tree in exchange for carbohydrates created by the tree’s photosynthetic processes.  What we call “truffles” are actually the fungal fruiting bodies, or sporocarps, which upon maturing develop massive numbers of spores that must somehow be dispersed.  This is a problem for an organism that is attached to underground tree roots.  The solution that has evolved in truffles is emission of volatile substances that communicate with truffle-loving mammals, informing these mammals where the truffles are, and that they are ripe and available.  Mammals dig the truffles up, eat them, and then defecate the spores in a new location that may be many meters or even kilometers away.  

Sporocarps (fruiting bodies) of the four truffle species used in this research. From left to right: (a) Elaphomyces americanus, (b) E. verruculosus, (c) E. macrosporus, (d) E. bartletti. In each photo one truffle has been cut in half, revealing the spores and the spore-bearing structures.

At the Bartlett Experimental Forest in the White Mountains, Stephens and his colleagues set up 1.1 ha grids at eight forest stands that were rich in Eastern Hemlock. Within each grid, they set up 48 16m2 sampling plots for truffle collection.  They used a short-tined cultivator to dig up all truffles within each plot, counting, drying and weighing each sporocarp, and then analyzing each sporocarp for %N. Using simple arithmetic (and some assumptions), the researchers were able to convert %N to % digestible protein.

Short-tined cultivator in action, uncovering two sporocarps. Credit: Ryan Stephens.

It was impossible to measure the depth of each sporocarp, because raking the soil disturbed it too much for accurate measures.  Instead Stephens and his colleagues took advantage of previous research that discovered that soils dominated by ectomycorrhizal fungi have a specific pattern of how a stable isotope of heavy nitrogen (15N) is distributed in relation to the normal isotope (14N), with higher 15N concentrations the deeper you go. The sporocarps have similar 15N concentrations as the soil around them (actually slightly higher, but in a predictable way), so a researcher can measure the 15N/14N ratio of a sporocarp, and estimate its depth in the soil column.  

Stephens and his colleagues discovered that Elaphomyces verruculosus was, by far, the most abundant truffle (Figure a below).  Sporocarps of the four species occupied very different depths, with some overlap (Figure b).  E. verruculosus and E. macrosporus had more digestible protein than the other two species (Figure c).  Lastly, all species had similar sized sporocarps (Figure d).

Sporocarp (a) abundance, (b) depth in soils, (c) % digestive protein and (d) dry mass across the sampling grids. Thick horizontal line in box plots are median values, box limits are first and third quartiles and notches are the 95% confidence intervals. The vertical lines above and below each box extend to the most distant data point that is within 1.5 times the interquartile range.

If a truffle’s sporocarp is deep below the soil surface, we might expect it to emit a stronger chemical signal than a sporocarp nearer to the surface, so it can attract a mammalian disperser.  Having a fully equipped chemical laboratory allowed the researchers to measure the quantity and type of chemical emitted by each species.  They discovered that E. macrosporus and E. bartletti, the two deepest species had the strongest chemical signals – emitting relatively large quantities of methanol, acetone, ethanol and acetaldehyde.

Field research team of Andrew Uccello, Tyler Remick and Chris Burke – each has handsful of E. verruculosus sporocarps. Credit: Ryan Stephens.

The question then becomes, which truffle species are small mammals most likely to eat?  If they go for the easiest to reach, they should prefer E. americanus.  If they go for the most nutritious, they should prefer E. verruculosus and E. macrosporus.  But if they prefer the ones with the strongest signal, they should focus their attentions on E. macrosporus and E. bartletti.

On the surface you might realize that it is difficult to figure out who is eating what.  This is true. To address this problem, the researchers analyzed the quantity and type of fungal spores defecated by mammals that were captured in live traps within their research grids.  Analysis of the feces indicated a consistent preference among all five species of small mammals for the two truffle species with the strongest signals, even though that resulted in them needing to do a bit of extra digging.  The only exception to that trend was red-backed voles consumed much more E. verruculosus than did the other mammals. Recall that E. verruculosus was one of the most nutritious truffles, and that fungi comprise more than 60% of a red-backed vole’s diet. So it was more important for that species to discriminate based on food quality.

Truffle selection by small mammal community. N. insignia = woodland jumping mouse , P. maniculatus = deer mouse, M. gapperi = red-backed vole, P. leucopus = white-footed mouse, T. striatus = eastern chipmunk. Jacob’s selection index measures consumption relative to the availability of each truffle species, with +1 representing strong preference and -1 representing strong avoidance of a particular truffle species. Error bars represent 95% confidence intervals around the mean.

Why don’t the shallow truffle species emit stronger signals? One possibility is that a shallow truffle with a strong signal might get harvested and eaten before it is fully mature, and does not yet have viable spores. A second possibility is that shallow truffles might rely on soil disturbance or mammal activity (burrowing or just scurrying by) to make its way to the surface.  Upon reaching the surface, its spores can disperse in the wind like the spores of more traditional mushrooms. It requires considerable resources to produce these volatile compounds, so a truffle should only produce them if they are highly beneficial. Thus a stronger, energetically costly signal might not be necessary for the shallowest truffles, and may even be counterproductive.

note: the paper that describes this research is from the journal Ecology. The reference is Stephens, R. B.,  Trowbridge, A. M.,  Ouimette, A. P.,  Knighton, W. B.,  Hobbie, E. A.,  Stoy, P. C., and  Rowe, R. J..  2020.  Signaling from below: rodents select for deeper fruiting truffles with stronger volatile emissions. Ecology  101(3):e02964. 10.1002/ecy.2964. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2020 by the Ecological Society of America. All rights reserved.

Stress frequency structures communities

COVID-19 has amplified our experience of stress, but even in a COVID-free world, we share with most other organisms a continuously stressful existence, highlighted by situations affecting our survival (e.g. getting food and not becoming someone else’s food) and our reproductive success.  Today we will discuss organisms that live in a very stressful environment – the subtidal zone off of the Galapagos islands – located just below the line demarcating the furthest extent of low tide.  One serious stress for subtidal organisms is coping with dramatically fluctuating ocean currents.  The speedy surgeonfish uses its powerful pectoral fins and slender, disc-shaped body to minimize drag, permitting feeding in high flow conditions brought about by powerful ocean waves.  In contrast, the broad-bodied torpedo-shaped parrotfish is unable to do so; for it, fast water is too much of a drag.

ALE_3

Yellowtail surgeonfish (Prionurus laticlavius) stand out as voracious herbivores that can feed even in the most wave-swept coastlines of the Galapagos Islands. Credit: Dr. Alejandro Perez-Matus.

Waters near the Galapagos Islands are enriched by upwelling equatorial currents, which provide nutrients to a diverse community of plankton and benthic (attached to the ocean bottom) algae.  These in turn support a high diversity of macroinvertebrates and herbivorous fish that feed on them, including the pencil urchin, Eucidaris galapagensis, a voracious feeder on algae, barnacles and coral. This species wedges itself among rocks and crevices during the day, and emerges to feed at night.  It attaches itself (and moves very slowly) using its tube feet.  Robert Lamb, Franz Smith and Jon Witman hypothesized that given the weak attachment strength of the pencil urchin’s tube feet, it might only be an effective feeder in locations where wave action was minimal.

IMG_0465

Robert Lamb bolts experimental cages to the rock as Eucidaris urchins stand guard at the sheltered side of Caamaño. Credit: Salome Buglass.

To explore how wave action might affect the subtidal community, the researchers set up two research locations at Caamaño and Las Palmas – both off the Galapagos Island of Santa Cruz.

LambFig1

Effect of wave action (exposed – dark bar, sheltered – light bar) on abundance of some of the important members of the subtidal community off of the island of Santa Cruz.

 

At each location, they chose an exposed site with strong wave action and a sheltered site that had much reduced wave action.  Mean flow speed was more than twice as fast at exposed sites than in sheltered sites. As you can see in the figure to your left, site differences in mean flow speed corresponded to differences in the subtidal community. Crustose coralline algae (red algae firmly attached to corals) were more common in sheltered sites (Figure A), while a variety of red and green macroalgae were more common at exposed sites (Figure B).  Surgeonfish (Figure C) and parrotfish (Figure D) were much more abundant in exposed areas, while pencil urchins were much more abundant in sheltered sites (Figure E).

 

 

 

 

 

Lamb and his colleagues wanted to know why these differences exist. They set up a series of exclosures within each of these sites using wire mesh cages to either allow fish, but not urchins (+ fish treatment), allow urchins but not fish (+ urchins), or exclude both groups of herbivores (- all).  They also had a control treatment that allowed all herbivores (+ all).

LambTreatments

In one experiment the researchers created sandwiches made up of the delectable green algae Ulva.  For five days, they ran six replicates of each treatment at exposed and sheltered sites at Caamaño and Las Palmas. Lamb and his colleagues then harvested the sandwiches, weighed them, and calculated the percent remaining of each sandwich.

LambUlvaSandwich

An Ulva sandwich

At exposed locations, urchins (without fish) consumed very little Ulva, while fish (without urchins) consumed about 2/3 of the Ulva (when compared to the –all controls). In contrast, at sheltered locations, urchins took some mighty significant bites from the Ulva sandwiches, while fish also ate substantial Ulva at Caamaño, but not at Las Palmas.

LambFig3

Percent of Ulva biomass remaining after five days of the Ulva sandwich experiment. Error bars are 1 SE.

In a related experiment, the researchers used the same cages to explore how macroalgal communities assemble themselves in the presence or absence of urchin and fish herbiores under different flow rates.  If this was not enough to consider, they also ran these experiments both during the cool season, when nutrient-rich ocean currents lead to high production, and during the warm season when production is usually lower.  Lamb and his colleagues bolted two 13 X 13 cm polycarbonate plates to the bottom of each cage, and after two months measured the abundance and type of algae that colonized each plate.

Several trends emerge.  First, macroalgae colonized much more effectively during the cool season.  Second, urchins profoundly reduced macroalgal colonization at sheltered sites, but had little effect at exposed sites.  In contrast, fish herbivory reduced macroalgal colonization at exposed sites at Caamaño but not Las Palmas, during the warm and cool season.

LambFig4

Effect of herbivores on macroalgal community assembly, as measured by amount of algae colonizing the polycarbonate plates after two weeks.

In addition, the researchers set up video cameras and were able to document herbivory by 17 fish species, with drastically higher herbivory rates at exposed sites.

Lamb and his colleagues conclude that the dominant herbivores switched between urchins in low flow sites and fish in exposed sites. Fish can leave the resource patch when stress (flow rate) is unusually high, and return when flow rate drops, while the slow-moving pencil urchins do not have that option. The researchers argue that in many ecosystems, consumer mobility in relation to the frequency of environmental stress can predict how consumers influence community structure and assembly.  They point out that the coupling of mobility effects with environmental stress is common throughout the natural world.  As examples, many shorebirds feed on marine organisms that become available during low tides, or also between crashing waves.  Large mammals in Africa can migrate long distances to escape drought-stricken areas, while smaller animals cannot undertake such long journeys.  In locally acidic regions of the Mediterranean Sea, many fish species can enter, feed and leave before experiencing toxic effects from the acid water, while slow-moving urchins are excluded from feeding in those habitats. Thus, while extreme environmental stress often decreases consumer activity, there are also times when it doesn’t.  In these cases, we need to understand how particular species will behave and perform in the stressful environment to predict how stress influences community structure and functioning.

note: the paper that describes this research is from the journal Ecology. The reference is Lamb, R. W.,  Smith, F., and  Witman, J. D..  2020.  Consumer mobility predicts impacts of herbivory across an environmental stress gradient. Ecology  101( 1):e02910. 10.1002/ecy.2910. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2020 by the Ecological Society of America. All rights reserved.

Birds and plants team up and trade off

For many years, ecologists have been puzzling over the question of why the world is so green.  Given the abundance of herbivores in the world, it seems, on the surface, that plants don’t stand a chance. The famous naturalist/ecologist Aldo Leopold was one of the first scientists to emphasize the role of predators, which provide service for plants by eating herbivores (his example was wolves eating deer, ultimately preserving the plant community growing on a hillside).  As it turns out there are many different predator species providing these services. Colleen Nell began her PhD program with Kailen Mooney with a keen interest on how insectivorous birds locate their prey, and how this could affect the plants that are being attacked by herbivorous insects.

COYE common yellowthroat simple

 A Common Yellowthroat perches on Encelia californica. Credit: Sandrine Biziaux.

Plants are not as poorly defended as you might expect (having sat on a prickly pear cactus I can  painfully attest to that).  In addition to thorns and other discouraging structures, many plants are armed with a variety of toxins that protect them against herbivores.  Thorns and toxins are examples of direct defenses.  But many plants use indirect defenses that involve attracting a predator to the site of attack.  Some plants emit volatile compounds that predators are attuned to; these compounds tell the predator that there is a yummy herbivore nearby.  Nell and Mooney recognized that plant morphology (shape and form) could also act as an indirect defense, making herbivorous insects more accessible to bird predators. They also recognized that we might expect a tradeoff between how much a plant invests in different types of defense.  For example, a plant that produces nasty thorns might not invest so much in a morphology attractive to predaceous birds.

pricklypearcawr_2

California Coastal Cactus Wren eating an orthopteran insect on a prickly pear cactus. Credit: Sandrine Biziaux.

What is a plant morphology that attracts birds?  The researchers hypothesized that birds might be attracted to a plant with simple branching patterns, so they could easily land on any branch that might be hosting a herbivorous insect (Encelia californica (first photo) has a simple or open branching pattern).  In contrast, birds might have a more difficult time foraging on insects that feed on structurally complex plants that host herbivorous insects which might be difficult to reach.

isocoma menziesii complex

Isocoma menziesii, a structurally complex plant. Credit: Colleen Nell.

The researchers chose nine common plant species from the coastal sage scrub ecosystem – a shrub-dominated ecosystem along the southern California coast. For each plant species they measured both its direct resistance and indirect resistance to herbivores.  Plants of each species were raised until they were four years old.  Then, for three months during bird breeding season, bird-protective mesh was placed over eight plants of each species, leaving five or six plants as unprotected controls.

IMG_3938

Kailen Mooney and Daniel Sheng lower bird-protective mesh over a plant. Credit: Colleen Nell.

After three months, the researchers vacuumed all of the arthropods from the plants, measured each arthropod, and classified it to Order or Family to evaluate whether the arthropod was herbaceous.

IMG_vaccum

Colleen Nell vacuums the arthropods from Artemisia californica. Credit: Colleen Nell.

Nell and Mooney evaluated the herbivore resistance of each plant species by measuring herbivore density in the bird-exclusion plants.  Relatively few herbivorous arthropods in plants that were protected from birds would indicate that these plants had strong direct defenses against herbivores.  The researchers also evaluated indirect defenses as the ratio of herbivore density on bird exclusion plants in comparison to controls (technically the ln[exclusion density/control density]).  A density of herbivores on plants protected from birds that is much greater than the density of herbivores on plants that allowed birds would indicate that birds are eating many herbivores. Finally, Nell and Mooney estimated plant complexity by counting the number of times a branch intersected an axis placed through the center of the plant at three different angles.  More intersecting branches indicated a more complex plant.

The researchers expected a tradeoff between direct and indirect defenses.  As predicted, as herbivore resistance (direct defense) increased, indirect defenses from birds decreased among the nine plant species.

NellFiga

Tradeoff between direct herbivore resistance and indirect defense by predaceous birds, for nine common plant species in the coastal sage scrub ecosystem.

The researchers also expected that more structurally complex plants would be less accessible to birds because complex branching would interfere with bird perching and foraging.  Thus Nell and Mooney predicted that structurally more complex plants would have weaker indirect defenses from birds, which is precisely what they discovered.

NellFigc

Indirect defenses (from birds) in relation to plant structural complexity .

Given that structurally complex plants received little benefit from birds, you might expect that they had greater direct defenses in the form of herbivore resistance.  Once again the data support this prediction.

NellFigb

Direct defenses (herbivore resistance) in relation to plant  structural complexity.

Initially, Nell was uncertain about whether increased plant complexity would deter insectivorous birds.  She points out that the top predators in this ecosystem are birds of prey that circle overhead in search of vulnerable birds to eat.  Structurally complex plants might provide refuge for insectivorous birds, which could result in them spending more time foraging in complex plants.  But the research showed the opposite trend. Plant complexity reduced the foraging efficiency of these small insectivorous birds, who prefer foraging on plants with relatively simple structure, which are easier to access and tend to host more prey.

note: the paper that describes this research is from the journal Ecology. The reference is Nell, C. S., and  Mooney, K. A..  2019.  Plant structural complexity mediates trade‐off in direct and indirect plant defense by birds. Ecology  100( 10):e02853. 10.1002/ecy.2853.  Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2019 by the Ecological Society of America. All rights reserved.

Tadpoles shun trout across time

At a young school child (so long ago I can’t recall exactly when) I was exposed to Ernst Haeckel’s dictum that “ontogeny recapitulates phylogeny.”  More interested in language than biology at the time, I thought “cool – three words that I’m clueless about.” Though biological thinking about ontogeny – the processes of growth and development – has changed since Haeckel’s time, interest has, if anything, grown more intense across disciplines. Tiffany Garcia has explored her lifelong fascination with ontogeny by focusing her research on amphibians, which are famous for their distinct stages of development, each with unique habitats and ecological requirements. Working with eggs and tadpoles of the Pacific chorus frog (Pseudacris regilla), Garcia and her colleagues investigated whether stress associated with the presence of predators during one developmental stage (for example an egg) would carry over to influence behavior or development of subsequent stages.

chorus frog

The Pacific chorus frog (Pseudacris regilla). Credit Brett Hanshew.

A tadpole’s anti-predator strategy can be influenced by other factors besides carry-over from earlier developmental stages.  For example, we might expect that tadpoles whose ancestors lived in association with predators for many generations might have evolved a different anti-predator strategy than did tadpoles whose ancestors lived in a less threatening environment (this would be an adaptive effect). Tadpoles may also show very short-term changes in behavior or development (this is termed plasticity) if exposed to a cue that indicated a possible predation threat.

ThreeCreeksLindsey

Collecting newly laid eggs at Three Creeks Lake. Credit: Lindsey Thurman

These three processes operate over very different time scales (long term – adaptive; intermediate term – carry-over; short term – plastic).  Garcia and her colleagues designed an experiment to explore how these processes might interact to influence a tadpole’s anti-predator strategy.  To investigate long term adaptive effects, the researchers collected newly laid (fertilized) eggs from lakes with and without rainbow trout (Oncorhynchus mykiss). They investigated carry-over effects by conditioning these eggs with four different environments during development: (1) trout odor, (2) cues from injured tadpoles (alarm cues), (3) trout odor paired with alarm cues, and (4) a water control (no odors nor cues).  The researchers created alarm cues by grinding up four juvenile tadpoles in 150 ml of water, and trout odor by housing 30 juvenile rainbow trout in a 200 L tank filled with well water.  They then conducted behavioral and developmental assays on tadpoles to see how adaptive, carry-over and plastic effects influenced tadpole growth, development and behavior.

GarciaFig2

Overview of the experimental design.

Garcia and her colleagues discovered that early exposure to trout odor had very little effect on growth and development, with body size and stage of development equivalent to that of controls.  In contrast exposure of eggs to tadpole alarm cues or to alarm cues + trout odor resulted in smaller, less developed fish (see table below).  In addition there was no effect of evolutionary history – eggs from lakes with and without trout showed similar patterns of growth and development.

GarciaTable1

Tadpole size and development in response to the four conditioning  treatments.  Higher Gosner stage numbers indicated more developed tadpoles. A tadpole hatches at Gosner stage 21 and begins metamorphosis at Gosner stage 42.

The next question is how do tadpoles respond behaviorally from exposure to different environments over the long, intermediate and short time scale?  To test tadpole anti-predator behavior, the researchers placed an individual tadpole into a tub that had a 6 X 8 cm piece of corrugated black plastic, which the tadpole could use as a refuge.  The researchers added to each tub one of the following: water (as a control (C)), tadpole alarm cues (AC), trout odor (TO), or alarm cues + trout odor (AC+TO).  After an acclimation period, a researcher noted the position of the tadpole (under the refuge or out in the open) every 20 minutes over a 3-hour time period.

There were no effects of evolutionary history on refuge use.  Tadpoles from lakes with and without trout showed similar patterns of refuge use.  However, embryonic conditioning to alarm cues and trout odor had a large effect on refuge use.  The left graph below shows the response of tadpoles from all four conditioning groups (C, AC, TO and TO+AC) to the addition of water.  As you can see, tadpoles that hatched from eggs that were conditioned with AC+TO were most likely to use refuges, while tadpoles from AC only or TO only eggs were somewhat more likely to use refuges. The pattern repeats itself when tadpole alarm cues are added to the water (second graph from left).  However when trout odor is added to the water, the responses are much more extreme, but follow the same pattern (third graph).  Lastly, when confronted with alarm cues and trout odor, tadpoles increase refuge use dramatically, but again show the same pattern, with tadpoles from control eggs using refuges the least, and tadpoles from eggs conditioned with alarm cues and trout odor using refuges the most (right graph).

GarciaFig6

Refuge use by tadpoles in response to embryonic conditioning and experimental exposure. C = water control, AC = tadpole alarm cue, TO = trout odor, and AC+TO = tadpole alarm cue and trout odor. Blue bars are means and gray bars are 95% confidence intervals.

There are two processes going on here.  First, over the short term, tadpoles are more responsive to the strongest cues, increasing refuge use when exposed to both tadpole alarm cues and trout odor.  Second, over the intermediate term, there is solid evidence for carry over effects.  Tadpoles that hatched from eggs conditioned with alarm cues and/or trout odor showed markedly increased refuge use than did tadpoles that hatched from control eggs.

These predator-induced responses impose a cost to the tadpoles.  Tadpoles exposed to alarm cues and trout odor while still in the egg were smaller and less developed, and probably metamorphosed into smaller frogs.  Many studies have shown that smaller frogs have reduced reproductive success.  The researchers recommend further studies to explore these trade-offs between survivorship, growth rate, development rate and size at metamorphosis. These studies are particularly essential, because rainbow trout are a non-native predator to these lakes.  Studies such as these allow conservation ecologists to understand the evolution and development of predator-prey interactions when novel species are introduced into an ecosystem.

note: the paper that describes this research is from the journal Ecology. The reference is Garcia, T. S.,  Bredeweg, E. M.,  Urbina, J., and  Ferrari, M. C. O..  2019.  Evaluating adaptive, carry‐over, and plastic antipredator responses across a temporal gradient in Pacific chorus frogs. Ecology  100( 11):e02825. 10.1002/ecy.2825.  Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2019 by the Ecological Society of America. All rights reserved.

 

 

 

 

 

 

 

 

 

 

 

Gone gorilla

Humans and lowland gorillas (Gorilla gorilla gorilla) share many features, including strong social bonds among members of their group.  Lowland gorillas differ from humans in that one male (the silverback) dominates the group, which is composed of several females and their offspring. Some mature males are unable to attract females and may be consigned to a solitary existence.  The silverback male mates with females in his group, and may allow other females to join.  However, if a female joins a new group with an unweaned child, there is a high probability that the silverback will kill the child, as a way of getting the female into estrous more quickly, so that he can be the father of more future children.

Gorilla1

A group of gorillas ranges over the landscape. Credit: Céline Genton CNRS/University of Rennes

The Odzala-Kokoua National Park in the Republic of Congo is home to several thousand lowland gorillas. Nelly Ménard and Pascaline Le Gouar (in affiliation with the ECOBIO laboratory CNRS/University of Rennes) have been studying two populations of these gorillas for over 20 years, and have identified and collected long-term data on 593 individuals from the two populations in their study. Working with their student, Alice Baudouin, and several other researchers, they documented that about 22% of the individuals were suffering from a yaws-like disease – an infectious skin disease caused by the bacterium Treponema pallidum pertenue.

FA2 + E2 ; pian ; GR ; Sergio

A mother carries her infected infant. Credit: Ludovic Bouquier CNRS/University of Rennes

Females may disperse from their social group several times over the course of their lifetime.  Factors influencing the decision to disperse include availability of a higher quality silverback, reduction of predation, and avoiding inbreeding, resource competition and disease.  Given the prevalence and conspicuousness of yaws, the researchers suspected that these highly intelligent animals would use a variety of cues to inform them of whether they should disperse and which group they should attempt to join.  They expected that females should leave diseased silverbacks for healthy ones, that they should leave groups with numerous diseased individuals and immigrate into groups with healthy individuals, and that diseased females would be less likely to leave their group. Other factors influencing a gorilla’s decision might include group size, group age and whether she had an unweaned infant in her care.

gorilla3.jpg

Silverback gorilla viewed from the mirador (observation post). Credit: Céline Genton CNRS/University of Rennes.

Because they considered so many variables, the researchers used their dataset to construct models of the probability of emigration (leaving the group) and immigration (entering a new group).  The research team categorized each breeding group based on the age of the oldest offspring: young (oldest offspring less than 4 years), juvenile (<7.5 years), mature (<11 years) and senescent (< 14 years). Female gorillas were more likely to emigrate if their group had numerous infected individuals (graph a below) and if the silverback was severely infected (graph b). They were also more likely to leave an older breeding group, perhaps understanding that the silverback would be losing effectiveness in the near future (graph c).  Lastly, females with unweaned infants were very unlikely to leave a group (graph d), presumably unwilling to accept the risk that their infant might starve or be killed if they attempted to join a new group.

GorillaFig2

Probability an adult female emigrates from her group in relation to (a) number of severely diseased individuals within her group, (b) presence of severe lesions on the silverback, (c) age of the breeding group, and (d) presence of an unweaned infant.  Dotted lines (in graph a) and bars (in graphs b, c and d) indicate 95% confidence intervals.

The research team did a similar analysis of factors associated with female gorillas immigrating into a new breeding group.

GorillaFig3

Probability an adult female immigrates into a group in relation to (a) age of group, (b) presence of severely diseased individuals, and (c) group size. Bars (in graph a and b) and dotted lines (graph c) indicate 95% confidence intervals.

 

They discovered that females were much more likely to join younger groups which had younger silverbacks (graph a).  In addition, females tended to join groups without any severely diseased individuals (graph b).  They were also attracted to smaller groups (graph c).

Based on these data, it is clear that disease strongly influences female dispersal decisions.  Females were much more likely to disperse from breeding groups with numerous infected individuals, and strongly avoided groups with more than two diseased individuals. This is not surprising, given how conspicuous these skin lesions are, particularly in the facial regions.  Contrary to expectation, female disease status (infected or not) did not influence female dispersal tendency. The researchers suggest that dispersal might not be particularly costly to the female (assuming she does not have an unweaned infant) because the home range of social groups overlap broadly so it is easy to move from one group to another, and food is also plentiful throughout the range.

Many features of a gorilla’s social environment influence its dispersal decisions. Because diseased females are as likely to disperse as healthy females, the disease pathogen may be more easily spread into previously uninfected gorilla populations.  On the other hand,  dispersing female avoidance of diseased populations has the effect of quarantining the diseased populations. The researchers hope to get a better understanding of the mechanisms of female appraisal of their social environment, so they can predict changes in the prevalence of this pathogen.

note: the paper that describes this research is from the journal Ecology. The reference is Baudouin, A., S. Gatti, F. Levrero, C. Genton, R. H. Cristescu, V. Billy, P. Motsch, J.-S. Pierre, P. Le Gouar, and N. Ménard. 2019. Disease avoidance, and breeding group age and size condition the dispersal patterns of western lowland gorilla females. Ecology 100(9): e02786. 10.1002/ecy.2786.  Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2019 by the Ecological Society of America. All rights reserved.

Hot ants defend plants from elephants

I’ve lost a lot of sleep over ants.  As a spider researcher, I often placed ants on spiderwebs to lure my spiders out of their underground retreats and onto their webs. The problem was that these harvester ants (Pogonmyrmex species) were fierce, so to minimize damage to myself, I was forced to capture them in the very early morning, when they and (alas) I were very sluggish.

acacias_thorn copy

Swollen thorn (domatia) that serves as living quarters for acacia ants. Credit: T. Palmer.

Todd Palmer has worked with ants for many years, including research on ant-plant mutualisms in which acacia trees provide domatia (swollen thorns) as ant living quarters and extrafloral nectaries as ant food, while ants provide protection from herbivores such as elephants, kudus and steenboks.

Similar to my efforts with ants and spiders, Palmer wanted to reduce ant-induced damage to himself and his colleagues, so he often took advantage of early morning ant sluggishness for purposes of manipulating acacia trees. On the other hand, if he wanted to study aggressive responses, he learned that mid-day was best. Recognizing the daily patterns of ant activity got Palmer, Ryan Tamashiro (Palmer’s undergraduate research student) and Patrick Milligan (Palmer’s graduate student) thinking about how these different levels of activity would influence herbivores, many of which tend to be most active during dawn and dusk when temperatures are low and ants are relatively sluggish.

Elephant side

Elephants are major herbivores that can cause enormous damage to acacia trees. Credit: T. Palmer.

Four species of ants live in domatia on branches of Acacia drepanolobium, the dominant tree species at Mpala Research Centre in Laikipia, Kenya.

Acdr habitat

A grove of Acacia drepanolobium. Credit: T. Palmer.

In order of relative abundance, the ant species are Crematogaster mimosae (52%), C. sjostedti (18%), Tetraponera penzigi (16%) and C. nigriceps (15%).  Previous research showed that C. mimosae and C. nigriceps are the two most effective acacia defenders.

Cnigriceps copy

Crematogaster nigriceps on an acacia tree. Credit: T. Palmer.

Ants are poikilotherms, whose body temperature, and presumably their activity levels, fluctuate with environmental temperature.  As these ants live in acacia branches, the first order of business became to determine how branch temperature fluctuated with time of day during the 21 days of data collection.  Not surprisingly, branch temperature peaked at mid-day, and was lowest at dawn and dusk (temperatures were not measured during the night).

TamashiroFig S!

Variation in branch surface temperature with time of day. Horizontal bars are median values; boxes are first and third quartiles.

Tamashiro, Milligan and Palmer next asked how ant activity level related to branch temperature.  Different ant species don’t get along so well, so each tree hosted only one ant species.  For each tree surveyed, the researchers counted the number of ants that passed over a 5 cm branch segment during a 30 second time period (they did this twice for each tree),  The researchers discovered a strong correlation between branch surface temperature and baseline ant activity, with C. mimosae and C. nigriceps showing greatest activity levels at all temperatures, which increased sharply at higher temperatures.

TamashiroFig 1a

Ant activity levels in relation to branch surface temperature. Shaded areas are 95% confidence intervals for each species.

Do higher temperatures cause a stronger aggressive response to predators or other disturbances? Tamashiro and his colleagues tested this by rapidly sliding a gloved hand over a 15 cm segment of a branch three times and then resting the gloved hand on the branch for 30 s.  They then removed the glove and counted the number of ants that had swarmed onto the glove.  Again, C. mimosae and C. nigriceps showed the strongest aggressive response, which increased sharply with temperature

TamashiroFig 1b

Aggressive swarming by ants in relation to branch surface temperature. Shaded areas are 95% confidence intervals for each species.

While a gloved hand is a nice surrogate for predators, the researchers wanted to know how the ants would respond to a real predator, and whether the response was temperature dependent.  At the same time, they wanted to determine whether the predator would change its behavior in response to changes in ant defensive behavior at different temperatures.  They used eight somali goats (Capra aegagrus hircus) as their predators, and C. mimosae as the focal ant species for these trials.

Cpl. Paula M. Fitzgerald, USMC - United States Department of Defense

Somali goats in Ali Sabieh, Djibouti. Credit: Cpl. Paula M. Fitzgerald, USMC – United States Department of Defense.

The researchers chose eight trees of similar size for their experiment, and removed ants from four of the trees by spraying them with a short-lived insecticide, and preventing ant recolonization by spreading a layer of ultra-sticky solution (Tanglefoot) around the based of each treated tree.  Goats were allowed to feed for five minutes.

TamashiroFig2

Number of bites (top graph) and time spent feeding (bottom graph) by goats in relation to branch surface temperature. Shaded area is 95% confidence interval.

Tamashiro and his colleagues measured the number of bites taken (top graph) and the amount of time spent feeding (bottom graph) at different branch temperatures.  Both measures of goat feeding were not influenced by branch temperature if there were no ants on the trees (blue lines and points).  But if ants were present (red lines and points), goat feeding decreased sharply with increasing branch temperature, presumably reflecting more aggressive ant defense of the plants.

These findings have important implications for acacia trees, which are a critical species in the sub-Saharan ecosystem.  Previous research has shown that elephant damage is strongly influenced by the number of swarming ants on a particular tree; a greater number of swarming ants are associated with less elephant damage. Many vertebrate browsers feed throughout the day, but may feed preferentially at dawn and dusk, when temperatures are cooler and ant-defense is weakest. Browsing is particularly problematic for acacia saplings, which are usually attacked by small-bodied vertebrates such as steenbok, which forage primarily at night when ants are least active.  Thus the effectiveness of ant defense may be compromised by mismatches between vertebrate activity periods and ant activity periods.

note: the paper that describes this research is from the journal Ecology. The reference is Tamashiro, R. A., P. D. Milligan, and T. M. Palmer. 2019. Left out in the cold: temperature-dependence of defense in an African ant–plant mutualism. Ecology 100(6): e02712. 10.1002/ecy.2712 . Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2019 by the Ecological Society of America. All rights reserved.

 

 

Mystifying trophic cascades

Within ecosystems, trophic cascades may occur when one species, usually a predator, has a negative effect on a second species (its prey), thereby having a positive effect on its prey’s prey. Today’s example considers the interaction between a group of predators (including several fish species, a sea snail and a sea star) their prey (the sea urchin Paracentrotus lividus) and sea urchin prey, which comprise numerous species of macroalgae that attach to the shallow ocean floor. These predators can negatively affect sea urchin populations either by eating them (consumptive effects), or by scaring them so they forage less efficiently (nonconsumptive effects). If sea urchins are less abundant or less aggressive foragers, the net indirect effect of a large population of fish, sea snails and sea stars will be an increase in macroalgal abundance.

Maldonado Halo

A large sea urchin grazing in a macroalgal community.  Notice the white halo surrounding the urchin, indicating that it has grazed all of the algae within that region. Credit: Albert Pessarrodona.

Many humans enjoy eating predatory fish, and we have overfished much of the ocean’s best fisheries including the shallow temperate rocky reefs (4 – 12 m deep) in the northwest Mediterranean Sea. Removing these predators has caused sea urchin populations to explode, overgrazing their favorite macroalgal food source, and ultimately leading to the formation of urchin barrens – large areas with little algal growth, low productivity and a small nondiverse assemblage of invertebrates and vertebrates.

DCIM112GOPRO

A sea urchin barrens whose macroalgae have been overgrazed by sea urchins. Credit: Albert Pessarrodona

Albert Pessarrodona became interested in this trophic cascade after years of diving in the Mediterranean. He noticed that in Marine Protected Areas, predatory fish abound and there are few visible urchins and lots of macroalgae. In nearby unprotected areas where fishing is permitted, urchins graze out in the open brazenly, and urchin barrens are common. He also wondered whether a second variable – sea urchin size – might play a role in this dynamic. Were large sea urchins relatively immune from predation by virtue of their large size and long spines, allowing them to forage out in the open even if predators were relatively common?

Urchinfig1

Interactions investigated in this study.  (a) Predators consume either small (left) or large (right) sea urchins (consumptive effects). (b) Sea urchins eat macroalgae. (c) Predators scare small or large sea urchins, reducing their foraging efficiency (nonconsumptive effects). (d) Predatory fish indirectly increase macroalgal abundance.

Pessarrodona and his research team used field and laboratory experiments to explore the relationship between sea urchin size and their survival and behavior in high-predator-risk and low-predator-risk conditions. High-risk was the Medes Islands Marine Reserve, which has had no fishing since 1983 and boasts a large, diverse assemblage of predatory fish, while low-risk was the nearby Montgri coast, which has a similar habitat structure, but allows fishing. The researchers tethered 40 urchins of varying sizes to the sea bottom (about 5m deep) in each of these regions, left them for 24 hours, and then collected the survivors to compare survival in relation to body size in high and low-risk conditions. They discovered that large urchins were much less likely to get eaten than were small urchins, and that the probability of getting eaten was substantially greater in the high-risk site.

UrchinFig3a

Probability of being eaten in relation to sea urchin size (cm) in high-risk (blue line) and low-risk (green line) habitats.

Pessarrodona and his colleagues followed this up by investigating whether the relatively predation-resistant large urchins were less fearful, and thus more likely to forage effectively, even in high-risk sites. Previous studies showed that sea urchins can evaluate risk using chemical cues given off by other urchins injured in a predatory attack, or given off by the actual predators. To explore the relationship between these cues and sea urchin behavior, the researchers put either large or small sea urchins into partitioned tanks with an injured sea urchin. Water flowed from one partition to the other, so the experimental sea urchins received chemical cues from the injured urchins. They also had a group of sea urchins placed in similar tanks without any injured sea urchins as controls. The experimental sea urchins were given seagrass to feed on, and the researchers calculated feeding rates based on how much food remained after seven days.

Small sea urchins were not deterred by the presence of an injured urchin (left graph below), while large sea urchins drastically reduced their feeding rates in response to the presence of an injured urchin (middle graph). This was startling as it flew in the face of the commonsense expectation that small sea urchins (most susceptible to predation) should be most fearful of predator cues. The researchers repeated the experiment (under slightly different conditions) placing an actual predator (a fearsome sea snail) on the other side of the partition. Again, large urchins showed drastically reduced foraging rates (right graph below).

UrchinFig4

Sea urchin responses to predation risk cues in the laboratory. When exposed to injured urchins – symbolized as having a triangle cut out – (A) small urchins did not reduce their grazing rate, while (B) large urchins drastically curtailed grazing. (C) When exposed to a predatory snail on the other side of a partition, large urchins sharply curtailed grazing. n.s = no significant difference, **P<0.01.

It turns out that large sea urchins are the critical players in this trophic cascade because they do much more damage to algal biomass than do the smaller urchins (we won’t go through the details of that research). The question then becomes how this plays out in natural ecosystems. Do consumptive and non-consumptive effects of predators in high-risk sites reduce sea urchin abundance and reduce the foraging levels of large sea urchins so that macroalgal cover is greatly enhanced? Pessarrodona and his colleagues surveyed high-risk and low-risk sites for sea urchin density and algal abundance. They set up 45 quadrats (40 X 40 cm) at each site, measured each sea urchin’s diameter, and estimated the abundance of each type of algae by harvesting a 20 X 20 cm subsample from each quadrat and drying and weighing the sample.

The findings were striking. Small and large sea urchins were much less abundant at high-risk sites than at low-risk sites (left graph below). At the same time, macroalgae were much more abundant at high-risk sites than at low-risk sites (right graph below).

UrchinFig5bc

(Left graph) Density of small and large sea urchins in high-risk and low-risk habitats. (Right graph) Biomass of macroalgae of different growth structures in high-risk and low-risk habitats. Canopy algae are taller than 10 cm, while turf algae are lower stature. Codium algae are generally not grazed by sea urchins. **P<0.01, ***P<0.001.

UrchinFig6a

Summary of interactions.  Arrow width indicates relative importance.

To summarize this system, predators reduce small sea urchin abundance by eating them (consumptive effects), and reduce large sea urchin foraging by intimidating them (nonconsumptive effects). The net indirect effect of predators on macroalgae is a function of these two effects. Large sea urchins are the major macroalgae consumers, but, of course, large sea urchins develop from small sea urchins.

The $64 question is why large sea urchins fear predators so much, while small (more vulnerable) urchins do not. The quick answer is that we don’t know. One possibility is that small sea urchins may be bolder in risky environments since they are more vulnerable to starvation (have fewer reserves), and also have lower reproductive potential since they are likely to die before they get large enough to reproduce. In contrast, large sea urchins can survive many days without food because of their large reserves. In addition, large urchins are close to sexual maturity, and thus may be unwilling to accept even a small risk to their well-being, which could interfere with them achieving reproductive success.

note: the paper that describes this research is from the journal Ecology. The reference is Pessarrodona, A.,  Boada, J.,  Pagès, J. F.,  Arthur, R., and  Alcoverro, T. 2019.  Consumptive and non‐consumptive effects of predators vary with the ontogeny of their prey. Ecology  100( 5):e02649. 10.1002/ecy.2649. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2019 by the Ecological Society of America. All rights reserved.

Invasive engineers alter ecosystems

Ecosystem engineers  change the environment in a way that influences the availability of essential resources to organisms living within that environment.  Beavers are classic ecosystem engineers; they chop down trees and build dams that change water flow and provide habitat for many species, and alter nutrient and food availability within an ecosystem. Ecologists are particularly interested in understanding what happens when an invasive species also happens to be an ecosystem engineer; how are the many interactions between species influenced by the presence of a novel ecosystem engineer?

For her Ph.D research Linsey Haram studied the effects of the invasive red alga Gracilaria vermiculophylla on native estuarine food webs in the Southeast USA. She wanted to know how much biomass this ecosystem engineer contributed to the system, how it decomposed, and what marine invertebrates ate it. She was spending quite a lot of time in Georgia’s knee-deep mud at low tide, and became acquainted with the shorebirds that zipped around her as she worked. She knew that small marine invertebrates are attracted to the seaweed and are abundant on algae-colonized mudflats, and she wondered if the shorebirds were cueing into that. If so, the non-native alga could affect the food web both directly, by providing more food to invertebrate grazers, and indirectly, by providing habitat for marine invertebrates and thus boosting resources for shorebirds.

least-sandpiper-in-grac_signed.jpg

A least sandpiper forages on a red algae-colonized mudflat. Credit: Linsey Haram.

Since the early 2000’s, Gracilaria vermiculophylla has dramatically changed estuaries in southeast USA by creating novel habitat on mudflats that had previously been mostly bare, due to high turbidity and a lack of hard surface for algal attachment.  But this red alga has a symbiotic association with a native tubeworm, Diopatra cuprea, that attaches the seaweed to its tube so it can colonize the mudflats.  This creates a more hospitable environment to many different invertebrates, providing cover from heat, drying out, and predators, while also providing food to invertebrates that graze on the algae.

2015-01-20 14.39.58

Closeup view of the red alga Gracilaria vermiculophylla, an invasive ecosystem engineer.  Credit: Linsey Haram

Haram and her colleagues decided to investigate how algae presence might be influencing bird distribution and behavior.  They realized that this influence might be scale-dependent; on a large spatial scale birds may see the algae from afar and be drawn to an algae-rich mudflat, while on a smaller spatial scale, differences in foraging behavior may lead to differences in how a particular species uses the algal patches in comparison to bare patches.

To explore large scale effects, the researchers counted all shorebirds (as viewed from a boat) on 500 meter transects along six bare mudflats and six algal mudflats.  They also measured algal density (even algal mudflats have large patches without algae), and invertebrate distribution and abundance both on the surface and buried within the sediment. These surveys showed that shorebirds, in general, were much more common on algal mudflats. As you can see, this trend was stronger in some shorebird species than others, and one species (graph f below) showed no significant trend.

HaramFig1

Field surveys of shorebird density (#/ha) on six bare mudflats compared to six mudflats colonized by Gracilaria vermiculophylla. * indicates weak trend (0.05 < P < 0.10), ** indicates a stronger difference (P < 0.05).  Bold horizontal bars are median values. Common names of species are (b) dunlin, (c) small sandpipers, (d) ruddy turnstone, (e) black-bellied plover, (f) semipalmated plover, (g) willet, (h) short-billed dowitcher.

Algal mudflats had a much greater abundance and biomass of invertebrates living on the surface, particularly isopods and snails, which presumably attracted some of these birds.  However, below the surface, there were no significant differences in invertebrate abundance and biomass when comparing mudflats with and without algae.

Having shown that on a large spatial scale shorebirds tend to visit algal mudflats, Haram and her colleagues then turned their attention to bird preferences on a smaller spatial scale. First, they conducted experiments on an intermediate scale, observing bird foraging preferences on 10 X 20 plots with or without algae.  They then turned their attention to an even smaller scale, by observing the foraging behavior on a <1mscale.  On each sampling day, the researchers observed individuals of seven different shorebird species on a mudflat with algal patches, to see whether focal birds spent more time foraging on algal patches or bare mud.  During each 3-minute observation, researchers recorded the number of pecks made into algal patches vs. bare mud, and compared that to the expected peck distribution based on the observed ratio of algal-cover to bare mud (which was a ratio of 27:73).

On the smallest scale, two of the species, Calidras minutilla and Aranaria interpres, showed a very strong preferences for foraging in algae, while a third species, Calidris alpine, showed a weak algal preference. In contrast, Calidris species (several species of difficult-to-distinguish sandpipers) and Charadrius semipalmatus strongly preferred foraging in bare mud, while the remaining two species showed no preference.

HaramFig2

Small-scale foraging preferences  (x–axis) of shorebirds. Solid blue curve is the strength of population preference (in terms of probability – y-axis) for mudflats, while solid red curve is the strength of population preference for algae.  Dashed curves are individual preferences.  Red arrows at 0.27 indicates the proportion of the mudflat that is covered with algae, while the blue arrow at 0.73 represents the proportion of bare mudflat (and hence indicate random foraging decisions).  Filled arrows are significantly different from random, shaded arrows are slightly different from random, while unfilled arrows are random. Common names of species are: (a) dunlin, (b) least sandpiper, (c) small sandpipers, (d) ruddy turnstone, (e) semipalmated plover, (f) willet, (g) short-billed dowitcher.

If you compare the two sets of graphs above, you will note that in some cases shorebird preferences for algae are similar across large and small spatial scales, but for other species, these preferences may vary with spatial scale.  For example, Arenaria interpres was attracted to algal mudflats on a large scale, and once present, these birds foraged exclusively amongst the algae, shunning any mud that lacked algae.  Small sandpipers (Calidris species) also were attracted to algal mudflats on a large scale, but in contrast to Arenaria interpres, these sandpipers foraged exclusively in bare mud, rather than in the algae.

The researchers conclude that different species have different habitat preferences across spatial scales in response to Gracilaria vermiculophylla. Most, but not all, species were more attracted to mudflats that harbored the invasive ecosystem engineer.  But once there, shorebird small-scale preference varied in response to species-specific foraging strategy.  For example, the ruddy turnstone (Arenaria interpres) discussed in the previous paragraph, forages by turning over stones (hence its name) shells and clumps of vegetation, eating any invertebrates it uncovers.  Accordingly, it forages primarily in algal clumps.  In contrast, willets (Tringa semipalmata), short-billed dowitchers (Limnodromus griseus) and dunlins (Calidris alpine) were all attracted strongly to algal mudflats, but showed basically random foraging on a small spatial scale, showing little or no preference for algal clumps.  The researchers explain that these three species use their very long beaks to probe deeply beneath the surface, using tactile cues to grab prey. So unlike the ruddy turnstone and some other species that forage for surface invertebrates, they don’t use the algae as a cue that food is available below.  Thus species identity, and consequent morphology, behavior and foraging niche are all important parts of how a community responds to an invasive ecosystem engineer.

note: the paper that describes this research is from the journal Ecology. The reference is Haram, L. E., Kinney, K. A., Sotka, E. E. and Byers, J. E. (2018), Mixed effects of an introduced ecosystem engineer on the foraging behavior and habitat selection of predators. Ecology, 99: 2751-2762. doi:10.1002/ecy.2495. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.