Spiders eat spiders sometimes

In human society, a guild is an association of craftsmen or merchants that work together to achieve a common goal. For example, 14thcentury Paris boasted over 350 different guilds, including drapiers (cloth makers), knife-makers, locksmiths, helmet-makers and harness-polishers. Ecological guilds are similar to human guilds, in that members of the same guild depend on the same resources for survival. But members of the same ecological guild are  different species, each of which uses a similar resource, or group of resources.  As we shall now discover, as in human guilds, members of ecological guilds don’t always get along very well.

A guild is part of a food web, which is a summary of the feeding relationships within a community.  Israel Leinbach, Kevin McCluney and John Sabo were interested in one particular part of a food web – the relationship between a large wolf spider (Hogna antelucana), a small wolf spider (Pardosa species) and a cricket (Gryllus alogus).  Both spiders are in the same guild, because they obtain their energy from similar sources – insect prey.  This cricket specializes on willow and cottonwood leaves that fall to the ground in the semi-arid floodplain of the San Pedro River in southeast Arizona. Under natural conditions, the researchers observed the large spiders eating both the small spiders and crickets.  However, they never observed the small spider eating the relatively large cricket (which averages 20 times its mass), though small spiders are delighted to eat many other (smaller) insect species.

spideeatscricket.jpg

A large wolf spider subdues and begins to consume a cricket.  Credit: Kevin McCluney.

The researchers argue that even though guild members specialize on similar resources, it is important to consider how other resources might influence the relationships among the species.  During the dry season, water is a critical limiting resource.  As it turns out, large spiders, crickets and small spiders are very different in how much energy and water they contain. From the table below you can see that the small Pardosa spiders are very low in water content, but pack a huge amount of energy into their tiny bodies.  Crickets of both sexes have a high water content, but contain a relatively small amount of energy in their large bodies. Thus small spiders have a much higher energy/water ratio than crickets or large spiders.

SpiderTable2

Mean (+/- 1 standard error) dry mass, energy, water and energy/water ratio of the three species discussed in this report.

When water is limiting, the large spiders might devote themselves to eating crickets to take advantage of their very high water content. But when water is not limiting, the large spiders would be expected to turn their attention to eating small wolf spiders, which are much dryer, but much higher in energy per unit body mass. The researchers reasoned that providing water to large spiders should increase the rate of intraguild predation (in this case large spiders eating small spiders).

SpiderFig1

Interactions among the three species when water is limiting (Control – left) and abundant (Experimental – right).  Black arrows are direct effects, while gray arrows show the direction of energy flux).

Leinbach and his colleagues set up a mesocosm experiment using 2 X 2 X 2 meter cages in which they experimentally manipulated community composition and water availability.

spidercages.jpg

Network of cages set up in the San Pedro River floodplain. Credit: Kevin McCluney.

All cages, except controls, received either one large male or female spider, two small spiders (sex unknown) and two crickets (again either male or female). Controls received no large spiders, and were used to establish a baseline survival rate for the two potential prey items (small spiders and crickets). To test for the effects of water availability on predation by large spiders the researchers placed water pillows that held approximately 30 ml of water into half of the enclosures. They predicted that large spiders would primarily eat energy-rich small spiders in cages with water pillows, but prefer water-rich crickets in cages without water pillows. The water pillows had minimal impact on cricket water levels as they got plenty of water from their food (green water-rich leaves)

spiderwaterpillow.jpg

A large wolf spider sucks water from the water pillow.  Credit: Kevin McCluney.

Leinbach and his colleagues used per capita interaction strength as their quantitative measure of predation effects.  If prey survival was lower in the experimental cages than  in the control, there was a negative interaction strength – indicating that large spiders were eating a particular prey type.  When the researchers provided them with water, large spiders of both sexes ate significantly more small spiders than they did  without water supplements.

SpiderFig4

Interaction strength (effect of predation) of large spider (Hogna antelucana) on the small spider (Pardosa species).  Both male and female large spiders have significant negative effects on small spiders when water is supplemented (blue bars), but have minimal effects without water supplements (gray bars).

But the story was very different with crickets.  The researchers expected that when supplemented with water, large spiders would bypass the water-rich crickets in favor of the energy-rich small spiders. Surprisingly, instead of crickets in cages with pillows surviving as well as controls, they actually survived better – at least male crickets did. One possible explanation is that spiders may emit odor (or other types of) cues that affect cricket behavior in a negative way, for example by causing them to feed more cautiously and inefficiently. Once the large spiders have killed the small spiders, there may be fewer spiders around to smell up the place, and crickets may feed more efficiently, and thus survive better.

spidermccluney.jpg

Israel Leinbach searches for spiders and crickets within a cage. Credit Kevin McCluney.

I asked Kevin McCluney if there were any other surprising findings, and he pointed out that large male and female spiders showed very similar consumption patterns.  He expected that females would need more energy because egg production is very energy demanding.  One explanation for this lack of difference is that large male spiders may expend considerable energy wandering around in search of sexually receptive females, and their overall energy needs may be similar to those of females. Balancing the demands of energy, water and sex may be equally demanding for both sexes of large spiders, and may lead to adaptive feeding on different levels of the food chain as environmental conditions shift.

note: the paper that describes this research is from the journal Ecology. The reference is Leinbach, I.,  McCluney, K. E., and Sabo, J. L. 2019. Predator water balance alters intraguild predation in a streamside food web. Ecology 100(4):e02635. 10.1002/ecy.2635. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2019 by the Ecological Society of America. All rights reserved.

A saltier Great Salt Lake supports a shifting ecosystem

In science, like many other fields, “who you know” can be critical to success. Eric Boyd from Montana State University was introduced to the Great Salt Lake (GSL) ecosystem by his colleague Bonnie Baxter, a professor at Westminster College and the Great Salt Lake Institute in Salt Lake City, Utah.  Baxter was fascinated by microbialites- deposits of carbonate mud of diverse shape and structure, that harbor an impressive diversity and abundance of microorganisms.  Some of these microorganisms are photosynthetic, using dissolved organic carbon from the water to build carbohydrates; as such they are the primary producers which feed the rest of the ecosystem. Baxter impressed upon Boyd the need to understand the ecosystem, which feeds huge populations of two consumer species, the brine fly Ephydra gracilis and the brine shrimp Artemia franciscana. Up to 10 million birds, representing about 250 species, feed on these two species over the course of a year.

dsc_0279.jpg

Eric Boyd collects samples from the north arm of GSL. Credit: Bonnie Baxter.

In 1959 a railroad causeway was built that divided GSL into a south and north arm, which differ from each other in one critical way.  The south arm receives freshwater input from three rivers, while the north arm’s only freshwater input is rain and snowmelt.  Both arms are hypersaline; the south arm is 4-5 times saltier than typical ocean water, while the north arm is about twice as salty as the south arm. Boyd and Baxter recognized that these salinity differences were probably impacting the microbial communities in the two arms; in fact preliminary observations indicated that microbialite communities were no longer forming in the north arm.  So when Melody Lindsay began her doctoral research with Boyd, she elected to investigate how salinity was influencing the microbialite communities in the lake.

DSC_0040

Melody Lindsay (right) and Bonnie Baxter (left) planning to sample in the south arm of GSL.  Credit: Jaimi Butler.

Lindsay and her colleagues collected samples of microbialite mats from the south arm of the lake where the salinity of the water measured 15.6% (as a comparison, typical ocean water is about 3.5%).  At each of six salinity levels (8, 10, 15, 20, 25 and 30%), the researchers set up three microcosms of 150 ml of lakewater, which they then inoculated with 10 grams of homogenized microbial mat. They then sampled microbial diversity and abundance four and seven weeks after beginning the experiment.

DSC_0033

Exposed microbialites along the south arm’s shoreline.  Credit: Eric Boyd.

This experiment was conceptually simple, but technically a bit of a challenge.  Microorganisms are difficult to identify and count; and in fact it is likely that some of the species were new to science. Fortunately, researchers can use molecular approaches (quantitative PCR) to measure the quantity of each type of 16S rRNA gene in each microcosm. Each species of microorganism has distinct rRNA genes, so different base sequences indicate different microorganisms.  This allows researchers to estimate how much of each species is present. One restriction is that closely related species will have almost identical rRNA genes, so they may be difficult to distinguish from each other.

Overall, microorganism abundance was 152% greater after four weeks and 128% greater after seven weeks at the 15% salinity. Recall that these samples came from microbialites associated with 15.6% salinity, so this finding indicates good growth at the salinity which the microorganisms have recently experienced.  Interestingly, microorganisms thrived even better at 10% salinity.  But higher salinity levels, particularly  25% and 30%, were very detrimental to microbial growth.

LindsayFig2

Change in abundance of 16S rRNA gene from microcosms incubated for four and seven seeks in comparison to abundance at week 0 for each salinity.  Significant differences are comparisons with abundance at week 0. NS = no significant difference, * P<0.1, ** P<0.01, *** P<0.001, **** P<0.0001. Error bars = 1SE.

The researchers broke down their results into taxonomic Orders, based on the 16S rRNA sequence of each gene. The two most common Orders were Sphingobacteriales and Spirochaetales, which both grew best at low salinity. The next most common Orders were a cyanobacterium from the Order Croococcales, and an alga from the Order Naviculales.  Species from these two taxonomic Orders are foundational to the ecosystem, because they are photosynthetic and relatively large. These dominant producers either directly, or indirectly, feed the rest of the ecosystem. Croococcales grew best at intermediate salinities (10-20%), while Naviculales did best at 8-15%, but also reasonably well at 20% salinity (see the figure below for a summary of the most common Orders).

LindsayFig3

Abundance of taxonomic Orders of Microorganisms incubated at different salinities at 4 and 7 weeks, in comparison to initial abundance (week 0 = yellow square). Darker green squares indicate a greater increase, and darker brown squares indicate a greater decrease in abundance.  The most common Orders are on top, least common are on the bottom. Het = heterotroph, PP = primary producers, PhH = photoheterotroph.

Overall, primary productivity, as measured by how much dissolved organic carbon was taken up by the photosynthesizers, was greatest at 10 and 15%, and declined sharply above 20% salinity.  In addition, brine shrimp, one of the two important animal consumers of microorganisms, hatched and survived best at the lowest salinities.

Mating brine shrimpHans Hillewaert

Two mating brine shrimp under the watchful eyes of an observer. Credit: Hans Hillewaert.

Lindsay and her colleagues conclude that conditions in the south arm are conducive to microbialite communities and the consumers they support.  However, the north arm has much lower productivity, with salinity levels so high that salt is spontaneously crystalizing out of solution in some areas. Given that climate change models predict increased drought severity over the next century in the GSL region, it is very likely that salinity levels will rise throughout the lake.  Over the same time period, humans are expected to increase water usage from the rivers that flow into the lake, which will further drop water levels in the lake, increase salinity in GSL, and dry out many of the microbial mats. This loss of ecosystem production is expected to cascade up the ecosystem, reduce brine shrimp abundance and ultimately the abundance and diversity of migratory birds that feed on them.

note: the paper that describes this research is from the journal Ecology. The reference is Lindsay, M. R.,  Johnston, R. E.,  Baxter, B. K., and  Boyd, E. S.  2019.  Effects of salinity on microbialite‐associated production in Great Salt Lake, Utah. Ecology  100( 3):e02611. 10.1002/ecy.2611. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2019 by the Ecological Society of America. All rights reserved.

Decomposition: it’s who you are and where you are

“Follow the carbon” is a growing pastime of ecologists and environmental researchers worldwide. In the process of cellular respiration, organisms use carbon compounds to fuel their metabolic pathways, so having carbon around makes life possible.  Within ecosystems, following the carbon is equivalent to following how energy flows among the producers, consumers, detritivores and decomposers. In soils, decomposers play a central role in energy flow, but we might not appreciate their importance because many decomposers are tiny, and decomposition is very slow.  We are thrilled by a hawk subduing a rodent, but are less appreciative of a bacterium breaking down a lignin molecule, even though at their molecular heart, both processes are the same, in that complex carbon enters the organism and fuels cellular respiration.  However. from a global perspective, cellular respiration produces carbon dioxide as a waste product, which if allowed to escape the ecosystem, will increase the pool of atmospheric carbon dioxide thereby increasing the rate of global warming. So following the carbon is an ecological imperative.

As the world warms, trees and shrubs are colonizing regions that previously were inaccessible to them. In northern Sweden, mountain birch forests (Betula pubescens) and birch shrubs (Betula nana) are advancing into the tundra, replacing the heath that is dominated by the crowberry, Empetrum nigrum. As he began his PhD studies, Thomas Parker became interested in the general question of how decomposition changes as trees and shrubs expand further north in the Arctic. On his first trip to a field site in northern Sweden he noticed that the areas of forest and shrubs produced a lot of leaf litter in autumn yet there was no significant accumulation of this litter the following year. He wondered how the litter decomposed, and how this process might change as birch overtook the crowberry.

ParkerView

One of the study sides in autumn: mountain birch forest (yellow) in the background, dwarf birch (red) on the left and crowberry on the right. Credit: Tom Parker.

Several factors can affect leaf litter decomposition in northern climes.  First, depending on what they are made of, different species of leaves will decompose at different rates.  Second, different types of microorganisms present will target different types of leaves with varying degrees of efficiency.  Lastly, the abiotic environment may play a role; for example, due to shade and creation of discrete microenvironments, forests have deeper snowpack, keeping soils warmer in winter and potentially elevating decomposer cellular respiration rates. Working with several other researchers, Parker tested the following three hypotheses: (1) litter from the more productive vegetation types will decompose more quickly, (2) all types of litter decompose more quickly in forest and shrub environments, and (3) deep winter snow (in forest and shrub environments) increase litter decomposition compared to heath environments.

To test these hypotheses, Parker and his colleagues established 12 transects that transitioned from forest to shrub to heath. Along each transect, they set up three 2 m2 plots – one each in the forest, shrub, and heath – 36 plots in all. In September of 2012, the researchers collected fresh leaf littler from mountain birch, shrub birch and crowberry, which they sorted, dried and placed into 7X7 cm. polyester mesh bags.  They placed six litter bags of each species at each of the 36 plots, and then harvested these bags periodically over the next three years. Bags were securely attached to the ground so that small decomposers could get in, but the researchers had to choose a relatively small mesh diameter to make sure they successfully enclosed the tiny crowberry leaves. This restricted access to some of the larger decomposers.

ParkerLitterBags

Some litter bags attached to the soil surface at the beginning of the experiment. Credit: Tom Parker.

To test for the effect of snow depth, the researchers also set up snow fences on nearby heath sites.  These fences accumulated blowing and drifting snow, creating a snowpack comparable to that in nearby forest and shrub plots.

Parker and his colleagues found that B. pubescens leaves decomposed most rapidly and E. nigrum leases decomposed most slowly.  In addition, leaf litter decomposed fastest in the forest and most slowly in the heath.  Lastly, snow depth did not  influence decomposition rate.

ParkerEcologyFig1

(Left graph) Decomposition rates of E. nigrum, B. nana and B. pubescens in heath, shrub and forest. (Right graph) Decomposition rates of E. nigrum, B. nana and B. pubescens in heath under three different snow depths simulating snow accumulation at different vegetation types: Heath (control), + Snow (Shrub) and ++ Snow (Forest) . Error bars are 1 SE.

B. pubescens in forest and shrub lost the greatest amount (almost 50%) of mass over the three years of the study, while E. nigrum in heath lost the least (less than 30%).  However, B. pubescens decomposed much more rapidly in the forest than in the shrub between days 365 and 641. The bottom graphs below show that snow fences had no significant effect on decomposition.

ParkerEcologyFig2

Percentage of litter mass remaining (a, d) E. nigrum, (b, e) B. nana, (c, f) B. pubescens in heath, shrub, or forest. Top graphs (a, b, c) are natural transects, while the bottom graphs (d, e, f) represent heath tundra under three different snow depths simulating snow accumulation at different vegetation types: Heath (control), + Snow (Shrub) and ++ Snow (Forest) . Error bars represent are 1SE. Shaded areas on the x-axis indicate the snow covered season in the first two years of the study.

Why do mountain birch leaves decompose so much more than do crowberry leaves?  The researchers chemically analyzed both species and discovered that birch leaves had 1.7 times more carbohydrate than did crowberry, while crowberry had 4.9 times more lipids than did birch. Their chemical analysis showed much of birch’s rapid early decomposition was a result of rapid carbohydrate breakdown. In contrast, crowberry’s slow decomposition resulted from its high lipid content being relatively resistant to the actions of decomposers.

ParkerResearchers

Researchers (Parker right, Subke left) harvesting soils and litter in the tundra. Credit: Jens-Arne Subke.

Parker and his colleagues did discover that decomposition was fastest in the forest independent of litter type. Forest soils are rich in brown-rot fungi, which are known to target the carbohydrates (primarily cellulose) that are so abundant in mountain birch leaves.  The researchers propose that a history of high cellulose litter content has selected for a biochemical environment that efficiently breaks down cellulose-rich leaves. Once the brown-rot fungi and their allies have done much of the initial breakdown, another class of fungi (ectomycorrhizal fungi) kicks into action and metabolizes (and decomposes) the more complex organic molecules.

The result of all this decomposition in the forest, but not the heath, is that tundra heath stores much more organic compounds than does the adjacent forest (which loses stored organic compounds to decomposers).  As forests continue their relentless march northward replacing the heath, it is very likely that they will introduce their efficient army of decomposers to the former heathlands.  These decomposers will feast on the vast supply of stored organic carbon compounds, release large quantities of carbon dioxide into the atmosphere, which will further exacerbate global warming. This is one of several positive feedbacks loops expected to destabilize global climate systems in the coming years.

note: the paper that describes this research is from the journal Ecology. The reference is Parker, T. C., Sanderman, J., Holden, R. D., Blume‐Werry, G., Sjögersten, S., Large, D., Castro‐Díaz, M., Street, L. E., Subke, J. and Wookey, P. A. (2018), Exploring drivers of litter decomposition in a greening Arctic: results from a transplant experiment across a treeline. Ecology, 99: 2284-2294. doi:10.1002/ecy.2442. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

Recovering soils suffer carbon loss

When dinosaurs roamed the Earth, and I was in high school, acid rain became big news.  Even my dad, who as an industrial chemist believed that industry seldom sinned, acknowledged that he could see how coal plants could release sulfur (and other) compounds, which would be converted to strong acids, borne by prevailing winds to distant destinations, and deposited by rain and snow into soils. Forest ecosystems in North America and Europe are happily, albeit slowly, recovering from the adverse effects of acid deposition, but there are some causes for concern.  At the Hubbard Brook Experimental Forest in New Hampshire, USA, researchers experimentally remediated some of the impacts of acid deposition by adding calcium silicate to a watershed (via helicopter!). A decade later, this treatment had caused a 35% decline in the total carbon stored in the soil. This result was very unexpected and alarming, as this could mean that acid-impacted temperate forests may become major sources of CO2, with more carbon running off into streams, and some becoming atmospheric CO2, as the effects of acid rain wane. Richard Marinos and Emily Bernhardt wanted to determine exactly what caused this carbon loss to better understand how forests will behave in the future as they recover from acidification.

hubbrook

The forest at Hubbard Brook in the Autumn. Credit: Hubbard Brook Ecosystem Study at hubbardbrook.org

The problem is that calcium and acidity (lower pH is more acid: higher pH is more alkaline) have different and complex effects on plants, soil microorganisms and the soils in which they live. Several previous studies demonstrated that higher soil pH (becoming more alkaline) caused an increase in carbon solubility, while higher calcium levels caused carbon to become less soluble. Soluble organic carbon forms a tiny fraction of total soil carbon, but is very important because it can be used by microorganisms for cellular respiration, and also can be leached from ecosystems as runoff. In general, soil microorganisms benefit as acidic soils recover because heavy metal toxicity is reduced, enzymes work better, and mycorrhizal associations are more robust.  Complicating the picture even more, both elevated calcium and increased pH have been associated with increased plant growth, but increased calcium is also associated with reduced fine root growth.

To help unravel this complexity, Marinos and Bernhardt experimentally tested the effects of increasing pH and increased calcium on soil organic carbon (SOC) solubility, microbial activity and plant growth.  They collected acidic soil from Hubbard Brook Experimental Forest, which formed three distinct layers: leaf litter on top, organic horizon below the leaf litter, and mineral soil below the organic horizon.

soil_excavation.jpg

Soil excavation site at Hubbard Brook. Credit: Richard Marinos.

The researchers then filled 100 2.5-liter pots with these three soil layers (in correct sequence) and planted 50 pots with sugar maple saplings, leaving 50 pots unplanted.  Pots were moved to a greenhouse, and that November given one of five treatments: calcium chloride addition (Ca treatment), potassium hydroxide addition (alkalinity treatment), Ca + alkalinity treatment combined, a deionized water control, and a potassium chloride control. The potassium chloride control had no effect, so we won’t discuss it further.

plants_outside

Potted sugar maple saplings used for the experiments. Credit Richard Marinos.

The following July, Marinos and Bernhardt harvested all of the pots, carefully separating plant roots from the soil, and analyzing the organic horizon and mineral soil levels separately (there wasn’t enough leaf litter remaining for analysis). The researchers measured SOC by mixing soil from each pot with deionized water, centrifuging at high speed to extract the water-soluble material, combusting the material at high temperature and measuring how much CO2 was generated. The result is termed water extractable organic carbon (WEOC).

Remember that previous studies had shown that higher calcium levels decreased carbon solubility, while higher alkalinity increased carbon solubility. Surprisingly, Marinos and Bernhardt found that in unplanted pots, the Ca treatment reduced WEOC in both soil layers, while the alkalinity treatment decreased WEOC in the organic horizon, but not in mineral soil. In pots planted with maple saplings, the Ca treatment had no effect on WEOC, while the alkalinity treatment, and the Ca + alkalinity treatment, increased WEOC markedly.

marinosfig1

Water-Extractable Organic Carbon in soil without plants (left column) and with plants (right column). Top graphs are organic horizon soils and bottom graphs are mineral horizon soils. Error bars are 1 standard error.

The next question was how might soil microorganisms fit into the plant-soil dynamics?

marinosfig2b

Soil respirations rates (top) over the short term (days 1-7 post-harvest) and (bottom) the long term (days 8-75 post-harvest). Error bars are 1 standard error.

Soil microorganisms use carbon products for cellular respiration, so the researchers expected that soils with more SOC would have higher respiration rates.  They measured soil respiration 1, 2, 4, 8, 16, 35 and 72 days after the harvest, so they could evaluate both short-term and long-term effects. In unplanted pots, soil respiration rates were unaffected by treatment.  But in planted pots, the alkalinity treatment increased soil respiration rates considerably in the short term (top graphs), but much less so in the long-term (bottom graphs). Putting the WEOC data from the figure above together with the respiration data from the two figures to your left, you can see that in pots with plants, increased alkalinity was associated with more SOC and higher respiration rates.

The researchers weighed the saplings after harvest and discovered that the sugar maples grew best in soils treated with calcium. Two previous studies had treated fields with calcium silicate and found better sugar maple growth in the treated fields.  Marinos and Bernhardt argue that their study provides evidence that it is the Ca enrichment, and not the increased pH, that caused increased growth for both of those studies.

Perhaps the most surprising finding is that higher alkalinity increased soil microbial activity only in pots with plants, and had no effect on soil microbial activity in pots without plants. Somehow, the plants in an alkaline environment are increasing the rate of microbial respiration, perhaps by releasing carbohydrates produced by photosynthesis into the soil, which could then stimulate decomposition of SOC by the microorganisms. Finding that this effect largely disappeared a few days after harvest (bottom graph above), supports the idea that the plants are releasing a substance that helps microorganisms carry on cellular respiration. But this idea awaits further study. In the meantime, we have a better understanding of how forest recovery from acid rain affects one aspect of the carbon cycle, though many other human inputs may interact with this recovery process.

note: the paper that describes this research is from the journal Ecology. The reference is Marinos, R. E. and Bernhardt, E. S. (2018), Soil carbon losses due to higher pH offset vegetation gains due to calcium enrichment in an acid mitigation experiment. Ecology, 99: 2363-2373. doi:10.1002/ecy.2478. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

Sweltering ants seek salt

Like humans, ants need salt and sugar.  Salt is critical for a functioning nervous system and for maintaining muscle activity, while sugar is a ready energy source. In ectotherms such as ants, body temperature is influenced primarily by the external environment, with higher environmental temperatures leading to higher body temperatures.  When ants get hot their metabolic rates rise, so they can go out and do energetically demanding activities such as foraging for essential resources like salt and sugar. On the down side, hot ants excrete more salt and burn up more sugar.  In addition, like humans, very high body temperature can be lethal, so ants are forced to seek shelter during extreme heat.   As a beginning graduate student, Rebecca Prather wanted to know whether ants adjust their foraging rates on salt and sugar in response to the conflicting demands of elevated temperatures on ants’ physiological systems.

Prather at field site

Rebecca Prather at her field site in Oklahoma, USA. Credit: Rebecca Prather.

Prather and her colleagues studied two different field sites: Centennial Prairie is home to 16 ant species, while Pigtail Alley Prairie has nine species.  For their first experiment, the researchers established three transects with 100 stations baited with vials containing cotton balls and either 0.5% salt (NaCl) or 1% sucrose.  The bait stations were 1 meter apart.  After 1 hour, they collected the vials (with or without ants), and counted and identified each ant in each vial.  The researchers measured soil temperature at the surface and at a depth of 10 cm. The researchers repeated these experiments at 9 AM, 1 PM and 5 PM, April – October, 4 times each month.

AntsinVial.jpg

Ants recruited to vials with 0.5% salt solution.  Credit: Rebecca Prather.

Sugar is easily stored in the body, so while sugar consumption increases with temperature, due to increased ant metabolic rate, sugar excretion is relatively stable with temperature.  In contrast, salt cannot be stored effectively, so salt excretion increases at high body temperature.  Consequently, Prather and her colleagues expected that ant salt-demand would increase with temperature more rapidly than would ant sugar-demand.

PratherFig1

Ant behavior in response to vials with 0.5% salt (dark circles) and 1% sucrose (white circles) at varying soil temperatures at 9AM, 1 PM (13:00) and 5PM (17:00). The three left graphs show the number of vials discovered (containing at least one ant), while the three right graphs show the number of ants recruited per vial.  The Q10 value  = the rate of discovery or recruitment at 30 deg. C divided by the rate of discovery or recruitment at 20 deg. C. * indicates that the two curves have statistically significantly different slopes.

The researchers discovered that ants foraged more at high temperatures. However, when surface temperatures were too high (most commonly at 1 PM during summer months), ants could not forage and remained in their nests.  At all three times of day, ants discovered more salt vials at higher soil temperatures. Ants also discovered more sugar vials at higher temperatures in the morning and evening, but not during the 1 PM surveys. Most interesting, the slope of the curve was much steeper for salt discovery than it was for sugar discovery, indicating that higher temperature increased salt discovery rate more than it increased sugar discovery rate (three graphs on left).

When ants discover a high quality resource, they will recruit other nestmates to the resource to help with the harvest.  Ant recruitment rates increased with temperature to salt, but not sugar, indicating that ant demand for 0.5% salt increased more rapidly than ant demand for 1% sugar (three graphs above on right).

The researchers were concerned that the sugar concentrations were too low to excite much recruitment, so they replicated the experiments the following year using four different sugar concentrations.  Ant recruitment was substantially greater to higher sugar concentrations, but was still two to three times lower than it was to 0.5% salt.

PratherFig2

Ant recruitment (y-axis) to different sugar concentrations at a range of soil temperatures (X-axis). Q10 values are to the left of each line of best fit.

Three of the four most common ant species showed the salt and sugar preferences that we described above, but the other common species, Formica pallidefulva, actually decreased foraging at higher temperatures.  The researchers suggest that this species is outcompeted by the other more dominant species at high temperatures, and are forced to forage at lower temperatures when fewer competitors are present.

In a warming world, ant performance will increase as temperatures increase up to ants’ thermal maximum, at which point ant performance will crash.  Ants are critical to ecosystems, playing important roles as consumers and as seed dispersers. Thus many ecosystems in which ants are common (and there are many such ecosystems!) may function more or less efficiently depending on how changing temperatures influence ants’ abilities to consume and conserve essential nutrients such as salt.

note: the paper that describes this research is from the journal Ecology. The reference is Prather, R. M., Roeder, K. A., Sanders, N. J. and Kaspari, M. (2018), Using metabolic and thermal ecology to predict temperature dependent ecosystem activity: a test with prairie ants. Ecology, 99: 2113-2121. doi:10.1002/ecy.2445Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

What grows up must go down: plant species richness and soils below.

Almost 20 years ago, Dorota Porazinska was a postdoctoral researcher investigating whether plant diversity influenced the diversity of organisms that lived in the soil below these plants, including bacteria, protists, fungi and nematodes (collectively known as soil biota).  Surprisingly, she and her colleagues discovered no linkages between aboveground and belowground species diversity.  She suspected that two issues were responsible for this lack of linkage. First, the early study lumped related species into functional groups – for example nematodes that eat bacteria, or nematodes that eat fungi.  Lumping simplifies data collection but loses a lot of data because individual species are not distinguished.  Back in those days, identifying species with DNA analysis was time-consuming, expensive, and often impractical. The second issue was that even if aboveground-belowground diversity was linked, it might be difficult to detect.  Ecosystems are very complex, and many belowground species make a living off of legacies of carbon or other nutrients that are the remains of organisms that lived many generations ago.   These legacy organic nutrient pools allow for indirect (and thus more difficult to detect) linkages between aboveground and belowground species.

Porazinska and her colleagues reasoned that if there were aboveground/belowground relationships, they would be easiest to detect in the simplest ecosystems that lacked significant pools of legacy nutrients. They also used molecular techniques that were not readily available for earlier studies to identify distinct species based on DNA analysis. The researchers established 98 1-m radius circular plots at the Niwot Ridge Long Term Ecological Research Site in the Colorado, USA Rocky Mountains. At each plot, they identified and counted each vascular plant, and recorded the presence of moss and lichen.  They also censused soil biota by using a variety of DNA amplification and isolation techniques that allowed them to identify bacteria, archaea, protists, fungi and nematodes to species.

PorazinskaOpening9256 Photo

Field assistant Jarred Huxley surveys plants in a high species richness plot. Credit Dorota L. Porazinska.

As expected in this alpine environment, plant species richness was quite low, averaging only 8 species per plot (range = 0 – 27).  In contrast to what had been found in other ecosystems, high plant diversity was associated with high diversity of soil biota.

PorazinskaEcologyFig1

Relationship between plant richness (x-axis) and soil biota richness (y-axis) for (A) bacteria, (B) eukaryotes (excluding fungi and nematodes), (C) fungi, and (D) nematodes.  OTUs are operational taxonomic units, which represent organisms with very similar or identical DNA sequences on a marker gene.  For our purposes, they represent distinct species.

Looking at the graphs above, you can see that different groups responded to different degrees; nematodes had the strongest response to increases in plant richness while fungi had the weakest response.  When viewed at a finer level, some groups of soil organisms, including photosynthetic microorganisms such as cyanobacteria and green algae actually decreased, presumably in response to competition with aboveground plants for light and possibly nutrients.

Given the strong relationship between plant species richness and soil biota richness, Porazinska and her colleagues next explored whether high plant richness was associated with soil nutrient levels (nutrient pools).  In general, there was a strong correlation between plant species richness and nutrient pools (see graphs below).  But soil moisture, and the ability of soil to hold moisture were the two most important factors associated with nutrient pools.

PorazinskaEcologyFig2

Amount (micrograms per gram of soil) of carbon (left graph) and nitrogen (right graph) in relation to plant species richness.

Ecologists studying soil processes can measure the rates at which microorganisms are metabolizing nutrients such as carbon, phosphorus and nitrogen.  The expectation was that if high plant species richness was associated with higher soil biota richness, and larger soil nutrient pools, then the activity of enzymes that metabolize soil nutrients should proportionally increase with these factors.  The researchers found that enzyme activity was very low where plants were absent or rare, and greatest in complex plant communities.  But the most important factors influencing enzyme activity were the amount of organic carbon present within the soil, and the ability of the soil to hold water.

PorazinskaClosing4427

Patchy vegetation at the field site. Credit: Cliffton P. Bueno de Mesquita.

Porazinska and her colleagues hypothesize that the relationship between plant species richness, soil biota richness, nutrient pools, and soil processes such as enzyme activity, exist in most ecosystems, but are obscured by indirect linkages between these different levels.  They hypothesize that these relationships in other ecosystems such as grasslands and forests are difficult to observe.  In these more complex ecosystems, carbon inputs into the soil form large legacy carbon pools. These carbon pools, and the ability of the soil to hold nutrient pools, fundamentally influence the abundance and richness of soil biota. In contrast, in nutrient-poor soils, such as high Rocky Mountain alpine meadows, legacy carbon pools are rare and small. Consequently, plants and soil biota interact more directly, and correlations between plant species diversity and soil biota diversity are much easier to detect.

note: the paper that describes this research is from the journal Ecology. The reference is Porazinska, D. L., Farrer, E. C., Spasojevic, M. J., Bueno de Mesquita, C. P., Sartwell, S. A., Smith, J. G., White, C. T., King, A. J., Suding, K. N. and Schmidt, S. K. (2018), Plant diversity and density predict belowground diversity and function in an early successional alpine ecosystem. Ecology, 99: 1942-1952. doi:10.1002/ecy.2420. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.

 

Mangroves partner with rats in China

Many of us have seen firsthand the havoc that invasive plants can wreak on ecosystems.  We are accustomed to think of native plants as unable to defend themselves, much like a skinny little kid surrounded by a group of playground bullies. ‘Not so fast’ says Yihui Zhang.  As it turns out, many native plants can defend themselves against invasions, and they do so with the help of unlikely allies.

In southern China, mangrove marshes are being invaded by the salt marsh cordgrass, Spartina alterniflora, which is native to the eastern USA coastline. Cordgrass seeds can float into light gaps among the mangroves, and then germinate and choke out mangrove seedlings.  However, intact mangrove forests can resist cordgrass invasion.  Zhang and his colleagues wanted to know how they resist.

mangrove-Spartina ecotone

Cordgrass (pale green) meets mangrove (bright green) as viewed from space. Credit: Yihui Zhang.

Cordgrass was introduced into China in 1979 to reduce coastal erosion.  It proved up to the task, quickly transforming mudflats into dense cordgrass stands, and choking out much of the native plant community.  Dense mangrove forests grow near river channels that enter the ocean, and are considerably taller than their cordgrass competitors.  The last player in this interaction is a native rat, Rattus losea, which often nests on mangrove canopies above the high tide level. At the research site (Yunxiao), many rat nests were built on mangroves, using cordgrass leaves and stems as the building material.

zhangnest.png

Rat nest constructed from cordgrass shoots rests upon a mangrove tree.  Credit Yihui Zhang.

Zhang and his colleagues suspected that cordgrass invasion into the mangrove forest was prevented by both competition from mangroves and herbivory by rats on cordgrass.

Baby rat in the nest

Baby rats in their nest. Credit Yihui Zhang.

 

To test this hypothesis, they built cages to exclude rats from three different habitats: open mudflats (primarily pure stands of cordgrass), the forest edge, and the mangrove forest understory, (with almost no cordgrass). They set up control plots that also had cages, but that still allowed rats to enter.

zhangregenshoot

Arrow points to resprouting cordgrass. Credit Yihui Zhang.

The researchers planted 6 cordgrass ramets (genetically identical pieces of live plant) in each plot and then monitored rodent grazing, resprouting of original shoots following grazing, and shoot survival over the next 70 days.

They discovered that the cages worked; no rats grazed inside the cages.  But in the control plots, grazing was highest in the forest understory and lowest in the mudflats (Top figure below).  Most important, both habitat type and exposure to grazing influenced cordgrass survival.  In the understory, rodent grazing was very important; only one ramet survived in the control plots, while 46.7% of ramets survived if rats were excluded.  In the other two habitats, grazing did not affect ramet survival, which was very high with or without grazing (Middle figure). Rodent grazing effectively eliminated resprouting of ramets in the understory, but not in the other two habitats (Bottom figure).

Zhangfig2

Impact of rat grazing on cordgrass in the field study in three different habitats.  Top figure is % of stems grazed, middle figure is transplant survival, and bottom figure is resprouting after grazing (there was no grazing in the rodent exclusion plots). Error bars are 1 standard error. Different letters above bars indicate significant differences between treatments.

The researchers suspected that low light levels in the understory were preventing cordgrass from resprouting after rat grazing. This was most easily tested in the greenhouse, where light conditions could be effectively controlled.  High light was 80% the intensity of outdoor sunlight, medium light was 33% (about what strikes the forest edge) and low light was 10% the intensity of outdoor sunlight (similar to mangrove understory light).  Rat grazing was simulated by cutting semi-circles on the stembase, pealing back the leaf sheath, and digging out the leaf tissue. Cordgrass ramets were planted in large pots, exposed to different light and grazing treatments, and monitored for survival, growth and resprouting following grazing.

Greenhouse setup

Cordgrass growing in greenhouse under different light treatments. Credit: Yihui Zhang.

Zhang and his colleagues found that simulated grazing sharply reduced cordgrass survival from 85% to 7% at low light intensity, but had no impact on survival at medium or high light intensities.  Cordgrass did not resprout after simulated grazing at low light intensity, in contrast to approximately 50% resprouting at medium and high light intensity.

ZhangFig4

Survival (top) and resprouting (bottom) of cordgrass following simulated grazing in the greenhouse experiment.

The researchers conclude that grazing by rats and shading by mangroves are two critical factors that make mangroves resistant to cordgrass invasion. Rats tend to build their nests near the mangrove forest edge, so it is not clear how far into the forest the rat effect extends. Rats do prefer to forage in the understory (rather than right along the edge), presumably because the understory helps to protect them from predators.  In essence, mangroves compete directly with cordgrass by shading them out, and also indirectly by attracting cordgrass-eating rats. Conservation biologists need to be aware of both direct and indirect effects when designing management programs for protecting endangered ecosystems such as mangrove forests.

note: the paper that describes this research is from the journal Ecology. The reference is Zhang, Y. , Meng, H. , Wang, Y. and He, Q. (2018), Herbivory enhances the resistance of mangrove forest to cordgrass invasion. Ecology. Accepted Author Manuscript. doi:10.1002/ecy.2233. Thanks to the Ecological Society of America for allowing me to use figures from the paper. Copyright © 2018 by the Ecological Society of America. All rights reserved.