Tuesday, October 26, 2021

Scientists reveal genetic secrets of stress-tolerant mangrove trees


Mangrove trees use changes in gene activity, including the activity of parasitic ‘jumping genes’, to increase their resilience to stress, a new study finds.


Peer-Reviewed Publication

OKINAWA INSTITUTE OF SCIENCE AND TECHNOLOGY (OIST) GRADUATE UNIVERSITY

Mangrove trees at the oceanside and riverside 

IMAGE: (LEFT) MANGROVE TREES GROWING NEAR THE OCEAN EXPERIENCE HIGH LEVELS OF SALINITY AND ARE SMALL IN STATURE. (RIGHT) MANGROVE TREES GROWING UPRIVER HAVE LESS SALINE, BRACKISH CONDITIONS AND GROW TALLER, WITH THICKER TRUNKS AND LARGER LEAVES. THESE TREES WERE SURVEYED BY DR. MATIN MIRYEGANEH (PICTURED) AND HER COLLEAGUES, AS PART OF A NEW STUDY FEATURED IN THE PRESS RELEASE, “SCIENTISTS REVEAL GENETIC SECRETS OF STRESS-TOLERANT MANGROVE TREES.” view more 

CREDIT: OIST

  • Mangrove trees live in harsh environments and have evolved a remarkable resilience to stres
  • Researchers have now decoded the genome of the mangrove tree, Bruguiera gymnorhiza, which contains 309 million base pairs with an estimated 34,403 genes
  • The genome is larger than other known mangrove trees, with a quarter of the genome composed of parasite ‘jumping genes’ called transposons
  • The researchers also compared gene activity between mangrove trees grown in environments with high salinity to those grown in conditions with low salinity
  • Mangrove trees grown in more stressful, high saline conditions suppressed the activity of the transposons and increased the activity of stress-response genes

Mangrove trees straddle the boundary between land and ocean, in harsh environments characterized by rapidly changing levels of salinity and low oxygen. For most plants, these conditions would mark a death sentence, but mangroves have evolved a remarkable resistance to the stresses of these hostile locations.

Now, researchers from the Okinawa Institute of Science and Technology Graduate University (OIST) have decoded the genome of the mangrove tree, Bruguiera gymnorhiza, and revealed how this species regulates its genes in order to cope with stress. Their findings, published recently in New Phytologist, could one day be used to help other plants be more tolerant to stress.

“Mangroves are an ideal model system for studying the molecular mechanism behind stress tolerance, as they naturally cope with various stress factors,” said Dr. Matin Miryeganeh, first author of the study and a researcher in the Plant Epigenetics Unit at OIST.

Mangroves are an important ecosystem for the planet, protecting coastlines from erosion, filtering out pollutants from water and serving as a nursery for fish and other species that support coastal livelihoods. They also play a crucial role in combating global warming, storing up to four times as much carbon in a given area as a rainforest.

Despite their importance, mangroves are being deforested at an unprecedented rate, and due to human pressure and rising seas, are forecast to disappear in as little as 100 years. And genomic resources that could help scientists try to conserve these ecosystems have so far been limited.

The mangrove project, which was initially suggested by Sydney Brenner, one of the founding fathers of OIST, began in 2016, with a survey of mangrove trees in Okinawa. The scientists noticed that the mangrove tree, Bruguiera gymnorhiza, showed striking differences between individuals rooted in the oceanside, with high salinity, and those in the upper riverside, where the waters were more brackish.

“The trees were amazingly different; near the ocean, the height of the trees was about one to two meters, whereas further up the river, the trees grew as high as seven meters,” said senior author, Professor Hidetoshi Saze, who leads the Plant Epigenetics Unit. “But the shorter trees were not unhealthy – they flowered and fruited normally – so we think this modification is adaptive, perhaps allowing the salt-stressed plant to invest more resources into coping with its harsh environment.”

Unlike long-term evolutionary adaptation, which involves changes to the genetic sequence, adaptations to the environment that take place over an organism’s lifespan occur via epigenetic changes. These are chemical modifications to DNA that affect the activity of different genes, adjusting how the genome responds to different environmental stimuli and stresses. Organisms like plants, which can’t move to a more comfortable environment, rely heavily on epigenetic changes to survive.

Before focusing in on how the genome was regulated, the research team first extracted DNA from the mangrove tree, Bruguiera gymnorhiza, and decoded the genome for this species. They found that the genome contained 309 million base pairs, with a predicted 34,403 genes – a much larger genome than those for other known mangrove tree species. The large size was due to, for the most part, almost half of the DNA being made up of repeating sequences.

When the research team examined the type of repetitive DNA, they found that over a quarter of the genome consisted of genetic elements called transposons, or ‘jumping genes.’

Prof. Saze explained: “Active transposons are parasitic genes that can ‘jump’ position within the genome, like cut-and paste or copy-and-paste computer functions. As more copies of themselves are inserted into the genome, repetitive DNA can build up.”

Transposons are a big driver of genome evolution, introducing genetic diversity, but they are a double-edged sword. Disruptions to the genome through the movement of transposons are more likely to cause harm than provide a benefit, particularly when a plant is already stressed, so mangrove trees generally have smaller genomes than other plants, with suppressed transposons.

However, this isn’t the case for Bruguiera gymnorhiza, with the scientists speculating that as this mangrove species is more ancestral than others, it may not have evolved to have an efficient means of suppression.


CAPTION

Researchers from the OIST Plant Epigenetics Unit grew mangrove trees under controlled conditions in the laboratory to see the effect of different salinity levels. Their findings were part of a new study featured in the press release “Scientists reveal genetic secrets of stress-tolerant mangrove trees.”

CREDIT

OIST

The team then examined how activity of the genes, including the transposons, varied between individuals in the oceanside location with high salinity, and individuals in the less saline, brackish waters upriver. They also compared gene activity for mangrove trees grown in the lab, under two different conditions that replicated the oceanside and upriver salinity levels.

Overall, in both the oceanside individuals and those grown in high salinity conditions in the lab, genes involved in suppressing transposon activity showed higher expression, while genes that normally promote transposon activity showed lower expression. In addition, when the team looked specifically into transposons, they found evidence of chemical modifications on their DNA that lowered their activity.

“This shows that an important means of coping with saline stress involves silencing transposons,” said Dr. Miryeganeh.

The researchers also saw increases in the activity of genes involved in stress responses in plants, including those that activate when plants are water-deprived. Gene activity also suggested the stressed plants have lower levels of photosynthesis.

In future research, the team plan to study how seasons, changes in temperature and rainfall, also affect the activity of the mangrove tree genomes.

“This study acts as a foundation, providing new insights into how mangrove trees regulate their genome in response to extreme stresses,” said Prof. Saze. “More research is needed to understand how these changes in gene activity impact molecular processes within the plant cells and tissues and could one day help scientists create new plant strains that can better cope with stress.”

Tiny microscopic hunters could be a crystal ball for climate change


Simple measurements of these obscure organisms can help predict future CO2 emissions for warming ecosystems, study finds

Peer-Reviewed Publication

DUKE UNIVERSITY

Protists such as this Euplotes are common in water, soil, even moss. 

IMAGE: SCIENTISTS SAY A FEW SIMPLE MEASURES OF A PROTIST’S CELL SIZE AND SHAPE CAN BE POWERFUL PREDICTORS OF HOW THEY MIGHT RESPOND TO GLOBAL WARMING. view more 

CREDIT: COURTESY OF DAN WIECZYNSKI.

DURHAM, N.C. -- It’s hard to know what climate change will mean for Earth’s interconnected and interdependent webs of life. But one team of researchers at Duke University says we might begin to get a glimpse of the future from just a few ounces of microbial soup.

Every drop of pond water and teaspoon of soil is teeming with tens of thousands of tiny unicellular creatures called protists. They’re so abundant that they are estimated to weigh twice as much as all the animals on Earth combined.

Neither animals nor plants nor fungi, the more than 200,000 known species of protists are often overlooked. But as temperatures warm, they could play a big role in buffering the effects of climate change, said Jean Philippe Gibert, an assistant professor of biology at Duke.

That’s because of what protists like to eat. They gobble up bacteria, which release carbon dioxide into the air when they respire, just like we do when we breathe out. But because bacteria account for more of the planet's biomass than any other living thing besides plants, they are among the largest natural emitters of carbon dioxide — the greenhouse gas most responsible for global warming.

In a study published Oct. 19 in Proceedings of the National Academy of Sciences, Gibert, postdoctoral researcher Dan Wieczynski and colleagues tested the effects of warming on bacteria-eating protists by creating mini ecosystems -- glass flasks each containing 10 different species of protists going about the business of eating and competing and reproducing.

The flasks were kept at five temperatures ranging from 60 degrees to 95 degrees Fahrenheit. Two weeks later, the researchers looked to see which species had survived at each temperature and measured how much CO2 they gave off during respiration.

“To me, the question was a simple one in nature,” Gibert said. “Is there something to be measured on living organisms, today, that may allow us to predict their response to increasing temperature, tomorrow?”

The answer was yes. The researchers were surprised to find that each species’ response to temperature could be predicted from just a few simple measurements of their size, shape and cell contents. And together, these factors in turn influenced respiration rates for the community as a whole.

They also found that by taking measurements such as cell size and shape and plugging them into a mathematical model, they could get very close to how things played out in their mini ecosystems in reality.

“We can actually use what we know about the relationship between traits and temperature responses at the species level, and scale it all the way up to a whole ecosystem level,” Wieczynski said.

The work is important because it sheds light on “how climate change will alter microbial communities and how this will feed back to influence the pace of climate change,” Wieczynski said.

This research was supported by a grant from the U.S. Department of Energy (DE-SC0020362).

CITATION: “Linking Species Traits and Demography to Explain Complex Temperature Responses Across Levels of Organization," Daniel J. Wieczynski, Pranav Singla, Adrian Doan, Alexandra Singleton, Zeyi Han, Samantha Votzke, Andrea Yammine, Jean P. Gibert. Proceedings of the National Academy of Sciences, Oct. 19, 2021. DOI:  10.1073/pnas.2104863118

That primate’s got rhythm!


Peer-Reviewed Publication

MAX PLANCK INSTITUTE FOR PSYCHOLINGUISTICS

Researchers from the universities of Turin, Lyon/Saint-Étienne and the Max Planck Institute for Psycholinguistics in Nijmegen studied indris, the ‘singing primates’ from Madagascar 

VIDEO: RESEARCHERS FROM THE UNIVERSITIES OF TURIN, LYON/SAINT-ÉTIENNE AND THE MAX PLANCK INSTITUTE FOR PSYCHOLINGUISTICS IN NIJMEGEN STUDIED INDRIS, THE ‘SINGING PRIMATES’ FROM MADAGASCAR view more 

CREDIT: ANDREA RAVIGNANI

Songbirds share the human sense of rhythm, but it is a rare trait in non-human mammals. An international research team led by senior investigators Marco Gamba from the University of Turin and MPI’s Andrea Ravignani set out to look for musical abilities in primates. “There is longstanding interest in understanding how human musicality evolved, but musicality is not restricted to humans”, says Ravignani. “Looking for musical features in other species allows us to build an ‘evolutionary tree’ of musical traits, and understand how rhythm capacities originated and evolved in humans.”

To find out whether non-human mammals have a sense of rhythm, the team decided to study one of the few ‘singing’ primates, the critically endangered lemur Indri indri. The researchers wanted to know whether indri songs have categorical rhythm, a ‘rhythmic universal’ found across human musical cultures. Rhythm is categorical when intervals between sounds have exactly the same duration (1:1 rhythm) or doubled duration (1:2 rhythm). This type of rhythm makes a song easily recognisable, even if it is sung at different speeds. Would indri songs show this “uniquely human” rhythm?

CAPTION

Indri songs recorded in the wild have rhythmic categories similar to those found in human music.

CREDIT

Filippo Carugati

Ritardando in the rainforest

Over a period of twelve years, the researchers from Turin visited the rainforest of Madagascar to collaborate with a local primate study group. The investigators recorded songs from twenty indri groups (39 animals), living in their natural habitat. Members of an indri family group tend to sing together, in harmonised duets and choruses. The team found that indri songs had the classic rhythmic categories (both 1:1 and 1:2), as well as the typical ‘ritardando’ or slowing down found in several musical traditions. Male and female songs had a different tempo but showed the same rhythm.

According to first author Chiara de Gregorio and her colleagues, this is the first evidence of a ‘rhythmic universal’ in a non-human mammal. But why should another primate produce categorical ‘music-like’ rhythms? The ability may have evolved independently among ‘singing’ species, as the last common ancestor between humans and indri lived 77.5 million years ago. Rhythm may make it easier to produce and process songs, or even to learn them.

CAPTION

Finding common musical traits across species may shed light on the biology and evolution of rhythm and music.

CREDIT

Filippo Carugati


Endangered species

“Categorical rhythms are just one of the six universals that have been identified so far”, explains Ravignani. “We would like to look for evidence of others, including an underlying ‘repetitive’ beat and a hierarchical organisation of beats—in indri and other species.” The authors encourage other researchers to gather data on indri and other endangered species, “before it is too late to witness their breath-taking singing displays.”

Aquatic fungus has already wiped amphibians off the map and now threatens survival of terrestrial frogs


A study detected unprecedented mortality in the Atlantic Rainforest among tiny frogs that live on land, with signs of infection by chytrid fungus. The episode coincided with an atypical period of drought.

Peer-Reviewed Publication

FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO

Mortality in the Atlantic Rainforest among tiny frogs 

IMAGE: THE EPISODE COINCIDED WITH AN ATYPICAL PERIOD OF DROUGHT, WHICH MAY HAVE FORCED THE ANIMALS TO SEEK WATER IN STREAMS WHERE THE PATHOGEN IS ABUNDANT view more 

CREDIT: DIEGO MOURA-CAMPOS/UNICAMP

A water-borne fungus that has led to the extinction of several species of amphibians that spend all or part of their life cycle in water is also threatening terrestrial amphibians. In Brazil, researchers supported by FAPESP detected unprecedented mortality among a genus of tiny frogs known as pumpkin toadlets that live in the Atlantic Rainforest far from any aquatic environments. The animals were severely infected by chytrid fungus (Batrachochytrium dendrobatidis), which causes chytridiomycosis.

The study, published in the journal Biological Conservation, shows that the fungus is also a threat to terrestrial-breeding amphibians with important ecological functions, which include controlling insects that transmit diseases such as dengue, yellow fever, and zika.

“The fungus attacks the amphibian’s skin, which is where it exchanges gas with the external environment. Infection causes a physiological imbalance, and the animal eventually dies from a heart attack,” said Diego Moura-Campos, first author of the article. The study was conducted during his master’s research at the University of Campinas’s Institute of Biology (IB-UNICAMP) in the state of São Paulo, with a scholarship from the Brazilian Ministry of Education’s Coordination for the Improvement of Higher Education Personnel (CAPES).

The investigation was conducted under the aegis of the project Chytrid fungus in Brazil: origin and consequences, linked to the FAPESP Research Program on Biodiversity Characterization, Conservation, Restoration and Sustainable Use (BIOTA-FAPESP) and coordinated by Luís Felipe Toledo, a professor at IB-UNICAMP and a co-author of the article.

“We’ve studied the fungus from several angles, but have rarely had the unhappy opportunity to see animals dying from fungal infection in the wild. This is the first study to show the phenomenon in Brazil. If an amphibian dies and is infected, that doesn’t mean the fungus caused its death. It might be coexisting with the pathogen without developing the disease. In this case, we were sure it was the cause of death because the animals had the right symptoms, such as weight loss, heavily sloughing skin, and very high infection loads,” said Toledo, who is also principal investigator for another project that focuses on understanding how the fungus spreads in nature.

The researchers believe direct-developing species (which reproduce on land and lack a tadpole, with terrestrial eggs hatching as fully formed miniature adults) are even less adapted to the fungus. Aquatic species have been in contact with the pathogen for longer and may have developed a degree of resistance to infection.

Moura-Campos observed morbidity and mortality in infected frogs during a field survey conducted on the Serra do Japi Biological Reserve in Jundiaí, São Paulo, between May 2018 and May 2019. Curiously, dead and dying individuals of the species Brachycephalus rotenbergae were found after an atypical period of drought.

“These animals are very small and hard to find. After dying, they decompose quickly. Finding nine of them dead or heavily diseased in a short period, as we did, suggests others probably died as well,” said Guilherme Becker, a professor at the University of Alabama in the United States and last author of the article.

According to Becker, who is also a visiting professor at UNICAMP under its Graduate Program in Ecology, the study shows that accelerating global climate change in the coming decades will increase the frequency of this type of disease, with causative agents that may become more virulent as hybrids emerge, as already shown in an earlier study by the group

“Lack of soil moisture in the forest where they live may have led these animals to seek hydration in streams and become more contaminated than normal by the fungus,” he said.

Another hypothesis raised by the researchers is that periods of drought may compromise the frogs’ immune system so that they become more vulnerable to the fungus.

Cosmopolitan pathogen

The fungus originated in Asia and has probably spread around the world as a result of the trade in frog meat. Species consumed by humans for this purpose, such as the American bullfrog (Rana catesbeiana), are resistant to the fungus and can be bearers without being infected.

According to a paper published in 2018 in the journal Science with Toledo as a co-author, the fungus originated on the Korean peninsula and spread to other parts of the world in the early twentieth century.

Another study to which Toledo contributed also found that the fungus has caused a decline in the populations of at least 501 species of amphibians worldwide. In Brazil alone, at least 50 species or populations have been affected, 12 have become extinct, and 38 have undergone decline (more at: agencia.fapesp.br/30127/). 

“Amphibians are very important to the functioning of many ecosystems. Their biomass in forests is enormous. They serve as food for a wide array of other animals, eat arthropods in the wild, and control communities of invertebrates,” Becker said. “In the case of aquatic species, most are herbivorous in the tadpole stage and consume phytoplankton, which could overwhelm aquatic environments if it were not for tadpoles. These animals cross aquatic and terrestrial ecosystems, so when outbreaks of chytridiomycosis occur, the impact is significant.”

To exemplify, Becker recalled a recent study in which scientists affiliated with institutions in the US and Panama show that amphibian population collapse due to infection by B. dendrobatidis was linked to an increase in outbreaks of malaria in the 1990s and 2000s in Panama and Costa Rica.

According to Becker, Toledo and collaborators, more observation is required over a period of years to reach a more precise estimate of the global impact of chytridiomycosis on amphibian populations.

###

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at http://www.fapesp.br/en and visit FAPESP news agency at http://www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.

 

Research inspects planetary nebula NGC 6905 and its central star

Research inspects planetary nebula NGC 6905 and its central star
NOT ALFOSC color-composite image of NGC 6905. Credit: Gómez-González et al., 2021.

Using the Nordic Optical Telescope (NOT), astronomers have investigated a planetary nebula known as NGC 6905 and its central star. Results of the study, presented in a paper published October 18 on the arXiv pre-print server, provide more insights into the nature of this object.

Planetary nebulae (PNe) are expanding shells of gas and dust that have been ejected from a star during the process of its evolution from a main sequence star into a red giant or white dwarf. They are relatively rare, but are important for astronomers studying the chemical evolution of stars and galaxies.

At a distance of about 8,800 light years away from the Earth, NGC 6905, also known as the "Blue Flash Nebula" for its characteristic colors, is a high-excitation PN with a clearly clumpy morphology. It is composed of a central roundish cavity with an angular radius of some 0.81  and a pair of extended V-shaped structures extending towards two opposite directions. The central star of this PN, designated HD 193949, is a Wolf-Rayet-type star with a radius of about 0.15 solar radii, mass of approximately 0.6 solar masses, and effective temperature in the range of 150,000–165,000 degrees K.

A team of astronomers led by Víctor Mauricio Alfonso Gómez-González of the National Autonomous University of Mexico has recently conducted a multi-wavelength study of NGC 6905 and HD 193949, aiming to shed more light on the properties and structure of this object. The research is based mainly on the data from NOT's Alhambra Faint Object Spectrograph and Camera (ALFOSC), but also on archival infrared images obtained from telescopes such as NASA's Spitzer and WISE.

"We present a multi-wavelength characterisation of the  (PN) NGC 6905 and its [Wolf-Rayet]-type ([WR]) central star (CSPN) HD 193949. Our Nordic Optical Telescope (NOT) Alhambra Faint Object Spectrograph and Camera (ALFOSC) spectra and images unveil in unprecedented detail the high-ionization structure of NGC 6905," the researchers wrote in the paper.

The observations allowed the team to detect the three broad WR bumps, the so-called O-bump, blue bump and red bump, confirming that HD 193949 belongs to the [WO]-class of Wolf-Rayet . They also detected 21 WR features which suggest that the spectral type of this CSPN cannot be later than a [WO2]-subtype star. The  of HD 193949 was measured to be around 140,000 degrees K, therefore lower than previously thought.

Based on the data, the astronomers investigated the physical properties and chemical abundances of different regions of NGC 6905. They found that the low-ionization knots located at the northwest and southeast regions of this PN do not exhibit different electron density nor electron temperature compared to its other regions. The averaged value of electron density was calculated to be 500/cm3, while the electron temperature was estimated to be at a level of 13,000 degrees K.

The researchers noted that NGC 6905 has similar abundances as other WRPN, but a slightly smaller nitrogen to oxygen ratio.

"In particular, comparing the N/O ratio versus the N abundance following previous studies suggests that the CSPN of NGC 6905 had a relatively low initial mass of about 1 solar mass. This makes NGC 6905 one of the WRPN with the less massive central star," the scientists explained.

The study also allowed the astronomers to conclude that there is no anomalous carbon-enrichment within NGC 6905 which suggests that no very late thermal pulse (VLTP) has been involved in its formation of or the production of its . Additionally, the team reproduced the nebular and dust properties of NGC 6905 and found that the total mass of gas in this PN is in the range of 0.31 and 0.47 solar masses, while the mass of dust was estimated to be between 0.00224 and 0.00169 solar masses.

New giant exoplanet detected with TESS

More information: V. M. A. Gómez-González et al, Planetary nebulae with Wolf-Rayet-type central stars—III. A detailed view of NGC 6905 and its central star. arXiv:2110.09551v1 [astro-ph.SR], arxiv.org/abs/2110.09551

© 2021 Science X Network

 

Fossil dental exams reveal how tusks first evolved

Fossil dental exams reveal how tusks first evolved
Life reconstruction of the the dicynodont Dicynodon. Aside from the tusks in the upper jaw, 
most dicynodonts possessed a turtle-like beak that they used to chew their food.
 Image by Marlene Hill Donnelly. Credit: Marlene Hill Donnelly

A wide variety of animals have tusks, from elephants and walruses to five-pound, guinea pig-looking critters called hyraxes. But one thing tusked animals have in common is that they're all mammals—there are no known fish, reptiles, or birds with tusks. In a new study in Proceedings of the Royal Society B, paleontologists traced the first tusks back to ancient mammal relatives that lived before the dinosaurs, and to do so, they had to define what makes a tusk a tusk in the first place.

"Tusks are this very famous anatomy, but until I started working on this study, I never really thought about how tusks are restricted to mammals," says Megan Whitney, a researcher at Harvard University and the lead author of the study.

"We were able to show that the first tusks belonged to animals that came before modern mammals, called dicynodonts," says Ken Angielczyk, a curator at Chicago's Field Museum and an author of the paper. "They're very weird animals."

The dicynodonts mostly lived before the time of the dinosaurs, from about 270 to 201 million years ago, and they ranged from rat-sized to elephant-sized. Modern mammals are their closest living relatives, but they looked more reptilian, with turtle-like beaks. And since their discovery 176 years ago, one of their defining features has been the pair of protruding tusks in their upper jaws. The name dicynodont even means "two ."

The researchers got the idea to study the origin of tusks while taking a lunch break on a paleontological dig. "We were sitting in the field in Zambia, and there were dicynodont  everywhere," recalls Whitney. "I remember Ken picking them up and asking how come they were called tusks, because they had features that tusks don't have."

Angielczyk had hit upon a crucial distinction: not all protruding teeth are technically tusks, and the teeth's makeup and growth patterns tell us whether they count. "For this paper, we had to define a tusk, because it's a surprisingly ambiguous term," says Whitney. The researchers decided that for a tooth to be a tusk, it has to extend out past the mouth, it has to keep growing throughout the animal's life, and unlike most mammals' teeth (including ours), tusks' surfaces are made of dentine rather than hard enamel.

Fossil dental exams reveal how tusks first evolved
Left side of the skull of the dicynodont Dolichuranus (NMT RB554) from Tanzania. The large tusk is visible at the lower left of the specimen. Photo by K. Angielczyk. Credit: Ken Angielczyk

Under these parameters, elephants, walruses, warthogs, and hyraxes all have tusks. Other big teeth in the animal kingdom don't make the cut, though. For instance, rodent teeth, even though they sometimes stick out and are ever-growing, have an enamel band on the front of the tooth, so they don't count.

Some of the dicynodont tusks that the team observed in Zambia didn't seem to fit the definition of a tusk either— they were coated in enamel instead of dentine.

The different makeup of teeth versus tusks also gives scientists insights into an animal's life. "Enamel-coated teeth are a different evolutionary strategy than dentine-coated tusks, it's a trade-off," says Whitney. Enamel teeth are tougher than dentine, but because of the geometry of how teeth grow in the jaw, if you want teeth that keep growing throughout your life, you can't have a complete enamel covering.

Animals like humans made an evolutionary investment in durable but hard-to-fix teeth— once our adult teeth grow in, we're out of luck if they get broken. Tusks are less durable than our enamel-coated teeth, but they grow continuously, even if they get damaged. It's like the compromise of getting a car that's very reliable but very difficult to get repaired when it does have trouble, versus driving a beater that needs frequent repairs but is a model that's cheap and easy for any mechanic to fix.

The different kinds of teeth animals have evolved can tell scientists about the pressures those animals faced that could have produced those teeth. Animals with tusks might use them for fighting or for rooting in the ground, exposing them to little injuries that would be risky for enamel teeth that don't grow continuously.

To study whether dicynodonts tusks really were tusks, the researchers cut paper-thin slices out of the fossilized teeth of 19 dicynodont specimens, representing ten different species, and examined their structure with a microscope. They also used micro-CT scans to examine how the teeth were attached to the skull, and whether their roots showed evidence of continuous growth. The scientists found that some dicynodont teeth are indeed tusks, while others, particularly those of some of the earlier species, were just large teeth. It wasn't a strict progression from non-tusks to tusks, though— different members of the dicynodont family evolved tusks independently.

Fossil dental exams reveal how tusks first evolved
Isolated tusk fragments found in Zambia by field teams in 2018. Photo by K. Angielczyk. Credit: Ken Angielczyk

Whitney says she was surprised by the finding. "I kind of expected there to be one point in the family tree where all the dicynodonts started having tusks, so I thought it was pretty shocking that we actually see tusks evolve convergently," she says.

"Dicynodont tusks can tell us a lot about mammalian tusk evolution in general," says Angielczyk. "For instance, this study shows that reduced rates of tooth replacement and a flexible ligament attaching the tooth to the jaw are needed for true tusks to evolve. It all ladders up to giving us a better understanding of the tusks we see in mammals today."

"Dicynodonts were the most abundant and diverse vertebrates on land just before dinosaur times, and they're famous for their 'tusks.' The fact that in reality only a few have true tusks, and the rest have big teeth, is a beautiful example of evolution we can document. We can see how to build a tusk!" says Brandon Peecook, a curator at the Idaho Museum of Natural History and one of the paper's authors.

The researchers say that the study, which shows the earliest known instance of true tusks, could help scientists better understand how evolution works.

"Tusks have evolved a number of times, which makes you wonder how—and why? We now have good data on the anatomical changes that needed to happen for dicynodonts to evolve tusks. For other groups, like warthogs or walruses, the jury is still out," says Christian Sidor, a curator at the University of Washington Burke Museum and one of the paper's authors.

"Despite being extremely weird animals, there are some things about dicynodonts, like the evolution of , that inform us about the mammals around us today," says Angielczyk. "Plus, anytime you can say mammals aren't that special, dicynodonts did it first, that's a good day."Thailand seizes large elephant tusks worth over $450,000

More information: The evolution of the synapsid tusk: insights from dicynodont therapsid tusk histology, Proceedings of the Royal Society B: Biological Sciences (2021). DOI: 10.1098/rspb.2021.1670. rspb.royalsocietypublishing.or … .1098/rspb.2021.1670

Journal information: Proceedings of the Royal Society B 

Provided by Field Museum 

 

How does 'normal' Internet browsing look today? Now we know

browser security
Credit: Unsplash/CC0 Public Domain

It's 7:15 am on a Friday morning, and Jordan wants to download an application to their laptop. They know the app by name, or so they think; they open a new tab in their Internet browser and mistype the app's name. The error brings them to a malicious website that looks like a legitimate site, only it isn't, causing Jordan to download an app containing malware. Jordan's computer is now infected with malware.

Jordan is a real person, although their name isn't really Jordan. They were a participant in a new study by CyLab researchers that aimed to learn what "normal" Internet browsing looks like. Such datasets didn't previously exist, but now that one does, researchers can better understand how people like "Jordan" are led to download malicious content and come up with ways to prevent that from happening again.

Their study, titled "How Do Home Computer Users Browse the Web?" was published in the latest issue of ACM Transactions on the Web.

"The goal for this paper was to be a foundation that other researchers could use," says CyLab's Kyle Crichton, a Ph.D. student in Engineering and Public Policy and the study's lead author. "Now that we know what normal  looks like, we can start to identify anomalous behavior and begin to address any number of security challenges."

To create their dataset, the authors of the study observed the browsing behavior of 257 willing participants through the Security Behavior Observatory (SBO), a group of participants consenting to have their daily computing behaviors observed. One might think consenting to being monitored may lead one to act a bit different than they normally would, but Crichton says he doesn't believe that happened here.

"In general, there was a substantial number of visits to potentially pirated streaming websites, pornographic websites, and gambling websites," Crichton says. "Therefore, we assume that they were generally behaving as they normally do."

So what does "normal" browsing look like? Lots of browser tab usage—some use just a few and some use a ton—and most time is spent on the top 1% of websites.

"People spend most of their time on a small number of websites," says Crichton. "Fifty percent of people's browsing time is spent on roughly 30 websites, among millions of websites."

Occasionally, Crichton says, people end up at what he refers to as "the periphery" of the Internet—relatively low traffic websites that are commonly associated with riskier content. These sites are often adware, gambling, pornography, and potentially illegal streaming websites.

"We observed a lot of people who started out at a popular streaming service like Netflix or Hulu, and they must not have found what they wanted, then they'd jump out to the periphery," Crichton says.

While the study may serve as a foundation for other researchers to use, it'll do so only until people's browsing behavior evolves enough to necessitate recording a new baseline, which Crichton says is inevitable.

"When Google came out in the late 90s, people's way of finding content quickly changed," he says. "People's browsing behavior shifted again when tabbed browsing was introduced in the mid-2000s. It's these game changers that are introduced, and things rapidly evolve.Misconceptions plague security and privacy tools

More information: Kyle Crichton et al, How Do Home Computer Users Browse the Web?, ACM Transactions on the Web (2021). DOI: 10.1145/3473343

Provided by Carnegie Mellon University 

 

Action video games make players better learners of visual and memory tasks

gaming
Credit: CC0 Public Domain

Playing video games that are heavy on action can make you better at some new tasks. New research reveals that these games are helping by teaching players to be quicker learners.

"Imagine taking an American and putting them through physical training to boost their athleticism," says C. Shawn Green, a psychology professor at the University of Wisconsin-Madison who studies how people learn. "If you then have them try to play rugby for the first time ever, they might not look that good. After all, even a good athlete who is stepping into a rugby match for the first time will have to learn a lot of new rules. However, their increased athleticism will mean that they'll tend to be in slightly better positions initially than people with lesser athleticism, and thus will learn to play rugby more quickly."

That idea is similar for action video games—typically those in a genre called first-person shooters—in which players are rewarded for swiftly and accurately tracking and reacting to features of the game that appear and move quickly.

"If you're increasing the equivalent of athleticism for perceptual —like visual attention or speed of processing—that should allow you to learn faster when you've got a new  that calls on those abilities," Green says.

The results will help researchers understand how gaming—which is used to train laparoscopic surgeons and drone pilots, and to help people with amblyopia (sometimes called "lazy eye") and attention deficit disorders—creates some of its well-documented positive effects.

"Games are really powerful, complex experiences," says Green, whose work is supported by the Office of Naval Research. "We know they produce interesting changes in behavior, but their level of complexity makes them hard to study."

The study also shows that not all training activities are equal. Some types of practice can make you perfect, but only at one thing. Types of training that make trainees better at quickly learning to perform a broad range of tasks—or at least more than one—have obvious advantages.

"If you have people who are new to a basketball court shoot nothing but free throws over and over and over, they'll probably get much better at shooting straight on from 15 feet," Green says. "But then if you have them shoot from somewhere else on the court, they're probably going to go right back to where they started. We call that a failure of transfer or a failure of generalization."

In a pair of xperiments described recently in the journal Communications Biology, 25 participants at the University of Rochester in New York and then 52 participants at the University of Geneva in Switzerland were separated into roughly equal groups assigned to play 45 hours of either action video games (such as those from the Call of Duty series) or other popular video games that unfold at a different pace without relying so much on visual attention and reaction speed (games such as Sims and Zoo Tycoon).

Before the players began their gaming assignments, they were tested with tasks that measured their visual perception and working . The  tasks required participants to use a brief glance to pick out the direction of movement of an object or orientation of stripes running across a shape. The working memory tests were more challenging, asking players to listen and watch for pairs of letters read aloud and shapes appearing in different locations on a screen, and report when one matched a sound or placement from a certain number of turns back.

Both groups came out relatively even in the initial tests. But after their contrasting gaming experiences, the action game players were different.

"They had a slight advantage right away, after playing the action games. But the bigger effect was that they improved faster at these orientation and memory tasks than the people who played other games," says Green, who collaborated on the study with researchers from Rochester, Geneva, New York University and the University of California, Irvine.

The tests were chosen because the simple movements and orientation of basic shapes engage parts of the brain involved in very rudimentary visual processing and working memory. "Constantly having to manage new stuff versus old stuff as information comes in" is a common factor in tackling new tasks, Green says.

Teasing the significant aspects out of the complexity will help future  designers who are focused on training as much as—or more than—entertainment.

"There are issues with action games—they tend to be violent, for example, and it's unlikely that is necessary to cause the effects we want to see in players," Green says. "Before you can start designing games with the goal of maximizing the benefits, you need to know what's helping and what's not.

Playing action video games can boost learning, new study reports


More information: Ru-Yuan Zhang et al, Action video game play facilitates "learning to learn", Communications Biology (2021). DOI: 10.1038/s42003-021-02652-7
Journal information: Communications Biology