It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Thursday, April 28, 2022
Wildfires in US, Canadian boreal forests could release sizable amount of remaining global carbon budget
New Science Advances study finds reducing these emissions should be factored into fire management decisions, budgets
WASHINGTON (April 27, 2022)—A paper by U.S. scientists published in the peer-reviewed journal Science Advances today finds that fires occurring in U.S. and Canadian boreal forests between now and 2050 could release about 3% of the remaining global carbon budget unless greater investments are made to limit fire size in these carbon-rich forests. The first-of-its-kind study was led by Dr. Carly Phillips, a fellow with the Western States Climate Team at the Union of Concerned Scientists (UCS), and co-authored with a team of researchers from the Woodwell Climate Research Center, Tufts University, Harvard University, the University of California, and Hamilton College.
The latest scientific report by the Intergovernmental Panel on Climate Change (IPCC) makes clear that countries have a quickly narrowing window to rein in heat-trapping emissions. To meet the Paris Agreement’s principal goal of limiting warming to 1.5 degrees Celsius to avoid some of the worst climate change impacts, nations need to drastically reduce heat-trapping emissions during this consequential decade and reach net-zero carbon emissions by 2050.
“Wildfires in boreal forests can be especially harmful in terms of the amount of emissions they release into the atmosphere since they store about two-thirds of the world’s forest carbon, most of which is contained in the soil and has accumulated over hundreds or even thousands of years,” said Dr. Phillips. “If not properly contained, heat-trapping emissions from wildfires in boreal forests could dramatically increase, jeopardizing nations’ ability to limit warming in line with the Paris Agreement.”
The study found that by midcentury, burned area in Alaskan and Canadian boreal forests is projected to increase as much as 169% and 150%, respectively, releasing nearly 12 gigatons of net carbon emissions—equivalent to the annual emissions of 2.6 billion cars—which represents about 3% of the remaining global carbon budget. These estimates are conservative, as the study did not assess the potential for boreal forest wildfires to accelerate permafrost thaw and other ecosystem processes that could further increase net carbon emissions.
“Governments rightly prioritize rapid suppression of wildfires that occur near heavily populated areas and crucial infrastructure, but allow other areas that hold large amounts of carbon to burn–a practice hazardous to the health and safety of communities in Alaska, Canada and beyond,” said Dr. Peter Frumhoff, a research scientist at Harvard University’s Center for the Environment and a co-author of the study. “Expanding fire management to keep wildfires near historical levels across boreal North America would provide multiple benefits and leave us far better positioned to meet the goals of the Paris Agreement.”
Despite contributing an outsized share of carbon emissions, U.S. and Canadian boreal forests are given disproportionately small amounts of funding for fire suppression efforts. Alaska, for example, accounts for roughly 20% of burned land area and half of U.S. fire emissions annually, yet only receives about 4%, on average, of federal fire management funding. The study found the average cost of avoiding the emission of 1 ton of carbon dioxide was about $12, a cost comparable to or below that of other measures to mitigate climate change. In Alaska, that would mean investing an average of $696 million per year over the next decade to keep the state’s wildfire emissions at historical levels.
“Reducing boreal forest fires to near-historical levels and keeping carbon in the ground will require additional investments,” said Dr. Brendan Rogers, an associate scientist at Woodwell Climate Research Center and co-author of the study. “These funds are comparatively low and pale in comparison to the costs countries will face to cope with the growing health consequences exacerbated by worsening air quality and the more frequent and intense climate impacts that are expected if emissions continue to rise unabated. They can also ensure wildlife, tourism, jobs, and many other facets of our society can persevere in a warming world.”
More information can be found online in the Science Advances press package at eurekalert.org. Reporters will need their user ID and password to access the information.
###
The Union of Concerned Scientists puts rigorous, independent science to work to solve our planet's most pressing problems. Joining with people across the country, we combine technical analysis and effective advocacy to create innovative, practical solutions for a healthy, safe, and sustainable future. For more information, go to www.ucsusa.org.
Human-caused climate change will make strong tropical cyclones twice as frequent by the middle of the century, putting large parts of the world at risk, according to a new study published in Scientific Advances. The analysis also projects that maximum wind speeds associated with these cyclones could increase around 20%.
Despite being amongst the world’s most destructive extreme weather events, tropical cyclones are relatively rare. In a given year, only around 80-100 tropical cyclones form globally, most of which never make landfall. In addition, accurate global historical records are scarce, making it hard to predict where they will occur and what actions Governments should take to prepare.
To overcome this limitation, an international group of scientists involving Ivan Haigh from the University of Southampton developed a new approach that combined historical data with global climate models to generate hundreds of thousands of “synthetic tropical cyclones”.
Dr. Nadia Bloemendaal from the Institute for Environmental Studies, Vrije Universiteit Amsterdam, who led the study, said:
“Our results can help identify the locations prone to the largest increase in tropical cyclone risk. Local governments can then take measures to reduce risk in their region, so that damage and fatalities can be reduced”
“With our publicly available data, we can now analyse tropical cyclone risk more accurately for every individual coastal city or region”
By creating a very large dataset with these computer-generated cyclones, which have similar features to natural cyclones, the researchers were able to much more accurately project the occurrence and behaviour of tropical cyclones around the world over the next decades in the face of climate change, even in regions where tropical cyclones hardly ever occur today.
The team’s analysis found that the frequency of the most intense cyclones, those from Category 3 or higher, will double globally due to climate change, while weaker tropical cyclones and tropical storms will become less common in most of the world’s regions. The exception to this will be the Bay of Bengal, where the researchers found a decrease in the frequency of intense cyclones
Many of the most at risk locations will be in low-income countries. Countries where tropical cyclones are relatively rare today will see an increased risk in the coming years, including Cambodia, Laos, Mozambiqueand many Pacific Island Nations, such as the Solomon Islands and Tonga. Globally, Asia will see the largest increase in the number of people exposed to tropical cyclones, with additional millions exposed in China, Japan, South Korea and Vietnam.
Dr. Ivan Haigh, Associate Professor at the University of Southampton, said:
“Of particular concern is that the results of our study highlight that some regions that don’t currently experience tropical cyclones are likely to in the near future with climate change”
“The new tropical cyclone dataset we have produced will greatly aid the mapping of changing flood risk in tropical cyclone regions”
The study could help governments and organisations better assess the risk from tropical cyclones, thereby supporting the development of risk mitigation strategies to minimise impacts and loss of life.
IMAGE: ARTIST IMPRESSION OF A SUPERCONDUCTING CHIPview more
CREDIT: TU DELFT
Associate professor Mazhar Ali and his research group at TU Delft have discovered one-way superconductivity without magnetic fields, something that was thought to be impossible ever since its discovery in 1911 – up till now. The discovery, published inNature, makes use of 2D quantum materials and paves the way towards superconducting computing. Superconductors can make electronics hundreds of times faster, all with zero energy loss. Ali: “If the 20th century was the century of semi-conductors, the 21st can become the century of the superconductor.”
During the 20th century many scientists, including Nobel Prize winners, have puzzled over the nature of superconductivity, which was discovered by Dutch physicist Kamerlingh Onnes in 1911 (read more about this in the frame below). In superconductors, a current goes through a wire without any resistance, which means inhibiting this current or even blocking it is hardly possible – let alone getting the current to flow only one way and not the other. That Ali’s group managed to make superconducting one-directional – necessary for computing – is remarkable: one can compare it to inventing a special type of ice which gives you zero friction when skating one way, but insurmountable friction the other way.
Superconductor: super-fast, super-green The advantages of applying superconductors to electronics are twofold. Superconductors can make electronics hundreds of times faster, and implementing superconductors into our daily lives would make IT much greener: if you were to spin a superconducting wire from here to the moon, it would transport the energy without any loss. For instance, the use of superconductors instead of regular semi-conductors might safe up to 10% of all western energy reserves according to NWO.
The (im)possibility of applying superconducting In the 20th century and beyond, no one could tackle the barrier of making superconducting electrons go in just one-direction, which is a fundamental property needed for computing and other modern electronics (consider for example diodes that go one way as well). In normal conduction the electrons fly around as separate particles; in superconductors they move in pairs of twos, without any loss of electrical energy. In the 70s, scientists at IBM tried out the idea of superconducting computing but had to stop their efforts: in their papers on the subject, IBM mentions that without non-reciprocal superconductivity, a computer running on superconductors is impossible.
Interview with corresponding author Mazhar Ali
Q: Why, when one-way direction works with normal semi-conduction, has one-way superconductivity never worked before?
Mazhar Ali: “Electrical conduction in semiconductors, like Si, can be one-way because of a fixed internal electric dipole, so a net built in potential they can have. The textbook example is the famous "pn junction"; where we slap together two semiconductors: one has extra electrons (-) and the other has extra holes (+). The separation of charge makes a net built in potential that an electron flying through the system will feel. This breaks symmetry and can result in "one-way" properties because forward vs backwards, for example, are no longer the same. There is a difference in going in the same direction as the dipole vs going against it; similar to if you were swimming with the river or swimming up the river.”
“Superconductors never had an analog of this one-directional idea without magnetic field; since they are more related to metals (i.e. conductors, as the name says) than semiconductors, which always conduct in both directions and don't have any built in potential. Similarly, Josephson Junctions (JJs), which are sandwiches of two superconductors with non-superconducting, classical barrier materials in-between the superconductors, also haven't had any particular symmetry-breaking mechanism that resulted in a difference between "forward" and "backwards".
Q: How did you manage to do what first seemed impossible?
Ali: “It was really the result of one of my group's fundamental research directions. In what we call "Quantum Material Josephson Junctions" (QMJJs), we replace the classical barrier material in JJs with a quantum material barrier, where the quantum material's intrinsic properties can modulate the coupling between the two superconductors in novel ways. The Josephson Diode was an example of this: we used the quantum material Nb3Br8, which is a 2D material like graphene that has been theorized to host a net electric dipole, as our quantum material barrier of choice and placed it between two superconductors.”
“We were able to peel off just a couple atomic layers of this Nb3Br8 and make a very, very thin sandwich - just a few atomic layers thick - which was needed for making the Josephson diode, and was not possible with normal 3D materials. Nb3Br8, is part of a group of new quantum materials being developed by our collaborators, Professor Tyrel McQueen’s and his group at Johns Hopkins University in the USA, and was a key piece in us realizing the Josephson diode for the first time.”
Q: What does this discovery mean in terms of impact and applications?
Ali: “Many technologies are based on old versions of JJ superconductors, for example MRI technology. Also, quantum computing today is based on Josephson Junctions. Technology which was previously only possible using semi-conductors can now potentially be made with superconductors using this building block. This includes faster computers, as in computers with up to terahertz speed, which is 300 to 400 times faster than the computers we are now using. This will influence all sorts of societal and technological applications. If the 20th century was the century of semi-conductors, the 21st can become the century of the superconductor.”
“The first research direction we have to tackle for commercial application is raising the operating temperature. Here we used a very simple superconductor that limited the operating temperature. Now we want to work with the known so-called "High Tc Superconductors", and see whether we can operate Josephson diodes at temperatures above 77 K, since this will allow for liquid nitrogen cooling. The second thing to tackle is scaling of production. While it’s great that we proved this works in nanodevices, we only made a handful. The next step will be to investigate how to scale production to millions of Josephson diodes on a chip.”
Q: How sure are you of your case?
Ali: “There are several steps which all scientists need to take to maintain scientific rigor. The first is to make sure their results are repeatable. In this case we made many devices, from scratch, with different batches of materials, and found the same properties every time, even when measured on different machines in different countries by different people. This told us that the Josephson diode result was coming from our combination of materials and not some spurious result of dirt, geometry, machine or user error or interpretation.”
“We also carried out "smoking gun" experiments that dramatically narrows the possibility for interpretation. In this case, to be sure that we had a superconducting diode effect we actually tried "switching" the diode; as in we applied the same magnitude of current in both forward and reverse directions and showed that we actually measured no resistance (superconductivity) in one direction and real resistance (normal conductivity) in the other direction.”
“We also measured this effect while applying magnetic fields of different magnitudes and showed that the effect was clearly present at 0 applied field and gets killed by an applied field. This is also a smoking gun for our claim of having a superconducting diode effect at zero-applied field, a very important point for technological applications. This is because magnetic fields at the nanometer scale are very difficult to control and limit, so for practical applications, it is generally desired to operate without requiring local magnetic fields.”
Q: Is it realistic for ordinary computers (or even the supercomputers of KNMI and IBM) to make use of superconducting?
Ali: “Yes it is! Not for people at home, but for server farms or for supercomputers, it would be smart to implement this. Centralized computation is really how the world works now-a-days. Any and all intensive computation is done at centralized facilities where localization adds huge benefits in terms of power management, heat management, etc. The existing infrastructure could be adapted without too much cost to work with Josephson diode based electronics. There is a very real chance, if the challenges discussed in the other question are overcome, that this will revolutionize centralized and supercomputing!”
More information On May 18th – 19th, Professor Mazhar Ali and his collaborators Prof. Valla Fatemi (Cornell University) and Dr. Heng Wu (TU Delft) are hosting a “Superconducting Diode Effects Workshop” on the Virtual Science Forum, in which 12 international experts in the field will be giving recorded talks online (to be published on YouTube) about the current state of the field as well as future research and development directions.
Associate professor Mazhar Ali studied at UC Berkeley and Princeton and did his postdoc at IBM and won the Sofia Kovalevskaja Award from the Alexander von Humboldt Foundation in Germany before joining the faculty of Applied Sciences in Delft.
IMAGE: ARTIST’S CONCEPTUAL IMAGE OF THE 25 EXOPLANETS EXAMINED IN THIS STUDY.view more
CREDIT: ESA/HUBBLE, N. BARTMANN
An international team of researchers examined data for 25 exoplanets and found some links among the properties of the atmospheres, including the thermal profiles and chemical abundances in them. This marks the first time exoplanet atmospheres have been studied as populations, rather than individually. These findings will help establish a generalized theory of planet formation which will improve our understanding of all planets, including the Earth.
Today there are more than 3000 confirmed exoplanets, planets orbiting stars other than the Sun. Because they are far away from Earth, it is difficult to study them in detail. Determining the characteristics of even one exoplanet has been a noteworthy accomplishment.
In this research, astronomers used archival data for 25 hot Jupiters, gas giant planets that orbit close to their host stars. The data included 600 hours of observations from the Hubble Space Telescope and more than 400 hours of observations from the Spitzer Space Telescope.
One of the characteristics investigated by the team was the presence or absence of a “thermal inversion.” Planetary atmospheres trap heat, so in general the temperature increases as you probe deeper into the atmosphere. But some planets show a thermal inversion where an upper layer of the atmosphere is warmer than the layer beneath it. On Earth, the presence of ozone causes a thermal inversion. The team found that almost all of the hot Jupiters with a thermal inversion also showed evidence for hydrogen anion (H-) and metallic species such as titanium oxide (TiO), vanadium oxide (VO), or iron hydride (FeH). Conversely, exoplanets without these chemicals almost never had thermal inversions. It is difficult to draw conclusions based on correlation alone, but since these metallic species are efficient absorbers of stellar light, one theory holds that when these chemicals are present in the upper atmosphere, they absorb light from the host star and cause the temperature to increase.
Masahiro Ikoma at the National Astronomical Observatory of Japan, a co-investigator in this study, explains, “The theory of gas giant formation proposed by my students and I predicted diversity in the composition of hot Jupiter atmospheres, and helped to motivate this systematic survey of atmospheric characteristics.”
This new study, identifying populations of similar exoplanet atmospheres, will help refine the theoretical models, bringing us closer to a comprehensive understanding of planet formation. In the coming decade, new data from next-generation space telescopes, including the James Webb Space Telescope, Twinkle, and Ariel, will provide data for thousands of exoplanets, both enabling and necessitating new categories for classifying exoplanets beyond the methods explored in this research.
IMAGE: THE HABITAT AND ANIMALS THAT WERE FOUND TOGETHER WITH THE GIANT ICHTHYOSAURS COPYRIGHT HEINZ FURRERview more
CREDIT: HEINZ FURRER
Paleontologists have discovered sets of fossils representing three new ichthyosaurs that may have been among the largest animals to have ever lived, reports a new paper in the peer-reviewed Journal of Vertebrate Paleontology.
Unearthed in the Swiss Alps between 1976 and 1990, the discovery includes the largest ichthyosaur tooth ever found. The width of the tooth root is twice as large as any aquatic reptile known, the previous largest belonging to a 15-meter-long ichthyosaur.
Other incomplete skeletal remains include the largest trunk vertebra in Europe which demonstrates another ichthyosaur rivaling the largest marine reptile fossil known today, the 21-meter long Shastasaurus sikkanniensis from British Columbia, Canada.
Dr Heinz Furrer, who co-authors this study, was among a team who recovered the fossils during geological mapping in the Kössen Formation of the Alps. More than 200 million years before, the rock layers still covered the seafloor. With the folding of the Alps, however, they had ended up at an altitude of 2,800 meters!
Now a retired curator at the University of Zurich’s Paleontological Institute and Museum, Dr Furrer said he was delighted to have uncovered “the world's longest ichthyosaur; with the thickest tooth found to date and the largest trunk vertebra in Europe!"
And lead author P. Martin Sandler, of the University of Bonn, hopes “maybe there are more remains of the giant sea creatures hidden beneath the glaciers.”
“Bigger is always better,” he says. “There are distinct selective advantages to large body size. Life will go there if it can. There were only three animal groups that had masses greater than 10–20 metric tonnes: long-necked dinosaurs (sauropods); whales; and the giant ichthyosaurs of the Triassic.”
These monstrous, 80-ton reptiles patrolled Panthalassa, the World's ocean surrounding the supercontinent Pangea during the Late Triassic, about 205 million years ago. They also made forays into the shallow seas of the Tethys on the eastern side of Pangea, as shown by the new finds.
Ichthyosaurs first emerged in the wake of the Permian extinction some 250 million years ago, when some 95 percent of marine species died out. The group reached its greatest diversity in the Middle Triassic and a few species persisted into the Cretaceous. Most were much smaller than S. sikanniensis and the similarly-sized species described in the paper.
Roughly the shape of contemporary whales, ichthyosaurs had elongated bodies and erect tail fins. Fossils are concentrated in North America and Europe, but ichthyosaurs have also been found in South America, Asia, and Australia. Giant species have mostly been unearthed in North America, with scanty finds from the Himalaya and New Caledonia, so the discovery of further behemoths in Switzerland represents an expansion of their known range.
However, so little is known about these giants that there are mere ghosts. Tantalizing evidence from the UK, consisting of an enormous toothless jaw bone, and from New Zealand suggest that some of them were the size of blue whales. An 1878 paper credibly describes an ichthyosaur vertebrae 45 cm in diameter from there, but the fossil never made it to London and may have been lost at sea. Sander notes that "it amounts to a major embarrassment for paleontology that we know so little about these giant ichthyosaurs despite the extraordinary size of their fossils. We hope to rise to this challenge and find new and better fossils soon."
CAPTION
Heinz Furrer with the largest ichthyosaur vertebrae.
These new specimens probably represent the last of the leviathans. “In Nevada, we see the beginnings of true giants, and in the Alps the end,” says Sander, who also co-authored a paper last year about an early giant ichthyosaur from Nevada’s Fossil Hill. “Only the medium–to–large-sized dolphin – and orca-like forms survived into the Jurassic.”
While the smaller ichthyosaurs typically had teeth, most of the known gigantic species appear to have been toothless. One hypothesis suggests that rather than grasping their prey, they fed by suction. “The bulk feeders among the giants must have fed on cephalopods. The ones with teeth likely feed on smaller ichthyosaurs and large fish,” Sander suggests.
The tooth described by the paper is only the second instance of a giant ichthyosaur with teeth—the other being the 15-meter-long Himalayasaurus. These species likely occupied similar ecological roles to modern sperm whales and killer whales. Indeed, the teeth are curved inwards like those of their mammalian successors, indicating a grasping mode of feeding conducive to capturing prey such as giant squid.
“It is hard to say if the tooth is from a large ichthyosaur with giant teeth or from a giant ichthyosaur with average-sized teeth,” Sander wryly acknowledges. Because the tooth described in the paper was broken off at the crown, the authors were not able to confidently assign it to a particular taxon. Still, a peculiarity of dental anatomy allowed the researchers to identify it as belonging to an ichthyosaur.
“Ichthyosaurs have a feature in their teeth that is nearly unique among reptiles: the infolding of the dentin in the roots of their teeth,” explains Sander. “The only other group to show this are monitor lizards.”
The two sets of skeletal remains, which consist of a vertebrae and ten rib fragments, and seven asssociated vertebrae, have been assigned to the family Shastasauridae, which contains the giants Shastasaurus, Shonisaurus, and Himalayasaurus. Comparison of the vertebrae from one set suggests that they may have been the same size or slightly smaller than those of S. sikkanniensis. These measurements are slightly skewed by the fact that the fossils have been tectonically deformed—that is, they have literally been squashed by the movements of the tectonic plates whose collision led to their movement from a former sea floor to the top of a mountain.
Known as the Kössen Formation, the rocks from which these fossils derive were once at the bottom of a shallow coastal area—a very wide lagoon or shallow basin.
This adds to the uncertainty surrounding the habits of these animals, whose size indicates their suitability to deeper reaches of the ocean. “We think that the big ichthyosaurs followed schools of fish into the lagoon. The fossils may also derive from strays that died there,” suggests Furrer.
“You have to be kind of a mountain goat to access the relevant beds,” Sander laughs. “They have the vexing property of not occurring below about 8,000 feet, way above the treeline.”
“At 95 million years ago, the northeastern part of Gondwana, the African plate (which the Kössen Formation was part of), started to push against the European plate, ending with the formation of the very complex piles of different rock units (called "nappes") in the Alpine orogeny at about 30–40 million years ago,” relates Furrer. So it is that these intrepid researchers found themselves picking through the frozen rocks of the Alps and hauling pieces of ancient marine monsters nearly down to sea level once again for entry into the scientific record.
The first ichthyosaurs swam through the primordial oceans in the early Triassic period about 250 million years ago. They had an elongated body and a relatively small head. But shortly before most of them became extinct some 200 million years ago (only the familiar dolphin-like species survived until 90 million years ago), they evolved into gigantic forms. With an estimated weight of 80 tons and a length of more than 20 meters, these prehistoric giants would have rivaled a sperm whale. However, they left scarcely any fossil remains - "why that is remains a great mystery to this day," stresses Prof. Dr. Martin Sander from the Section Paleontology at the Institute of Geosciences at the University of Bonn.
Folding of the Alps brought up fossils from the bottom of the sea
The finds now examined come from the Grisons (canton of Graubünden). Sander's colleague Dr. Heinz Furrer of the University of Zurich had recovered them together with students between 1976 and 1990 during geological mapping in the Kössen Formation. More than 200 million years before, the rock layers with the fossils still covered the seafloor. With the folding of the Alps, however, they had ended up at an altitude of 2,800 meters. "Maybe there are more rests of the giant sea creatures hidden beneath the glaciers," Sander hopes.
The paleontologist first held the fossilized bones in his hands three decades ago. At that time, he was still a doctoral student at the University of Zurich. In the meantime, the material had been somewhat forgotten. "Recently, though, more remains of giant ichthyosaurs have appeared," the researcher explains. "So it seemed worthwhile to us to analyze the Swiss finds again in more detail as well."
According to the study, the fossils come from three different animals that lived about 205 million years ago. From one of the ichthyosaurs, a vertebra is preserved together with ten rib fragments. Their sizes suggest that the reptile was probably 20 meters in length. In contrast, only a series of vertebrae were excavated from a second ichthyosaur. Comparison with better preserved skeletal finds suggests a length of about 15 meters.
"From our point of view, however, the tooth is particularly exciting," explains Sander. "Because this is huge by ichthyosaur standards: Its root was 60 millimeters in diameter - the largest specimen still in a complete skull to date was 20 millimeters and came from an ichthyosaur that was nearly 18 meters long." His colleague Heinz Furrer is delighted with the belated appreciation of the spectacular remains from the Swiss Alps: "The publication has confirmed that our finds at the time belonged to the world's longest ichthyosaur; with the thickest tooth found to date and the largest trunk vertebra in Europe!"
However, it is unlikely that the animals that populated the primordial oceans 205 million years ago were much longer than previously thought. "The tooth diameter cannot be used to directly infer the length of its owner," emphasizes paleontologist Martin Sander from Bonn. "Still, the find naturally raises questions."
Root of the tooth (IMAGE)
UNIVERSITY OF BONN
CAPTION
The root of the tooth found has a diameter of 60 Millimeters. This makes it the thickest ichthyosaur tooth found so far.
Predators larger than a sperm whale are not really possible
This is because research assumes that extreme gigantism and a predatory lifestyle (which requires teeth) are incompatible. There is a reason why the largest animal of our time is toothless: the blue whale, which can be up to 30 meters long and weighs 150 tons. Next to it, the teeth-bearing sperm whale (20 meters and 50 tons) looks like an adolescent. While the blue whale filters tiny creatures from the water, the sperm whale is a perfect hunter. This means it requires a larger portion of the calories it consumes to fuel its muscles. "Marine predators therefore probably can't get much bigger than a sperm whale," Sander says.
It is thus possible that the tooth did not come from a particularly gigantic ichthyosaur - but from an ichthyosaur with particularly gigantic teeth.
Participating institutions:
The Section Paleontology of the Institute of Geosciences of the University of Bonn, the Paleontological Institute and Museum of the University of Zurich, and the Institute of Anatomy of the University of Bonn were involved in the study.
UNIVERSITY OF MIAMI ROSENSTIEL SCHOOL OF MARINE & ATMOSPHERIC SCIENCE
IMAGE: RED TIDES ARE BECOMING A NEAR ANNUAL OCCURRENCE OFF THE WEST COAST OF FLORIDA, WHICH ARE CAUSED BY MASSIVE BLOOMS OF THE ALGAE KARENIA BREVIS FUELED IN PART BY EXCESS NUTRIENTS IN THE OCEAN.view more
CREDIT: NASA EARTH OBSERVATORY IMAGES BY JOSHUA STEVENS, USING LANDSAT DATA, ACQUIRED ON JULY 14, 2021, BY THE OPERATIONAL LAND IMAGER (OLI) ON LANDSAT 8.
MIAMI—A new study found that when red tides began in early summer and continued into the fall, low oxygen areas—or dead zones— were more likely to also occur. This study by scientists at the University of Miami Rosenstiel School of Marine and Atmospheric Science, and NOAA collaborators is the first study to link low oxygen—or hypoxia—to red tides across the west coast of Florida and offers new information to better understand the conditions favorable for combined events as they are expected to increase as Earth continues to warm.
Red tides are becoming a near annual occurrence off the west coast of Florida, which are caused by massive blooms of the algae Karenia brevis fueled in part by excess nutrients in the ocean. These algae blooms turn the ocean surface red and produce toxins that are harmful to marine mammals, sharks, seabirds and humans causing a range of issues from respiratory irritation, localized fish kills to large-scale massive mortalities to marine life. Hypoxic areas are typically referred to as ‘dead zones’.
“These events are so disruptive they are being incorporated in population assessments of some grouper species for use in fishery management decisions. During the 2005 red tide that also had hypoxia, it was estimated that about 30% of the red grouper population was killed,” said Brendan Turley, an assistant scientist at the UM Rosenstiel School and NOAA’s Cooperative Institute of Marine and Atmospheric Studies. “There are also concerns that the conditions favorable for combined red tide and hypoxia events will increase with climate change projections into the future.”
The study, conducted as part of NOAA’s Gulf of Mexico Integrated Ecosystem Assessment Program, examined nearly 20 years of oceanographic data that included temperature, salinity, and dissolved oxygen from the surface to the seafloor across the West Florida Shelf to determine the frequency of hypoxia and association with known red tides. The researchers found that hypoxia was present in five of the 16 years examined, three of which occurred concurrently with extreme red tides in 2005, 2014, and 2018. There is an ongoing effort to collaborate with commercial fishermen in Southwest Florida to monitor for red tide blooms and formation of hypoxia, which incorporates data collected during various NOAA surveys conducted in the region annually.
The study, titled “Relationships between blooms of Karenia brevis and hypoxia across the West Florida Shelf,” will appear in the May issue of the journal Harmful Algae, which is currently online. The study’s authors include: Brendan Turley from the UM NOAA Cooperative Institute for Marine and Atmospheric Studies; Mandy Karnauskas, Matthew Campbell, David Hanisko from NOAA’s Southeast Fisheries Science Center and Christopher Kelble from NOAA’s Atlantic Oceanographic and Meteorological Laboratory.
This research was carried out, in part, under the auspices of the Cooperative Institute for Marine and Atmospheric Studies and the National Oceanic and Atmospheric Administration, cooperative agreement # NA20OAR4320472.