Monday, February 07, 2022

Dark matter travelling through stars could produce potentially detectable shock waves

Dark matter travelling through stars could produce potentially detectable shock waves
Illustration of the shock formation. A dark asteroid traveling supersonically through a star 
creates a strong shock wave near it. The shock wave travels to the surface of the star, 
where it releases its energy as heat. Credit: Das et al

Dark matter, a hypothetical material that does not absorb, emit or reflect light, is thought to account for over 80 percent of the matter in the universe. While many studies have indirectly hinted at its existence, so far, physicists have been unable to directly detect dark matter and thus to confidently determine what it consists of.

One factor that makes searching for dark matter particularly challenging is that very little is known about its possible mass and composition. This means that dark matter searches are based on great part on hypotheses and theoretical assumptions.

Researchers at SLAC National Accelerator Laboratory and Université Paris Saclay have recently carried out a theoretical study that could introduce a new way of searching for dark matter. Their paper, published in Physical Review Letters, shows that when macroscopic dark matter travels through a star, it could produce shock waves that might reach the star's surface. These waves could in turn lead to distinctive and transient optical, UV and X-ray emissions that might be detectable by sophisticated telescopes.

"Most experiments have searched for dark matter made of separate particles, each about as heavy as an , or clumps about as massive as planets or stars," Kevin Zhou, one of the researchers who carried out the study, told Phys.org. "We were interested in the intermediate case of asteroid-sized dark matter, which had been thought to be hard to test experimentally, since dark asteroids would be too rare to impact Earth, but too small to see in space."

Initially, Zhou and his colleagues started exploring the possibility that the heat produced during the impact between a dark matter asteroid and an ordinary star could result in the star exploding. This hypothesis was based on past studies suggesting that energy deposition can sometimes trigger supernova in white dwarfs. After a few weeks of calculations and discussions, however, the team realized that the impact between a dark matter asteroid and an ordinary star would most likely not lead to an explosion, as ordinary stars are more stable than white dwarfs.

"We had a hunch that the energy produced by such a collision should be visible somehow, so we brainstormed for a few months, trying and tossing out idea after idea," Zhou explained. "Finally, we realized that the shock waves generated by the dark asteroid's travel through the star were the most promising signature."

Dark matter travelling through stars could produce potentially detectable shock waves
Illustration of the detection method. In a traditional search for particle dark matter (left), 
individual dark matter particles collide with nuclei in a detector on Earth. The resulting 
recoil energy can be seen by sensitive detectors. Analogously, dark asteroids can collide 

with stars (right), leading to shock waves that heat up their surfaces. The resulting UV
 emission can be seen by telescopes on Earth. Credit: Das et al

Shock waves are sharp signals that are produced when an object is moving faster than the speed of sound. For instance, a  produces a sonic boom, which can be heard from the Earth's surface even when it is flying miles above it.

Similarly, Zhou and his colleagues predicted that the  produced by dark asteroids deep inside a star could reach a star's surface. This would in turn result in a short-lived hot spot that could be detected using telescopes that can examine the UV spectrum.

"We're excited that we identified a powerful new way to search for a kind of dark matter thought to be hard to test, using telescopes that we already have in an unexpected way," Zhou said. "The most powerful UV telescope is the Hubble space telescope, but since stellar shock events are transients, it helps to be able to monitor more of the sky at once."

The recent study follows a growing trend within the astrophysics community to use astronomical objects as enormous dark matter detectors. This promising approach to searching for dark  unites the fields of particle physics and astrophysics, bringing these two communities closer together.

In the future, the recent work by this team of researchers could inspire engineers to build new and smaller UV telescopes that can observe wider parts of the universe. A similar telescope, dubbed ULTRASAT, is already set to be released in 2024. Using this telescope, physicists could try searching for  by examining stellar surfaces. In their next works, the researchers themselves plan to try to detect potential dark asteroid impact events using UV telescope data.

"The ideal case would be to use the Hubble space  to monitor a large globular cluster in the UV," Zhou said. "It would also be interesting to consider dark asteroids impacting other astronomical objects. Since our work, there have been papers by others considering impacts on neutron  and red giants, but there are probably even more promising ideas in this direction that nobody has thought of yet."Physicist seeks to understand dark matter with Webb Telescope

More information: Anirban Das et al, Stellar Shocks from Dark Matter Asteroid Impacts, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.128.021101

Journal information: Physical Review Letters 

© 2022 Science X Network

NASA Proposes a Way That Dark Matter’s Influence Could Be Directly Observed

Milky Way Galaxy and Central Bar Viewed From Above

This artist’s rendering shows a view of our own Milky Way Galaxy and its central bar as it might appear if viewed from above. Credit: NASA/JPL-Caltech/R. Hurt (SSC)

How Dark Matter Could Be Measured in the Solar System

Pictures of the Milky Way show billions of stars arranged in a spiral pattern radiating out from the center, with illuminated gas in between. But our eyes can only glimpse the surface of what holds our galaxy together. About 95 percent of the mass of our galaxy is invisible and does not interact with light. It is made of a mysterious substance called dark matter, which has never been directly measured.

Now, a new study calculates how dark matter’s gravity affects objects in our solar system, including spacecraft and distant comets. It also proposes a way that dark matter’s influence could be directly observed with a future experiment. The article is published in the Monthly Notices of the Royal Astronomical Society.

“We’re predicting that if you get out far enough in the solar system, you actually have the opportunity to start measuring the dark matter force,” said Jim Green, study co-author and advisor to NASA’s Office of the Chief Scientist. “This is the first idea of how to do it and where we would do it.”

Dark matter in our backyard

Here on Earth, our planet’s gravity keeps us from flying out of our chairs, and the Sun’s gravity keeps our planet orbiting on a 365-day schedule. But the farther from the Sun a spacecraft flies, the less it feels the Sun’s gravity, and the more it feels a different source of gravity: that of the matter from the rest of the galaxy, which is mostly dark matter. The mass of our galaxy’s 100 billion stars is minuscule compared to estimates of the Milky Way’s dark matter content.

To understand the influence of dark matter in the solar system, lead study author Edward Belbruno calculated the “galactic force,” the overall gravitational force of normal matter combined with dark matter from the entire galaxy. He found that in the solar system, about 45 percent of this force is from dark matter and 55 percent is from normal, so-called “baryonic matter.” This suggests a roughly half-and-half split between the mass of dark matter and normal matter in the solar system.

“I was a bit surprised by the relatively small contribution of the galactic force due to dark matter felt in our solar system as compared to the force due to the normal matter,” said Belbruno, mathematician and astrophysicist at Princeton University and Yeshiva University. “This is explained by the fact most of dark matter is in the outer parts of our galaxy, far from our solar system.”

A large region called a “halo” of dark matter encircles the Milky Way and represents the greatest concentration of the dark matter of the galaxy. There is little to no normal matter in the halo. If the solar system were located at a greater distance from the center of the galaxy, it would feel the effects of a larger proportion of dark matter in the galactic force because it would be closer to the dark matter halo, the authors said.

NASA Voyager 1 Spacecraft Solar System

In this artist’s conception, NASA’s Voyager 1 spacecraft has a bird’s-eye view of the solar system. The circles represent the orbits of the major outer planets: Jupiter, Saturn, Uranus, and Neptune. Launched in 1977, Voyager 1 visited the planets Jupiter and Saturn. The spacecraft is now more than 14 billion miles from Earth, making it the farthest human-made object ever built. In fact, Voyager 1 is now zooming through interstellar space, the region between the stars that is filled with gas, dust, and material recycled from dying stars. Credit: NASA, ESA, and G. Bacon (STScI)

How dark matter may influence spacecraft

Green and Belbruno predict that dark matter’s gravity ever so slightly interacts with all of the spacecraft that NASA has sent on paths that lead out of the solar system, according to the new study.

“If spacecraft move through the dark matter long enough, their trajectories are changed, and this is important to take into consideration for mission planning for certain future missions,” Belbruno said.

Such spacecraft may include the retired Pioneer 10 and 11 probes that launched in 1972 and 1973, respectively; the Voyager 1 and 2 probes that have been exploring for more than 40 years and have entered interstellar space; and the New Horizons spacecraft that has flown by Pluto and Arrokoth in the Kuiper Belt.

But it’s a tiny effect. After traveling billions of miles, the path of a spacecraft like Pioneer 10 would only deviate by about 5 feet (1.6 meters) due to the influence of dark matter. “They do feel the effect of dark matter, but it’s so small, we can’t measure it,” Green said.

Where does the galactic force take over?

At a certain distance from the Sun, the galactic force becomes more powerful than the pull of the Sun, which is made of normal matter. Belbruno and Green calculated that this transition happens at around 30,000 astronomical units, or 30,000 times the distance from Earth to the Sun. That is well beyond the distance of Pluto, but still inside the Oort Cloud, a swarm of millions of comets that surrounds the solar system and extends out to 100,000 astronomical units.

This means that dark matter’s gravity could have played a role in the trajectory of objects like ‘Oumuamua, the cigar-shaped comet or asteroid that came from another star system and passed through the inner solar system in 2017. Its unusually fast speed could be explained by dark matter’s gravity pushing on it for millions of years, the authors say.

If there is a giant planet in the outer reaches of the solar system, a hypothetical object called Planet 9 or Planet X that scientists have been searching for in recent years, dark matter would also influence its orbit. If this planet exists, dark matter could perhaps even push it away from the area where scientists are currently looking for it, Green and Belbruno write. Dark matter may have also caused some of the Oort Cloud comets to escape the orbit of the Sun altogether.

Could dark matter’s gravity be measured?

To measure the effects of dark matter in the solar system, a spacecraft wouldn’t necessarily have to travel that far. At a distance of 100 astronomical units, a spacecraft with the right experiment could help astronomers measure the influence of dark matter directly, Green and Belbruno said.

Specifically, a spacecraft equipped with radioisotope power, a technology that has allowed Pioneer 10 and 11, the Voyagers, and New Horizon to fly very far from the Sun, may be able to make this measurement. Such a spacecraft could carry a reflective ball and drop it at an appropriate distance. The ball would feel only galactic forces, while the spacecraft would experience a thermal force from the decaying radioactive element in its power system, in addition to the galactic forces. Subtracting out the thermal force, researchers could then look at how the galactic force relates to deviations in the respective trajectories of the ball and the spacecraft. Those deviations would be measured with a laser as the two objects fly parallel to one another.

A proposed mission concept called Interstellar Probe, which aims to travel to about 500 astronomical units from the Sun to explore that uncharted environment, is one possibility for such an experiment.

Galaxy Cluster Cl 0024+17

Two views from Hubble of the massive galaxy cluster Cl 0024+17 (ZwCl 0024+1652) are shown. To the left is the view in visible-light with odd-looking blue arcs appearing among the yellowish galaxies. These are the magnified and distorted images of galaxies located far behind the cluster. Their light is bent and amplified by the immense gravity of the cluster in a process called gravitational lensing. To the right, a blue shading has been added to indicate the location of invisible material called dark matter that is mathematically required to account for the nature and placement of the gravitationally lensed galaxies that are seen. Credit: NASA, ESA, M.J. Jee and H. Ford (Johns Hopkins University)

More about dark matter

Dark matter as a hidden mass in galaxies was first proposed in the 1930s by Fritz Zwicky. But the idea remained controversial until the 1960s and 1970s, when Vera C. Rubin and colleagues confirmed that the motions of stars around their galactic centers would not follow the laws of physics if only normal matter were involved. Only a gigantic hidden source of mass can explain why stars at the outskirts of spiral galaxies like ours move as quickly as they do.

Today, the nature of dark matter is one of the biggest mysteries in all of astrophysics. Powerful observatories like the Hubble Space Telescope and the Chandra X-Ray Observatory have helped scientists begin to understand the influence and distribution of dark matter in the universe at large. Hubble has explored many galaxies whose dark matter contributes to an effect called “lensing,” where gravity bends space itself and magnifies images of more distant galaxies.

Astronomers will learn more about dark matter in the cosmos with the newest set of state-of-the-art telescopes. NASA’s James Webb Space Telescope, which launched Dec. 25, 2021, will contribute to our understanding of dark matter by taking images and other data of galaxies and observing their lensing effects. NASA’s Nancy Grace Roman Space Telescope, set to launch in the mid-2020s, will conduct surveys of more than a billion galaxies to look at the influence of dark matter on their shapes and distributions.

The European Space Agency’s forthcoming Euclid mission, which has a NASA contribution, will also target dark matter and dark energy, looking back in time about 10 billion years to a period when dark energy began hastening the universe’s expansion. And the Vera C. Rubin Observatory, a collaboration of the National Science Foundation, the Department of Energy, and others, which is under construction in Chile, will add valuable data to this puzzle of dark matter’s true essence.

But these powerful tools are designed to look for dark matter’s strong effects across large distances, and much farther afield than in our solar system, where dark matter’s influence is so much weaker.

“If you could send a spacecraft out there to detect it, that would be a huge discovery,” Belbruno said.

Reference: “When leaving the Solar system: Dark matter makes a difference” by Edward Belbruno and James Green, 4 January 2022, Monthly Notices of the Royal Astronomical Society.
DOI: 10.1093/mnras/stab3781

Sunday, February 06, 2022

Biofuels Have a Big, Big Problem Nobody Talks About

2 Feb 2022,   


 Think of economically viable alternatives to traditional fuels like gas and diesel. Nothing makes more immediate sense than biofuels. They use waste, produce fewer emissions, and lower the supply of untreated fuels needed everywhere else.

The United States alone is a powerhouse in the cultivation of corn – an ingredient that has now become extremely important for biofuels. When driving across the Upper Midwest, you will be met by vast corn fields stretching to the horizon in either direction. This agricultural giant was the bedrock of the American diet. Corn flower, corn bread, corn meal – all staples of traditional American food – are now being actively forgotten. Farmers and companies are pivoting toward the energy industry. Today, nearly half of all corn production is used for making ethanol. Mixing this liquid with gasoline results in a reduction for the carbon footprint of all those commuting.

For most Americans today, the gasoline they use is composed by up to 10% of ethanol and farmers are making more than ever because of the government mandate. Moreover, the current administration is being lobbied to up the percentage of ethanol found in fuels.Why this is happening
There is nothing inherently wrong with burning fuel as long as we develop a method to capture its products. Unfortunately, we have become like the yeast trapped in a brewer`s vat: we are using the fuel provided and releasing toxic byproducts. At some point, the alcohol produced reaches a concentration that is extremely harmful. Here, survival is no longer an option. Moreover, nobody is interested in investing in a dying economic sector like hydrocarbons. But, for us humans, biofuels try to address this imbalance before it`s too late or, you know, before EVs completely take over.

The theoretical cycle of biofuels is comprised of the theory that they can be restored over a short period of time, unlike fossil fuels. You take the corn, you ferment it and, voila, you have fuel that can be burned, can release CO2 and it will all be taken back by next year`s crop.

According to the U.S. Energy Information Administration, in 2019 only 5% of the total energy was biomass. Almost half of that has been ethanol. The primary method of producing this liquid is from yeast. Without oxygen and with the help of bacteria, it uses anaerobic respiration where it converts sugar into energy and ethanol. The U.S. is the largest producer of ethanol, and the quantity made is growing yearly.What we will face
Given the fact that it`s being requested to double the percentage of ethanol in fuels and turning them into biofuels, the land needed to plant corn only will amass over 30,000,000 hectares. One fifth of the U.S. farmland will be used only for producing ethanol.

Farmers are getting more cash for their crops and the need for larger quantities of corn will only encourage the use of land in this direction. Croplands are expanding at an explosive rate: over 1,000,000 acres per year! Therefore, natural habitats are suffering in the U.S.

Food prices have also risen because of this growing interest in ethanol. Corn is being used for chickens, cows, and other animals, so, naturally, the prices of eggs and milk are now much higher than ever before. This vicious cycle will continue, and the average U.S. citizen will pay even more.

On top of all that, one liter of ethanol contains only 5.130 kcal, and to make it you need 6.600 kcal of energy. This means that from the start there is a loss of 22,3%. It is a negative process. This is not a green technology because, in reality, photosynthesis is also a very inefficient way to turn sunlight into usable energy. On average, plants can capture and convert about 1% of sunlight, while humans can use up to 20% of the sunlight with photovoltaic cells. Corn is even worse at this than your average plant – 0,25%.Growing plants for food is a necessity, growing plants for power is irrational
Corn also needs a lot of water to grow properly and is, as stated above, inefficient. Water is incredibly scarce, and its absence is already causing havoc in some parts of the world, not only U.S. Using it for that part of the agriculture that in the end produces biofuels is wrong. The water footprint of biomass energy is 72 times higher than that of fossil fuels and 240 times more than solar! More, over 80% of freshwater is already being used for agriculture in the States. Increasing crops for biofuels will just raise this percentage to an unsustainable level.

That is one of the reasons why the E.U. already put a mild stop to this since 2014.

Nonetheless, the government will keep subsidizing this sector just because it represents over 300,000 jobs, which cannot be lost under any administration that wants a second term.Viable alternatives
Batteries are still heavy and still short ranged for heavy duty vehicles or planes.

The precious freshwater can be preserved by using algae. This, however, is expensive. The cost is now somewhere between $300 and $2600 per barrel. It has great potential and that is why researchers like those at the Pacific Northwest National Laboratory are already working on it to make it as cheap as possible. They are trying to create new strains of algae because it has a greater fat content, which can be turned into green crude oil that can eventually become fuels for transportation.

Finally, we need alternative technologies, and we should invest in them. Unfortunately, we don`t. Biofuels obtained from ethanol and fossil fuels are not at all green or sustainable. 
Visualized: a third of Americans already face above-average warming
An unofficial thermometer reads 55C (132F) at Death Valley
 National Park, California, in 2021.
 Photograph: David Becker/Getty Images

Temperatures in 499 counties across west, northeast and upper midwest US have already breached 1.5C (2.7F)


by Oliver Milman, graphics by Andrew Witherspoon

FOR GRAPICS:

Sun 6 Feb 2022 

More than a third of the American population is currently experiencing rapid, above-average rates of temperature increase, with 499 counties already breaching 1.5C (2.7F) of heating, a Guardian review of climate data shows.

The US as a whole has heated up over the past century due to the release of planet-warming gases from burning fossil fuels, and swathes of the US west, northeast and upper midwest – representing more than 124.6 million people – have recorded soaring increases since federal government temperature records began in 1895.

Though the climate crisis is convulsing the US, it is doing so unevenly. Hotspots of extreme warming have emerged in many of America’s largest cities, and places as diverse as California’s balmy coast to the previously frigid northern reaches of Minnesota, while other places, particularly in the south, have barely seen their temperatures budge.

“The warming isn’t distributed evenly,” said Brian Brettschneider, an Alaska-based climate scientist who collated the county temperature data from the National Oceanic and Atmospheric Administration (Noaa). “Many places have seen dramatic changes, but there are always some places below the average who will think, ‘It didn’t seem that 

Ventura county in California has heated up more than any other county in the contiguous US, according to the Noaa data, experiencing a 2.6C (4.75F) increase in total warming in the period from 1895 to 2021. Meanwhile, counties that include many of America’s largest cities, including New York, Los Angeles, Miami, Philadelphia, San Fransisco and Boston, have all seen their average temperatures rise far beyond the national average, which stands at around a 1C (1.8F) increase on pre-industrial times.

Mark Jackson, a meteorologist at the National Weather Service based in Oxnard in Ventura county said the county’s temperature increase is “a remarkable number, it’s a scary number when you consider the pace we are looking at”. Jackson said the county has seen a large increase in heatwaves, including a spell above 37C (100F) last summer that “really stressed” the local community.

Ventura county, which hugs the Californian coast northwest of Los Angeles, is known for a pleasant Mediterranean climate cooled slightly by the proximity of the ocean. But Jackson said that recent heatwaves have seen warm air flow down from mountains in the nearby Los Padres National Forest to the coast, while the ocean itself is being roiled by escalating temperatures. “It’s been really remarkable to see it get that hot right up to the coast,” he said.

California is in the grip of its most severe drought in 1,200 years and scientists say this is fueling the heat seen in many places in the state – Los Angeles has warmed by 2.3C (4.2F) since 1895, while Santa Barbara has jumped by 2.4C (4.38F) – by reducing moisture in soils, which then bake more quickly.

Higher temperatures are also worsening the risk of wildfires in the state. “We lost everything,” said Tyler Suchman, founder of online marketing firm Tribal Core who in 2017 fled with his wife to escape a huge wildfire that razed their home in Ojai, in Ventura county. “It was harrowing. The winds were blowing like crazy and the hills lining the highway were all on fire, I had never seen anything like it.”

Just 11 months later, a separate wildfire destroyed the couple’s next home, in Malibu, as their neighbor scooped up water from his hot tub in a desperate attempt to tackle the flames. “No one wants us to move next to them now,” Suchman said. “You can see how the area has changed over the 18 years since we moved to Ojai. It’s a beautiful place but regretfully we can’t live there now, the risk is too great.”

Hotspots of above-average warming are found across the US. Grand county in Utah, a place of sprawling deserts, cliffs and plateaus, is the second fastest warming county in the lower 48 states, while every county in New Jersey, Massachusetts and Connecticut has warmed by more than 1.5C (2.7F) since 1895.

It’s the more northern latitudes that have experienced the most extreme recent heat, however, with counties in Alaska making up all of the top six fastest warming places since 1970 (comparable temperature data for Alaska does not go back further than the 1920s). Alaska’s North slope, situated within the rapidly warming Arctic, has heated up by an enormous 3.7C (6.6F) in just the past 50 years.

“There really is a climate shift underway in Alaska, everyone can see things are different than they used to be and everyone is concerned about what the future here will look like,” said Brettschneider, who said even his teenage children have noticed the retreat of sea ice, an elongating fire season and a dearth of cold days.

The warmth is also melting frozen soils, known as permafrost, causing buildings to subside and roads to buckle. “If you drive on the roads near Fairbanks you better have a strong stomach because it feels like you’re riding a rollercoaster,” said Katharine Hayhoe, a climate scientist at Texas Tech University and chief scientist at the Nature Conservancy.

Other locations that traditionally used to severe cold have also seen sharp temperature increases. Roseau and Kittson counties, in northern Minnesota, are both in the top five fastest warming counties in the lower 48 states, with their warming driven by winters that have heated up by around 3.8C (7F) in the state since modern record keeping began.

Winters are warming more quickly than summers because more heat usually escapes the land during the colder months, but it is now being trapped by greenhouse gases. “Some might say ‘well I like warmer winters’ but people are noticing negative impacts, such as changes to the growing season and the loss of cultural practices such as cross-country skiing races,” said Heidi Roop, a climate scientist at the University of Minnesota. “Even small temperature changes have big consequences.”

Globally, governments set a goal in the 2015 Paris climate agreement to avoid a temperature rise of 1.5C (2.7F) above the pre-industrial era. Beyond this point, scientists say, the world will face increasingly punishing heatwaves, storms, flooding and societal unrest.

While certain areas of the US have already passed 1.5C, the important metric is still the global average, Hayhoe said. “In some places a 2C increase is fine but 2.5C is when the wheels fall off the bus, some locations are OK with 5ft of sea level rise because of their elevation while others can’t cope with 5 in because they are low-lying,” she said. “Local vulnerability is very customized. What’s relevant for communities is whether the world meets its targets or not, it’s a collective target for the world.”

That global threshold is in severe peril, with some forecasts warning that 1.5C (2.7F) could be breached within a decade without drastic cuts to carbon emissions. Communities will need to brace themselves for the consequences of this, according to Roop.

“The warming we are seeing is pushing at the bounds of lived human experience, of what we thought was possible,” she said. “We are paying the costs for that and we need to prepare for the changes already set in motion, as well as to prevent further warming.”

HOMOCIDE & ECOCIDE

FPSO Explodes and Sinks off Nigeria with 10 Aboard 

oil storage vessel explores and catches fire off Nigeria
Trinity Spirt caught fire and sunk off the Nigerian coast

PUBLISHED FEB 3, 2022 3:07 PM BY THE MARITIME EXECUTIVE

 

An aging tanker that was operating as a floating oil production and storage vessel (FPSO) exploded and caught fire before sinking off the coast of Nigeria on Wednesday, February 2. Ten people working aboard the vessel were reported missing and presumed killed.

The Trinity Spirit, which was 274,774 dwt and measured 1,105 feet, was built in 1976. Shebah Exploration and Production Company Limited (SEPCOL) which was the operator at the Nigerian oil field reports that an offshore company purchased the FPSO Trinity Spirit from ConocoPhillips and leased it the vessel to SEPCOL on a bareboat basis. The FPSO was serving as the primary production facility for the OML 108 in Nigeria's offshore Ukpokiti oil field located near the Niger Delta.

The company indicated that the Trinity Spirit could process up to 22,000 barrels per day as well as inject up to 40,000 barrels with water per day. Its storage capacity was 2 million barrels of oil, although it is not known how much was aboard at this time. The Nigerian oil ministry indicated that the Ukpokiti oil field was not in production in 2020 and 2021. Some reports are indicating that the company was in financial trouble and that the Nigerian authorities revoked the production license in 2019.

 

 

Details about the incident are scarce. Local media outlets report that there were one or more explosions aboard the vessel early on Wednesday followed by the raging fire. Pictures from the scene show the vessel settling midships with the fire raging near the accommodation block and bridge.

SEPCOL issued a brief statement saying that the vessel had caught fire after an explosion confirming that 10 people were working aboard the vessel. They said that attempts to contain the situation were being made with help from local communities and other oil fields companies working in the area.

"The cause of the explosion is currently being investigated and we are working with necessary parties to contain the situation," said the company's chief executive Ikemefuna Okafor in the written statement. "We appreciate the assistance provided us by the Clean Nigeria Associates, the Chevron team operating in the nearby Escravos facility, and our community stakeholders as well as fishermen, who have been of tremendous assistance since the incident happened.”

Nigerian officials said efforts were underway to contain the environmental damage to the vessel and that they planned to conduct a full investigation into the cause of the fire.



Design Efforts Proceeding for Mobile Test Vessel for Tidal Energy

mobile test vessel for tidal energy research
Preliminary conceptual design for the mobile test vessel for tidal energy research (IDOM)

PUBLISHED FEB 4, 2022 6:00 PM BY THE MARITIME EXECUTIVE

 

The U.S. Department of Energy in collaboration with engineering firm IDOM and Florida Atlantic University’s Southeast National Marine Renewable Energy Center is moving forward with the plans to develop a vessel designed to test tidal energy technologies in a range of real-world environments. The project, which was selected by DOE in December 2020, has issued a request for information soliciting feedback from developers of current energy converters on how the mobile test vessel can best support the development of energy technologies.

“As the marine energy industry continues to advance technologies towards commercialization, there is an ongoing need for testing at all levels of technological development,” they write in the solicitation for input. “The slow pace of design and in-water testing cycles is further exacerbated by the limited availability of testing infrastructure at various scales, complex and time-consuming permitting processes, and expensive environmental monitoring.”

The vessel concept was proposed by IDOM and Florida Atlantic University in response to the DOE’s effort to support foundational research and development and expand testing capacity to advance the marine energy industry. DOE believes the vessel can address the gap in the testing capabilities as part of its broader program to accelerate research in the field.

Existing testing infrastructure in the U.S. can only accommodate small-scale current energy converters with rotors two to three meters in diameter. DOE believes there is a need for a mobile testing capability that can accommodate CECs with up to eight-meter diameter rotors for testing turbines under different flow conditions in a wide range of test conditions.

The concept for the mobile test vessel (MTV) is that it would accommodate a wide range of turbine types and sizes, current speeds, depths, wave conditions, and seabed types. The MTV would support anchoring and mooring for the testing of CECs in rivers, tides, and or open sea. It would also potentially incorporate an onboard power generation system to support test setup, maintenance, inspection, testing, and services for data acquisition systems.

The RFI seeks information from developers and others involved in the research to understand technologies that would utilize the MTV as well as how the MTV can best support the development of CEC technologies and testing. The responses will be used for strategic planning to ensure the MTV’s capabilities are aligned with industry needs. Based on the input, the project seeks to define the MTV’s main requirements and its final configuration.

The effort to develop the designs for the test vessel comes as the DOE announced a new round of grants to support testing of wave energy projects at an offshore facility. DOE awarded $25 million in funding to support eight projects that will make up the first round of open-water testing at the PacWave South test site located off the Oregon coast. Oregon State University is targeting the summer of 2023 for the first tests at its offshore facility.
 

IMPERIALISM AT SEA

Ocean Shipping Reform Act to Strengthen FMC Reaches U.S. Senate

reform of FMC reaches US Senate
(file photo)

PUBLISHED FEB 4, 2022 2:25 PM BY THE MARITIME EXECUTIVE

 

The U.S. Senate is set to take up reform of the federal regulations for the global shipping industry focusing on the challenges that American exporters have been experiencing due to port congestion and disruptions in the global supply chain.

On Thursday, Senators John Thune of South Dakota and Amy Klobuchar of Minnesota introduced the Ocean Shipping Reform Act following the U.S. House of Representatives passage of its version of the bill in December 2021. The bill, which represents the first reform of the Federal Maritime Commission and its authorities in decades, has already met with strong criticism from the shipping industry.

The Senate version of the bill, which has wide support from a broad coalition from the agricultural sector as well as manufacturers and related businesses such as trucking, is similar to the House version that passed with a strong partisan vote. According to the senators, the bill would level the playing field for American exporters by making it harder for ocean carriers to unreasonably refuse goods ready to export at ports, and it would give the FMC greater rulemaking authority to regulate the carriers’ business practices.

“Congestion at ports and increased shipping costs pose unique challenges for U.S. exporters, who have seen the price of shipping containers increase four-fold in just two years. Meanwhile, ocean carriers have reported record profits,” said Klobuchar. “This legislation will level the playing field by giving the Federal Maritime Commission greater authority to regulate harmful practices by carriers and set rules on what fees carriers can reasonably charge shippers and transporters.”

Among the specific provisions contained in the draft are rules to prohibit ocean carriers from “unreasonably” declining U.S. exports. The FMC would be given greater authority to oversee the export practices as well as to register the shipping exchanges focusing on contract negotiations. The FMC would also be able to self-initiate investigations into the business practices of the carriers as well as enforcement actions. To create greater transparency, carriers would be required to report quarterly to the FMC on their import and export volumes and the number of containers brought to the U.S. on each vessel.

“Producers across America are paying the price,” said Thune arguing that ocean carriers do not operate under fair and transparent rules. “The improvements made by this bill would provide the FMC with the tools necessary to address unreasonable practices by ocean carriers, holding them accountable for their bad-faith efforts that disenfranchise American producers.”

The World Shipping Council lobbying on behalf of ocean carriers was quick to renew its criticism of the efforts. “The deeply flawed bill passed by the House at the end of last year would place government officials in the role of second-guessing commercially negotiated service contracts and dictating how carriers operate ship networks – an approach that would make the existing congestion worse and stifle innovation,” said John Butler, President & CEO of the WSC.

Butler contends that “when ships cannot get into port to discharge and load cargo because of landside logistics breakdowns, it is clear that further regulating ocean carriers will not solve the deeper challenges in U.S. supply chains.” He argues that carriers have deployed every available ship and container to move the record levels of cargo. The World Shipping Council says it hopes to work with Congress to seek real solutions that further strengthen the ocean transportation system.

Is it Time to Amend the Law of the Sea?

UNEP
Illustration courtesy UNEP

PUBLISHED FEB 4, 2022 3:21 PM BY BRIAN GICHERU KINYUA

 

“The dark oceans were the womb of life: from the protecting oceans life emerged.” These were the words in a prophetic speech by Arvid Pardo to a UN meeting in 1967. It was amongst the first sessions convened to deliberate creation of a body of laws to govern the global oceans. It culminated in the modern United Nations Convention on the Law of the Sea (UNCLOS), as adopted by the UN in 1982. Arvin Pardo would go on to earn the credit as the father of UNCLOS.

Since then, UNCLOS remains the most far-reaching treaty ever negotiated under the auspices of the UN, and a harbinger of a global attempt to regulate the maritime domain.

While UNCLOS is fairly settled on many questions of governance, contemporary challenges such as climate change, protection of high seas fisheries and management of strategic ocean spaces – like the South China Sea - are prompting a debate over reviewing the established law.

For example, climate change poses considerable challenges in the future application and interpretation of the UNCLOS.

“Whilst this is not a completely new dynamic - in that the changing ocean and coastline conditions have always had to be addressed by the law of the sea - rapid climate change and its impact upon the oceans has the potential to impact upon nearly all aspects of ocean activity. Particularly, its unpredictable consequences for many coastal states,” writes Donald Rothwell and Tim Stephens in their book, The International Law the Sea.

This concern has jolted Pacific States from Kiribati to Tuvalu to map their remote islands in a bid to claim permanent exclusive economic zones (EEZs), stretching 200 nautical miles, irrespective of future sea level rise. As global warming leads to rising seas, Pacific nations fear their islands could eventually be flooded, shrinking their EEZs and their rights to fishing and seabed resources within their boundaries. Therefore, they are trying to lock in the existing zones now.

A 2018 resolution by The International Law Association supported the vulnerable islands, arguing that any maritime zones determined under UNCLOS “should not be required to be recalculated should the sea level change affect the geographical reality of the coastline.”

The enforcement and dispute resolution mechanism of UNCLOS has been brought to question in the wake of rising tension in South China Sea. China is a party to UNCLOS after it ratified the treaty in 1996. This notwithstanding, China has refused to accept a major ruling by the Permanent Court of Arbitration in the Hague, which found its claims in the South China Sea inconsistent with UNCLOS. The five-judge tribunal hearing the case was established under the compulsory dispute settlement provisions of UNCLOS, and its ruling should be final and legally binding to all parties concerned. Unfortunately, due to lack of an enforcement mechanism, China can still assert and pursue its claims in the South China Sea, even if the legal basis for such activity is untenable. In light of this, international law analysts say that a dilemma occurs on the question of whether China should withdraw from UNCLOS.

The effectiveness of UNCLOS dispute resolution mechanism is also covered in an ongoing House of Lords inquiry in UK, which is currently examining how UNCLOS is fit for purpose in the 21st century.

Meanwhile, in view of the ever-increasing human rights violations at sea, especially for seafarers and fishermen, does UNCLOS provide for their protection?

This is a relatively new question for legal scholars. Usually, whenever a discussion of human rights at sea and its connection to UNCLOS arises, it is dismissed because it is addressed in discrete sections of the treaty – specific segments addressing modern slavery, human trafficking, search and rescue at sea - supplemented by 2006 Maritime Labour convention and IMO guidelines.

Elizabeth Mavropoulou, head of Research at Human Rights at Sea, suggests that this perspective reflects a narrow understanding of human rights at sea, incorrectly equating it to minimum labor and welfare standards onboard vessels. She recommends that human rights at sea should be a central theme of UNCLOS, alongside other pertinent questions on ocean governance.

Indeed, amending UNCLOS to address some of the emergent issues could be a genuine challenge. However, we must also consider that UNCLOS as drafted may not be up to date with our fast-changing world, given the modern realities of climate change and geopolitical rivalry.

Video: Oil Tanker Taking on Water After Hitting Breakwater 

tanker listing after hitting breakwater in Taiwan
Torm Emilie struck the breakwater arriving in Taiwan

PUBLISHED FEB 4, 2022 4:28 PM BY THE MARITIME EXECUTIVE

 

[Brief]  A Danish-registered product tanker clipped the breakwater while arriving in Kaohsiung, Taiwan earlier this week. The vessel took a severe list before authorities cleared it to dock and begin an emergency unloading.

The 74,999 dwt Torm Emilie was entering the port of Kaohsiung when it apparently misjudged the harbor entrance and hit the southern breakwater on February 1. The tanker, which was built in 2004, reportedly breached the hull in its ballast tanks and while not causing an oil leak took on a list reported to be as much as 14 degrees. 

 

 

Inbound from Kuwait and the UAE after a stop in Singapore, the 748-foot tanker was carrying a load of an oil derivative product known as naphtha. Authorities reported that there was no danger of an oil leak but they made immediate arrangements for the vessel to come alongside at the Intercontinental Wharf. An oil boom was strung around the vessel while they began pumping both the product and the water. Later reports indicated that they were able to bring the vessel back to a 9 degree list. It was expected to take at least two days to unload the tanker before repairs could begin on its hull.