Monday, July 05, 2021

Astronomers discover an oversized black hole population in the star cluster Palomar 5

UNIVERSITY OF BARCELONA

Research News

"The number of black holes is roughly three times larger than expected from the number of stars in the cluster, and it means that more than 20% of the total cluster mass is made up of black holes. They each have a mass of about 20 times the mass of the Sun, and they formed in supernova explosions at the end of the lives of massive stars, when the cluster was still very young" says Prof Mark Gieles, from the Institute of Cosmos Sciences of the University of Barcelona (ICCUB) and lead author of the paper.

Tidal streams are streams of stars that were ejected from disrupting star clusters or dwarf galaxies. In the last few years, nearly thirty thin streams have been discovered in the Milky Way halo. "We do not know how these streams form, but one idea is that they are disrupted star clusters. However, none of the recently discovered streams have a star cluster associated with them, hence we can not be sure. So, to understand how these streams formed, we need to study one with a stellar system associated with it. Palomar 5 is the only case, making it a Rosetta Stone for understanding stream formation and that is why we studied it in detail" explains Gieles.

The authors simulate the orbits and the evolution of each star from the formation of the cluster until the final dissolution. They varied the initial properties of the cluster until a good match with observations of the stream and the cluster was found. The team finds that Palomar 5 formed with a lower black hole fraction, but stars escaped more efficiently than black holes, such that the black hole fraction gradually increased. The black holes dynamically puffed up the cluster in gravitational slingshot interactions with stars, which led to even more escaping stars and the formation of the stream. Just before it completely dissolves - roughly a billion years from now - the cluster will consist entirely of black holes. "This work has helped us understand that even though the fluffy Palomar 5 cluster has the brightest and longest tails of any cluster in the Milky Way, it is not unique. Instead, we believe that many similarly puffed up, black hole-dominated clusters have already disintegrated in the Milky Way tides to form the recently discovered thin stellar streams" says co-author Dr. Denis Erkal at the University of Surrey.

Gieles points out that in this paper "we have shown that the presence of a large black hole population may have been common in all the clusters that formed the streams". This is important for our understanding of globular cluster formation, the initial masses of stars and the evolution of massive stars. This work also has important implications for gravitational waves. "It is believed that a large fraction of binary black hole mergers form in star clusters. A big unknown in this scenario is how many black holes there are in clusters, which is hard to constrain observationally because we can not see black holes. Our method gives us a way to learn how many BHs there are in a star cluster by looking at the stars they eject.'', says Dr. Fabio Antonini from Cardiff University, a co-author of the paper.

Palomar 5 is a globular cluster discovered in 1950 by Walter Baade. It is in the Serpens constellation at a distance of about 80,000 light-years, and it is one of the roughly 150 globular clusters that orbit around the Milky Way. It is older than 10 billion years, like most other globular clusters, meaning that it formed in the earliest phases of galaxy formation. It is about 10 times less massive and 5 times more extended than a typical globular cluster and in the final stages of dissolution.

###

 

Psychedelic spurs growth of neural connections lost in depression

YALE UNIVERSITY

Research News

The psychedelic drug psilocybin, a naturally occurring compound found in some mushrooms, has been studied as a potential treatment for depression for years. But exactly how it works in the brain and how long beneficial results might last is still unclear.

In a new study, Yale researchers show that a single dose of psilocybin given to mice prompted an immediate and long-lasting increase in connections between neurons. The findings are published July 5 in the journal Neuron.

"We not only saw a 10% increase in the number of neuronal connections, but also they were on average about 10% larger, so the connections were stronger as well," said Yale's Alex Kwan, associate professor of psychiatry and of neuroscience and senior author of the paper.

Previous laboratory experiments had shown promise that psilocybin, as well as the anesthetic ketamine, can decrease depression. The new Yale research found that these compounds increase the density of dendritic spines, small protrusions found on nerve cells which aid in the transmission of information between neurons. Chronic stress and depression are known to reduce the number of these neuronal connections.

Using a laser-scanning microscope, Kwan and first author Ling-Xiao Shao, a postdoctoral associate in the Yale School of Medicine, imaged dendritic spines in high resolution and tracked them for multiple days in living mice. They found increases in the number of dendritic spines and in their size within 24 hours of administration of psilocybin. These changes were still present a month later. Also, mice subjected to stress showed behavioral improvements and increased neurotransmitter activity after being given psilocybin.

For some people, psilocybin, an active compound in "magic mushrooms," can produce a profound mystical experience. The psychedelic was a staple of religious ceremonies among indigenous populations of the New World and is also a popular recreational drug.

It may be the novel psychological effects of psilocybin itself that spurs the growth of neuronal connections, Kwan said.

"It was a real surprise to see such enduring changes from just one dose of psilocybin," he said. "These new connections may be the structural changes the brain uses to store new experiences."

###

 

Global BECCS potential is largely constrained by sustainable irrigation

NATIONAL INSTITUTE FOR ENVIRONMENTAL STUDIES

Research News

A new collaborative research led by researchers from the National Institute for Environmental Studies, Potsdam Institute for Climate Impact Research, Ritsumeikan University, and Kyoto University found that although unlimited irrigation could increase global BECCS potential (via the increase of bioenergy production) by 60-71% by the end of this century, sustainably constrained irrigation would increase it by only 5-6%. The study has been published in Nature Sustainability on July 5.

Bioenergy with carbon capture and storage (BECCS) is a process of extracting bioenergy from biomass, then capturing and storing the carbon to a geological reservoir. It is a negative emission technology since the biomass is produced by plants through photosynthesis that can uptake the carbon dioxide from atmosphere. To achieve the 2°C or 1.5°C climate goal, large-scale deployment of BECCS was assumed to be prominent in many previous studies. However, this caused increasing concerns on the challenges brought to water and land resources to grow the bioenergy crops. For example, existing studies have showed that irrigation to achieve considerable bioenergy crop production needed for BECCS potential comparable to the requirement of 2°C or 1.5°C climate goal would lead to severe water stress even than climate change itself.

Under this context, where and to what extent irrigation can enhance the global BECCS potential remains unknown under sustainable water use. "Here, we define it as water use securing the local and downstream water availability for conventional water use and environmental flow requirements, suppressing nonrenewable water resources withdrawal, and preventing additional water stress." explains lead author Zhipin Ai from National institute for environmental studies, Japan.

The study was based on simulations with a spatially explicit representation of bioenergy crop plantations and water cycle in an internally consistent model framework. To quantitatively determine the constraints of irrigation water resources, the researchers designed distinct irrigation ways (unlimited irrigation, sustainable irrigation, and no irrigation) with bioenergy crops planted on land scenarios with strict land protections to prevent adverse effects on biodiversity, food production, land degradation, and desertification due to large-scale land conversion.

The study found that, under the rain fed condition, the average global BECCS potential in 2090 was 0.82-1.99 Gt C yr-1. The BECCS potential reached 1.32-3.42 Gt C yr-1 (60% and 71% increases compared to that under rainfed condition) under full irrigation, whereas under sustainable irrigation, the BECCS potential was 0.88-2.09 Gt C yr-1 (5% and 6% increases compared to that under rainfed condition). The BECCS potential under sustainable irrigation is close to the lower limit of 1.6-4.1 Gt C yr-1, which is the required amount of BECCS in 2100 that consistent with the 1.5°C or 2°C climate goal as documented in the IPCC Special Report on Global Warming of 1.5ºC.

Given the many negative environmental impacts of large-scale deployment of BECCS, the researchers suggest that comprehensive assessments of the BECCS potential that consider both potential benefits and adverse effects are necessary for simultaneously achieving the multiple sustainable development goals on climate, water, land, etc. "In addition, considering the relatively low biophysically constrained BECCS potential under sustainable water and land use scenarios, a critical reexamination of the contribution of BECCS towards achieving the Paris Agreement goal is needed." says co-author Vera Heck from the Potsdam Institute for Climate Impact Research.

###

This study was supported by the Environment Research and Technology Development Fund (JPMEERF20202005, JPMEERF15S11418, and JPMEERF20211001) of the Environmental Restoration and Conservation Agency of Japan.

#STOPWOLFHUNTS

Hunting and hidden deaths led to 30% reduction in WI wolf population

UNIVERSITY OF WISCONSIN-MADISON

Research News

MADISON, Wis. -- About 100 additional wolves died over the winter in Wisconsin as a result of the delisting of grey wolves under the Endangered Species Act, alongside the 218 wolves killed by licensed hunters during Wisconsin's first public wolf hunt, according to new research.

The combined loss of 313 to 323 wolves represents a decline in the state's wolf population of between 27% and 33% between April 2020 and April 2021. Researchers estimate that a majority of these additional, uncounted deaths are due to something called cryptic poaching, where poachers hide evidence of illegal killings.

The findings are the first estimate of Wisconsin's wolf population since the public hunt in February, which ended early after hunters exceeded the quota of 119 wolves within a few days. These population estimates can help the Wisconsin Department of Natural Resources (DNR) prepare for the next legally mandated wolf hunt this fall.

They also provide guidance to other states planning wolf hunts following the removal of federal protections announced in November 2020 and effective January 2021.

University of Wisconsin-Madison environmental studies scientists Adrian Treves, Francisco Santiago-Ávila and Karann Putrevu performed the research, which was published July 5 in the journal PeerJ.

Under a variety of population growth scenarios, the researchers estimate that Wisconsin now hosts between 695 and 751 wolves, compared with at least 1,034 wolves last year. The scientists say this likely represents the maximum current wolf population, because they incorporated optimistic assumptions about population growth and low poaching rates into their models.

This decline is despite the hunting quota of 119 wolves for non-native hunters, set with the goal of helping maintain but not reduce the state's wolf population. Ojibwe Tribes were granted a quota of 81 wolves, but they did not conduct a hunt.

"Although the DNR is aiming for a stable population, we estimate the population actually dropped significantly," says Treves, a professor in the Nelson Institute for Environmental Studies and director of the Carnivore Coexistence Lab at UW-Madison.

The new study suggests that about one-third of the population decline is due to hidden deaths in the wolf population, resulting from relaxed legal protections.

Previous research by the Treves lab showed that wolf population growth declined in Wisconsin and Michigan when legal protections were relaxed, regardless of the number of wolves legally killed. And Santiago-Ávila led research that found that Wisconsin's wolves and the heavily monitored Mexican wolves of the American Southwest disappeared at greater rates when lethal control methods were allowed.

Other studies by the lab of attitudes toward wolves suggest that when governments allow lethal management, would-be poachers are inclined to kill more wolves because the relaxed policies signal that predators are less valued.

Those previous findings helped Santiago-Ávila, Putrevu and Treves model the uncounted deaths in Wisconsin since last November.

"During these periods, we see an effect on poaching, both reported and cryptic. Those wolves disappear and you never find them again," says Santiago-Ávila, a postdoctoral researcher in the lab. "Additional deaths are caused simply by the policy signal, and the wolf hunt adds to that."

Treves and his team estimate that the population could recover in one to two years without hunting. Wisconsin law requires a wolf hunt between November and February when hunting is not prohibited by federal protections.

Following the federal delisting of wolves that became effective in January 2021, the DNR initially planned to conduct the first hunt in November 2021. But after a lawsuit, the DNR immediately implemented a wolf hunt at the end of February.

The research team hopes that the Wisconsin DNR and other states' natural resource agencies take advantage of their methods to develop a more complete assessment of the effect of new policies on predator populations.

"These methods and models are freely available to these agencies," says Putrevu, a doctoral student who also researches tiger populations in the Russian Far East. "They should take advantage of the best available science to meet their stated goals."

###

--Eric Hamilton, (608) 263-1986, eshamilton@wisc.edu

 

Being clean and hygienic need not impair childhood immunity

Peer-reviewed | opinion | people

UNIVERSITY COLLEGE LONDON

Research News

The theory that modern society is too clean, leading to defective immune systems in children, should be swept under the carpet, according to a new study by researchers at UCL and the London School of Hygiene & Tropical Medicine.

In medicine, the 'hygiene hypothesis' states that early childhood exposure to particular microorganisms protects against allergic diseases by contributing to the development of the immune system.

However, there is a pervading view (public narrative) that Western 21st century society is too hygienic, which means toddlers and children are likely to be less exposed to germs in early life and so become less resistant to allergies.

In this paper, published in the Journal of Allergy and Clinical Immunology, researchers point to four significant reasons which, they say, disprove this theory and conclude we are not "too clean for our own good".

Lead author, Emeritus Professor of Medical Microbiology Graham Rook (UCL Infection & Immunity), said: "Exposure to microorganisms in early life is essential for the 'education' of the immune and metabolic systems.

"Organisms that populate our guts, skin and airways also play an important role in maintaining our health right into old age: so throughout life we need exposure to these beneficial microorganisms, derived mostly from our mothers, other family members and the natural environment.

"But for more than 20 years there has been a public narrative that hand and domestic hygiene practices, that are essential for stopping exposure to disease-causing pathogens, are also blocking exposure to the beneficial organisms.

"In this paper, we set out to reconcile the apparent conflict between the need for cleaning and hygiene to keep us free of pathogens, and the need for microbial inputs to populate our guts and set up our immune and metabolic systems."

In a review of evidence, the researchers point to four factors.

  • Firstly, the microorganisms found in a modern home are, to a significant degree, not the ones that we need for immunity.
  • Secondly, vaccines, in addition to protecting us from the infection that they target, do a lot more to strengthen our immune systems*, so we now know that we do not need to risk death by being exposed to the pathogens.
  • Thirdly, we now have concrete evidence that the microorganisms of the natural green environment are particularly important for our health; domestic cleaning and hygiene have no bearing on our exposure to the natural environment.
  • Finally, recent research** demonstrates that when epidemiologists find an association between cleaning the home and health problems such as allergies, this is often not caused by the removal of organisms, but rather by exposure of the lungs to cleaning products that cause a type of damage that encourages the development of allergic responses.

Professor Rook added: "So cleaning the home is good, and personal cleanliness is good, but, as explained in some detail in the paper, to prevent spread of infection it needs to be targeted to hands and surfaces most often involved in infection transmission. By targeting our cleaning practices, we also limit direct exposure of children to cleaning agents

"Exposure to our mothers, family members, the natural environment, and vaccines can provide all the microbial inputs that we need. These exposures are not in conflict with intelligently targeted hygiene or cleaning."

###

* Vaccinology: time to change the paradigm? The Lancet Infectious Diseases 2020

** Food allergy as a biological food quality control system. Cell 2021

** Does the epithelial barrier hypothesis explain the increase in allergy, autoimmunity and other chronic conditions? Nature Reviews Immunology 2021

The 1936 North American heat wave hit Toronto hard — temperatures reach 40 °C
Randi Mann 7 hrs ago
Listen to The Weather Network's This Day in Weather History podcast on this topic, here.
This Day In Weather History is a daily podcast by The Weather Network that features stories about people, communities, and events and how weather impacted them.

--

On Sunday, July 5, 1936, one of Canada's deadliest heat waves hit Manitoba and Ontario. It was part of the 1936 North American heat wave. It took place during the Great Depression and Dust Bowl.

© Provided by The Weather NetworkCity of Toronto Archives

In North America, the heat wave killed more than 5,000 people and destroyed a vast number of crops. The weather event set many record highs that held until the 2012 North American heat wave. The 1936 heat wave was also followed by one of the continent's coldest winters.

© Provided by The Weather NetworkCity of Toronto Archives

In late June, the temperatures started to exceed 38 °C across the United States. The Midwest was faced with some of their hottest temperatures on record. In the Northeast, the temperatures reached approximately 35 °C.

In July, North Dakota reached a record 49 °C; still the hottest temperature in the state's history. Many other states set record highs during the month.

In Canada, Ontario and Manitoba reached 43 °C, tying previous heat records. By July 5, Ontario was in a drought. Areas from what is now the QEW corridor, from Hamilton to Niagara and Lake Erie was described as “parched waste,” in the Toronto Daily Star.

© Provided by The Weather Network Courtesy of The Toronto Daily Star

By July 9, temperatures surpassed 40 °C. Areas in Toronto were referred to as “downtown slums” and “districts of torture.” Drivers were lined up on Fleet Street in hopes of getting some lake breeze.

© Provided by The Weather NetworkCity of Toronto Archives

By July 15, the temperatures finally made it out of the 40s and 30s and sat in the high 20s. By then, the heatwave killed more than 200 people in Toronto. The overall death toll in Canada was around 1,180.

To learn more about the 1936 heatwave, listen to today's episode of "This Day In Weather History."

This Day In Weather History is a daily podcast by The Weather Network that features unique and informative stories from host Chris Mei.

Thumbnail: Courtesy of City of Toronto Archives

Chocolate fix: How the cocoa industry could end deforestation in West Africa

Sophia Carodenuto, Assistant Professor of Geography, University of Victoria 

Despite a significant uptick in corporate sustainability efforts in the cocoa sector, it is nearly impossible for most chocolate consumers to know the amount of tropical deforestation associated with their sweet luxury.
© (AP Photo/Rebecca Blackwell) A farmer walks past cocoa pods growing on a tree on a cocoa farm in Ivory Coast.

The cocoa bean is the fundamental and irreplaceable ingredient in chocolate. Cocoa beans come from trees that require specific climates and pollination systems. These conditions are found in and around tropical forest ecosystems.

As global demand for chocolate increases due to increasing awareness of the potential health benefits of dark chocolate and rising disposable incomes in emerging economies, cocoa farms are replacing the last remaining biodiversity hotspots.

Cocoa production is highly concentrated in a few countries in West Africa. Côte d’Ivoire and Ghana together produce around 62 per cent of cocoa globally.

Despite recent media attention exposing illegal deforestation from cocoa farming in critically protected ecosystems, these countries recently experienced the world’s highest rates of increase in deforestation in the world.

In 2018, Ghana saw a 60 per cent increase in forest loss compared to 2017, the largest annual increase in the world. Côte d’Ivoire was second at 26 per cent. Deforestation has irreversible negative impacts on biodiversity, soil health and the adaptive capacity of ecosystems in the face of climate change.

With my team at the University of Victoria, my research project Follow the Bean: Tracing Zero Deforestation Cocoa identified three of the main challenges to halting the deforestation embedded in global cocoa supply chains.
The “first mile” of supply chain traceability

The first challenge lies in the need to know the precise origins of cocoa beans in order to determine whether the farm where they were grown replaced primary forest. Tracing cocoa beans to the farm level is uniquely difficult in the West African cocoa sector because production is spread among millions of smallholder farms of roughly three to five hectares.

Cocoa is generally produced on small farms because it’s difficult to introduce machines to do the work. Cocoa trees require regular pruning and chemical treatments to combat pest and disease. In addition, the fruit that produces cocoa beans ripens intermittently, so farmers harvest by hand.
© (Sophia Carodenuto) A farmer dries recently harvested cocoa beans under the sun.

Nobody knows precisely how many cocoa farmers are operating in the West African region. Past estimates suggest two million farmers depend on cocoa in the region, which is likely underestimated considering a recent study found 1.5 million children working on cocoa farms.

Due to the complexities of local land tenure, farm boundaries are generally not publicly registered. The lack of a public map of smallholder cocoa farmers makes it difficult to know precisely where cocoa originates.

Some open-source maps are now tracing cocoa to the cooperative level. A major challenge moving forward will be to trace cocoa from farm to the first point of aggregation in the supply chain, also known as “first mile” traceability. This is important because not all cocoa goes through co-operatives.
The indirect supply chain

The US$140 billion chocolate industry’s most recent response to the deforestation challenge has been the creation of sustainability programs for their direct supply chains. However, estimates suggest that at least half of the cocoa supply in Côte d’Ivoire is sourced indirectly.

Indirect sourcing means that cocoa is bought through local traders, many of whom operate informally with limited public oversight. Very little is known about these local traders, although they generally often have a bad reputation for taking advantage of vulnerable farmers in need of immediate cash.

Read more: Ghana's cocoa production relies on the environment, which needs better protection

Sustainability programs often aim to eliminate these intermediaries. However, given the prevalence of the indirect supply chain, this solution might result in significant unemployment and related socio-economic implications in areas where the rural economy depends entirely on cocoa production and trade.

Together with Janina Grabs, our collaborative research aims to explore whether and how traders, including informal traders, might help to roll out sustainability programs. Because traders are often the farmer’s only point of contact with the supply chain, their role in relaying information and providing incentives for improved production practices may prove critical.
Power and accountability

Cocoa is one of the most consolidated sectors in the world, with three companies controlling 60 per cent of the cocoa traded globally. These companies are not consumer-facing with known brands and for many people they might be “the largest company you’ve never heard of,” according to Ian Welsh from Innovation Forum.

© (Shutterstock) Chocolate bars line the shelves at a supermarket in Russia.

This situation carries opportunities but also risks. On the one hand, companies are increasingly partnering with governments, think tanks and civil society organizations to meet sustainability targets. On the other hand, there is a growing tendency for the big players in the cocoa industry to abandon third-party sustainability standards such as Fair Trade, instead opting for designing and implementing their own sustainable sourcing programs.

In this situation, companies choose for themselves which information to disclose to their customers. Corporate sustainability reporting often portrays happy cocoa farmers who have benefited from their sustainability programs, but customers are not informed about the complexity of issues such as indirect sourcing.
What to do?

Although there are artisan chocolatiers making small batches of bean-to-bar chocolate products, the vast majority of chocolate consumers will remain in the dark and unable to determine the impacts of their purchases until governments, industry and consumers demand more accountability.

For most conventional chocolate products, it remains impossible to trace cocoa origins to the extent required to determine deforestation. However, independent reviews such as the Chocolate Scorecard are making great strides in providing transparency to consumers interested in buying “ethical” chocolate.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Sophia Carodenuto receives funding from the Social Sciences and Humanities Research Council of Canada (SSHRC).
CANADA
Tory MP fighting gay 'blood ban' says Parliament could have worked 'better' on C-6



OTTAWA — Eric Duncan has spoken more about his sexuality in the past two years than he says he has in his entire 33.
© Provided by The Canadian Press

The Conservative member of Parliament has talked openly in the House of Commons about being gay while pushing the Liberal government to fulfil a two-time campaign pledge to end the ban on blood donations from men who have sex with other men.


This session of Parliament has taken on a highly partisan atmosphere that Prime Minister Justin Trudeau has recently described as toxic, intensifying speculation that Canadian voters may head to the polls before MPs return from the summer break.

In a recent interview with The Canadian Press, though, Duncan described being able to shine a light on an issue that affects him deeply, with some encouragement from a caucus colleague, as an example of how Parliament can work.

Canada introduced a lifetime ban for gay men in 1992 and in 2013 changed it so blood would be accepted from a man who abstained from sex with another man for at least five years.

The waiting period then dropped to one year, and became three months in 2019.

That year, Duncan became a rookie MP — the first openly gay man to be elected under the Conservative Party of Canada banner.


Duncan is not the first MP who has called for an end to the ban, but he did so in a particularly personal way last November. He repeatedly pressed Health Minister Patty Hajdu at a committee hearing in the House of Commons on whether she would accept his blood, which he said he couldn't donate despite the need created by the COVID-19 pandemic.

Hajdu agreed the policy was discriminatory, but ultimately under the control of the independent Canadian Blood Services and Hema-Quebec.

Duncan said afterwards a few Liberal MPs thanked him for raising it.

He credits the idea to do so coming from the person who sat behind him: longtime Conservative MP and health critic Michelle Rempel Garner.

Improving equality for those in the LGBTQ community is something Rempel Garner says she has long worked to do

SHE ALSO SPOKE OUT IN DEFENSE OF  WICCANS IN THE HOUSE AT SAMHAIN!!

"I said to Eric, would you be willing to push (Hajdu) on this? Because I know you feel passionately about it. Would you push her on it and push her hard, like, and I'm talking about asking her if she would take your blood," she said.

"And he's like, 'Wow.''

Duncan says initially he didn't feel confident, being he was a newbie MP set to go toe-to-toe with a senior cabinet minister. He says he also wasn't sure he wanted to share his personal story.

Rempel Garner said the fact he did took courage.
#BDS
Nordic fund KLP excludes 16 companies over links to Israeli settlements in West Bank


By Gwladys Fouche and Simon Jessop
© Reuters/AMMAR AWAD FILE PHOTO: A Jewish settler walks past Israeli settlement construction sites around Givat Zeev and Ramat Givat Zeev in the Israeli-occupied West Bank

OSLO (Reuters) - Norway's largest pension fund KLP said on Monday it would no longer invest in 16 companies including Alstom and Motorola because of their links to Israeli settlements in the West Bank.

Along with a number of other countries, Norway considers the settlements a breach of international law. A 2020 United Nations report said it had found 112 companies that have operations linked to the region, home to around 650,000 Israelis.

The companies, which span telecoms, banking, energy and construction, all help facilitate Israel's presence and therefore risk being complicit in breaches of international law, and against KLP's ethical guidelines, it said in a statement.

"In KLP's assessment, there is an unacceptable risk that the excluded companies are contributing to the abuse of human rights in situations of war and conflict through their links with the Israeli settlements in the occupied West Bank," KLP said.

The move by KLP follows a decision by Norway's sovereign wealth fund in May to exclude two companies linked to construction and real estate in the Palestinian territories.

KLP said it had sold shares in the companies worth 275 million Norwegian crowns ($31.81 million) and as of June had completed the process. In Motorola and Alstom, it had also sold its bond holdings.

Selling Motorola Solutions was "a very straightforward decision" as its video security and software was used in border surveillance.

Motorola and Alstom did not immediately reply to requests for comment.

A senior member of the Palestine Liberation Organization (PLO) welcomed KLP's move.

"The Norwegian step is significant to stop dealing with companies that support settlements on Palestinian land. We welcome it, and we urge other countries to take similar steps," Wasel Abu Youssef told Reuters.

"After the United Nations announced its blacklist of companies that operate in settlements, all countries must either suspend the work of these companies or boycott them."

Telecoms companies including Bezeq and Cellcom Israel were removed as the services they provide help make the settlements more attractive residential areas, KLP said, while banks including Leumi helped finance the infrastructure.

In a similar vein, construction and engineering groups such as Alstom and local peers Ashtrom and Electra were responsible for building the infrastructure, while Paz Oil helped power them.

The other companies to be excluded were: Bank Hapoalim, Israel Discount Bank, Mizrahi Tefahot Bank, Delek Group, Energix Renewable Energies, First International Bank of Israel and Partner Communications.

Mizrahi Tefahot Bank and Partner Communications declined to comment. The other companies did not immediately reply to requests for comment.

Telecoms company Altice, which was listed until January 2021, was also excluded.

Altice did not immediately reply to a request for comment.

($1 = 8.6460 Norwegian crowns)

(Reporting by Gwladys Fouche in Oslo and Simon Jessop in London, Steven Scheer and Maayan Lubell in Jerusalem and Ali Sawafta in Ramallah, editing by Louise Heavens)

Astronomers Use Artificial Intelligence to Reveal the Actual Shape of the Universe

AI Data Analysis Actual Shape of the Universe

Using AI driven data analysis to peel back the noise and find the actual shape

of the Universe. Credit: The Institute of Statistical Mathematics

Japanese astronomers have developed a new artificial intelligence (AI) technique to remove noise in astronomical data due to random variations in galaxy shapes. After extensive training and testing on large mock data created by supercomputer simulations, they then applied this new tool to actual data from Japan’s Subaru Telescope and found that the mass distribution derived from using this method is consistent with the currently accepted models of the Universe. This is a powerful new tool for analyzing big data from current and planned astronomy surveys.

Wide area survey data can be used to study the large-scale structure of the Universe through measurements of gravitational lensing patterns. In gravitational lensing, the gravity of a foreground object, like a cluster of galaxies, can distort the image of a background object, such as a more distant galaxy. Some examples of gravitational lensing are obvious, such as the “Eye of Horus”. The large-scale structure, consisting mostly of mysterious “dark” matter, can distort the shapes of distant galaxies as well, but the expected lensing effect is subtle. Averaging over many galaxies in an area is required to create a map of foreground dark matter distributions.

But this technique of looking at many galaxy images runs into a problem; some galaxies are just innately a little funny looking. It is difficult to distinguish between a galaxy image distorted by gravitational lensing and a galaxy that is actually distorted. This is referred to as shape noise and is one of the limiting factors in research studying the large-scale structure of the Universe.

To compensate for shape noise, a team of Japanese astronomers first used ATERUI II, the world’s most powerful supercomputer dedicated to astronomy, to generate 25,000 mock galaxy catalogs based on real data from the Subaru Telescope. They then added realist noise to these perfectly known artificial data sets, and trained an AI to statistically recover the lensing dark matter from the mock data.

After training, the AI was able to recover previously unobservable fine details, helping to improve our understanding of the cosmic dark matter. Then using this AI on real data covering 21 square degrees of the sky, the team found a distribution of foreground mass consistent with the standard cosmological model.

“This research shows the benefits of combining different types of research: observations, simulations, and AI data analysis.” comments Masato Shirasaki, the leader of the team, “In this era of big data, we need to step across traditional boundaries between specialties and use all available tools to understand the data. If we can do this, it will open new fields in astronomy and other sciences.”

Reference: “Noise reduction for weak lensing mass mapping: an application of generative adversarial networks to Subaru Hyper Suprime-Cam first-year data” by Masato Shirasaki, Kana Moriwaki, Taira Oogi, Naoki Yoshida, Shiro Ikeda and Takahiro Nishimichi, 9 April 2021, Monthly Notices of the Royal Astronomical Society.
DOI: 10.1093/mnras/stab982