Tuesday, March 02, 2021

Neanderthal and early modern human stone tool culture co-existed for over 100,000 years

Research from the University of Kent's School of Anthropology and Conservation has discovered that one of the earliest stone tool cultures, known as the Acheulean, likely persisted for tens of thousands of years longer than previously thought.

UNIVERSITY OF KENT

Research News

The Acheulean was estimated to have died out around 200,000 years ago but the new findings suggest it may have persisted for much longer, creating over 100,000 years of overlap with more advanced technologies produced by Neanderthals and early modern humans.

The research team, led by Dr Alastair Key (Kent) alongside Dr David Roberts (Kent) and Dr Ivan Jaric (Biology Centre of the Czech Academy of Sciences), made the discovery whilst studying stone tool records from different regions across the world. Using statistical techniques new to archaeological science, the archaeologists and conservation experts were able to reconstruct the end of the Acheulean period and re-map the archaeological record.

Previously, a more rapid shift between the earlier Acheulean stone tool designs often associated with Homo heidelbergensis - the common ancestor of modern humans and Neanderthals - and more advanced 'Levallois' technologies created by early modern humans and Neanderthals, was assumed. However, the study has shed new light on the transition between these two technologies, suggesting substantial overlap between the two.

Acheulean stone tool technologies are the longest-lived cultural tradition practiced by early humans. Originating in East Africa 1.75 million years ago, handaxes and cleavers - the stone tool types which characterise the period - went on to be used across Africa, Europe and Asia by several different species of early human. Prior to this discovery, it was widely assumed that the Acheulean period ended between 300-150,000 year ago. However, the record was lacking in specific dates, and the timing of its demise has been heavily debated. The Kent and Czech team discovered that the tradition likely ended at different times around the world, varying from as early as 170,000 years ago in Sub-Saharan Africa through to as late as 57,000 years ago in Asia.

To understand when the Acheulean ended, the team collected information on different archaeological sites from around the world to find the latest known stone tool assemblages. A statistical technique known as optimal linear estimation - commonly used in conservation studies to estimate species extinctions - was used to predict how much longer the stone tool tradition continued after the most recent known sites. In effect, the technique was able to model the portion of the archaeological record yet to be discovered.

Dr Alastair Key, a Palaeolithic Archaeologist and the lead author of the study, said: 'The earliest archaeological record will always be an incomplete picture of early human behaviour, so we know that the youngest known Acheulean sites are unlikely to actually represent the final instances of these technologies being produced. By allowing us to reconstruct these missing portions of the archaeological record, this technique not only gives us a more accurate understanding of when the tradition ended, but it gives us an indication of where we can expect to find new archaeological discoveries in the future.'

Dr Roberts added: 'This technique was originally developed by myself and a colleague to date extinctions, as the last sighting of a species is unlikely to be the date when it actually became extinct. It is exciting to see it applied in a new context.'

###

Their research paper 'Modelling the end of the Acheulean at global and continental levels suggests widespread persistence into the Middle Palaeolithic' is published by Humanities & Social Sciences Communications. doi:10.1057/s41599-021-00735-8

The first operations for which our ancestors gradually learned to adapt their hands during the many thousands of years of transition from ape to man could have been only very simple ones. ... Thus the hand is not only the organ of labour, it is also the product of labour.

Watch: Recycled cotton becomes new fabric

LUND UNIVERSITY

Research News

A lot of us recycle our old textiles, but few of us know that they are very difficult to re-use, and often end up in landfills anyway. Now, researchers at Lund University in Sweden have developed a method that converts cotton into sugar, that in turn can be turned into spandex, nylon or ethanol.

WATCH: New method transforms old cotton into glucose
https://www.youtube.com/watch?v=B1V--prLs08

Every year, an estimated 25 million tonnes of cotton textiles are discarded around the world. In total, 100 million tonnes of textiles are thrown out. In Sweden, most of the material goes straight into an incinerator and becomes district heating. In other places, it is even worse, as clothes usually end up in landfills.

"Considering that cotton is a renewable resource, this is not particularly energy-efficient", says Edvin Ruuth, researcher in chemical engineering at Lund University.

"Some fabrics still have such strong fibres that they can be re-used. This is done today and could be done even more in future. But a lot of the fabric that is discarded has fibres that are too short for re-use, and sooner or later all cotton fibres become too short for the process known as fibre regeneration."

At the Department of Chemical Engineering in Lund where Edvin Ruuth works, there is a great deal of accumulated knowledge about using micro-organisms and enzymes, among other things, to transform the "tougher" carbohydrates in biomass into simpler molecules. This means that everything from biological waste and black liquor to straw and wood chips can become bioethanol, biogas and chemicals.

Now the researchers have also succeeded in breaking down the plant fibre in cotton - the cellulose - into smaller components. However, no micro-organisms or enzymes are involved this time; instead, the process involves soaking the fabrics in sulphuric acid. The result is a clear, dark, amber-coloured sugar solution.

"The secret is to find the right combination of temperature and sulphuric acid concentration", explains Ruuth, who fine-tuned the 'recipe' together with doctoral student Miguel Sanchis-Sebastiá and professor Ola Wallberg.

Glucose is a very flexible molecule and has many potential uses, according to Ruuth.

"Our plan is to produce chemicals which in turn can become various types of textiles, including spandex and nylon. An alternative use could be to produce ethanol."

From a normal sheet, they extract five litres of sugar solution, with each litre containing the equivalent of 33 sugar cubes. However, you couldn't turn the liquid into a soft drink as it also contains corrosive sulphuric acid.

One of the challenges is to overcome the complex structure of cotton cellulose.

"What makes cotton unique is that its cellulose has a high crystallinity. This makes it difficult to break down the chemicals and reuse their components. In addition, there are a lot of surface treatment substances, dyes and other pollutants which must be removed. And structurally, a terrycloth towel and an old pair of jeans are very different", says Ruuth.

"Thus it is a very delicate process to find the right concentration of acid, the right number of treatment stages and temperature."

The concept of hydrolizing pure cotton is nothing new per se, explains Ruuth; it was discovered in the 1800s. The difficulty has been to make the process effective, economically viable and attractive.

"Many people who tried ended up not utilising much of the cotton, while others did better but at an unsustainable cost and environmental impact", says Ruuth.

When he started making glucose out of fabrics a year ago, the return was a paltry three to four per cent. Now he and his colleagues have reached as much as 90 per cent.

Once the recipe formulation is complete, it will be both relatively simple and cheap to use.

However, for the process to become a reality, the logistics must work. There is currently no established way of managing and sorting various textiles that are not sent to ordinary clothing donation points.

Fortunately, a recycling centre unlike any other in the world is currently under construction in Malmö, where clothing is sorted automatically using a sensor. Some clothing will be donated, rags can be used in industry and textiles with sufficiently coarse fibres can become new fabrics. The rest will go to district heating.

Hopefully, the proportion of fabrics going to district heating will be significantly smaller once the technology from Lund is in place.

###

Assessing hemp-containing foodstuff

The BfR recommends acute reference dose as basis for assessing hemp-containing foodstuff

BFR FEDERAL INSTITUTE FOR RISK ASSESSMENT

Research News

In order to avoid the occurrence of such effects, the Federal Institute for Health Protection of Consumers and Veterinary Medicine (BgVV) recommended guidance values for maximum THC levels in various food groups in 2000. The guidance value for beverages was given as 0.005 mg/kg, for edible oils with 5 mg/kg and for all other foods with 0.150 mg/kg. In 2018, the BfR came to the conclusion that these values no longer correspond to current scientific knowledge.

Instead, the BfR recommends that the toxicological assessment of hemp-containing foods be carried out on the basis of the acute reference dose (ARfD) of 1 microgram Δ-THC/kg body-weight derived by the European Food Safety Authority (EFSA) in 2015. The ARfD specifies the estimated maximum quantity of a substance that can be consumed with food in the course of one day - either during one meal or during several meals - without a detectable risk to health. From the point of view of the BfR, whether the ARfD can possibly be exceeded should be checked on a case-by-case basis for each product under assessment. The measured THC levels and the estimated consumption quantities are used for this assessment. Information on the latter can be found in the "EFSA Comprehensive European Food Consumption Database".

###

Link to BfR Opinion No 006/2021 issued 17 February 2021:
https://www.bfr.bund.de/cm/349/the-bfr-recommends-acute-reference-dose-as-basis-for-assessing-hemp-containing-foodstuff.pdf

About the BfR

The German Federal Institute for Risk Assessment (BfR) is a scientifically independent institution within the portfolio of the Federal Ministry of Food and Agriculture (BMEL) in Germany. It advises the German federal government and German federal states ("Laender") on questions of food, chemical and product safety. The BfR conducts its own research on topics that are closely linked to its assessment tasks.

This text version is a translation of the original German text which is the only legally binding version.

The selection of leaders of political parties through primary elections penalizes women

Researchers at the UPF Department of Political and Social Sciences, published in the journal Party Politics , concludes that the odds of women candidates winning in primary processes fall relative to other selection methods

UNIVERSITAT POMPEU FABRA - BARCELONA

Research News

IMAGE

IMAGE: PREDICTED PROBABILITIES OF WINNING A MIXED-GENDER CONTEST ACROSS GENDER AND TYPE OF SELECTION METHOD. view more 

CREDIT: UPF

A study by two researchers at the UPF Department of Political and Social Sciences (DCPIS) has examined the effect of selecting party leaders by direct vote by the entire membership (a process known in southern Europe as "primaries" and in English-speaking countries as "one-member-one -vote", OMOV) on the likelihood of a woman winning a leadership competition against male rivals.

Javier Astudillo and Andreu Paneque, a tenured lecturer and PhD with the DCPIS, respectively, and members of the Institutions and Political Actors Research Group, are the authors of the article published recently in the journal Party Politics: "Our statistical analysis shows that 'controlling for other influential factors', while the probability of a male candidate winning hardly varies according to the type of selection method, in the case of a female candidate, her likelihood of winning falls significantly in primaries compared to other methods", they assert.

In the case of primaries, a female candidate has a 14 percentage points lower likelihood of winning than a male candidate.

The researchers determine that, while under other, less inclusive selection systems the likelihood that men and women have of winning the contest is practically the same (the difference is not statistically significant, although it slightly favours women), in the case of primaries, a woman candidate has a 14 percentage points lower chance of winning than a male candidate.

Contributions to the debate around the primaries selection method

In their study, the authors re-examine the debate currently existing in the specialized literature concerning the possible effects of this system of selecting leaders and candidates in comparison with traditional methods (primarily via congresses of delegates).

One of the most recent proposals, developed by the researchers Rahat, Hazan and Katz, proposes that the selection system by means of primaries involves having to choose between two democratic values: on the one hand, the "inclusion" of members that form a group in their decision-making, and, on the other, the "representation" of social groups that have traditionally been excluded, such as women ("descriptive representation").

In its current format, the system for selecting leaders through primaries implies choosing between "inclusion" and "representation."

To subject this proposal to a more robust empirical test than those conducted to date, Javier Astudillo and Andreu Paneque constructed a database of leadership contests held in the main centre-right and centre-left parties, both national and regional, in eight western democracies (Australia, Austria, Belgium, Canada, Spain, Portugal and the UK) between 1985 and the present. This database contained 608 male and female candidates who contested 168 mixed competitions.

From their analysis, the authors show that the female candidates perform worse than males in party primaries, and they argue that in its current format this leadership selection mechanism effectively involves a trade-off between "inclusion" and "representation": "Our study confirms that the introduction of party leaders through primaries, in its current form, poses a handicap for women to break the glass ceiling", they assure.

Influence of the type of people competing

The study authors have also considered whether the difference might be due to the fact that primaries differ from other selection systems not only with regard to who participates in the voting (whether rank and file members or party delegates and leaders), but also the type of candidate who stands, whether male or female.

"Our study indicates that even when controlling for 'type' of candidate (measured by age and political experience), women achieve worse results under primaries".

According to researchers, it may occur that in conferences and in other more restricted selection processes only those women take the step to stand who are previously highly confident of their chances of winning, for example, having extensive experience within the parties, while in primaries, a wider variety of women are encouraged to compete, in the belief that selection by the membership removes the traditional "gatekeeper" barriers in the parties.

However, following their analysis, the authors assure that their "study indicates that even when controlling for '0type' of candidate (measured by age and political experience), women achieve worse results under primaries".

A problem of "demand" and a proposal for improvement

These results lead the researchers to believe that there is really a problem of "demand" (that is, in the selectors) and not so much a problem of "supply" (of those standing): "Further studies should clarify whether this is so due to a simple matter of party membership underestimating women's leadership, or if it hides another type of problem", they posit.

However, Javier Astudillo and Andreu Paneque uphold that these results do not necessarily have to lead to the conclusion that primaries should be avoided and claim that other mechanisms can be explored to make democratic values of inclusion and representation compatible within an institution: "One possible way could be the combination of mixed co-leaderships, as introduced in several parties, and selection by means of primaries, with one contest for men and another for women. Such a 'quota' system would ensure the presence of women in party leadership", they conclude.

###

Reference work: Astudillo, J., Paneque, A. (January 2021). "Do Party Primaries Punish Women? Revisiting the trade-off between the inclusion of party members and the selection of women as party leaders". Party Politics

The missing trillions

Establishing the true cost of the planet's energy and transport systems

UNIVERSITY OF SUSSEX

Research News

IMAGE

IMAGE: BENJAMIN K SOVACOOL, PROFESSOR OF ENERGY POLICY IN THE SCIENCE POLICY RESEARCH UNIT (SPRU) AT THE UNIVERSITY OF SUSSEX BUSINESS SCHOOL. view more 

CREDIT: UNIVERSITY OF SUSSEX

The hidden social, environmental and health costs of the world's energy and transport sectors is equal to more than a quarter of the globe's entire economic output, new research from the University of Sussex Business School and Hanyang University reveals.

According to analysis carried out by Professor Benjamin K. Sovacool and Professor Jinsoo Kim, the combined externalities for the energy and transport sectors worldwide is an estimated average of $24.662 trillion - the equivalent to 28.7% of global Gross Domestic Product.

The study found that the true cost of coal should be more than twice as high as current prices when factoring in the currently unaccounted financial impact of externalities such as climate change, air pollution and land degradation.

The study authors say the research highlights the market failure of the world's energy systems. Factoring in their true costs by including social costs almost equal to production costs, would make many fossil fuelled and nuclear power stations economically unviable, the research published in Energy Research & Social Science found. Even wind, solar, hydro, and other renewable energy systems have their own hidden costs.

Benjamin K. Sovacool, Professor of Energy Policy in the Science Policy Research Unit (SPRU) at the University of Sussex Business School, said: "Our research has identified immense hidden costs that are almost never factored into the true expense of driving a car or operating a coal-powered power station. Including these social costs would dramatically change least-cost planning processes and integrated resource portfolios that energy suppliers and others depend upon.

"It is not that these costs are never paid by society, they are just not reflected in the costs of energy. And unfortunately these hidden costs are not distributed equally or fairly. The most affected parties are under-represented in the marketplace, and have external costs imposed upon them, whether that be the families forced to live in areas of highest air pollution and toxicity because they have no other choice to the inhabitants of low-lying island states such as the Maldives or Vanuatu who are threatened most immediately by rising sea levels."

Professor Jinsoo Kim, from the Department of Earth Resources and Environmental Engineering at Hanyang University, said: "Or study clearly reveals oil, coal, and waste in electricity portfolios generate far more externalities than alternative sources of supply. If you factored in the true cost of fossil fuels, the multi-national giants that dominate this sector would be huge loss-making operations. Instead, it is left to society and government to pick up the considerable bill."

The researchers sought to find the range and scope of externalities, e.g. the unexpected costs or benefits resulting from economic activity that affects people other than those engaged in that activity for which there's no proper compensation, associated with electricity supply, energy efficiency, and transport.

To do so, they carried out a meta-analysis and research synthesis of 139 studies with 704 distinct estimates of externalities: 83 studies (with 318 observations) for electricity supply, 13 studies (with 13 observations) for energy efficiency, and 43 studies (with 373 observations) for transport.

They found that coal accounts for by far the largest share of energy externalities ($4.78 trillion, or 59%) followed by oil (more than $2 trillion, 26%) and gas ($552 billion, or 7%) across the four largest energy markets of China, Europe, India, and the United States.

The study found coal to have about three times as many negative externalities as solar PV, five times as many as wind energy, and 155 times as many as geothermal energy.

The researchers found that the externalities of coal amounted to 14.5 ¢/kWh compared to its levelized cost of energy (LCOE) of between 6.6 to 15.2 ¢/kWh. Similarly natural gas combined cycle turbines has externalities of 3.5¢ and an LCOE of 4.4 to 6.8 ¢/kWh.

Prof Sovacool said: "The challenge is for policymakers, regulators, and planners to ensure that electricity and transport markets function as they should and accurately price the trillions of dollars in external costs that the energy and mobility industries surreptitiously shift to society currently.

"At the moment, consumers have become shielded from the true costs of energy extraction, conversion, supply, distribution or use, which means the immense ecological or community impacts of our existing systems becomes far less discernible. The fundamental policy question is whether we want global markets that manipulate the presence of externalities to their advantage, or a policy regime that attempts to internalize them."

Prof Kim said: "Our findings are timely and we hope it will help inform the design of Green New Deals or post-pandemic Covid-19 recovery packages around the world.

"Some of the most important commonalities of many stimulus packages have been bailouts for the fossil fuel, automotive and aeronautic industries but a global and national recovery may not be sustainable if the true cost of these industries is not correctly factored in."


The time is ripe! An innovative contactless method for the timely harvest of soft fruits

Scientists develop convenient technique to measure ripeness using laser-induced plasma shockwaves and the ensuing vibrations on the fruit's surface

SHIBAURA INSTITUTE OF TECHNOLOGY

Research News

IMAGE

IMAGE: A PULSED LASER IS FOCUSED BY A LENS ONTO A POINT CLOSE TO THE SURFACE OF THE FRUIT. THE LASER-INDUCED PLASMA CREATES A SHOCKWAVE THAT EXCITES RAYLEIGH WAVES ON THE... view more 

CREDIT: SHIBAURA INSTITUTE OF TECHNOLOGY

Most people are probably familiar with the unpleasant feeling of eating overripe or underripe fruit. Those who work in agriculture are tasked with ensuring a timely harvest so that ripeness is at an optimal point when the fruit is sold, both to minimize the amount of fruit that goes to waste and maximize the quality of the final product. To this end, a number of techniques to assess fruit ripeness have been developed, each with their respective advantages and disadvantages depending on the type of produce.

Although biochemical and optical methods exist, mechanical techniques are the most widely used. They indirectly assess ripeness based on the fruit's firmness. In turn, firmness is quantified by observing the vibrations that occur on the fruit when mechanical energy is accurately delivered through devices such as hammers, pendulums, or speakers. Unfortunately, these approaches fall short for softer fruits, which are more readily damaged by the contact devices used.

In a recent study published in Foods, a team of scientists from Shibaura Institute of Technology (SIT), Japan, addressed this issue through an innovative method for measuring the firmness of soft fruits using laser-induced plasma (LIP). This work is a sort of follow-up of a previous study in which LIP was used to quantify the firmness of harder fruits.

But what is LIP and how is it used? Plasma is a state of matter similar to the gaseous state but in which most particles have an electric charge. This energetic state can be produced in normal air by focusing a high-intensity laser beam onto a small volume. Because the generated plasma "bubble" is unstable, it immediately expands, sending out shockwaves at ultrasonic speeds. Professor Naoki Hosoya and colleagues at SIT had successfully used LIP shockwaves generated close to the surface of apples to excite a type of vibration called 0S2 mode, colloquially referred to as "football mode vibration" because of how the resulting deformation looks on spherical bodies. They then verified that the frequency of the 0S2 mode vibrations was correlated with the firmness of the fruit.

However, soft fruits do not exhibit 0S2 mode vibrations, so the team had to analyze an alternative type of oscillation: Rayleigh waves. These are waves that occur exclusively on the surface of bodies without penetrating far into the interior. Using Kent mangoes, a setup for generating LIP, and commercially available laser-based vibrometers, the scientists verified that the velocity at which Rayleigh waves propagate is directly related to the firmness of the mangoes. Because the propagation velocity markedly decreases with storage time, it provides a reliable way to indirectly assess ripeness.

The team went further and looked for the best position on the mangoes' surface to determine the velocity of Rayleigh waves. Mangoes, as well as other soft fruits, have large seeds inside, which can alter the propagation of surface waves in ways that are detrimental to measurements. "The results of our experiments indicate that Rayleigh waves along the 'equator' of the mango are better for firmness assessment compared to those along the 'prime meridian'," explains Hosoya. The experiments also revealed that cavities within the fruit's flesh or decay can greatly affect the results of the measurements. Thus, as Hosoya adds, they will keep investigating which is the best area to measure firmness in mangoes using their novel approach.

In short, the team at SIT has engineered an innovative strategy to assess the ripeness of soft fruits from outside. "Our system," remarks Hosoya, "is suitable for non-contact and non-destructive firmness assessment in mangoes and potentially other soft fruits that do not exhibit the usual 0S2 mode vibrations." Further refinement of such firmness assessment methods will hopefully make them more reliable and accessible for the agricultural industry. With any luck, their widespread adoption will ensure that fruits reach your plate only when the time is ripe!

###

Reference

Title of original paper: Soft Mango Firmness Assessment Based on Rayleigh Waves Generated by a Laser-Induced Plasma Shock Wave Technique
Journal: Foods
DOI: 10.3390/foods10020323

About Shibaura Institute of Technology (SIT), Japan

Shibaura Institute of Technology (SIT) is a private university with campuses in Tokyo and Saitama. Since the establishment of its predecessor, Tokyo Higher School of Industry and Commerce, in 1927, it has maintained "learning through practice" as its philosophy in the education of engineers. SIT was the only private science and engineering university selected for the Top Global University Project sponsored by the Ministry of Education, Culture, Sports, Science and Technology and will receive support from the ministry for 10 years starting from the 2014 academic year. Its motto, "Nurturing engineers who learn from society and contribute to society," reflects its mission of fostering scientists and engineers who can contribute to the sustainable growth of the world by exposing their over 8,000 students to culturally diverse environments, where they learn to cope, collaborate, and relate with fellow students from around the world.

Website: https://www.shibaura-it.ac.jp/en/

About Professor Naoki Hosoya from SIT, Japan

Dr. Naoki Hosoya currently leads the Mechanical Dynamics Laboratory at SIT, where he performs research on various topics within the field of mechanical engineering, including vibration analysis and engineering, structural dynamics, soft actuators, non-destructive tests, and acoustic and modal analysis. He has authored over 140 published papers, some of which have been among the top 10 most read in their respective journals.

Funding Information

This study was partly supported by the Tojuro Iijima Foundation for Food Science and

Technology, grant number 25, and the Japan Society for the Promotion of Science for their support under Grants-in-Aid for Scientific Research Programs (Grants-in-Aid for Scientific Research (B), Project No. JP 19H02088).

Disclaimer: AAAS and EurekAlert! are not res

Reinforced by policies, charters segregate schools

CORNELL UNIVERSITY

Research News

ITHACA, N.Y. - The expansion of charter schools in the 2000s led to an increase in school segregation and a slight decline in residential segregation, according to new research from Cornell University providing the first national estimates of the diverging trends.

According to the study, the average district to expand charter school enrollment between 2000 and 2010 experienced a 12% increase in white-Black school segregation and a 2% decrease in white-Black residential segregation.

The patterns moved in opposite directions, the research found, because charter schools - which receive public funds but operate independently - weaken the traditional link between neighborhood and school assignment, allowing families to choose more racially homogenous schools regardless of where they live.

The findings highlight education policy's influence beyond schools and offer a "cautionary lesson" about continued charter expansion without efforts to limit racial sorting by families, according to lead author Peter Rich.

Understanding charter schools' effects on segregation is critical, because they represent an increasingly popular educational reform, the researchers said. Charter school enrollment has quadrupled since 2000, serving nearly 6% of students in 2015-2016, and is expected to continue growing and gaining influence.

The researchers analyzed more than 1,500 metropolitan school districts to examine what happened when school choice decoupled neighborhood and school options, using data from the census and the National Center for Education Statistics' Common Core of Data.

The researchers said their findings reveal school and residential segregation as "more like eddies in a stream, circling and reinforcing each other via policies and preferences."

The analysis did not find that charter school affected white-Hispanic segregation in schools, because Hispanic students on average attend more diverse charter schools. White-Hispanic segregation did fall as charter enrollment grew.

Though the reductions in residential segregation were "nontrivial," the researchers said, policy makers should not see school choice as a tool for achieving resident diversity, given how it exacerbated school segregation.

###

The study, "Segregated Neighborhoods, Segregated Schools: Do Charters Break a Stubborn Link?" published March 1 in the journal Demography.

For additional information, see this Cornell Chronicle story.

-30-

Natural product isolated from sea sponge tested against cancer cells

FAR EASTERN FEDERAL UNIVERSITY

Research News

IMAGE

IMAGE: FEFU LAB FOR DNA DIAGNOSTICS, EQUIPMENT view more 

CREDIT: FEFU PRESS OFFICE

Scientists from Far Eastern Federal University (FEFU) together with Russian and German colleagues, continue studying antitumor compounds synthesized based on bioactive molecules isolated from a sea sponge. One of them fights cancer cells resistant to standard chemotherapy, and at the same time has an interesting dual mechanism of action. A related article appears in Marine Drugs.

Scientists have tested the biological effect of the marine alkaloid 3,10-dibromofascaplysin on various prostate cancer cells, including those resistant to standard docetaxel-based chemotherapy. The compound was first isolated from the sea sponge Fascaplysinopsis reticulata and subsequently chemically synthesized in FEFU. The substance forces tumor cells to die via a programmed cell death mechanism. This process is called "apoptosis" and is considered the most favorable mode of action of anticancer drugs.

"The examined compound, while killing cancer cells, even ones resistant to standard chemotherapy, simultaneously activates an enzyme (so-called «kinase») protecting these tumor cells. However, it can't be considered as a "good" or "bad" effect. This is just a mechanism of action, an understanding of which suggests us to apply 3,10-dibromofascaplysin together with inhibitors of these enzymes," says Dr. Sergey Dyshlovoy, from the Laboratory of biologically active substances of FEFU School of Natural Sciences, senior researcher in the laboratory of the pharmacology of National Scientific Centre of Marine Biology (Vladivostok, Russia).

According to the scientist, the synthesized compound in addition to its own activity, works well in combination with several already approved anticancer drugs, enhancing their antitumor effect.

Next, scientists are planning to examine how 3,10-dibromofascaplysin affects non-cancer cells. They already run a related project supported by the Russian Foundation for Basic Research, aiming to report the outcomes during 2021.

"Fascaplysins are rather toxic to non-cancer cells. In our laboratory, we are trying to modify the structure of these compounds in order to reduce their cytotoxic effect on normal cells, while maintaining the necessary antitumor effect. The goal is to create a substance for targeted therapy, with a minimum of side effects for healthy cells of the body," says Dr. Maxim Zhidkov, Head of Department of Organic Chemistry, FEFU School of Natural Sciences.

Commented on the time needed for the development of the on-the-shelf drug, scientists speak about the horizon of 10-15 years, given the necessity for long preliminary and further clinical trials.

###

In the study took part specialists from the Far Eastern Federal University, A.V. Zhirmunsky National Scientific Center of Marine Biology (RAS, Vladivostok), Martini Clinic (Germany), University Medical Center Hamburg-Eppendorf (Germany), V.N. Orekhovich Research Institute of Biomedical Chemistry (Russian Academy of Sciences), V.A. Engelhardt Institute of Molecular Biology (RAS, Moscow).

The World Ocean is one of the priority research run in the FEFU. University scientists conduct research on bioactive molecules of marine origin, develop engineering solutions for Arctic ice platforms (exploration and production of minerals), underwater robotics, under-ice communications, etc.

On calm days, sunlight warms the ocean surface and drives turbulence

OREGON STATE UNIVERSITY

Research News

IMAGE

IMAGE: CLOUDS FORM OVER THE INDIAN OCEAN AS THE SUN SETS. A NEW STUDY HAS FOUND THAT IN TROPICAL OCEANS, A COMBINATION OF SUNLIGHT AND WEAK WINDS DRIVES UP SURFACE TEMPERATURES... view more 

CREDIT: DEREK COFFMAN, NOAA.

CORVALLIS, Ore. - In tropical oceans, a combination of sunlight and weak winds drives up surface temperatures in the afternoon, increasing atmospheric turbulence, unprecedented new observational data collected by an Oregon State University researcher shows.

The new findings could have important implications for weather forecasting and climate modeling, said Simon de Szoeke, a professor in OSU's College of Earth, Ocean, and Atmospheric Sciences and the lead author of the study.

"The ocean warms in the afternoon by just a degree or two, but it is an effect that has largely been ignored," said de Szoeke. "We would like to know more accurately how often this is occurring and what role it may play in global weather patterns."

The findings were just published in the journal Geophysical Research Letters. Co-authors are Tobias Marke and W. Alan Brewer of the NOAA Chemical Sciences Laboratory in Boulder, Colorado.

Over land, afternoon warming can lead to atmospheric convection and turbulence and often produces thunderstorms. Over the ocean, the afternoon convection also draws water vapor from the ocean surface to moisten the atmosphere and form clouds. The warming over the ocean is more subtle and gets stronger when the wind is weak, said de Szoeke.

De Szoeke's study of ocean warming began during a research trip in the Indian Ocean several years ago. The research vessel was equipped with Doppler lidar, a remote sensing technology similar to radar that uses a laser pulse to measure air velocity. That allowed researchers to collect measurements of the height and strength of the turbulence generated by the afternoon warming for the first time.

Previous observations of the turbulence over the ocean had been made only by aircraft, de Szoeke said.

"With lidar, we have the ability to profile the turbulence 24 hours a day, which allowed us to capture how these small shifts in temperature lead to air turbulence," he said. "No one has done this kind of measurement over the ocean before."

Researchers gathered data from the lidar around the clock for about two months. At one point, surface temperatures warmed each afternoon for four straight days with calm wind speeds, giving researchers the right conditions to observe a profile of the turbulence created in this type of sea surface warming event.

It took a "perfect storm" of conditions, including round-the-clock sampling by the lidar and a long ocean deployment, to capture these unprecedented observations, de Szoeke said.

Sunlight warms the ocean surface in the afternoon, surface temperatures go up by a degree Celsius or more. This warming occurs during roughly 5% of days in the world's tropical oceans. Those oceans represent about 2% of the Earth's surface, about the equivalent of the size of the United States.

The calm wind and warming air conditions occur in different parts of the ocean in response to weather conditions, including monsoons and Madden-Julian Oscillation, or MJO, events, which are ocean-scale atmospheric disturbances that occur regularly in the tropics.

To determine the role these changing temperatures play in weather conditions in the tropics, weather models need to include the effects of surface warming, de Szoeke said.

"There are a lot of subtle effects that people are trying to get right in climate modeling," de Szoeke said. "This research gives us a more precise understanding of what happens when winds are low."

###

The research was supported by NOAA and the Office of Naval Research.

Socioeconomic status plays a major role in cognitive outcomes

Childhood cancer and its treatment can result in cognitive struggles. St. Jude scientists are studying the risk factors.

ST. JUDE CHILDREN'S RESEARCH HOSPITAL

Research News

IMAGE

IMAGE: HEATHER CONKLIN, PHD, OF ST. JUDE PSYCHOLOGY, CONTRIBUTED TO RESEARCH THAT STUDIED RISK FACTORS OF CERTAIN CANCER TREATMENTS IN CHILDREN. view more 

CREDIT: ST. JUDE CHILDREN'S RESEARCH HOSPITAL

Childhood cancer and its treatment can result in cognitive struggles. Scientists atSt. Jude Children's Research Hospital are studying the risk factors. They looked at social and economic issues in children with brain tumors treated with radiation.

These patients have the greatest risk of cognitive problems. Scientists followed a group of St. Jude patients for 10 years. The children all had conformal radiation therapy.

For each patient, researchers looked at certain factors. These included the parent's job, education level, and whether it was a single parent home. The children were from different backgrounds.

The findings show social and economic status is linked to IQ, academics, attention and self-care skills before treatment. The study also shows that this gap widens over time.

"What was most surprising was that for some measures, the contribution of socioeconomic status was even greater than age at treatment, which has typically been the biggest risk factor," said Heather Conklin, PhD, of St. Jude Psychology.

###

Neuro-Oncology published a report on this work