Tuesday, October 17, 2023

Close connectivity within the North Atlantic Current system identified


Long-term comparative study reveals parallels between time series from Florida and Newfoundland

Peer-Reviewed Publication

MARUM - CENTER FOR MARINE ENVIRONMENTAL SCIENCES, UNIVERSITY OF BREMEN

An acoustic current meter, built into a mooring buoy being prepared for deployment in the Atlantic. The ocean current is measured with these instruments. 

IMAGE: 

AN ACOUSTIC CURRENT METER, BUILT INTO A MOORING BUOY BEING PREPARED FOR DEPLOYMENT IN THE ATLANTIC. THE OCEAN CURRENT IS MEASURED WITH THESE INSTRUMENTS. PHOTO: MARUM – CENTER FOR MARINE ENVIRONMENTAL SCIENCES, UNIVERSITY OF BREMEN, D. KIEKE

view more 

CREDIT: PHOTO: MARUM – CENTER FOR MARINE ENVIRONMENTAL SCIENCES, UNIVERSITY OF BREMEN, D. KIEKE




In addition to wind, temperature and salinity are the driving forces of ocean currents in the North Atlantic. They transport warm, salty water from the subtropical regions northward to the colder, lower-salinity areas. Like the wind-driven Gulf Stream, these currents are components of the Atlantic Meridional Overturning Circulation (AMOC). Because of the immense amount of heat it transports, the AMOC is an important part of the global climate system. For example, it influences regional precipitation as well as strong tropical storms, and acts as a remote heating system for Europe’s climate.

Long-term studies help to document changes and refine the mathematical models of future climate change. Modelling studies indicate that, in a warming world, the expected weakening of the AMOC could have an impact on regional temperature or precipitation patterns. Long-term observation systems have been in place, for example, at 26 degrees north latitude between the east side of Florida and the west coast of Africa. For the first time, researchers from Bremen and Hamburg have compared these data, covering a period of 25 years, with those from an observation system to the north at 47 degrees north between Newfoundland and France.

“We see a statistical connection,” says first author Simon Wett of MARUM – Center for Marine Environmental Sciences and the Institute of Environmental Physics at the University of Bremen. “There are parallels between the data from the two observation systems. Parts of what we observe in the data from our system during this extended time period, we also see a few months later at the 26-degree stations."

The long-term study, extending over the entire span of the Atlantic basin, is based on data from arrays of moorings that collect measurement parameters both near the surface of the ocean as well as at greater depths within the water column. These include, for example, the salinity, water temperature, and strength of the current. The moorings at 47°N were installed as a cooperative project between the University of Bremen and the Federal Maritime and Hydrographic Agency (BSH).

The researchers are convinced that a long time series provides a better foundation for future models to realistically simulate the AMOC. “Of course, we need to continue to monitor the AMOC in order to make long-term assessments and future prognoses,” says Simon Wett. A long-term trend of strengthening or weakening of the current could not be identified from the study by Wett and his colleagues, which covers the past quarter of a century.

 

More information:

Participating institutes:

  • Institute for Environmental Physics, University of Bremen, Germany
  • MARUM – Center for Marine Environmental Sciences, University of Bremen, Germany
  • Federal Maritime and Hydrographic Agency, Germany

MARUM produces fundamental scientific knowledge about the role of the ocean and the seafloor in the total Earth system. The dynamics of the oceans and the seabed significantly impact the entire Earth system through the interaction of geological, physical, biological and chemical processes. These influence both the climate and the global carbon cycle, resulting in the creation of unique biological systems. MARUM is committed to fundamental and unbiased research in the interests of society, the marine environment, and in accordance with the sustainability goals of the United Nations. It publishes its quality-assured scientific data to make it publicly available. MARUM informs the public about new discoveries in the marine environment and provides practical knowledge through its dialogue with society. MARUM cooperation with companies and industrial partners is carried out in accordance with its goal of protecting the marine environment.

  

Schematic representation of the most important North Atlantic currents. Red (blue) arrows show the upper (deep) circulation paths. The acronyms indicate the positions of the North Atlantic Current (NAC) and the Eastern Boundary Current (EBC). The black lines show the transport lines of the observatory arrays. Graphic: MARUM – Center for Marine Environmental Sciences, University of Bremen; S. Wett.

CREDIT

Graphic: MARUM – Center for Marine Environmental Sciences, University of Bremen; S. Wett.

 

Proof-of-concept method advances bioprocess engineering for a smoother transition to biofuels


Japanese researchers propose green industrial strategy that employs bacterial cells to relay feedback and optimize production

Peer-Reviewed Publication

FUJITA HEALTH UNIVERSITY

Bioprocess optimization can be improved by addressing the problem of process-model mismatch (PMM) 

IMAGE: 

AN IN SILICO MODEL-BASED CONTROLLER AND AN IN-CELL FEEDBACK CONTROLLER COMPLEMENT EACH OTHER’S LIMITATIONS, THUS OFFERING AN EFFECTIVE SOLUTION TO THE PROBLEM OF PMM.

view more 

CREDIT: THE AUTHOR(S) FROM NARA INSTITUTE OF SCIENCE AND TECHNOLOGY, KYUSHU UNIVERSITY, AND FUJITA HEALTH UNIVERSITY




One of the primary goals of bioprocess engineering is to increase the yield of the desired material while maintaining high production rates and low raw material utilization. This optimization is usually accomplished by controlling the behavior of microorganisms used in the process and ensuring that their biological capabilities are fully utilized. This control may be computerized (in silico feedforward) or autonomous (in-cell feedback) which predicts the optimization based on inputs received. However, a process-model mismatch (PMM) occurs when there is a discrepancy between the predicted and actual production processes.

A recent paper published in Scientific Reports demonstrates a proof-of-concept method that effectively addresses the issue of PMM. This paper was made available online on September 4, 2023 and was published in Volume 13 of Scientific Reports“To address the PMM issue with in silico controllers, we propose a hybrid control strategy that combines a high-level in silico feedforward controller and a low-level in-cell controller,” remarks doctoral student Tomoki Ohkubo from the Graduate School of Science and Technology at Nara Institute of Science and Technology and Senior Assistant Professor Katsuyuki Kunida from Fujita University. The hybrid in silico/in-cell controller (HISICC) proposed in this study combines model-based optimization and the integration of synthetic genetic circuits into cells.

According to the study, PMM can be solved by integrating two types of feedforward and feedback controllers into industrial bioprocesses. The first is aided by a monitoring computer and the second by genetically altered living cells. A computer, for instance, may suggest the optimal temperature and pH for an industrial bioprocess as a feedforward prediction. Meanwhile, in-cell feedback control provides valuable feedback by detecting the actual levels of intracellular nucleic acid, enzymes, and metabolites. It is difficult to determine these parameters using computer-based controllers alone.

To validate their postulate, the research team demonstrates the use of two genetically modified bacterial (Escherichia coli) strains—TA1415 and TA2445—for the optimized production of isopropanol—a versatile cleaning agent, industrial solvent, and chemical intermediate. While TA2445 harbors an in-cell feedback controller, TA1415 does not. More specifically, the in-cell feedback controller in TA2445 comprises a metabolic toggle switch (MTS)—a genetic circuit that exerts control on isopropanol production by responding to a chemical reagent called isopropyl β-d-1-thiogalactopyranoside (IPTG)—and a specialized genetic circuit that allows it to relay real-time feedback on cell density. Although TA1415 has an “MTS,” it is unable to provide any feedback on cell density due to the complete absence of the specialized genetic circuitry.

The team found that the inclusion of the TA2445 strain in the bioproduction results into the optimal regulation of the MTS circuit under the multiple PMM conditions that were intentionally introduced. This enhanced regulation, based on the detection of cell density, prevented the decrease of isopropanol yields due to PMM, resulting in higher yield. In other words, the proposed hybrid control system can efficiently compensate for PMM and robustly maintain the efficiency of microbial material production.          

It is a common practice in the field of control to combine a high-level controller that uses a mathematical model with a low-level controller that does not use one. In this study, the authors adopt a novel approach where engineered bacterial cells themselves relay feedback as low-level controllers.

This landmark study has several key implications. The demonstrated method, for instance, could be adapted for the cost-effective and eco-friendly production of chemicals and fuels. Such timely intervention, in tandem with other global initiatives and efforts, could help reverse or mitigate the deadly effects of global warming and climate change in the long term.

The results show that the in-cell feedback controller in TA2445 effectively compensates for PMM by modifying MTS activation timing. The HISICC system presents a promising solution to the PMM problem in bioprocess engineering, paving the way for more efficient and reliable microbial bioprocess optimization,” explains doctoral student Tomoki Ohkubo and Senior Assistant Prof. Kunida.

Hats off to the Japanese researchers for their exemplary contribution to bioprocess optimization, which not only advances the field but also offers promising prospects for carbon footprint reduction!

 

***

 

Reference

DOI: https://doi.org/10.1038/s41598-023-40469-y

 

About Fujita Health University

Fujita Health University is a private university situated in Toyoake, Aichi, Japan. It was founded in 1964 and houses one of the largest teaching university hospitals in Japan in terms of the number of beds. With over 900 faculty members, the university is committed to providing various academic opportunities to students internationally. Fujita Health University has been ranked eighth among all universities and second among all private universities in Japan in the 2020 Times Higher Education (THE) World University Rankings. THE University Impact Rankings 2019 visualized university initiatives for sustainable development goals (SDGs). For the “good health and well-being” SDG, Fujita Health University was ranked second among all universities and number one among private universities in Japan. The university became the first Japanese university to host the "THE Asia Universities Summit" in June 2021. The university’s founding philosophy is “Our creativity for the people (DOKUSOU-ICHIRI),” which reflects the belief that, as with the university’s alumni and alumnae, current students also unlock their future by leveraging their creativity.

Website: https://www.fujita-hu.ac.jp/en/index.html

 

About Dr. Katsuyuki Kunida from Fujita Health University

Dr. Katsuyuki Kunida works as a Lecturer in the Department of Computational Biology at Fujita Health University’s School of Medicine. He obtained his Ph.D. in Medicine from Kyoto University. Dr. Kunida’s research primarily focuses on system-theoretical understanding, prediction and control of cell fate decision making, cell motility, and metabolism related to regenerative medicine and prediction of pathological condition. Dr. Kunida hopes to make significant contributions to preventive medicine by conducting research on precise prediction and control of cell systems. He has won several research awards and he also has more than 20 publications to his credit.  

 

About Mr. Tomoki Ohkubo from Nara Institute of Science and Technology

Mr. Tomoki Ohkubo is a PhD candidate in Computational Biology at Nara Institute of Science and Technology (NAIST). He received his B.S. in Precision Engineering and M.S. in Bioengineering from the University of Tokyo. He is also a biomedical engineer at Shimadzu Corporation, where he develops cell culture systems using microfluidics. 

 

The effects of preheating on vehicle fuel consumption and emissions appear minimal


Peer-Reviewed Publication

UNIVERSITY OF EASTERN FINLAND




Published in Applied Energy, a new study by the University of Eastern Finland and Tampere University found that the benefits of car preheating for both fuel economy and emissions are minimal. The researchers focused on vehicle fuel consumption and emissions under cold winter conditions. Of particular interest were cold start emissions and their relation to preheating.

The results show that cold starts are challenging, especially for diesel-powered vehicles under cold winter conditions. During the measurement campaign conducted in Finland, outdoor temperatures dropped as low as -28 degrees Celsius, and this was reflected in vehicle emissions. The route driven in the measurement campaign aimed to replicate typical commuting scenarios, including both urban and highway driving, as well as stops at intersections and traffic lights.

Vehicles were driven on the same route under three conditions: after a cold start, preheated, and with the engine already warmed up by driving. After a cold start, the engine coolant needed nearly the entire drive (13.8 km, about 19 minutes) to reach its optimal operating temperature (>60°C). The studied vehicles were equipped with either electric or fuel-powered preheaters.

“Efficient preheaters (at least 1 kW) helped in warming the engine coolant before starting, but they didn’t significantly speed up reaching the optimal operating temperature. Higher starting temperatures primarily improved vehicle comfort by providing a warmer cabin and preventing window frost. Additionally, according to car manufacturers, preheating can reduce engine wear during cold starts,” Postdoctoral Researcher Ville Leinonen of the University of Eastern Finland says.

The study found slightly lower overall fuel consumption (10–20%) when the vehicle was driven after being warmed up compared to after a cold start. Only two out of the six vehicles studied, both equipped with fuel-powered auxiliary heaters, showed small fuel savings due to preheating. Even in these vehicles, preheating before starting only helped reduce fuel consumption by less than 4% compared to cold starts.

“The calculated fuel savings did not account for the fuel or electricity consumption of the auxiliary heaters during preheating. When considering the fuel consumption during preheating, the post-preheating drive resulted in 26-37% higher overall fuel consumption than after a cold start. Preheating also had an impact on overall emissions,” Research Director Santtu Mikkonen of the University of Eastern Finland points out.

These findings reinforce the notion that the use of fuel-powered auxiliary heaters cannot be justified by better fuel economy or reduced emissions in actual cold temperature driving. Nevertheless, when considering the entire lifespan of a vehicle, the benefits of preheating may become apparent through extended engine oil life and longer engine durability, although these factors were not examined in this study.

No evidence on positive effect of preheating on particulate emissions

Assistant Professor Panu Karjalainen of Tampere University points out that, as a natural consequence of the observations on total fuel consumption, preheating did not significantly affect particulate emissions either. The number concentration of particles exceeded regulatory limits for new vehicles by up to a hundredfold. This may be partly explained by the fact that regulations only take into account solid particles larger than 23 nanometres in size and apply to emissions measured under warm conditions. The measurements conducted under cold winter conditions showed high concentrations of smaller particles, some of which could be liquid.

Even though diesel particulate filters are supposed to capture nearly all particles from emissions, significant particulate emissions were observed during driving in diesel vehicles equipped with fuel-powered auxiliary heaters. This can be attributed to the emissions produced by these heaters during operation, as they automatically provide additional heat to the engine or cabin while driving. The effect of auxiliary heaters on in-use emissions is more pronounced because there is no emissions aftertreatment for heaters as there is for engine emissions.

In contrast to particle number emissions, there were differences in emissions of particle mass, black carbon and nitrogen oxides in different driving situations. Particle mass and black carbon emissions were lower when driving with a warm engine compared to cold starts, especially in gasoline-powered vehicles. The largest reductions in particle mass were observed to be 85% over the entire route and 99% during the initial idle and early route sections.

Nitrogen oxide emissions were up to 90% lower when driving with a warm engine compared to after a cold start, depending on the vehicle. However, the benefits of preheating for reducing particle mass emissions were only observed in one gasoline vehicle, where these emissions decreased by 72%, and in one diesel vehicle, where the reduction was 24%. For black carbon emissions, preheating showed only minimal benefits. Regarding nitrogen oxide emissions, significant benefits from preheating were observed in only one gasoline vehicle, where the reduction was 41%. It is noteworthy that, when considering the emissions from auxiliary heaters as well, the reduction in nitrogen oxide emissions is only 15%. Importantly, electric preheaters were not found to provide significant benefits in terms of fuel consumption or emissions reduction.

For diesel vehicles, the role of auxiliary heaters in total nitrogen oxide emissions was not as significant as in gasoline vehicles, as diesel vehicles had significantly higher in-use nitrogen oxide emissions. Although all studied vehicles exceeded nitrogen oxide emission limits, the largest exceedances were observed in diesel vehicles, even up to 21 times the limit. It’s worth noting that nitrogen oxide emissions from one diesel vehicle exceeded the regulatory limit by a factor of 12, despite being equipped with a selective catalytic reduction (SCR) system. This suggests that the SCR system did not function properly in cold temperatures.

All benefits and downsides of preheating are not known

In summary, using preheating solely for the purpose of improving fuel economy and reducing emissions does not find support in actual cold-temperature driving conditions. However, in different driving scenarios and considering the entire lifespan of vehicles, emissions reduction might lead to somewhat different conclusions. The studied emissions, including particulate and nitrogen oxide emissions, indicated some reductions for vehicles with fuel-powered auxiliary heaters. Nevertheless, these benefits diminish significantly or even become negated when accounting for the fuel consumption and emissions of the auxiliary heater during the preheating cycle.

Furthermore, it’s important to note that preheating may have benefits regarding emissions that were not measured in this study, such as emissions of specific hydrocarbons. In addition, emissions from fuel-operated auxiliary heaters may be higher due to limited aftertreatment, potentially offsetting any advantages in terms of these emissions as well.

The study was funded by Jane and Aatos Erkko foundation and constitutes part of research conducted within the Academy of Finland’s ACCC Flagship.

 

Toward a global scientific consensus: identifying vulnerable marine ecosystems through imagery


Peer-Reviewed Publication

PEERJ

Amy R Baco 

IMAGE: 

DR. AMY R BACO

view more 

CREDIT: LEAD AUTHOR: DR. AMY R BACO




The scientific community is taking a significant step towards establishing a consensus on the designation of Vulnerable Marine Ecosystems (VMEs) from imagery data, as highlighted in the new article titled "Towards a scientific community consensus on designating Vulnerable Marine Ecosystems from imagery," authored by Dr. Amy R. Baco and colleagues, and published in PeerJ Life & Environment. 

 

“Many scientists around the world were working independently on a similar question: Given the UN Food and Agriculture Organization (FAO) regulations for deep-sea Vulnerable Marine Ecosystems on the high seas, how can VMEs be identified from seafloor images?  We recognized after many of our presentations at a conference that we were all working towards the same goal, and that working together would allow us to produce guidelines that could be used more consistently across regions.  This publication represents the discussions of that large group of scientists and managers, and the first-level answers to how we would determine if an image showed a VME,” said  Dr. Amy R. Baco.


 

Management of deep-sea fisheries in international waters demands the identification of areas containing VMEs, a critical task undertaken by Regional Fisheries Management Organizations/Arrangements (RFMO/As). Currently, fisheries data, including trawl and longline bycatch data, serve as the primary source for informing VME identification. However, the collection of such data can have negative ecological consequences, underscoring the urgent need for non-invasive alternatives for VME assessment and monitoring. 

 

Imagery data from scientific surveys offer a promising solution, but until now, there has been no established framework for identifying VMEs from images. Dr. Baco, a Professor in the Department of Earth, Ocean and Atmospheric Sciences at Florida State University, spearheaded an international collaboration with the aim of determining existing VME assessment protocols and establishing preliminary global consensus guidelines for VME identification from images.

 

The study's initial assessment revealed a lack of consistency among RFMO/A regions in defining VME indicator taxa, resulting in variability in how VMEs are defined. In certain cases, experts concurred that a single image could identify a VME, particularly in areas with scleractinian reefs, dense octocoral gardens, multiple VME species' co-occurrence, and chemosynthetic ecosystems. The study introduces a practical decision flow chart for interpreting the FAO criteria for single images.

 

To further evaluate density-related aspects of the flow chart, data were compiled to assess regional perceptions of density thresholds. The findings indicated that there is the potential to develop consistent thresholds across regions based on expert consensus because there was a statistically significant difference in the mean of the values for what was considered yes a VME or No not a VME.  But significant variation occurs in densities of different species, so the same thresholds will not work across all taxa. ,. Work is ongoing to develop an areal extent index, incorporate a measure of confidence, and deepen understanding of density and diversity levels correlating with key ecosystem functions for VME indicator taxa. 

 

Based on these results, the study proposes several important recommendations.

1. Establish a global consensus on which taxa should serve as VME indicators.

2. Encourage RFMO/As to adopt guidelines that incorporate imagery surveys as an alternative or complement to bycatch and trawl surveys for VME designation. 

3. Include imagery surveys in Impact Assessments for both fisheries and also for other industries that impact the seafloor.

 

The research not only advances our understanding of VME identification but also underscores the importance of sustainable deep-sea management practices. The development of standardized guidelines for VME assessment from imagery will contribute significantly to the preservation of vulnerable marine ecosystems in international waters.

 

 

Surprising discovery shows electron beam radiation can repair nanostructures


Self-healing crystals could improve materials used in today’s electronics


Peer-Reviewed Publication

UNIVERSITY OF MINNESOTA

Electron microscope image of crystal crack healing 

IMAGE: 

THESE ELECTRON MICROSCOPE IMAGES SHOW HOW THE CRACK IN A CRYSTAL OF TITANIUM DIOXIDE BEGINS TO "HEAL" WITH INCREASING ELECTRON DOSES.

view more 

CREDIT: MKOYAN GROUP, UNIVERSITY OF MINNESOTA




MINNEAPOLIS / ST. PAUL (10/12/2023)—In a surprising new study, researchers at the University of Minnesota Twin Cities have found that the electron beam radiation that they previously thought degraded crystals can actually repair cracks in these nanostructures. 

The groundbreaking discovery provides a new pathway to create more perfect crystal nanostructures, a process that is critical to improving the efficiency and cost-effectiveness of materials that are used in virtually all electronic devices we use every day.

“For a long time, researchers studying nanostructures were thinking that when we put the crystals under electron beam radiation to study them that they would degrade,” said Andre Mkhoyan, a University of Minnesota chemical engineering and materials science professor and lead researcher in the study. “What we showed in this study is that when we took a crystal of titanium dioxide and irradiate it with an electron beam, the naturally occurring narrow cracks actually filled in and healed themselves.”

The researchers accidentally stumbled upon the discovery when using the University of Minnesota’s state-of-the-art electron microscope to study the crystals for a completely different reason.

“I was studying the cracks in the crystals under the electron microscope and these cracks kept filling in,” said Silu Guo, a University of Minnesota chemical engineering and materials science Ph.D. student. “This was unexpected, and our team realized that maybe there was something even bigger that we should be studying.”

In the self-healing process, several atoms of the crystal moved together in tandem and met in the middle and formed a sort of bridge that filled the crack. For the first time, the researchers showed that the electron beams could be used constructively to engineer novel nanostructures atom-by-atom.

"Whether it's atomically sharp cracks or other types of defects in a crystal, I believe it's inherent in the materials we've grown, but it's truly astonishing to see how Professor Mkhoyan's group can mend these cracks using an electron beam," said University of Minnesota Chemical Engineering and Materials Science Professor Bharat Jalan, a collaborator on the research.

The researchers say the next step is to introduce new factors like changing the electron beam conditions or changing the temperature of crystal to find a way to improve or speed up the process.

“First we discovered, now we want to find more ways to engineer the process,” Mkhoyan said. 

In addition to Mkhoyan, Guo, and Jalan, the research team included University of Minnesota Chemical Engineering and Materials Science Ph.D. student Sreejith Nair, and former graduate student Hwanhui Yun.

This work was supported primarily by the National Science Foundation (NSF). Parts of this work were carried out at the UMN Characterization Facility. Computational resources were provided by the Minnesota Supercomputing Institute (MSI) and film growth was supported by the Department of Energy (DOE).

To read the entire research paper entitled “Mending cracks atom-by-atom in rutile TiO2 with electron beam radiolysis,” visit the Nature Communications website.

 

Self-correcting quantum computers within reach?


Harvard team’s method of reducing errors tackles major barrier to scaling up technology


Peer-Reviewed Publication

HARVARD UNIVERSITY





Quantum computers promise to reach speeds and efficiencies impossible for even the fastest supercomputers of today. Yet the technology hasn’t seen much scale-up and commercialization largely due to its inability to self-correct. Quantum computers, unlike classical ones, cannot correct errors by copying encoded data over and over. Scientists had to find another way.

Now, a new paper in Nature illustrates a Harvard quantum computing platform’s potential to solve the longstanding problem known as quantum error correction.

Leading the Harvard team is quantum optics expert Mikhail Lukin, the Joshua and Beth Friedman University Professor in physics and co-director of the Harvard Quantum Initiative. The work reported in Nature was a collaboration among Harvard, MIT, and Boston-based QuEra Computing. Also involved was the group of Markus Greiner, the George Vasmer Leverett Professor of Physics.

An effort spanning the last several years, the Harvard platform is built on an array of very cold, laser-trapped rubidium atoms. Each atom acts as a bit — or a “qubit” as it’s called in the quantum world — which can perform extremely fast calculations.

The team’s chief innovation is configuring their “neutral atom array” to be able to dynamically change its layout by moving and connecting atoms — this is called “entangling” in physics parlance — mid-computation. Operations that entangle pairs of atoms, called two-qubit logic gates, are units of computing power.

Running a complicated algorithm on a quantum computer requires many gates. However, these gate operations are notoriously error-prone, and a buildup of errors renders the algorithm useless.

In the new paper, the team reports near-flawless performance of its two-qubit entangling gates with extremely low error rates. For the first time, they demonstrated the ability to entangle atoms with error rates below 0.5 percent. In terms of operation quality, this puts their technology’s performance on par with other leading types of quantum computing platforms, like superconducting qubits and trapped-ion qubits.

However, Harvard’s approach has major advantages over these competitors due to its large system sizes, efficient qubit control, and ability to dynamically reconfigure the layout of atoms.

“We’ve established that this platform has low enough physical errors that you can actually envision large-scale, error-corrected devices based on neutral atoms,” said first author Simon Evered, a Harvard Griffin Graduate School of Arts and Sciences student in Lukin’s group. “Our error rates are low enough now that if we were to group atoms together into logical qubits — where information is stored non-locally among the constituent atoms — these quantum error-corrected logical qubits could have even lower errors than the individual atoms.”

The Harvard team’s advances are reported in the same issue of Nature as other innovations led by former Harvard graduate student Jeff Thompson, now at Princeton University, and former Harvard postdoctoral fellow Manuel Endres, now at California Institute of Technology. Taken together, these advances lay the groundwork for quantum error-corrected algorithms and large-scale quantum computing. All of this means quantum computing on neutral atom arrays is showing the full breadth of its promise.

“These contributions open the door for very special opportunities in scalable quantum computing and a truly exciting time for this entire field ahead,” Lukin said.

The research was supported by the U.S. Department of Energy’s Quantum Systems Accelerator Center; the Center for Ultracold Atoms; the National Science Foundation; the Army Research Office Multidisciplinary University Research Initiative; and the DARPA Optimization with Noisy Intermediate-Scale Quantum Devices program.

 

The advantage of digital-native brands setting up physical brand stores—and the challenge of preventing sales losses in existing channels


News from the Journal of Marketing

Peer-Reviewed Publication

AMERICAN MARKETING ASSOCIATION




Researchers from Erasmus School of Economics at Erasmus University Rotterdam, KU Leuven, Universität zu Lübeck, Christian-Albrechts-Universität zu Kiel, and FoodLabs published a new Journal of Marketing article that investigates the multichannel impact of brand stores by digital-native FMCG brands.

The study, forthcoming in the Journal of Marketing, is titled “Assessing the Multichannel Impact of Brand Store Entry by a Digital-Native Grocery Brand” and is authored by Michiel Van Crombrugge, Els Breugelmans, Florian Breiner, and Christian W. Scheiner.

Multichannel retailing has become crucial to the sales strategy of any brand, including digital-native brands that started retailing as online-only. Digital-native brands like Quip in the U.S. and Myprotein in Europe have partnered with independent retailers to offer consumers an in-person retail option. But some brands—especially those in the fast-moving consumer goods (FMCG) category—have opened their own brand stores to create a bigger physical footprint.

Brand stores are brick-and-mortar stores owned and operated by the manufacturer. They carry only the brand’s products and are designed to sell them profitably in a brand-centric environment. Van Crombrugge explains that “these stores offer physical exposure, which digital-native brands might struggle to attain on supermarket shelves given the steep competition from mass-market brands.” Brand stores increase brand awareness, which in turn can increase sales in the company-owned online channel and independent supermarkets. “Brand stores can also spark distributor interest and prompt supermarkets to distribute more of the brand on their shelves. Since the number of brand stores that a digital-native FMCG brand can open is limited, increasing breadth and depth of supermarket distribution can further drive brand sales,” adds Breugelmans.

Yet brand stores also entail risks. Sales in this channel may cannibalize sales in the incumbent channels if consumers migrate to the newly opened brand store. If brand stores signal the manufacturer’s encroachment, supermarkets might reduce their distribution of the brand. Finally, opening and operating brand stores is expensive and these substantial operational costs put pressure on profits.

The Supermarket Effect

This research uncovers a substantially different impact of brand store entry on own-online channel sales than on sales in independent supermarkets. In areas in the vicinity of brand stores, the brand’s online channel sales decreased, yet its supermarket sales increased. This is because for customers seeking a more elevated consumption experience, brand stores offer an interesting alternative, which causes cannibalization of its own online channel. In supermarkets, on the other hand, buyers are mainly concerned with price and convenience. For them, brand stores offer an opportunity to discover a digital-native brand that otherwise would have remained anonymous between bigger mass-market brands, which in turn causes supermarket sales to increase.

The research team also discovers that brand stores spark distributor interest and prompt supermarkets to start distributing the brand on their shelves. Indeed, part of the supermarket sales increase that brand stores bring about is driven by brand stores’ positive effect on the number of supermarkets that carry the brand. This increase in distribution breadth is an important component to drive sales since brands cannot open brand stores everywhere.

“We find that brand stores generate an influx of own brand store sales that more than make up for any online losses. This is not necessarily surprising because their strong local visibility, typically in locations with high foot traffic, and their appeal to customers who lack opportunities or motivations to visit the online channel or supermarket make brand stores an attractive sales channel on their own,” Scheiner says. Despite the cannibalizing impact on their own online channel, brand stores are an effective means to increase a brand’s top-line sales. Digital natives in startup or growth markets that aim to draw investors’ attention can try to improve their valuation through brand stores and the corresponding sales growth.

However, opening and running brand stores is a capital-intensive operation due to factors such as store rental cost and sales staff wages. Breinder warns that “our analyses show that nearly half of the brand stores under study were not able to turn a profit. Brands therefore need to carefully weigh brand stores’ top-line gains against their high operational expenses to justify the investment financially.”

These findings offer important insights and caveats to digital-native brands that consider opening brand stores to increase their physical footprint beyond supermarkets. The upside is that brand stores can help digital natives reach potential consumers and gain additional physical exposure that FMCG brands especially require. Yet brand stores are not without risks: they may hurt the brand’s sales in the online channel where the digital native started and further impact brand profitability if the influx of new sales is not great enough to cover those online losses and the brand stores’ own substantial operating costs.

Full article and author contact information available at: https://doi.org/10.1177/00222429231193371

About the Journal of Marketing 

The Journal of Marketing develops and disseminates knowledge about real-world marketing questions useful to scholars, educators, managers, policy makers, consumers, and other societal stakeholders around the world. Published by the American Marketing Association since its founding in 1936, JM has played a significant role in shaping the content and boundaries of the marketing discipline. Shrihari (Hari) Sridhar (Joe Foster ’56 Chair in Business Leadership, Professor of Marketing at Mays Business School, Texas A&M University) serves as the current Editor in Chief.
https://www.ama.org/jm

About the American Marketing Association (AMA) 

As the largest chapter-based marketing association in the world, the AMA is trusted by marketing and sales professionals to help them discover what is coming next in the industry. The AMA has a community of local chapters in more than 70 cities and 350 college campuses throughout North America. The AMA is home to award-winning content, PCM® professional certification, premiere academic journals, and industry-leading training events and conferences.
https://www.ama.org