Showing posts sorted by relevance for query LHC. Sort by date Show all posts
Showing posts sorted by relevance for query LHC. Sort by date Show all posts

Sunday, July 25, 2021

RIP
UT Austin Mourns Death of World-Renowned Physicist Steven Weinberg


Physicist Steven Weinberg, January 28, 2008. Credit: Larry Murphy, The University of Texas at Austin

Jul 24, 2021

AUSTIN, Texas — Nobel laureate Steven Weinberg, a professor of physics and astronomy at The University of Texas at Austin, has died. He was 88.

One of the most celebrated scientists of his generation, Weinberg was best known for helping to develop a critical part of the Standard Model of particle physics, which significantly advanced humanity’s understanding of how everything in the universe — its various particles and the forces that govern them — relate. A faculty member for nearly four decades at UT Austin, he was a beloved teacher and researcher, revered not only by the scientists who marveled at his concise and elegant theories but also by science enthusiasts everywhere who read his books and sought him out at public appearances and lectures.

“The passing of Steven Weinberg is a loss for The University of Texas and for society. Professor Weinberg unlocked the mysteries of the universe for millions of people, enriching humanity’s concept of nature and our relationship to the world,” said Jay Hartzell, president of The University of Texas at Austin. “From his students to science enthusiasts, from astrophysicists to public decision makers, he made an enormous difference in our understanding. In short, he changed the world.”

“As a world-renowned researcher and faculty member, Steven Weinberg has captivated and inspired our UT Austin community for nearly four decades,” said Sharon L. Wood, provost of the university. “His extraordinary discoveries and contributions in cosmology and elementary particles have not only strengthened UT’s position as a global leader in physics, they have changed the world.”

Weinberg held the Jack S. Josey – Welch Foundation Chair in Science at UT Austin and was the winner of multiple scientific awards including the 1979 Nobel Prize in physics, which he shared with Abdus Salam and Sheldon Lee Glashow; a National Medal of Science in 1991; the Lewis Thomas Prize for the Scientist as Poet in 1999; and, just last year, the Breakthrough Prize in Fundamental Physics. He was a member of the National Academy of Sciences, the Royal Society of London, Britain’s Royal Society, the American Academy of Arts and Sciences and the American Philosophical Society, which presented him with the Benjamin Franklin Medal in 2004.

Queen Beatrix of the Netherlands receives Nobel laureates: Paul Berg, Christian de Duve, Steven Weinberg, Queen Beatrix, Manfred Eigen, Nicolaas Bloembergen. Photo taken on 31 August 1983. Credit: Rob C. Croes / Anefo. Creative Commons Netherlands license.

In 1967, Weinberg published a seminal paper laying out how two of the universe’s four fundamental forces — electromagnetism and the weak nuclear force — relate as part of a unified electroweak force. “A Model of Leptons,” at barely three pages, predicted properties of elementary particles that at that time had never before been observed (the W, Z and Higgs boson) and theorized that “neutral weak currents” dictated how elementary particles interact with one another. Later experiments, including the 2012 discovery of the Higgs boson at the Large Hadron Collider (LHC) in Switzerland, would bear out each of his predictions.

Weinberg leveraged his renown and his science for causes he cared deeply about. He had a lifelong interest in curbing nuclear proliferation and served briefly as a consultant for the U.S. Arms Control and Disarmament Agency. He advocated for a planned superconducting supercollider with the capabilities of the LHC in the United States — a project that ultimately failed to receive funding in the 1990s after having been planned for a site near Waxahachie, Texas. He continued to be an ambassador for science throughout his life, for example, teaching UT Austin students and participating in events such as the 2021 Nobel Prize Inspiration Initiative in April and in the Texas Science Festival in February.

“When we talk about science as part of the culture of our times, we’d better make it part of that culture by explaining what we’re doing,” Weinberg explained in a 2015 interview published by Third Way. “I think it’s very important not to write down to the public. You have to keep in mind that you’re writing for people who are not mathematically trained but are just as smart as you are.”

By showing the unifying links behind weak forces and electromagnetism, which were previously believed to be completely different, Weinberg delivered the first pillar of the Standard Model, the half-century-old theory that explains particles and three of the four fundamental forces in the universe (the fourth being gravity). As critical as the model is in helping physical scientists understand the order driving everything from the first minutes after the Big Bang to the world around us, Weinberg continued to pursue, alongside other scientists, dreams of a “final theory” that would concisely and effectively explain current unknowns about the forces and particles in the universe, including gravity.

Weinberg wrote hundreds of scientific articles about general relativity, quantum field theory, cosmology and quantum mechanics, as well as numerous popular articles, reviews and books. His books include “To Explain the World,” “Dreams of a Final Theory,” “Facing Up,” and “The First Three Minutes.” Weinberg often was asked in media interviews to reflect on his atheism and how it related to the scientific insights he described in his books.

“If there is no point in the universe that we discover by the methods of science, there is a point that we can give the universe by the way we live, by loving each other, by discovering things about nature, by creating works of art,” he once told PBS. “Although we are not the stars in a cosmic drama, if the only drama we’re starring in is one that we are making up as we go along, it is not entirely ignoble that faced with this unloving, impersonal universe we make a little island of warmth and love and science and art for ourselves.”

Weinberg was a native of New York, and his childhood love of science began with a gift of a chemistry set and continued through teaching himself calculus while a student at Bronx High School of Science. The first in his family to attend college, he received a bachelor’s degree from Cornell University and a doctoral degree from Princeton University. He researched at Columbia University and the University of California, Berkeley, before serving on the faculty of Harvard University, the Massachusetts Institute of Technology and, since 1982, UT Austin.

He is survived by his wife, UT Austin law professor Louise Weinberg, and their daughter, Elizabeth.


With Steven Weinberg’s death, physics loses a titan

He advanced the theory of particles and forces, and wrote insightfully for a wider public



By Tom Siegfried
Contributing Correspondent


Steven Weinberg in his office at the University of Texas at Austin in 2018.

Mythology has its titans. So do the movies. And so does physics. Just one fewer now.

Steven Weinberg died July 23, at the age of 88. He was one of the key intellectual leaders in physics during the second half of the 20th century, and he remained a leading voice and active contributor and teacher through the first two decades of the 21st.

On lists of the greats of his era he was always mentioned along with Richard Feynman, Murray Gell-Mann and … well, just Feynman and Gell-Mann.

Among his peers, Weinberg was one of the most respected figures in all of physics or perhaps all of science. He exuded intelligence and dignity. As news of his death spread through Twitter, other physicists expressed their remorse at the loss: “One of the most accomplished scientists of our age,” one commented, “a particularly eloquent spokesman for the scientific worldview.” And another: “One of the best physicists we had, one of the best thinkers of any variety.”



Weinberg’s Nobel Prize, awarded in 1979, was for his role in developing a theory unifying electromagnetism and the weak nuclear force. That was an essential contribution to what became known as the standard model of physics, a masterpiece of explanation for phenomena rooted in the math describing subatomic particles and forces. It’s so successful at explaining experimental results that physicists have long pursued every opportunity to find the slightest deviation, in hopes of identifying “new” physics that further deepens human understanding of nature.

Weinberg did important technical work in other realms of physics as well, and wrote several authoritative textbooks on such topics as general relativity and cosmology and quantum field theory. He was an early advocate of superstring theory as a promising path in the continuing quest to complete the standard model by unifying it with general relativity, Einstein’s theory of gravity.

Early on Weinberg also realized a desire to communicate more broadly. His popular book The First Three Minutes, published in 1977, introduced a generation of physicists and physics fans to the Big Bang–birth of the universe and the fundamental science underlying that metaphor. Later he wrote deeply insightful examinations of the nature of science and its intersection with society. And he was a longtime contributor of thoughtful essays in such venues as the New York Review of Books.

In his 1992 book Dreams of a Final Theory, Weinberg expressed his belief that physics was on the verge of finding the true fundamental explanation of reality, the “final theory” that would unify all of physics. Progress toward that goal seemed to be impeded by the apparent incompatibility of general relativity with quantum mechanics, the math underlying the standard model. But in a 1997 interview, Weinberg averred that the difficulty of combining relativity and quantum physics in a mathematically consistent way was an important clue. “When you put the two together, you find that there really isn’t that much free play in the laws of nature,” he said. “That’s been an enormous help to us because it’s a guide to what kind of theories might possibly work.”

Attempting to bridge the relativity-quantum gap, he believed, “pushed us a tremendous step forward toward being able to develop realistic theories of nature on the basis of just mathematical calculations and pure thought.”

Experiment had to come into play, of course, to verify the validity of the mathematical insights. But the standard model worked so well that finding deviations implied by new physics required more powerful experimental technology than physicists possessed. “We have to get to a whole new level of experimental competence before we can do experiments that reveal the truth beneath the standard model, and this is taking a long, long time,” he said. “I really think that physics in the style in which it’s being done … is going to eventually reach a final theory, but probably not while I’m around and very likely not while you’re around.”

He was right that he would not be around to see the final theory. And perhaps, as he sometimes acknowledged, nobody ever will. Perhaps it’s not experimental power that is lacking, but rather intellectual power. “Humans may not be smart enough to understand the really fundamental laws of physics,” he wrote in his 2015 book To Explain the World, a history of science up to the time of Newton.

Weinberg studied the history of science thoroughly, wrote books and taught courses on it. To Explain the World was explicitly aimed at assessing ancient and medieval science in light of modern knowledge. For that he incurred the criticism of historians and others who claimed he did not understand the purpose of history, which is to understand the human endeavors of an era on its own terms, not with anachronistic hindsight.

But Weinberg understood the viewpoint of the historians perfectly well. He just didn’t like it. For Weinberg, the story of science that was meaningful to people today was how the early stumblings toward understanding nature evolved into a surefire system for finding correct explanations. And that took many centuries. Without the perspective of where we are now, he believed, and an appreciation of the lessons we have learned, the story of how we got here “has no point.”

Future science historians will perhaps insist on assessing Weinberg’s own work in light of the standards of his times. But even if viewed in light of future knowledge, there’s no doubt that Weinberg’s achievements will remain in the realm of the Herculean. Or the titanic.



 Tom Siegfried is a contributing correspondent. 
He was editor in chief of Science News from 2007 to 2012 
February 8, 2015
February 23, 2017











Monday, August 22, 2022

A New Cold War Could Slow the Advance of Science
CERN, the European Organization for Nuclear Research laboratory
 for particle physics.
Credit...Leslye Davis for The New York Time

OPINION
GUEST ESSAY
By Michael Riordan
Aug. 22, 2022, 
Dr. Riordan is a physicist who writes about science, technology and public policy. He is the author of “The Hunting of the Quark” and a co-author of “Tunnel Visions: The Rise and Fall of the Superconducting Super Collider.”

ORCAS ISLAND, Wash. — One of the many unfortunate consequences of Russia’s invasion of Ukraine is the collateral damage to international scientific cooperation. The past two decades may have represented the apex of this cooperation. Now it appears to be coming to at least a pause, if not an end.

In the years immediately after the Cold War ended in 1991, Russian scientists turned increasingly to Europe and the United States to remain involved in frontier research. Through the efforts of Presidents George H.W. Bush and Bill Clinton, Space Station Freedom became the International Space Station, which included major contributions from Canada, Japan, European nations and Russia as partners.

Between 1993 and 1996, the Russian agency responsible for atomic energy signed agreements with the European Laboratory for Particle Physics, known as CERN, and contributed money, equipment and brainpower to the Large Hadron Collider Project. That project led to the discovery in 2012 of the Higgs boson, a heavy subatomic particle that imbues other elementary particles with mass. Its existence had been predicted a half-century earlier.

And during the 1990s, Russian scientists from Lomonosov Moscow State University joined the international LIGO Scientific Collaboration, which in 2016 announced striking evidence of mergers of ultramassive black holes. The discovery confirmed the prediction in Einstein’s general theory of relativity that cataclysmic events like the merger of two black holes — in this case, about 1.3 billion light years away — create ripples in space-time known as gravitational waves.

But Russia recently decided to terminate its participation in the space station after 2024, and CERN will no longer allow Russian institutes to participate in collider experiments after its contracts with Russia expire that year. What’s more, the European Space Agency has excluded Russia from its planned ExoMars rover project, despite the yearslong delays that will likely result. And notwithstanding Russia’s efforts in support of the X-ray laser project known as European XFEL in Germany, which has opened new opportunities for research in materials science, biology and physics, scientists and institutions based in Russia cannot (at least for now) perform new experiments at this facility.

Scientific research has advanced to such an extent since the end of the Cold War that such large, expensive international projects are the only way to push back the frontiers in many disciplines. Individual nations no longer have sufficient financial and intellectual resources to pursue the science unilaterally. The current retreat from Russian involvement in these big projects can in this way easily curtail scientific progress — as well as impair international relations more broadly.

CERN was established in a suburb of Geneva in the early 1950s to promote peaceful cooperation among European nations, which had experienced two disastrous wars during the previous 40 years. Organizers viewed nuclear and high-energy physics as promising disciplines that invited cooperation. And it succeeded. With the discovery in the early 1980s of the W and Z bosons, which together are responsible for one of the four fundamental forces that govern the behavior of matter in the universe, CERN established itself as the world’s premier laboratory for high-energy physics. To many European leaders, it had become the highest expression of continental unity — reason enough to approve its multibillion euro LHC project in the 1990s.

After the Soviet Union dissolved in 1991, the funding of many of its institutes for scientific research collapsed. CERN became the principal venue where Russian high-energy physicists could continue doing cutting edge research. And CERN had begun to seek additional LHC funding from well beyond its European member nations. Physicists from Russia’s Joint Institute for Nuclear Research joined the gargantuan Compact Muon Solenoid experiment on this collider, contributing to its design and making sophisticated contributions. They could take due credit for their part in the breakthrough Higgs boson discovery — perhaps the pinnacle of international scientific achievement. Russia became an important player in a “world laboratory” knit together by the internet and Web, which now includes Canada, China, India, Japan, the United States and many other non-European nations.

Part of the rationale for establishing CERN was to promote international understanding among researchers working toward common scientific goals. It has proved a wonderful polyglot place. Although English and French dominate conversations in labs, offices and the cafeteria, national differences seem to melt away amid vigorous technical exchanges and good food.

But this scientific camaraderie begins to dissolve when one of the participant nations savagely attacks another. During the first month of the Russian invasion of Ukraine, thousands of Russian scientists signed a petition opposing the attack, taking great risks to their careers and livelihoods. In contrast, Russian scientific institutes have toed the Kremlin line — dependent as they are on its continued support.

Collaborations on the basis of individual relationships may continue with some Russian scientists. This intellectual exchange is certainly valuable. But one can easily imagine that pullbacks and withdrawals will continue on other large scientific projects, if they haven’t already, to the detriment of international relations generally. That would be an unfortunate aspect of a renewed bifurcation of the world order much like what happened during the Cold War. But I sincerely hope that the strong scientific bonds established during the last three decades will survive and help re-establish broader East-West relations.

Tuesday, February 06, 2024

Plans for collider ‘to smash particles together to unveil Universe’s mysteries’

Nina Massey, PA Science Correspondent
Mon, 5 February 2024 



Researchers are developing plans for a new collider that could smash particles together at a greater force than currently possible in a bid to shed light on some of the Universe’s biggest mysteries.

The European Organisation for Nuclear Research’s (Cern) Large Hadron Collider (LHC), will complete its mission around 2040, and experts are looking at what could replace it.

Early estimates suggest the new machine, called the Future Circular Collider (FCC), would cost around £13.7 billion (15 billion Swiss Francs).

It is expected to be installed in a tunnel measuring some 91 kilometres in circumference at a depth of between 100 and 400 metres on French and Swiss territory.

Using the highest energies, it will smash particles together in the hope that new findings will change the world of physics, and understanding of how the Universe works.

On Monday, Cern announced that a mid-term feasibility study did not find a “technical showstopper”.

Among other things, the review was also able to identify the ideal location for the infrastructure of the project, and the size of the proposed tunnel.

In 2012, the LHC detected a new particle called the Higgs Boson, which provides a new way to look at the Universe.

However, dark matter and dark energy have remained elusive, and researchers hope the new collider will be able to answer some of science’s greatest unanswered questions.

Cern’s director general, Professor Fabiola Gianotti, said: “The FCC will be an unprecedented instrument to explore the law of physics and of nature, at the smallest scales and at the highest energies.”

She added: “[It] will allow us to address some of the outstanding questions in fundamental physics today in our knowledge of the fundamental constituents of matter and the structure and evolution of the Universe.”

Addressing critics who suggest the project is very expensive, and there are no guarantees it will answer outstanding questions about the Universe, Eliezer Rabinovici, president of the Cern council, said the aim was to build “discovery machines”, and not “confirmation machines”.

Prof Gianotti added: “We build the facility, and experimental facilities not to run behind the prediction, [or] correct calculation.

“Our goal is to address open questions, then of course, theories develop, and ideas on how to answer those questions.

“But nature may have chosen a completely different path. So our goal is to look at the open question and try to find an answer, whichever answer, nature has decided out there.

“It’s true that at the moment, we do not have a clear theoretical guidance on what we should look for, but it is exactly at times where we lack theoretical guidance – which means we do not have a clear idea of how nature may answer the open question – that we need to build instruments.

“Because the instruments will allow us to make a big step forward towards addressing the question, or also telling us what are the right questions to ask.”

If approved, the FCC could be running by the early to mid 2040s.

Professor Tim Gershon, elementary particle physics group, University of Warwick, said: “The so-called Future Circular Collider is Cern’s proposal to address this challenge.

“It will provide the ability to measure the properties of the Higgs Boson in unprecedented precision, and in so doing to look at the Universe in new ways.

“It is hoped that this will provide answers to some of the most important fundamental questions about the Universe, such as what happened in its earliest moments.

“The latest report on the ongoing FCC feasibility studies is encouraging – in the most optimistic scenario the new collider could start to produce data in just over two decades from now.

“But there is still a very long way to go.”

Monday, September 05, 2022

Europe’s Energy Crisis Could Force The Large Hadron Collider To Be Idled

  • A combination of factors is feeding into a major energy crisis in Europe at the moment, forcing households to ration their power and industrial companies to shut plants.
  • Now, the Large Hadron Collider, the world’s largest and most powerful particle accelerator may have to be idled to ensure grid stability in France and Switzerland.
  • The European Organization for Nuclear Research, CERN, will shut down other accelerators first, claiming that it could reduce its power use by 25% without idling the LHC.

The energy crisis in Europe is not only disrupting businesses and household finances, but it’s also hitting at the heart of crucial scientific research and experiments. 

The European Organization for Nuclear Research, CERN, the world’s largest particle physics lab and home of the Large Hadron Collider, could shut down some accelerators and could even idle the LHC to ensure grid stability in the nearby French and Swiss regions amid the severe energy crisis in Europe, Serge Claudet, chair of the CERN energy management panel, told The Wall Street Journal

Europe is experiencing an unprecedented energy crisis amid halted Russian gas supply via the Nord Stream pipeline, low nuclear power generation in France, a power crisis in Switzerland, and sky-high gas and power prices.    

Large European industrial companies have already announced plant or production line closures due to soaring gas and energy prices, while governments in Europe are drafting plans to potentially ration gas supply to industries according to their specific priorities.  

The crisis became much worse at the end of last week, when Russian gas giant Gazprom said after three-day maintenance on Friday that Nord Stream would remain shut until “operational defects in the equipment are eliminated”, upping the ante in its gas war against Europe. 

For most governments in Europe, the indefinite suspension of Russian gas flows through the main pipeline to Germany wasn’t a surprise; they had expected such a move from Putin. But this doesn’t make the EU’s task of ensuring lights and heating on this winter any easier. Switzerland and France – whose grids CERN uses to power its supercollider and seven other particle accelerators to study matter and two decelerators to study antimatter – are among the worst hit.   

Switzerland has admitted that the country might have to resort to using oil for electricity generation this winter as Europe is dealing with low levels of Russian natural gas supply, which could be cut even further or cut off altogether.

In France, year-ahead power prices surged to $1,001 (1,000 euro) per megawatt-hour for the first time ever last month. French power prices have now soared tenfold over the past year, as drought and hot weather this summer have added to France’s nuclear power generation problems at the worst possible moment. EDF will restart all its nuclear reactors in the country this winter, French Energy Transition Minister Agnès Pannier-Runacher said last week. Currently, more than half of EDF’s reactors are out of operation either because of maintenance or technical issues. 

One of EDF’s largest clients is none other than CERN, which uses 1.3 terawatt hours of electricity annually. That’s enough power to fuel 300,000 UK homes for a year. At peak consumption, usually from May to mid-December, CERN uses about 200 MW, which is about a third of the amount of energy used to feed the nearby city of Geneva in Switzerland. 

May to mid-December is the period of active work at the Large Hadron Collider, the world’s largest and most powerful particle accelerator, which discovered ten years ago the existence of the Higgs boson that gives mass to the elementary particles. The collider was just restarted this July after a three-and-a-half-year hiatus for upgrades. 

However, due to the energy crisis, CERN is now considering how it could idle the world’s most powerful collider. 

“Our concern is really grid stability, because we do all we can to prevent a blackout in our region,” Claudet told the Journal.  

CERN and its power supplier, EDF, are now discussing the possibility of implementing daily warnings for power grid instability at the research complex to determine when it would need to conserve energy and use less electricity, the head of CERN energy management panel told the WSJ. The organization will shut down other accelerators first, before possibly having to resort to a shutdown of the world’s largest particle accelerator, he added. With the shutdown of some of the other accelerators, CERN could thus lower its power use by 25%.  

By Charles Kennedy for Oilprice.com

Sunday, December 11, 2022

Nuclear theorists collaborate to explore 'heavy flavor' particles

Leading US researchers will develop framework for describing exotic particles' behavior at various stages in the evolution of hot nuclear matter

Grant and Award Announcement

DOE/BROOKHAVEN NATIONAL LABORATORY


Tracking Heavy Quarks 

IMAGE: COLLISIONS AT THE RELATIVISTIC HEAVY ION COLLIDER (RHIC) PRODUCE A HOT SOUP OF QUARKS AND GLUONS (CENTER)—AND ULTIMATELY THOUSANDS OF NEW PARTICLES. A NEW THEORY COLLABORATION SEEKS TO UNDERSTAND HOW HEAVY QUARKS (Q) AND ANTIQUARKS (Q-BAR) INTERACT WITH THIS QUARK-GLUON PLASMA (QGP) AND TRANSFORM INTO COMPOSITE PARTICLES THAT STRIKE THE DETECTOR. TRACKING THESE "HEAVY FLAVOR" PARTICLES CAN HELP SCIENTISTS UNRAVEL THE UNDERLYING MICROSCOPIC PROCESSES THAT DRIVE THE PROPERTIES OF THE QGP. view more 

CREDIT: BROOKHAVEN NATIONAL LABORATORY

UPTON, NY—Scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory will participate in a new Topical Theory Collaboration funded by DOE’s Office of Nuclear Physics to explore the behavior of so-called “heavy flavor” particles. These particles are made of quarks of the “charm” and “bottom” varieties, which are heavier and rarer than the “up” and “down” quarks that make up the protons and neutrons of ordinary atomic nuclei. By understanding how these exotic particles form, evolve, and interact with the medium created during powerful particle collisions, scientists will gain a deeper understanding of a unique form of matter known as a quark-gluon plasma (QGP) that filled the early universe.

These experiments take place at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven Lab and the Large Hadron Collider (LHC) at Europe’s CERN laboratory. Scientists accelerate and smash together the nuclei of heavy atoms at energies high enough to set free the quarks and gluelike “gluons” that hold ordinary matter together. These collisions create a soup of quarks and gluons much like the matter that existed just after the Big Bang, some 14 billion years ago.

A powerful theory, known as quantum chromodynamics (QCD), describes very accurately how the plasma’s quarks and gluons interact. But understanding how those fundamental interactions lead to the complex characteristics of the plasma—a trillion-degree, dense medium that flows like a fluid with no resistance—remains a great challenge in modern research.

The Heavy-Flavor Theory (HEFTY) for QCD Matter Topical Theory Collaboration, which will be led by Ralf Rapp from Texas A&M University, seeks to close that gap in understanding by developing a rigorous and comprehensive theoretical framework for describing how heavy-flavor particles interact with the QGP.

“With a heavy-flavor framework in place, experiments tracking these particles can be used to precisely probe the plasma’s properties,” said Peter Petreczky, a theorist at Brookhaven Lab, who will serve as co-spokesperson for the collaboration along with Ramona Vogt from DOE’s Lawrence Livermore National Laboratory. “Our framework will also provide a foundation for using heavy-flavor particles as a probe at the future Electron-Ion Collider (EIC). Future experiments at the EIC will probe different forms of cold nuclear matter which are the precursors of the QGP in the laboratory,” Petreczky said.

In heavy ion collisions at RHIC and the LHC, heavy charm and bottom quarks are produced upon initial impact of the colliding nuclei. Their large masses cause a diffusive motion that can serve as a marker of the interactions in the QGP, including the fundamental process of quarks binding together to form composite particles called hadrons.

“The framework needs to describe these particles from their initial production when the nuclei first collide, through their subsequent diffusion through the QGP and hadroniziation,” Petreczky said. “And these descriptions need to be embedded into realistic numerical simulations that enable quantitative comparisons to experimental data.”

Swagato Mukherjee of Brookhaven Lab will be a co-principal investigator in the collaboration, responsible for lattice QCD computations. These calculations require some of the world’s most powerful supercomputers to handle the complex array of variables involved in quark-gluon interactions.

“Recently there has been significant progress in lattice QCD calculations related to heavy flavor probes of QGP,” Mukherjee said. “We are in an exciting time when the exascale computing facilities and the support provided by the topical collaboration will enable us to perform realistic calculations of the key quantities needed for theoretical interpretation of experimental results on heavy flavor probes.”

In addition to lattice QCD the collaboration will use variety of theoretical approaches, including rigorous statistical data analysis to obtain the transport properties of QGP.

“The resulting framework will help us unravel the underlying microscopic processes that drive the properties of the QGP, thereby providing unprecedented insights into the inner workings of nuclear matter based on QCD,” said Rapp of Texas A&M, the principal investigator of the project.

The HEFTY collaboration will receive $2.5 Million from the DOE Office of Science, Office of Nuclear Physics, over five years. That funding will provide partial support for six graduate students and three postdoctoral fellows at 10 institutions, as well as a senior staff position at one of the national laboratories. It will also establish a bridge junior faculty position at Kent State University.

Partnering institutions include Brookhaven National Laboratory, Duke University, Florida State University, Kent State University, Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, Los Alamos National Laboratory, Massachusetts Institute of Technology, Texas A&M University, and Thomas Jefferson National Accelerator Facility.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on Twitter or find us on Facebook.

Related Links

Tuesday, July 07, 2020

New collection of stars, not born in our galaxy, discovered in Milky Way

Caltech researchers use deep learning and supercomputing to identify Nyx, a product of a long-ago galaxy merger
UNIVERSITY OF TEXAS AT AUSTIN, TEXAS ADVANCED COMPUTING CENTER
IMAGE
IMAGE: STILL FROM A SIMULATION OF INDIVIDUAL GALAXIES FORMING, STARTING AT A TIME WHEN THE UNIVERSE WAS JUST A FEW MILLION YEARS OLD. view more 
CREDIT: HOPKINS RESEARCH GROUP, CALTECH
Astronomers can go their whole career without finding a new object in the sky. But for Lina Necib, a postdoctoral scholar in theoretical physics at Caltech, the discovery of a cluster of stars in the Milky Way, but not born of the Milky Way, came early - with a little help from supercomputers, the Gaia space observatory, and new deep learning methods.
Writing in Nature Astronomy this week, Necib and her collaborators describe Nyx, a vast new stellar stream in the vicinity of the Sun, that may provide the first indication that a dwarf galaxy had merged with the Milky Way disk. These stellar streams are thought to be globular clusters or dwarf galaxies that have been stretched out along its orbit by tidal forces before being completely disrupted.
The discovery of Nyx took a circuitous route, but one that reflects the multifaceted way astronomy and astrophysics are studied today.
FIRE in the Cosmos
Necib studies the kinematics -- or motions -- of stars and dark matter in the Milky Way. "If there are any clumps of stars that are moving together in a particular fashion, that usually tells us that there is a reason that they're moving together."
Since 2014, researchers from Caltech, Northwestern University, UC San Diego and UC Berkeley, among other institutions, have been developing highly-detailed simulations of realistic galaxies as part of a project called FIRE (Feedback In Realistic Environments). These simulations include everything scientists know about how galaxies form and evolve. Starting from the virtual equivalent of the beginning of time, the simulations produce galaxies that look and act much like our own.
Mapping the Milky Way
Concurrent to the FIRE project, the Gaia space observatory was launched in 2013 by the European Space Agency. Its goal is to create an extraordinarily precise three-dimensional map of about one billion stars throughout the Milky Way galaxy and beyond.
"It's the largest kinematic study to date. The observatory provides the motions of one billion stars," she explained. "A subset of it, seven million stars, have 3D velocities, which means that we can know exactly where a star is and its motion. We've gone from very small datasets to doing massive analyses that we couldn't do before to understand the structure of the Milky Way."
The discovery of Nyx involved combining these two major astrophysics projects and analyzing them using deep learning methods.
Among the questions that both the simulations and the sky survey address is: How did the Milky Way become what it is today?
"Galaxies form by swallowing other galaxies," Necib said. "We've assumed that the Milky Way had a quiet merger history, and for a while it was concerning how quiet it was because our simulations show a lot of mergers. Now, with access to a lot of smaller structures, we understand it wasn't as quiet as it seemed. It's very powerful to have all these tools, data and simulations. All of them have to be used at once to disentangle this problem. We're at the beginning stages of being able to really understand the formation of the Milky way."
Applying Deep Learning to Gaia
A map of a billion stars is a mixed blessing: so much information, but nearly impossible to parse by human perception.
"Before, astronomers had to do a lot of looking and plotting, and maybe use some clustering algorithms. But that's not really possible anymore," Necib said. "We can't stare at seven million stars and figure out what they're doing. What we did in this series of projects was use the Gaia mock catalogues."
The Gaia mock catalogue, developed by Robyn Sanderson (University of Pennsylvania), essentially asked: 'If the FIRE simulations were real and observed with Gaia, what would we see?'
Necib's collaborator, Bryan Ostdiek (formerly at University of Oregon, and now at Harvard University), who had previously been involved in the Large Hadron Collider (LHC) project, had experience dealing with huge datasets using machine and deep learning. Porting those methods over to astrophysics opened the door to a new way to explore the cosmos.
"At the LHC, we have incredible simulations, but we worry that machines trained on them may learn the simulation and not real physics," Ostdiek said. "In a similar way, the FIRE galaxies provide a wonderful environment to train our models, but they are not the Milky Way. We had to learn not only what could help us identify the interesting stars in simulation, but also how to get this to generalize to our real galaxy."
The team developed a method of tracking the movements of each star in the virtual galaxies and labelling the stars as either born in the host galaxy or accreted as the products of galaxy mergers. The two types of stars have different signatures, though the differences are often subtle. These labels were used to train the deep learning model, which was then tested on other FIRE simulations.
After they built the catalogue, they applied it to the Gaia data. "We asked the neural network, 'Based on what you've learned, can you label if the stars were accreted or not?'" Necib said.
The model ranked how confident it was that a star was born outside the Milky Way on a range from 0 to 1. The team created a cutoff with a tolerance for error and began exploring the results.
This approach of applying a model trained on one dataset and applying it to a different but related one is called transfer learning and can be fraught with challenges. "We needed to make sure that we're not learning artificial things about the simulation, but really what's going on in the data," Necib said. "For that, we had to give it a little bit of help and tell it to reweigh certain known elements to give it a bit of an anchor."
They first checked to see if it could identify known features of the galaxy. These include "the Gaia sausage" -- the remains of a dwarf galaxy that merged with the Milky Way about six to ten billion years ago and that has a distinctive sausage-like orbital shape.
"It has a very specific signature," she explained. "If the neural network worked the way it's supposed to, we should see this huge structure that we already know is there."
The Gaia sausage was there, as was the stellar halo -- background stars that give the Milky Way its tell-tale shape -- and the Helmi stream, another known dwarf galaxy that merged with the Milky Way in the distant past and was discovered in 1999.
First Sighting: Nyx
The model identified another structure in the analysis: a cluster of 250 stars, rotating with the Milky Way's disk, but also going toward the center of the galaxy.
"Your first instinct is that you have a bug," Necib recounted. "And you're like, 'Oh no!' So, I didn't tell any of my collaborators for three weeks. Then I started realizing it's not a bug, it's actually real and it's new."
But what if it had already been discovered? "You start going through the literature, making sure that nobody has seen it and luckily for me, nobody had. So I got to name it, which is the most exciting thing in astrophysics. I called it Nyx, the Greek goddess of the night. This particular structure is very interesting because it would have been very difficult to see without machine learning."
The project required advanced computing at many different stages. The FIRE and updated FIRE-2 simulations are among the largest computer models of galaxies ever attempted. Each of the nine main simulations -- three separate galaxy formations, each with slightly different starting point for the sun -- took months to compute on the largest, fastest supercomputers in the world. These included Blue Waters at the National Center for Supercomputing Applications (NCSA), NASA's High-End Computing facilities, and most recently Stampede2 at the Texas Advanced Computing Center (TACC).
The researchers used clusters at the University of Oregon to train the deep learning model and to apply it to the massive Gaia dataset. They are currently using Frontera, the fastest system at any university in the world, to continue the work.
"Everything about this project is computationally very intensive and would not be able to happen without large-scale computing," Necib said.
Future Steps
Necib and her team plan to explore Nyx further using ground-based telescopes. This will provide information about the chemical makeup of the stream, and other details that will help them date Nyx's arrival into the Milky Way, and possibly provide clues on where it came from.
The next data release of Gaia in 2021 will contain additional information about 100 million stars in the catalogue, making more discoveries of accreted clusters likely.
"When the Gaia mission started, astronomers knew it was one of the largest datasets that they were going to get, with lots to be excited about," Necib said. "But we needed to evolve our techniques to adapt to the dataset. If we didn't change or update our methods, we'd be missing out on physics that are in our dataset."
The successes of the Caltech team's approach may have an even bigger impact. "We're developing computational tools that will be available for many areas of research and for non-research related things, too," she said. "This is how we push the technological frontier in general."
###

Tuesday, March 23, 2021

Cern experiment hints at new force of nature

MAGICK BY ANY OTHER NAME
IT'S A QUANTUM UNIVERSE ANYTHING CAN HAPPEN


Experts reveal ‘cautious excitement’ over unstable particles that fail to decay as standard model suggests

Ian Sample Science editor 
THE GUARDIAN
@iansample
Tue 23 Mar 2021 


Scientists at the Large Hadron Collider near Geneva have spotted an unusual signal in their data that may be the first hint of a new kind of physics.

The LHCb collaboration, one of four main teams at the LHC, analysed 10 years of data on how unstable particles called B mesons, created momentarily in the vast machine, decayed into more familiar matter such as electrons.

The mathematical framework that underpins scientists’ understanding of the subatomic world, known as the standard model of particle physics, firmly maintains that the particles should break down into products that include electrons at exactly the same rate as they do into products that include a heavier cousin of the electron, a particle called a muon.

A man rides his bicycle along the beam line of the Large Hadron Collider.
Photograph: Valentin Flauraud/AFP via Getty Images

But results released by Cern on Tuesday suggest that something unusual is happening. The B mesons are not decaying in the way the model says they should: instead of producing electrons and muons at the same rate, nature appears to favour the route that ends with electrons.

“We would expect this particle to decay into the final state containing electrons and the final state containing muons at the same rate as each other,” said Prof Chris Parkes, an experimental particle physicist at the University of Manchester and spokesperson for the LHCb collaboration. “What we have is an intriguing hint that maybe these two processes don’t happen at the same rate, but it’s not conclusive.”

In physics parlance, the result has a significance of 3.1 sigma, meaning the chance of it being a fluke is about one in 1,000. While that may sound convincing evidence, particle physicists tend not to claim a new discovery until a result reaches a significance of five sigma, where the chance of it being a statistical quirk are reduced to one in a few million.

“It’s an intriguing hint, but we have seen sigmas come and go before. It happens surprisingly frequently,” Parkes said.

The standard model of particle physics describes the particles and forces that govern the subatomic world. Constructed over the past half century, it defines how elementary particles called quarks build protons and neutrons inside atomic nuclei, and how these, usually combined with electrons, make up all known matter. The model also explains three of the four fundamental forces of nature: electromagnetism; the strong force, which holds atomic nuclei together; and the weak force which causes nuclear reactions in the sun.

But the standard model does not describe everything. It does not explain the fourth force, gravity, and perhaps more strikingly, says nothing about the 95% of the universe that physicists believe is not constructed from normal matter.


Much of the cosmos, they believe, consists of dark energy, a force that appears to be driving the expansion of the universe, and dark matter, a mysterious substance that seems to hold the cosmic web of matter in place like an invisible skeleton.

 ONCE UPON A TIME IT WAS KNOWN AS AETHER TO SCIENTISTS 

“If it turns out, with extra analysis of additional processes, that we were able to confirm this, it would be extremely exciting,” Parkes said. It would mean there is something wrong with the standard model and that we require something extra in our fundamental theory of particle physics to explain how this would happen.”

Despite the uncertainties over this particular result, Parkes said when combined with other results on B mesons, the case for something unusual happening became more convincing.

“I would say there is cautious excitement. We’re intrigued because not only is this result quite significant, it fits the pattern of some previous results from LHCb and other experiments worldwide,” he said.

Ben Allanach, a professor of theoretical physics at the University of Cambridge, agrees that taken together with other findings, the latest LHCb result is exciting. “I really think this will turn into something,” he said.

If the result turns out to be true, it could be explained by so-far hypothetical particles called Z primes or leptoquarks that bring new forces to bear on other particles.

“There could be a new quantum force that makes the B mesons break up into muons at the wrong rate. It’s sticking them together and stopping them decaying into muons at the rate we’d expect,” Allanach said. “This force could help explain the peculiar pattern of different matter particles’ masses.”

B mesons contain elementary particles called beauty quarks, also know as bottom quarks.

Scientists will collect more data from the LHC and other experiments around the world, such as Belle II in Japan, in the hope of confirming what is happening.