Learning to speak quantum
Particle physicists are studying ways to harness the power of the quantum realm to further their research.
In a 1981 lecture, the famed physicist Richard Feynman wondered if a computer could ever simulate the entire universe. The difficulty with this task is that, on the smallest scales, the universe operates under strange rules: Particles can be here and there at the same time; objects separated by immense distances can influence each other instantaneously; the simple act of observing can change the outcome of reality.
“Nature isn’t classical, dammit,” Feynman told his audience, “and if you want to make a simulation of nature, you’d better make it quantum mechanical.”
Quantum computers
Feynman was imagining a quantum computer, a computer with bits that acted like the particles of the quantum world. Today, nearly 40 years later, such computers are starting to become a reality, and they pose a unique opportunity for particle physicists.
“The systems that we deal with in particle physics are intrinsically quantum mechanical systems,” says Panagiotis Spentzouris, head of Fermilab’s Scientific Computing Division. “Classical computers cannot simulate large entangled quantum systems. You have plenty of problems that we would like to be able to solve accurately without making approximations that we hope we will be able to do on the quantum computer.”
Quantum computers allow for a more realistic representation of quantum processes. They take advantage of a phenomenon known as superposition, in which a particle such as an electron exists in a probabilistic state spread across multiple locations at once.
Unlike a classical computer bit, which can be either on or off, a quantum bit—or qubit—can be on, off, or a superposition of both on and off, allowing for computations to be performed simultaneously instead of sequentially.
This not only speeds up computations; it makes currently impossible ones possible. A problem that could effectively trap a normal computer in an infinite loop, testing possibility after possibility, could be solved almost instantaneously by a quantum computer. This processing speed could be key for particle physicists, who wade through enormous amounts of data generated by detectors.
In the first demonstration of this potential, a team at CalTech recently used a type of quantum computer called a quantum annealer to “rediscover” the Higgs boson, the particle that, according to the Standard Model of particle physics, gives mass to every other fundamental particle.
Scientists originally discovered the Higgs boson in 2012 using particle detectors at the Large Hadron Collider at CERN research center in Europe. They created Higgs bosons by converting the energy of particle collisions temporarily into matter. Those temporary Higgs bosons quickly decayed, converting their energy into other, more common particles, which the detectors were able to measure.
Scientists identified the mass of the Higgs boson by adding up the masses of those less massive particles, the decay products. But to do so, they needed to pick out which of those particles came from the decay of Higgs bosons, and which ones came from something else. To a detector, a Higgs boson decay can look remarkably similar to other, much more common decays.
LHC scientists trained a machine learning algorithm to find the Higgs signal against the decay background—the needle in the haystack. This training process required a huge amount of simulated data.
Physicist Maria Spiropulu, who was on the team that discovered the Higgs the first time around, wanted to see if she could improve the process with quantum computing. The group she leads at CalTech used a quantum computer from a company called D-Wave to train a similar machine learning algorithm. They found that the quantum computer trained the machine learning algorithm on a significantly smaller amount of data than the classical method required. In theory, this would give the algorithm a head start, like giving someone looking for the needle in the haystack expert training in spotting the glint of metal before turning their eyes to the hay.
“The machine cannot learn easily,” Spiropulu says. “It needs huge, huge data. In the quantum annealer, we have a hint that it can learn with small data, and if you learn with small data you can use it as initial conditions later.”
Some scientists say it may take a decade or more to get to the point of using quantum computers regularly in particle physics, but until then they will continue to make advances to enhance their research.
Quantum sensors
Quantum mechanics is also disrupting another technology used in particle physics: the sensor, the part of a particle detector that picks up the energy from a particle interaction.
In the quantum world, energy is discrete. The noun quantum means “a specific amount” and is used in physics to mean “the smallest quantity of energy.” Classical sensors generally do not make precise enough measurements to pick up individual quanta of energy, but a new type of quantum sensor can.
“A quantum sensor is one that is able to sense these individual packets of energy as they arrive,” says Aaron Chou, a scientist at Fermilab. “A non-quantum sensor would not be able to resolve the individual arrivals of each of these little packets of energy, but would instead measure a total flow of the stuff.”
Chou is taking advantage of these quantum sensors to probe the nature of dark matter. Using technology originally developed for quantum computers, Chou and his team are building ultrasensitive detectors for a type of theorized dark matter particle known as an axion.
“We’re taking one of the qubit designs that was previously created for quantum computing and we’re trying to use those to sense the presence of photons that came from the dark matter,” Chou says.
For Spiropulu, these applications of quantum computers represent an elegant feedback system in the progression of technology and scientific application. Basic research in physics led to the initial transistors that fed the computer science revolution, which is now on the edge of transforming basic research in physics.
“You want to disrupt computing, which was initially a physics advance,” Spiropulu says. “Now we are using physics configurations and physics systems themselves to assist computer science to solve any problem, including physics problems.”
Sterile neutrino sleuths
Meet the detectors of Fermilab’s Short-Baseline Neutrino Program, hunting for signs of a possible fourth type of neutrino.
Neutrinos are not a sociable bunch. Every second, trillions upon trillions of the tiny particles shoot down to Earth from space, but the vast majority don’t stop in to pay a visit—they continue on their journey, almost completely unaffected by any matter they come across.
Their reluctance to hang around is what makes it such a challenge to study them. But the Short-Baseline Neutrino (SBN) Program at the US Department of Energy’s Fermilab is doing just that: further unraveling the mysteries of neutrinos with three vast detectors filled with ultrapure liquid argon.
Argon is an inert substance normally found in the air around us—and, once isolated, an excellent medium for studying neutrinos. A neutrino colliding with an argon nucleus leaves behind a signature track and a spray of new particles such as electrons or photons, which can be picked up inside a detector.
SBN uses three detectors along a straight line in the path of a specially designed neutrino source called the Booster Neutrino Beamline (BNB) at Fermilab. Scientists calculated the exact positions that would yield the most interesting and useful results from the experiment.
The detectors study a property of neutrinos that scientists have known about for a while but do not have a complete grasp on: oscillations, the innate ability of neutrinos to change their form as they travel. Neutrinos come in three known types, or “flavors”: electron, muon and tau. But oscillations mean each of those types is interchangeable with the others, so a neutrino that begins life as a muon neutrino can naturally transform into an electron neutrino by the end of its journey.
Some experiments, however, have come up with intriguing results that suggest there could be a fourth type of neutrino that interacts even less than the three types that have already been documented. An experiment at Los Alamos National Laboratory in 1995 showed the first evidence that a fourth neutrino might exist. It was dubbed the “sterile” neutrino because it appears to be unaffected by anything other than gravity. In 2007, MiniBooNE, a previous experiment at Fermilab, showed possible hints of its existence, too, but neither experiment was powerful enough to say if their results definitively demonstrated the existence of a new type of neutrino.
That’s why it’s crucial to have these three, more powerful detectors. Carefully comparing the findings from all three detectors should allow the best measurement yet of whether a sterile neutrino is lurking out of sight. And finding the sterile neutrino would be evidence of new, intriguing physics—something that doesn’t fit our current picture of the world.
These three detectors are international endeavors, funded in part by DOE’s Office of Science, the National Science Foundation, the Science and Technology Facilities Council in the UK, CERN, the National Institute for Nuclear Physics (INFN) in Italy, the Swiss National Science Foundation and others. Each helps further develop the technologies, training and expertise needed to design, build and operate another experiment that has been under construction since July: the Deep Underground Neutrino Experiment (DUNE). This international mega-scientific collaboration hosted by Fermilab will send neutrinos 800 miles from Illinois to the massive DUNE detectors, which will be installed a mile underground at the Sanford Underground Research Facility in South Dakota.
Meet each of the SBN detectors below:
Short-Baseline Near Detector
Closest to the BNB source at just 110 meters, the Short-Baseline Near Detector (SBND) provides a benchmark for the whole experiment, studying the neutrinos just after they leave the source and before they have a chance to oscillate between flavors. Almost a cube shape, the detecting part of the SBND is four meters tall and wide, five meters long and weighs around 260 tons in total—with a 112-ton active liquid argon volume.
With a CERN-designed state-of-the-art membrane design for its cooling cryostat—which keeps the argon in a liquid state—SBND is a pioneering detector in the field of neutrino research. It will test new technologies and techniques that will be used in later neutrino projects such as DUNE.
Due to its proximity to the neutrino source, SBND will collect a colossal amount of interaction data. A secondary, long-term goal of SBND will be to work through this cache to precisely study the physics of these neutrino interactions and even to search for other signs of new physics.
“After a few years of running, we will have recorded millions of neutrino interactions in SBND, which will be a treasure trove of data that we can use to make many measurements,” says David Schmitz, physicist at the University of Chicago and co-spokesperson for the experiment. “Studying these neutrino interactions in this particular type of detector will have long-term value, especially in the context of DUNE, which will use the same detection principles.”
The SBND is well on its way to completion; its groundbreaking took place in April 2016 and its components are being built in Switzerland, the UK, Italy and at CERN.
Stats
MicroBooNE
The middle detector, MicroBooNE, was the first of the three detectors to come online. When it did so in 2015, it was the first detector ever to collect data on neutrino interactions in argon at the energies provided by the BNB. The detector sits 360 meters past SBND, nestled as close as possible to its predecessor, MiniBooNE. This proximity is on purpose: MicroBooNE, a more advanced detector, is designed to get a better look at the intriguing results from MiniBooNE.
In all, MicroBooNE weighs 170 tons (with an active liquid argon volume of 89 tons), making it currently the largest operating neutrino detector in the United States of its kind—a Liquid Argon Time Projection Chamber (LArTPC). That title will transfer to the far detector, ICARUS (see below), upon its installation in 2018.
While following up on MiniBooNE’s anomaly, MicroBooNE has another important job: providing scientists at Fermilab with useful experience of operating a liquid argon detector, which contributes to the development of new technology for the next generation of experiments.
“We’ve never in history had more than one liquid argon detector on any beamline, and that’s what makes the SBN Program exciting,” says Fermilab’s Sam Zeller, co-spokesperson for MicroBooNE. “It’s the first time we will have at least two detectors studying neutrino oscillations with liquid argon technology.”
Techniques used to fill MicroBooNE with argon will pave the way for the gargantuan DUNE far detector in the future, which will hold more than 400 times as much liquid argon as MicroBooNE. Neutrino detectors rely on the liquid inside being extremely pure, and to achieve this goal, all the air normally has to be pumped out before liquid is put in. But MicroBooNE scientists used a different technique: They pumped argon gas into the detector—which pushed all the air out—and then cooled until it condensed into liquid. This new approach will eliminate the need to evacuate the air from DUNE’s six-story-tall detectors.
Along with contributing to the next generation of detectors, MicroBooNE also contributes to training the next generation of neutrino scientists from around the world. Over half of the collaboration in charge of running MicroBooNE are students and postdocs who bring innovative ideas for analyzing its data.
Stats
ICARUS (Imaging Cosmic And Rare Underground Signals)
The largest of SBN’s detectors, ICARUS, is also the most distant from the neutrino source—600 meters down the line. Like SBND and MicroBooNE, ICARUS uses liquid argon as a neutrino detection technique, with over 700 tons of the dense liquid split between two symmetrical modules. These colossal tanks of liquid argon, together with excellent imaging capabilities, will allow extremely sensitive detections of neutrino interactions when the detector comes online at Fermilab in 2018.
The positioning of ICARUS along the neutrino beamline is crucial to its mission. The detector will measure the proportion of both electron and muon neutrinos that collide with argon nuclei as the intense beam of neutrinos passes through it. By comparing this data with that from SBND, scientists will be able to see if the results match with those from previous experiments and explore whether they could be explained by the existence of a sterile neutrino.
ICARUS, along with MicroBooNE, is also positioned on the Fermilab site close to another neutrino beam, called Neutrinos at the Main Injector (NuMI), which provides neutrinos for the existing experiments at Fermilab and in Minnesota. Unlike the main BNB beam, the NuMI beam will hit ICARUS at an angle through the detector. The goal will be to measure neutrino cross-sections—a measure of their interaction likelihood—rather than their oscillations. The energy of the NuMI beam is similar to that which will be used for DUNE, so ICARUS will provide excellent knowledge and experience to work out the kinks for the huge experiment.
The detector’s journey has been a long one. From its groundbreaking development, construction and operation in Italy at INFN’s Gran Sasso Laboratory under the leadership of Nobel laureate Carlo Rubbia, ICARUS traveled to CERN in Switzerland in 2014 for some renovation and upgrades. Equipped with new observing capabilities, it was then shipped across the Atlantic to Fermilab in 2017, where it is currently being installed in its future home. Scientists intend to begin taking data with ICARUS in 2018.
“ICARUS unlocked the potential of liquid argon detectors, and now it’s becoming a crucial part of our research,” says Peter Wilson, head of Fermilab’s SBN program. “We’re excited to see the data coming out of our short-baseline neutrino detectors and apply the lessons we learn to better understand neutrinos with DUNE.”
Stats
Brown University animates science communication
The SciToons program pairs students with different levels of scientific expertise to create animated science explainers.
Once a week at Brown University, professors and students with backgrounds ranging from neuroscience to literary arts come together to collaborate. They're participating in a program called "SciToons," created in 2011 by Oludurotimi Adetunji, an adjunct assistant professor of physics and Brown's associate dean of undergraduate research and inclusive science. The program pairs "experts" with "novices" to search for the best way to combine scientific concepts and animation in a three- to five-minute YouTube video.
"Many people have this perception that those who don't attend college or have a background in science can't understand things like quantum physics," says Aisha Keown-Lang, a senior at Brown double-majoring in biology and political science. "But it is really about the ability to communicate properly."
SciToons aims to reach an audience of students in high school and above, so the video teams must take into account that the viewers will not be aware of all the specialized language scientists use. This is where the mesh between expert and novice comes in; they communicate back and forth to make sure all information is not only scientifically accurate but also understandable for the presumably inexperienced audience. "They fill in each other's gaps," Adetunji says.
Creating a SciToons video takes about six months. After brainstorming a topic, the teams must write a script. "This takes several months because every word is thoroughly thought through," says Torrey Truszkowski, a neuroscience graduate student at Brown. Both the expert and the novice must approve the script before the project moves into the next phase.
After developing the script, the writers hand the project over to the animators, who develop a storyboard and visuals. The experts and novices reconvene to discuss and give feedback on the result.
Before uploading the final product to YouTube, all participants in the program—animators, professors, writers and students from all types of educational backgrounds—come together to ask the final question: "Does this work?"
If their answer is yes, they hit publish and wait to see if their audience agrees. Some videos have struck a definite chord. The most popular SciToons video so far—"How do we see color?"—has reached over 145,000 views.
"I was thinking narrowly with neuroscience, but now I see how I can apply myself in many different ways," Truszkowski says. She is now planning to pursue a career in science communications.
SciToons has also created an atmosphere for scholars to discuss and collaborate on topics outside their select fields of study, Adetunji says. This has allowed both novice and expert to deviate from the isolation in their personal career paths.
Currently all members of SciToons are either students or professors at Brown, but Adetunji hopes to eventually include high school students and collaborators in the process as well.
Neural networks for neutrinos
Scientists are using cutting-edge machine-learning techniques to analyze physics data
Particle physics and machine learning have long been intertwined.
One of the earliest examples of this relationship dates back to the 1960s, when physicists were using bubble chambers to search for particles invisible to the naked eye. These vessels were filled with a clear liquid that was heated to just below its boiling point so that even the slightest boost in energy—for example, from a charged particle crashing into it—would cause it to bubble, an event that would trigger a camera to take a photograph.
Female scanners often took on the job of inspecting these photographs for particle tracks. Physicist Paul Hough handed that task over to machines when he developed the Hough transform, a pattern recognition algorithm, to identify them.
The computer science community later developed the Hough transform for use in applications such as computer vision, attempts to train computers to replicate the complex function of a human eye.
“There’s always been a little bit of back and forth” between these two communities, says Mark Messier, a physicist at Indiana University.
Since then, the field of machine learning has rapidly advanced. Deep learning, a form of artificial intelligence modeled after the human brain, has been implemented for a wide range of applications such as identifying faces, playing video games and even synthesizing life-like videos of politicians.
Over the years, algorithms that help scientists pick interesting aberrations out of background data have been used in physics experiments such as BaBar at SLAC National Accelerator Laboratory and experiments at the Large Electron-Positron Collider at CERN and the Tevatron at Fermi National Accelerator Laboratory. More recently, algorithms that learn to recognize patterns in large datasets have been handy for physicists studying hard-to-catch particles called neutrinos.
This includes scientists on the NOvA experiment, who study a beam of neutrinos created at the US Department of Energy’s Fermilab near Chicago. The neutrinos stream straight through Earth to a 14,000-metric-ton detector filled with liquid scintillator sitting near the Canadian border in Minnesota.
When a neutrino strikes the liquid scintillator, it releases a burst of particles. The detector collects information about the pattern and energy of those particles. Scientists use that information to figure out what happened in the original neutrino event.
“Our job is almost like reconstructing a crime scene,” Messier says. “A neutrino interacts and leaves traces in the detector—we come along afterward and use what we can see to try and figure out what we can about the identity of the neutrino.”
Over the last few years, scientists have started to use algorithms called convolutional neural networks (CNNs) to take on this task instead.
CNNs, which are modelled after the mammalian visual cortex, are widely used in the technology industry—for example, to improve computer vision for self-driving cars. These networks are composed of multiple layers that act somewhat like filters: They contain densely interconnected nodes that possess numerical values, or weights, that are adjusted and refined as inputs pass through.
“The ‘deep’ part comes from the fact that there are many layers to it,” explains Adam Aurisano, an assistant professor at the University of Cincinnati. “[With deep learning] you can take nearly raw data, and by pushing it through these stacks of learnable filters, you wind up extracting nearly optimal features.”
For example, these algorithms can extract details associated with particle interactions of varying complexity from the “images” collected by recording different patterns of energy deposits in particle detectors.
“Those stacks of filters have sort of sliced and diced the image and extracted physically meaningful bits of information that we would have tried to reconstruct before,” Aurisano says.
Although they can be used to classify events without recreating them, CNNs can also be used to reconstruct particle interactions using a method called semantic segmentation.
When applied to an image of a table, for example, this method would reconstruct the object by tagging each pixel associated with it, Aurisano explains. In the same way, scientists can label each pixel associated with characteristics of neutrino interactions, then use algorithms to reconstruct the event.
Physicists are using this method to analyze data collected from the MicroBooNE neutrino detector.
“The nice thing about this process is that you might find a cluster that’s made by your network that doesn’t fit in any interpretation in your model,” says Kazuhiro Terao, a scientist at SLAC National Accelerator Laboratory. “That might be new physics. So we could use these tools to find stuff that we might not understand.”
Scientists working on other particle physics experiments, such as those at the Large Hadron Collider at CERN, are also using deep learning for data analysis.
“All these big physics experiments are really very similar at the machine learning level,” says Pierre Baldi, a computer scientist at the University of California, Irvine. “It's all images associated with these complex, very expensive detectors, and deep learning is the best method for extracting signal against some background noise.”
Although most of the information is currently flowing from computer scientists to particle physicists, other communities may also gain new tools and insights from these experimental applications as well.
For example, according to Baldi, one question that’s currently being discussed is whether scientists can write software that works across all these physics experiments with a minimal amount of human tuning. If this goal were achieved, it could benefit other fields, such a biomedical imaging, that use deep learning as well. “[The algorithm] would look at the data and calibrate itself,” he says. “That’s an interesting challenge for machine learning methods.”
Another future direction, Terao says, would be to get machines to ask questions—or, more simply, to be able to identify outliers and try to figure out how to explain them.
“If the AI can form a question and come up with a logical sequence to solve it, then that replaces a human,” he says. “To me, the kind of AI you want to see is a physics researcher—one that can do scientific research.”
First cryomodule for ultrapowerful X-ray laser arrives
A Fermilab team built and tested the first new superconducting accelerator cryomodule for SLAC’s LCLS-II project.
Earlier this week, scientists and engineers at the US Department of Energy’s Fermi National Accelerator Laboratory in Illinois loaded one of the most advanced superconducting radio-frequency cryomodules ever created onto a truck and sent it heading west.
Today, that cryomodule arrived at SLAC National Accelerator Laboratory in California, where it will become the first of 37 powering a 3-mile-long machine that will revolutionize atomic X-ray imaging. The modules are the product of many years of innovation in accelerator technology, and the first cryomodule Fermilab developed for this project set a world record in energy efficiency.
These modules, when lined up end to end, will make up the bulk of the accelerator that will power a massive upgrade to the capabilities of the Linac Coherent Light Source at SLAC, a unique X-ray microscope that will use the brightest X-ray pulses ever made to provide unprecedented details of the atomic world. Fermilab will provide 22 of the cryomodules, with the rest built and tested at Thomas Jefferson National Accelerator Facility in Virginia.
The quality factor achieved in these components is unprecedented for superconducting radio-frequency cryomodules. The higher the quality factor, the lower the cryogenic load, and the more efficiently the cavity imparts energy to the particle beam. Fermilab’s record-setting cryomodule doubled the quality factor compared to the previous state-of-the-art.
“LCLS-II represents an important technological step which demonstrates that we can build more efficient and more powerful accelerators,” says Fermilab Director Nigel Lockyer. “This is a major milestone for our accelerator program, for our productive collaboration with SLAC and Jefferson Lab and for the worldwide accelerator community.”
Today’s arrival is merely the first. From now into 2019, the teams at Fermilab and Jefferson Lab will build the remaining cryomodules, including spares, and scrutinize them from top to bottom, sending them to SLAC only after they pass the rigorous review.
“It’s safe to say that this is the most advanced machine of its type,” says Elvin Harms, a Fermilab accelerator physicist working on the project. “This upgrade will boost the power of LCLS, allowing it to deliver X-ray laser beams that are 10,000 times brighter than it can give us right now.”
With short, ultrabright pulses that will arrive up to a million times per second, LCLS-II will further sharpen our view of how nature works at the smallest scales and help advance transformative technologies of the future, including novel electronics, life-saving drugs and innovative energy solutions. Hundreds of scientists use LCLS each year to catch a glimpse of nature’s fundamental processes.
To meet the machine’s standards, each Fermilab-built cryomodule must be tested in nearly identical conditions as in the actual accelerator. Each large metal cylinder—up to 40 feet in length and 4 feet in diameter—contains accelerating cavities through which electrons zip at nearly the speed of light. But the cavities, made of superconducting metal, must be kept at a temperature of 2 Kelvin (minus 456 degrees Fahrenheit).
To achieve this, ultracold liquid helium flows through pipes in the cryomodule, and keeping that temperature steady is part of the testing process.
“The difference between room temperature and a few Kelvin creates a problem, one that manifests as vibrations in the cryomodule,” says Genfa Wu, a Fermilab scientist working on LCLS-II. “And vibrations are bad for linear accelerator operation.”
In initial tests of the prototype cryomodule, scientists found vibration levels that were higher than specification. To diagnose the problem, they used geophones—the same kind of equipment that can detect earthquakes—to rule out external vibration sources. They determined that the cause was inside the cryomodule and made a number of changes, including adjusting the path of the flow of liquid helium. The changes worked, substantially reducing vibration levels to a 10th of what they were originally, and have been successfully applied to subsequent cryomodules.
Fermilab scientists and engineers are also ensuring that unwanted magnetic fields in the cryomodule are kept to a minimum, since excessive magnetic fields reduce the operating efficiency.
“At Fermilab, we are building this machine from head to toe,” Lockyer says. “From nanoengineering the cavity surface to the integration of thousands of complex components, we have come a long way to the successful delivery of LCLS-II’s first cryomodule.”
Fermilab has tested seven cryomodules, plus one built and previously tested at Jefferson Lab, with great success. Each of those, along with the modules yet to be built and tested, will get its own cross-country trip in the months and years to come.
Editor's note: This article is based on a Fermilab press release.
The biggest little detectors
The ProtoDUNE detectors for the Deep Underground Neutrino Experiment are behemoths in their own right.
In one sense, the two ProtoDUNE detectors are small. As prototypes of the much larger planned Deep Underground Neutrino Experiment, they are only representative slices, each measuring about 1 percent of the size of the final detector. But in all other ways, the ProtoDUNE detectors are simply massive.
Once they are complete later this year, these two test detectors will be larger than any detector ever built that uses liquid argon, its active material. The international project involves dozens of experimental groups coordinating around the world. And most critically, the ProtoDUNE detectors, which are being installed and tested at the European particle physics laboratory CERN, are the rehearsal spaces in which physicists, engineers and technicians will hammer out nearly every engineering problem confronting DUNE, the biggest international science project ever conducted in the United States.
Gigantic detector, tiny neutrino
DUNE’s mission, when it comes online in the mid-2020s, will be to pin down the nature of the neutrino, the most ubiquitous particle of matter in the universe. Despite neutrinos’ omnipresence—they fill the universe, and trillions of them stream through us every second—they are a pain in the neck to capture. Neutrinos are vanishingly small, fleeting particles that, unlike other members of the subatomic realm, are heedless of the matter through which they fly, never stopping to interact.
Well, almost never.
Once in a while, scientists can catch one. And when they do, it might tell them a bit about the origins of the universe and why matter predominates over antimatter—and thus how we came to be here at all.
A global community of more than 1000 scientists from 31 countries are building DUNE, a megascience experiment hosted by the Department of Energy’s Fermi National Accelerator Laboratory. The researchers’ plan is to observe neutrinos using two detectors separated by 1300 kilometers—one at Fermilab outside Chicago and a second one a mile underground in South Dakota at the Sanford Underground Research Facility. Having one at each end enables scientists to see how neutrinos transform as they travel over a long distance.
The DUNE collaboration is going all-in on the bigger-is-better strategy; after all, the bigger the detector, the more likely scientists are to snag a neutrino. The detector located in South Dakota, called the DUNE far detector, will hold 70,000 metric tons (equivalent to about 525,000 bathtubs) of liquid argon to serve as the neutrino fishing net. It comprises four large modules. Each will stand four stories high and, not including the structures that house the utilities, occupy a footprint roughly equal to a soccer field.
In short, DUNE is giant.
Small Particles, Big Science: The International LBNF/DUNE Project
Lots of room in ProtoDUNE
The ProtoDUNE detectors are small only when compared to the giant DUNE detector. If each of the four DUNE modules is a 20-room building, then each ProtoDUNE detector is one room.
But one room large enough to envelop a small house.
As one repeatable unit of the ultimate detector, the ProtoDUNE detectors are necessarily big. Each is an enormous cube—about two stories high and about as wide—and contains about 800 metric tons of liquid argon.
Why two prototypes? Researchers are investigating two ways to use argon and so are constructing two slightly different but equally sized test beds. The single-phase ProtoDUNE uses only liquid argon, while the dual-phase ProtoDUNE uses argon as both a liquid and a gas.
“They’re the largest liquid-argon particle detectors that have ever been built,” says Ed Blucher, DUNE co-spokesperson and a physicist at the University of Chicago.
As DUNE’s test bed, the ProtoDUNE detectors also have to offer researchers a realistic picture of how the liquid-argon detection technology will work in DUNE, so the instrumentation inside the detectors is also at full, giant scale.
“If you’re going to build a huge underground detector and invest all of this time and all of these resources into it, that prototype has to work properly and be well-understood,” says Bob Paulos, director of the University of Wisconsin–Madison Physical Sciences Lab and a DUNE engineer. “You need to understand all the engineering problems before you proceed to build literally hundreds of these components and try to transport them all underground.”
Partners in ProtoDUNE
ProtoDUNE is a rehearsal for DUNE not only in its technical orchestration but also in the coordination of human activity.
When scientists were planning their next-generation neutrino experiment around 2013, they realized that it could succeed only by bringing the international scientific community together to build the project. They also saw that even the prototyping would require an effort of global proportions—both geographically and professionally. As a result, DUNE and ProtoDUNE actively invite students, early-career scientists and senior researchers from all around the world to contribute.
“The scale of ProtoDUNE, a global collaboration at CERN for a US-based megaproject, is a paradigm change in the way neutrino science is done,” says Christos Touramanis, a physicist at the University of Liverpool and one of the co-coordinators of the single-phase detector. For both DUNE and ProtoDUNE, funding comes from partners around the world, including the Department of Energy's Office of Science and CERN.
The successful execution of ProtoDUNE’s assembly and testing by international groups requires a unity of purpose from parties that could hardly be farther apart, geographically speaking.
Scientists say the effort is going smoothly.
“I’ve been doing neutrino physics and detector technology for the last 20 or 25 years. I’ve never seen such an effort go up so nicely and quickly. It’s astonishing,” says Fermilab scientist Flavio Cavanna, who co-coordinates the single-phase ProtoDUNE project. “We have a great collaboration, great atmosphere, great willingness to make it. Everybody is doing his or her best to contribute to the success of this big project. I used to say that ProtoDUNE was mission impossible, because—in the short time we were given to make the two detectors, it looked that way in the beginning. But looking at where we are now, and all the progress made so far, it starts turning out to be mission possible.”
Inside the liquid-argon test bed
So how do neutrino liquid-argon detectors work? Most of the space inside serves as the arena of particle interaction, where neutrinos can smash into an argon atom and create secondary particles. Surrounding this interaction space is the instrumentation that records these rare collisions, like a camera committing the scene to film. DUNE collaborators are developing and constructing the recording instruments that will capture the evidence of these interactions.
One signal is ionization charge: A neutrino interaction generates other particles that propagate through the detector’s vast argon pool, kicking electrons—called ionization electrons—off atoms as they go. The second signal is light.
Animation: Neutrino Detection in Liquid-Argon Time Projection Chamber
The first signal emerges as a streak of ionization electrons.
To record the signal, scientists will use something called an anode plane array, or APA. An APA is a screen created using 24 kilometers of precisely tensioned, closely spaced, continuously wound wire. This wire screen is positively charged, so it attracts the negatively charged electrons.
Much the way a wave front approaches the beach’s shore, the particle track—a string of the ionization electrons—will head toward the positively charged wires inside the ProtoDUNE detectors. The wires will send information about the track to computers, which will record its properties and thus information about the original neutrino interaction.
A group in the University of Wisconsin–Madison Physical Sciences Lab led by Paulos designed the single-phase ProtoDUNE wire arrays. The Wisconsin group, Daresbury Laboratory in the UK and several UK universities are building APAs for the same detector. The first APA from Wisconsin arrived at CERN last year; the first from Daresbury Lab arrived earlier this week.
“These are complicated to build,” Paulos says, noting that it currently takes about three months to build just one. “Building these 6-meter-tall anode planes with continuously wound wire—that’s something that hasn't been done before.”
ProtoDUNE Anode Plane Assembly
The anode planes attract the electrons. Pushing away the electrons will be a complementary set of panels, called the cathode plane. Together, the anode and cathode planes behave like battery terminals, with one repelling electron tracks and the other drawing them in. A group at CERN designed and is building the cathode plane.
The dual-phase detector will operate on the same principle but with a different configuration of wire arrays. A special layer of electronics near the cathode will allow for the amplification of faint electron tracks in a layer of gaseous argon. Groups at institutions in France, Germany and Switzerland are designing those instruments. Once complete, they will also send their arrays to be tested at CERN.
Then there’s the business of observing light.
The flash of light is the result of a release of energy from the electron in the process of getting bumped from an argon atom. The appearance of light is like the signal to start a stopwatch; it marks the moment the neutrino interaction in a detector takes place. This enables scientists to reconstruct in three dimensions the picture of the interaction and resulting particles.
On the other side of the equator, a group at the University of Campinas in Brazil is coordinating the installation of instruments that will capture the flashes of light resulting from particle interactions in the single-phase ProtoDUNE detector.
Two of the designs for the single-phase prototype—one by Indiana University, the other by Fermilab and MIT—are of a type called guiding bars. These long, narrow strips work like fiber optic cables: they capture the light, convert it into light in the visible spectrum and finally guide it to an external sensor.
A third design, called ARAPUCA, was developed by three Brazilian universities and Fermilab and is being partially produced at Colorado State University. Named for the Guaraní word for a bird trap, the efficient ARAPUCA design will be able to “trap” even very low light signals and transmit them to its sensors.
“The ARAPUCA technology is totally new,” says University of Campinas scientist Ettore Segreto, who is co-coordinating the installation of the light detection systems in the single-phase prototype. “We might be able to get more information from the light detection—for example, greater energy resolution.”
Groups from France, Spain and the Swiss Federal Institute of Technology are developing the light detection system for the dual-phase prototype, which will comprise 36 photomultiplier tubes, or PMTs, situated near the cathode plane. A PMT works by picking up the light from the particle interaction and converting it into electrons, multiplying their number and so amplifying the signal’s strength as the electrons travel down the tube.
With two tricked-out detectors, the DUNE collaboration can test their picture-taking capabilities and prepare DUNE to capture in exquisite detail the fleeting interactions of neutrinos.
Bringing instruments into harmony
But even if they’re instrumented to the nines inside, two isolated prototypes do not a proper test bed make. Both ProtoDUNE detectors must be hooked up to computing systems so particle interaction signals can be converted into data. Each detector must be contained in a cryostat, which functions like a thermos, for the argon to be cold enough to maintain a liquid state. And the detectors must be fed particles in the first place.
CERN is addressing these key areas by providing particle beam, innovative cryogenics and computing infrastructures, and connecting the prototype detectors with the DUNE experimental environment.
DUNE’s neutrinos will be provided by the Long-Baseline Neutrino Facility, or LBNF, which held an underground groundbreaking for the start of its construction in July. LBNF, led by Fermilab, will provide the construction, beamline and cryogenics for the mammoth DUNE detector, as well as Fermilab’s chain of particle accelerators, which will provide the world’s most intense neutrino beam to the experiment.
CERN is helping simulate that environment as closely as possible with the scaled-down ProtoDUNE detectors, furnishing them with particle beams so researchers can characterize how the detectors respond. Under the leadership of scientist Marzio Nessi, last year the CERN group built a new facility for the test beds, where CERN is now constructing two new particle beamlines that extend the lab’s existing network.
In addition, CERN built the ProtoDUNE cryostats—the largest ever constructed for a particle physics experiment—which also will serve as prototypes for those used in DUNE. Scientists will be able to gather and interpret the data generated from the detectors with a CERN computing farm and software and hardware from several UK universities.
“The very process of building these prototype detectors provides a stress test for building them in DUNE,” Blucher says.
CERN’s beam schedule sets the schedule for testing. In December, the European laboratory will temporarily shut off beam to its experiments for upgrades to the Large Hadron Collider. DUNE scientists aim to position the ProtoDUNE detectors in the CERN beam before then, testing the new technologies pioneered as part of the experiment.
“ProtoDUNE is a necessary and fundamental step towards LBNF/DUNE,” Nessi says. “Most of the engineering will be defined there and it is the place to learn and solve problems. The success of the LBNF/DUNE project depends on it.”
Voyage into the dark sector
A hidden world of particles awaits.
We don’t need extra dimensions or parallel universes to have an alternate reality superimposed right on top of our own. Invisible matter is everywhere.
For example, take neutrinos generated by the sun, says Jessie Shelton, a theorist at the University of Illinois at Urbana-Champaign who works on dark sector physics. “We are constantly bombarded with neutrinos, but they pass right through us. They share the same space as our atoms but almost never interact.”
As far as scientists can tell, neutrinos are solitary particles. But what if there is a whole world of particles that interact with one another but not with ordinary atoms? This is the idea behind the dark sector: a theoretical world of matter existing alongside our own but invisible to the detectors we use to study the particles we know.
“Dark sectors are, by their very definition, built out of particles that don't interact strongly with the Standard Model,” Shelton says.
The Standard Model is a physicist’s field guide to the 17 particles and forces that make up all visible matter. It explains how atoms can form and why the sun shines. But it cannot explain gravity, the cosmic imbalance of matter and antimatter, or the disparate strengths of nature's four forces.
On its own, an invisible world of dark sector particles cannot solve all these problems. But it certainly helps.
The main selling point for the dark sector is that the theories comprehensively confront the problem of dark matter. Dark matter is a term physicists coined to explain bizarre gravitational effects they observe in the cosmos. Distant starlight appears to bend around invisible objects as it traverses the cosmos, and galaxies spin as if they had five times more mass than their visible matter can explain. Even the ancient light preserved in cosmic microwave background seems to suggest that there is an invisible scaffolding on which galaxies are formed.
Some theories suggest that dark matter is simple cosmic debris that adds mass—but little else—to the complexity of our cosmos. But after decades of searching, physicists have yet to find dark matter in a laboratory experiment. Maybe the reason scientists haven’t been able to detect it is that they’ve been underestimating it.
“There is no particular reason to expect that whatever is going on in the dark sector has to be as simple as our most minimal models,” Shelton says. “After all, we know that our visible world has a lot of rich physics: Photons, electrons, protons, nuclei and neutrinos are all critically important for understanding the cosmology of how we got here. The dark sector could be a busy place as well.”
According to Shelton, dark matter could be the only surviving particle out of a similarly complicated set of dark particles.
“It could even be something like the proton, a bound state of particles interacting via a very strong dark force. Or it could even be something like a hydrogen atom, a bound state of particles interacting via a weaker dark force,” she says.
Even if terrestrial experiments cannot see these stable dark matter particles directly, they might be sensitive to other kinds of dark particles, such as dark photons or short-lived dark particles that interact strongly with the Higgs boson.
“The Higgs is one of the easiest ways for the Standard Model particles to talk to the dark sector,” Shelton says.
As far as scientists know, the Higgs boson is not picky. It may very well interact will all sorts of massive particles, including those invisible to ordinary atoms. If the Higgs boson interacts with massive dark sector particles, scientists should find that its properties deviate slightly from the Standard Model’s predictions. Scientists at the Large Hadron Collider are precisely measuring the properties of the Higgs boson to search for unexpected quirks that could open a gateway to new physics.
At the same time, scientists are also using the LHC to search for dark sector particles directly. One theory is that at extremely high temperatures, dark matter and ordinary matter are not so different and can transform into one another through a dark force. In the hot and dense early universe, this would have been quite common.
“But as the universe expanded and cooled, this interaction froze out, leaving some relic dark matter behind,” Shelton says.
The energetic particle collisions generated by the LHC imitate the conditions that existed in the early universe and could unlock dark sector particles. If scientists are lucky, they might even catch dark sector particles metamorphosing into ordinary matter, an event that could materialize in the experimental data as particle tracks that suddenly appear from no apparent source.
But there are also several feasible scenarios in which any interactions between the dark sector and our Standard Model particles are so tiny that they are out of reach of modern experiments, according to Shelton.
"These ‘nightmare’ scenarios are completely logical possibilities, and in this case, we will have to think very carefully about astrophysical and cosmological ways to look for the footprints of dark particle physics,” she says.
Even if the dark sector is inaccessible to particle detectors, dark matter will always be visible through the gravitational fingerprint it leaves on the cosmos.
“Gravity tells us a lot about how much dark matter is in the universe and the kinds of particle interactions dark sector particles can and cannot have,” Shelton says. “For instance, more sensitive gravitational-wave experiments will give us the possibility to look back in time and see what our universe looked like at extremely high energies, and could maybe reveal more about this invisible matter living in our cosmos.”
Rivers in the sky
Local communities named newly discovered stellar streams for bodies of water close to home.
Most of the time, the Dark Energy Camera in Chile stares out into the deepest regions of space, measuring light from distant galaxies. But this gigantic eye sometimes discovers things closer to home—like the 11 newly found stellar streams that the Dark Energy Survey announced today. For a few lucky groups in Chile and Australia, this meant an extraordinary opportunity: getting to name an object in space.
“The people were very enthusiastic,” says Kyler Kuehn, a scientist with the Dark Energy Survey who coordinated the outreach effort in Australia. “I don’t know if they are aware how rarely people get to name things that are newly discovered in space—or anywhere, for that matter—but I was pretty excited about it."
Stellar streams are ribbons of stars orbiting a galaxy (in this case, our own Milky Way). These faint filaments are the remnants of dwarf galaxies or star clusters that have been ripped apart by the gravity of their monster neighbor. Unlike some celestial objects that have very specific naming conventions according to the International Astronomical Union, stellar streams have a bit of flexibility.
Previously discovered stellar streams were often named after constellations in the sky near their location—but with many streams often appearing close to one another and other objects such as dwarf galaxies using the same convention, things became messy. Carl Grillmair, a CalTech astronomer studying stellar streams, proposed using the names of rivers in Greek mythology, like the River Styx. From there, naming expanded into real-world rivers.
DES decided to go the terrestrial route. One set of stellar streams, located in the sky near the Indus constellation, received names of Indian rivers: Indus, Jhelum, Chenab and Ravi. The collaboration decided to name the other two groups of streams after native words relating to water or rivers in Chile, where the Dark Energy Camera is located at the Cerro Tololo Inter-American Observatory, and Australia, where the Anglo-Australian Telescope is often used to follow up on those DECam discoveries.
In Chile, DES worked with students in the nearby town of Vicuña. High school students Dánae Rojas and Emerson Carvajal researched words from the native Quechua and Aymara cultures that were related to water, then presented several options to about 90 kindergarten and first-grade students. Their final selections were the Aymara name Aliqa Una, meaning Quiet Water, and two Quechua names, Palca, meaning Crossing Rivers, and Willka Yaku, or Sacred Water. Two Spanish names for local rivers near Vicuña, Elqui and Turbio, rounded out the set.
“It was absolutely wonderful to get the community involved in this process,” says Alfredo Zenteno, a DES scientist who, along with Kathy Vivas, led the outreach effort in Chile. “It is a way to make these new discoveries, which were made with a telescope in the region, close to them. For us, the astronomers, it is a way to thank the region that hosts the telescope and allows us to investigate the sky,”
In Australia, Kuehn worked with an Aboriginal storyteller and tribal elders to pick culturally sensitive and appropriate names in native languages.
“I wanted to honor the long history of aboriginal Australians doing astronomy,” Kuehn says. “Today's Aboriginal populations are the caretakers of some of the oldest continuous cultures on the planet, and their collective knowledge—including astronomical observations—date back tens of thousands of years.”
With a list of half a dozen names, Kuehn presented to a group of about 100 raucous adults at the Sydney Royal Botanic Gardens and 40 polite preschoolers, asking them to cheer to select their favorites. The Australian-named stellar streams are Wambelong, meaning Crazy Water in the Gamilaraay language, and Turranburra, the Dharug name for the Lane Cove River that runs near the headquarters of the Australian Astronomical Observatory. Scientists hope the names build connections between the nations that host the observatories and the discoveries they make about the universe that hosts us all.
“It was wonderful to see the community have a chance to write in the sky,” Zenteno says.
Editor's note: You can learn more about the stellar streams and the accompanying release of three years of data from the Dark Energy Survey's lead lab, Fermilab.