Machine learning will become an even more important tool when scientists upgrade to the High-Luminosity Large Hadron Collider.
When do a few scattered dots become a line? And when does that line become a particle track? For decades, physicists have been asking these kinds of questions. Today, so are their machines.
Machine learning is the process by which the task of pattern recognition is outsourced to a computer algorithm. Humans are naturally very good at finding and processing patterns. That’s why you can instantly recognize a song from your favorite band, even if you’ve never heard it before.
Machine learning takes this very human process and puts computing power behind it. Whereas a human might be able to recognize a band based on a variety of attributes such as the vocal tenor of the lead singer, a computer can process other subtle features a human might miss. The music-streaming platform Pandora categorizes every piece of music in terms of 450 different auditory qualities.
“Machines can handle a lot more information than our brains can,” says Eduardo Rodrigues, a physicist at the University of Cincinnati. “It’s why they can find patterns that are sometimes invisible to us.”
Machine learning started to become commonplace in computing during the 1980s, and LHC physicists have been using it routinely to help to manage and process raw data since 2012. Now, with upgrades to what is already the world’s most powerful particle accelerator looming on the horizon, physicists are implementing new applications of machine learning to help them with the imminent data deluge.
“The high-luminosity upgrade to the LHC is going to increase our amount of data by a factor of 100 relative to that used to discover the Higgs,” says Peter Elmer, a physicist at Princeton University. “This will help us search for rare particles and new physics, but if we’re not prepared, we risk being completely swamped with data.”
Only a small fraction of the LHC’s collisions are interesting to scientists. For instance, Higgs bosons are born in just roughly one out of every 2 billion proton-proton collisions. Machine learning is helping scientists to sort through the noise and isolate what’s truly important.
“It’s like mining for rare gems,” Rodrigues says. “Keeping all the sand and pebbles would be ridiculous, so we use algorithms to help us single out the things that look interesting. With machine learning, we can purify the sample even further and more efficiently.”
LHC physicists use a kind of machine learning called supervised learning. The principle behind supervised learning is nothing new; in fact, it’s how most of us learn how to read and write. Physicists start by training their machine-learning algorithms with data from collisions that are already well-understood. They tell them, “This is what a Higgs looks like. This is what a particle with a bottom quark looks like.”
After giving an algorithm all of the information they already know about hundreds of examples, physicists then pull back and task the computer with identifying the particles in collisions without labels. They monitor how well the algorithm performs and give corrections along the way. Eventually, the computer needs only minimal guidance and can become even better than humans at analyzing the data.
“This is saving the LHCb experiment a huge amount of time,” Rodrigues says. “In the past, we needed months to make sense of our raw detector data. With machine learning, we can now process and label events within the first few hours after we record them.”
Not only is machine learning helping physicists understand their real data, but it will soon help them create simulations to test their predictions from theory as well.
Using algorithms in the absence of machine learning, scientists have created virtual versions of their detectors with all the known laws of physics pre-programmed.
“The virtual experiment follows the known laws of physics to a T,” Elmer says. “We simulate proton-proton collisions and then predict how the byproducts will interact with every part of our detector.”
If scientists find a consistent discrepancy between the virtual data generated by their simulations and the real data recorded by their detectors, it could mean that the particles in the real world are playing by a different set of rules than the ones physicists already know.
A weakness of scientists’ current simulations is that they’re too slow. They use series of algorithms to precisely calculate how a particle will interact with every detector part it bumps into while moving through the many layers of a particle detector.
Even though it takes only a few minutes to simulate a collision this way, scientists need to simulate trillions of collisions to cover the possible outcomes of the 600 million collisions per second they will record with the HL-LHC.
“We don’t have the time or resources for that,” Elmer says.
With machine learning, on the other hand, they can generalize. Instead of calculating every single particle interaction with matter along the way, they can estimate its overall behavior based on its typical paths through the detector.
“It’s a matter of balancing quality with quantity,” Elmer says. “We’ll still use the very precise calculations for some studies. But for others, we don’t need such high-resolution simulations for the physics we want to do.”
Machine learning is helping scientists process more data faster. With the planned upgrades to the LHC, it could play an even large role in the future. But it is not a silver bullet, Elmer says.
“We still want to understand why and how all of our analyses work so that we can be completely confident in the results they produce,” he says. “We’ll always need a balance between shiny new technologies and our more traditional analysis techniques.”
Making new discoveries may require writing new software first.
Scientists conceived of the Large Hadron Collider and its experiments in 1992. Back then, Apple was starting to figure out the laptop computer, and a CERN fellow named Tim Berners-Lee had just released the code for a pet project called the World Wide Web.
“In the days when we started, there was no Google. There was no Facebook. There was no Twitter. All of these companies that tackle big data did not exist,” says Graeme Stewart, a CERN software specialist working on the ATLAS experiment. “Big data didn’t exist.”
The LHC experiments grew up with computing and have been remarkable in their ability to adapt to the evolving technology. Over the last 15 years, researchers have written more than 20 million lines of code that govern everything from data acquisition to final analysis. But physicists are anxious that the continually accumulating code has begun to pose a problem.
“This software is not sustainable,” says Peter Elmer, a physicist at Princeton. “Many of the original authors have left physics. Given the complex future demands on the software, it will be very difficult to evolve.”
Back when Stewart and his computer engineering colleagues were designing the computing structure for the LHC research program, they were focused on making their machines perform a single task faster and faster.
“And then in the mid-2000s, hardware manufacturers hit a wall, and it was impossible to get a computer to do one single thing any more quickly,” Stewart says, “so instead they started to do something which we call concurrency: the ability of a computer to do more than one thing at the same time. And that was sort of unfortunate timing for us. If it had happened five or 10 years earlier, we would have built that concurrent paradigm into the software framework for the LHC startup, but it came a little bit too late.”
Thanks to concurrency, today’s personal laptops can perform roughly four tasks at the same time, and the processors in CERN’s computing clusters can perform around 30 tasks at once. But graphics cards—such as the GPUs used in gaming—are now able to process up to 500 tasks at once.
“It’s critical that we take advantage of these new architectures to get the most out of the LHC research program,” Stewart says. “At the same time, adapting to that kind of hardware is a tremendous challenge.”
The experiments will need these hardware advancements. In eight years, a turbocharged version of the LHC will turn on with a proton beam four times more intense than it is today. This transformation will provide scientists with the huge volume of data they need to search for new physics and study rare processes. But according to Stewart, today’s software won’t be able to handle it.
“The volume of data anticipated jumps by an order of magnitude, and the complexity goes up by an order of magnitude,” he says. “Those are tremendous computing challenges, and the best way of succeeding is if we work in common.”
Stewart and Elmer are part of a huge community initiative that is planning how they will meet the enormous computing challenges of the four big LHC experiments and prepare the program for another two decades of intensive data collection.
According to a white paper recently published by the High Energy Physics Software Foundation, the software and computing power will be the biggest limiting factor to the amount of data the LHC experiments can collect and process, and so “the physics reach during HL-LHC will be limited by how efficiently these resources can be used.”
So the HEP Software Foundation has set out to adapt the LHC software to modern computing hardware so that the entire system can run more effectively and efficiently. “It’s like engineering a car,” Stewart says. “You might design something with really great tires, but if it doesn’t fit the axle, then the final result will not work very well.”
Instead of building custom solutions for each experiment—which would be time-consuming and costly—the community is coming together to identify where their computing needs overlap.
“Ninety percent of what we do is the same, so if we can develop a common system which all the experiments can use, that saves us a lot on time and computing resources,” Stewart says. “We’re creating tool kits and libraries that protect the average physicist from the complexity of the hardware and give them good signposts and guidelines as to how they actually write their code and integrate it into the larger system.”
These incremental changes will gradually modernize LHC computing and help maintain continuity with all the earlier work. It will also enable the system to remain flexible and adaptable to future advancements in computing.
“The discovery of the Higgs is behind us,” says Elmer. “The game is changing, and we need to be prepared.”
A pair of results bring neutrinos into the new era of multi-messenger astronomy.
On September 22, 2017, a tiny but energetic particle pierced Earth’s atmosphere and smashed into the planet near the Amundsen-Scott South Pole Station in Antarctica. The collision set loose a second particle, which lit a blue streak through the clear ice.
Luckily for science, the first particle was a neutrino, and the strike occurred within the cubic kilometer that makes up the IceCube neutrino experiment. Its detectors recorded the hit and sent out a public alert. The trail of light left behind by the second particle, a muon, pointed back to an intriguing object 4 billion light-years away: a violent, particle-flinging galaxy called a blazar, powered by a supermassive black hole.
This is the first time scientists have found evidence of the birthplace of an ultra-high-energy cosmic neutrino. And it is the second big result—after an August 2017 blockbuster from the Laser Interferometer Gravitational-Wave Observatory and a band of additional experiments—in what scientists are calling a new era of multi-messenger astronomy.
In multi-messenger astronomy, experiments work together to study two or more different kinds of signals, such as gamma rays and neutrinos, from a single highly energetic event in a galaxy far away, says Regina Caputo, analysis coordinator for the Fermi Gamma-ray Space Telescope’s Large Area Telescope instrument.
“When telescopes first came out, you could only use optical light—what you see,” she says. “The next step was seeing other wavelengths of light: infrared, ultraviolet, X-ray, gamma ray. Now, seeing neutrinos and gravitational waves from a single source tells us even more.”
Because none of these multi-messenger astronomical signals has an electromagnetic charge, all of them travel through the universe unaffected by magnetic fields. Their paths are predictable, meaning scientists can trace them back to where they came from.
Out of this world
A computer cluster at the South Pole sends out an automatic public alert whenever the IceCube detector catches a neutrino with so much energy that it must have come from outside Earth’s solar system. IceCube found its first two such particles—nicknamed “Bert” and “Ernie”—in 2013.
Before then, “nobody took neutrinos seriously as an astronomical messenger,” says Naoko Kurahashi Neilson, an assistant professor at Drexel University and co-convener of the group that searches for neutrino sources on IceCube. For about a century, scientists had been collecting detections of ultra-high-energy particles from space—but all of them were charged particles called cosmic rays, not neutrinos.
“Before IceCube, we had never seen a high-energy neutrino emission from the universe,” Kurahashi Neilson says.
Now “we have about eight of those per year,” says IceCube physicist Ignacio Taboada, an associate professor at Georgia Tech and Kurahashi Neilson’s fellow co-convener.
IceCube was finally detecting ultra-high-energy neutrinos, but until September, no experiment had been able to figure out where they were coming from.
“My opinion is that most of the sources of these neutrinos are so far away that it’s really difficult to observe their light or gamma rays,” Taboada says. “In this case, we got lucky and got one that is close enough.”
Using data collected by the Fermi Gamma-ray Space Telescope, Hiroshima University scientist Yasuyuki Tanaka was the first to link the neutrino to the blazar. The blazar in question had recently attracted the attention of the Fermi scientists by producing a higher flux of gamma rays than any of the 2000-some blazars they’ve seen in the last decade.
Next, scientists on the MAGIC telescope, located on the Canary Islands, made the same connection. “That triggered an avalanche of follow-ups,” Taboada says.
IceCube scientists followed up as well, searching back through 9.5 years of data to see if they had seen signs of the blazar before.
“It turns out, this part of the sky is special,” Kurahashi Neilson says. For four or five months starting in December 2014, IceCube detected an unusually large flux of neutrinos coming from that same area. They were also high-energy, but not nearly as energetic as the September 22 neutrino.
“The thing I find interesting about these two observations is how differently the blazar was behaving in each case,” Taboada says. In one, scientists caught one high-energy neutrino and copious gamma rays; in the other, the neutrinos were plentiful but the gamma rays were few.
The two findings, published in Science, reach a statistical significance of about 3 and 3.5 sigma, and together they paint a compelling picture.
IceCube scientists expect their participation in multi-messenger astronomy has only just begun. “It’s fascinating to think that just five years ago, we didn’t know for sure whether we would ever be able to detect gravitational waves or ultra-high-energy neutrinos,” Kurahashi Neilson says. “Now we’re talking about the possibility of finding a source in all three messengers.”
LGBT+ scientists offer advice for promoting inclusivity in a guide written for the physics and astronomy community.
Traveling to an important research conference or collaboration meeting is the kind of experience that can help launch a scientist’s career. It’s also the kind of experience that can be uniquely tense for someone from the lesbian, gay, bisexual, transgender and queer community.
“These types of opportunities are central to a person’s development as a scientist,” says Diana Parno, an assistant professor of physics at Carnegie Mellon University. “But it can be especially stressful and fraught when you aren’t certain if your identity is welcomed in a place.”
For years, Parno has witnessed people struggling for guidance on dealing with these types of situations through an LGBT+ physicists email listserv. A transgender graduate student is accepted to a summer school and wonders what to do about a rooming situation: Are they expected to put down their birth gender? If they put down their actual lived gender, will there be a problem if it is not listed on their driver’s license? Will their identity cause a problem with roommates?
“If there is an issue, and they bring it up with the organizers, maybe they’ll be very supportive, or maybe it will start a very poisonous gossip mill,” Parno says.
To make their fellow physicists aware of these situations and give them the tools they need to prevent or manage them, a group of LGBT+ physicists came together in 2014 to create a guidebook for the field. An updated version was published in April. Parno worked on both iterations.
The document includes best practices for physics institutions, as well as individuals, interested in creating an inclusive environment for their LGBT+ co-workers and students.
Monica Plisch, the director of education and diversity at the American Physical Society, says she is unaware of any other resource like this guide at the intersection of physics and the LGBT+ community.
One of the obstacles to LGBT+ inclusivity that Plisch sees in physics is a mindset that physics research exists in a realm of objectivity, separate from other human concerns. It doesn’t.
“Physics is fundamentally a human endeavor,” Plisch says. “Individuals need to be well supported, or the physics won’t be as strong. We need to talk about inclusivity, or we risk losing some of the best minds.”
A grassroots movement
The beginnings of the guide go back to 2010, when Elena Long, a graduate student in physics, noticed there were no LGBT+ resources available within the discipline. She began organizing informal gatherings of LGBT+ physicists at APS meetings in search of community and support.
The high level of participation underscored the critical need for these types of discussions in physics, says Tanmoy Laskar, a Jansky postdoctoral fellow of the National Radio Astronomy Observatory at the University of California, Berkeley and co-editor of the guide. “Human beings are messy, nonlinear entities with experiences accumulated over a lifetime,” he says. When complex people get together to do complex science, which has its own demands, “then sometimes things clash.”
Parno notes that one of the main takeaways from the APS session was that people were hungry for support and inclusion within their academic departments. This informed the initial planning for the guide. “We decided that the text would target people at the department level, specifically department chairs, who would like to make sure that people are included but aren’t quite sure what the issues are or what they should do.”
Meanwhile, APS organized a task force to gain a better understanding of barriers to LGBT+ inclusion in the broader physics community. The group authored the 2016 “LGBT Climate in Physics Report” based on a survey of more than 300 individuals.
When the APS posted its report to Facebook, the majority of the comments they received expressed support, Plisch says. The negative comments sprinkled among them had two general messages: “Why should certain people get special attention?” and “Physics should only be about physics.”
But building a career in physics is about more than just understanding physics, as the report documented, Plisch says. According to the report, feelings of exclusion led one in three LGBT+ physicists surveyed to consider leaving a position.
APS has continued to host roundtable discussions at professional meetings moderated by LGBT+ physicists to offer space to talk about these issues. Allies in the physics community attend these discussions, as well.
“I’ve talked to many physicists who identify as straight or cisgender throughout this process, and who really want to try and do right by their LGBT+ colleagues,” Parno says. “I’ve been very impressed and heartened by that.”
Plisch says, “These efforts show what self-organized groups can do to change the culture and practice of physics.”
A living guide
In 2016 Caltech undergraduate Adrian Ray Avalani went searching on the internet for resources for LGBT+ academics like Elena Long had. Thanks to Long and the others responsible for the APS guide, he found what he was looking for.
Ray Avalani decided to join the effort. He took part in the update of the guide, working primarily on sections about issues that transgender students face, such as the best way to handle the need for a name or pronoun change. He gave advice on how to notify a department and handle classroom dynamics.
“It surprised me how much better some departments can be at addressing these concerns compared to others,” Ray Avalani says. “Some physics departments at universities rely on some pretty outdated information.”
But the team of authors understands that guides like these can become outdated fast. The year after the original APS LGBT+ guide was published, the Supreme Court of the United States ruled that same-sex couples have a constitutional right to marry.
“One of the most exciting things we were able to include in the second edition, or rather exclude, was that we could strike everything about marriage laws being different in different states,” Parno says. “That was tremendously exciting.”
In the next iteration of the guide, the authors would like to add the perspectives of physicists working outside of academia—including those in industry and at national labs—as well as physicists who are based outside of the United States.
“A guide like this is a living document,” Laskar says. “Culture changes, people change, and laws change. Someday the resulting transformation will make guides like this obsolete. The faster that happens, the better.”
Meet four physicists who have found different ways to apply the skills they learned through their studies of the Higgs boson.
The discovery of the Higgs boson, considered the missing piece of the Standard Model of particle physics, was the result of a marathon of methodical work performed by hundreds of scientists, many of them graduate students and postdocs.
Six years later, Symmetry catches up with four scientists who followed their curiosity from Higgs research to brand new enterprises.
When Joanna Weng was a kid, there was never any question about whether she would pursue science.
“I had a chemistry experimental kit as a kid, and I wanted to understand how things work,” she says. “Also, MacGyver was my favorite TV series, and he was a physicist as well.”
In college, Weng’s physics professor told stories about CERN. Her uncle, also a physics professor, told her about the streets there named after famous physicists.
“It seemed like such a magical place,” she says.
As a master’s student, she had the opportunity to see CERN for herself, and she was hooked. She stayed at the European research center, working on experiments at the Large Hadron Collider, through her PhD and then afterward as a postdoc. She was surprised by how well the gigantic collaborations worked and loved the international environment.
“Flat hierarchies and nationality don’t really matter if everyone is pursuing the same goal,” she says. “Working with these people was truly amazing. I’m still friends with many of them today.”
By the time the LHC first started on September 10, 2008, Weng was in her first year as a postdoc and had already spent five years working to commission the CMS detector and creating simulations of what new discoveries such as the long-sought Higgs boson would look like in the CMS experiment. She was ready. But less than 10 days later, she and her colleagues were hit with a major setback: A faulty electrical bus connection led to extensive damages of several LHC magnets, and the accelerator was temporarily out of commission.
“It was a frustrating time for my generation of physicists,” she says.
While engineers repaired the LHC, Weng and her colleagues continued to test and optimize their software and data transfers. (“Looking back,” she says, “I should have taken more holidays.”)
The accelerator restarted on November 20, 2009. In 2010, Weng finally got her hands on her first batch of LHC data. On July 4, 2012, the CMS and ATLAS collaborations declared the discovery of the Higgs boson.
“It was great,” Weng says, “but I was also a bit disappointed we did not discover something completely unexpected, something new. It felt like we just confirmed something everybody was expecting anyway.”
She considered her options. “I didn’t want to keep moving from postdoc to postdoc, and trying to become a professor didn’t seem like a good option either, due to the very limited number of fixed positions,” she says. “After 10 years of fundamental research, I decided to do something more applied.”
After taking some time off, Weng found a job as a risk analyst for a Swiss nuclear power plant. Her responsibilities drew heavily on her ability to simulate and evaluate data as a means of predicting the future.
“The Fukushima nuclear disaster in Japan happened in part because the strength and risks of tsunamis were underestimated,” she says. “To evaluate the safety of the nuclear power plant, we looked at everything from the likelihood of a plane crashing into the power plant to simulating and evaluating what would happen during a natural disaster, such as a flooding. It was very exciting, and I learned a lot in new areas like engineering and reliability predictions.”
After the plant was scheduled to be shut down in 2019, Weng started teaching at a university in Zürich. “They were looking for someone with experience evaluating risks and also knowledge in particle accelerators,” she says. “It was a very specific profile which matched my experience perfectly.”
Today, Weng’s career has come nearly full circle. She works on safety analysis for particle accelerators, and she collaborates with colleagues at CERN and the European Spallation Source in Sweden.
“Looking back, I made the right choice of both doing particle physics and then leaving particle physics,” she says. “I enjoyed my life at CERN, and I’m enjoying my life after.”
In 2012, Andrew Hard had just started his second year of graduate school when his adviser, University of Wisconsin Professor Sau Lan Wu, asked him to get a head start analyzing a fresh set of data collected by the ATLAS experiment at the LHC.
“I was young and eager, so I agreed,” he says.
For the next several months, Hard worked around the clock in what turned out to be the final sprint in a decades-long marathon to find the Higgs boson.
“There was no work-life balance during the discovery,” he says. “I was even questioned by the Swiss police one time as I left CERN because they didn't believe that anyone would work until 3 a.m.”
Now Hard spends his hours analyzing data for Google—and usually gets home in time for dinner with his wife. “The days are still a bit long, but it beats the late nights and Sunday meetings in graduate school,” he says.
Like most physicists, Hard was drawn to particle physics because of his fondness for problem solving. “I love a challenge,” he says. “Understanding the nature of the universe is one of the hardest problems I can imagine.”
The start-up of the LHC offered budding researchers like Hard the opportunity to tackle what had been one of the most enduring questions in physics: How do fundamental particles get their mass? According to Hard, this was both exciting and all-consuming.
He was working on a search for two photons coming from the decay of a single, massive particle—a rare (but clear and clean) predicted sign of the Higgs. On June 22, 2012, he and another colleague ran the data through their code, and a prominent bump immediately arose right around a mass range where the ATLAS scientists thought they might find it.
“We soon got confirmation that other ATLAS analysts were seeing the same thing,” Hard says.
Not long after, the ATLAS and CMS experiments announced the discovery of the Higgs.
“I actually went through a miniature existential crisis following the public announcement of the results,” Hard says. “Everything had been focused on this one thing: discovery. Once that was over, I didn't know what to do with myself.”
After earning his PhD, Hard decided to pursue a new kind of search.
“During my first interview with Google, I mentioned that CERN and Google are both in the business of search, and the primary difference is the dataset,” he says. “The interviewer seemed to like that.”
Today, Hard is a software engineer using machine learning to improve GBoard, a virtual keyboard for mobile devices with features such as and hand-drawn emoji recognition and voice dictation. He says his training as a physicist prepared him to work across domains such as computing, statistics, mathematics and electronics, and his regular research presentations made him more confident as a public speaker.
He says he’s proud of his time working at CERN and his contributions to the discovery of the Higgs.
“I think that CERN plays an invaluable role in growing the collective knowledge of our species, and I was glad to contribute to that endeavor. My personal focus has just switched to increasing access to that knowledge.”
Géraldine Conti may have found her career path wishing on a star.
“During high school I started taking astronomy classes,” she says. Her fascination with astronomy led to a fascination with particle physics, which eventually led to her current job at the Walt Disney Company.
“When I discovered that you could do machine learning at Disney, which for me is the most magical industry, I was so excited,” she says.
Conti earned her PhD in 2010 working on the LHCb experiment at the Large Hadron Collider at CERN. The hunt for the Higgs boson was heating up, so for her postdoc she decided to switch experiments and join in on the excitement. “The LHC and its experiments were designed to look for the Higgs,” she says. “I thought that if the Higgs would be discovered at the LHC, I should definitely be part of its discovery.”
She started a postdoctoral research position at Harvard University, where she developed models that forecasted background events that mimic the signals predicted for the Higgs for the ATLAS experiment. She honed her ability to sift through the large and complicated data sets and find clandestine patterns.
In 2012, the ATLAS and CMS experiments announced the discovery of the Higgs boson. In 2016, Conti used her experience with large data sets to land a job at Disney. “I’m working with our Disney parks partners, applying the latest developments in machine learning to help them find ways to offer even better, more personalized guest experiences,” she says.
She says she finds surprisingly little difference between her current work at Disney and previous career as a physicist. “In both, we have research projects, develop code to do the analyses, and work in an international environment,” she says. “Only the nature of the data is fundamentally different.”
In addition to teaching her to draw meaning and insight from large and complex data sets, Conti found that her work as a physicist made her more dynamic and adaptable.
“As a physicist, you work on hardware, software and physics analyses, often simultaneously,” she says. “You get a broad field of competences, but most importantly, you get the confidence that you can adapt to new topics quickly and with success.”
Manuel Olmedo remembers looking through used books as a 15-year-old at a cafe in Tijuana on in 2004 and finding a copy of A Brief History of Time by Stephen Hawking in Spanish.
“It had nice pictures of the fission process,” he says. “I showed it to my dad, and he got it for me as a gift.”
Nine years later, he found himself in a setting straight out of Hawking’s book: working at CERN European research center, studying the recently discovered Higgs boson.
Olmedo was born in Mexico and moved with his family to the US when he was 13. His school placed him in the second lowest level of their English-as-a-second-language courses due to his reading comprehension skills.
“So I went to the library, grabbed a book that looked hard and started reading,” he says. “As I got better, I started reading Harry Potter.”
He remembers reading a science textbook around that time and seeing a short segment that stuck with him: It was about the CDF and DZero experiments at Fermi National Accelerator Laboratory, where scientists had discovered the top quark in 1995. “I didn’t really know English,” he says, “but I remember seeing this sidebar about a woman who studied quarks on the DZero experiment and thinking, ‘What’s a quark?’”
By the 9th grade, Olmedo had transitioned out of the ESL program. After high school, he went to the University of California, Santa Barbara to study physics.
“I had this amazing professor,” Olmedo says. “I would go to his office hours and we would ponder the world together. He taught me that understanding reality meant looking beyond what our eyes could see.”
In spring 2013, Olmedo moved to CERN to analyze a rare decay of the Higgs boson—into a pair of particles of light—as a part of the CMS experiment.
“The CMS data sets were too big to do this analysis efficiently,” he says. “My main project was to identify and isolate the pertinent information so that the entire collaboration could move much faster.”
Olmedo’s work helped finalize the so-called CMS Higgs legacy paper, which showed the collaboration’s first detailed measurements of the particle it had co-discovered.
After completing his PhD, Olmedo wanted to see what else particle physics had to offer. “I felt like particle physics is this huge tree, and I was just hanging onto a little twig. There’s so many branches and so many ways to look for new physics.”
Oldmedo is still looking beyond what his eyes can see, studying invisible particles called neutrinos as a postdoc for the University of Hawaii and a member of the Antarctic Impulse Transient Antenna collaboration. The ANITA experiment floats huge balloons carrying radio antennas above Antarctica to look for signs of high-energy cosmic neutrinos skimming the ice and emitting radio waves.
“It’s a much smaller experiment and collaboration than CMS, but I find myself using a lot of the same skills,” Olmedo says. “It’s also a lot of fun, which I think is a very important quality in scientific research.”
Nothing beats a small experiment for the breadth of experience it gives the scientist.
In March, a team of astronomers made a shocking announcement: They appeared to have found a galaxy with a severe lack of dark matter. The finding, published in Nature, challenged some of astronomers’ most fundamental knowledge about galaxy formation.
Perhaps as surprising as the result was the technology behind it. While many major astronomical discoveries are made by collaborations of hundreds or thousands of people with access to large, multimillion-dollar telescopes, the bizarre galaxy was observed by a team of five using a telescope two scientists constructed in their spare time from off-the-shelf Canon camera lenses.
The scope, called Dragonfly, is just one example of the small-scale projects that are making waves in fields such as astronomy and particle physics.
“Government agencies are uniquely positioned to be able to conduct the important big experiments,” says Fermilab scientist Aaron Chou. “But it shouldn’t be a requirement that the experiment has to be big to be interesting. The requirement should be, and is, that if the science is interesting, you should just go ahead and do it.”
The particle physics community is increasingly recognizing this. In 2014, an organizing group called the Particle Physics Project Prioritization Panel released its goals for particle physics over the coming decade, often referred to as the P5 report. One of its nine recommendations is to “maintain a program of projects at all scales, from the largest international projects to mid- and small-scale projects,” highlighting benefits of small-scale projects, such as exposing physicists to new techniques, allowing young scientists to lead projects, and facilitating partnerships between universities and national laboratories.
“It’s good to have a power-law kind of range spanning small, medium and large,” says Kate Scholberg, a neutrino physicist at Duke University. “I think that’s the healthiest thing for the community.”
Running the show
Small experiments are different than their “big science” counterparts in more ways than just size. Big experiments have their own bureaucracies, with working groups, meetings, mandatory shifts and authorship policies. The smaller the experiment, the less this is required.
The fast pace and open culture of small experiments can make them ideal for graduate students. Students who work on large telescopes or accelerators such as the Large Hadron Collider get the excitement of contributing to a major project, but they experience only one small part of it, such as analyzing a portion of data collected by someone else. Phases of experiments can last more than the five or so years it takes to get a PhD, meaning a student can leave school without ever seeing an experimental process from beginning to end.
In contrast, students on smaller experiments can actively contribute to—or even oversee—everything from hardware conception and design to data acquisition, analysis and publication in the course of their degrees.
Deborah Lokhorst, a University of Toronto graduate student working with Dragonfly’s co-creator Bob Abraham, has experienced this firsthand. Lokhorst has already developed both software and instrumentation for the telescope. She’s currently working on the instrumentation to hold filters that will be placed in front of the lenses—generally impossible on larger scopes—to image gas that permeates the space between galaxies, known as the intergalactic medium, or cosmic web. The new filter location should allow for clearer images in a more narrow wavelength of light. Once the filter technology is produced, she’ll take the telescope to New Mexico to test it.
“For now, we really have to treat it like a physics experiment, which is very unusual in astronomy, to be doing an experiment,” Lokhorst says. “This is all untested so far.”
For his part, Chou has worked on several small-science experiments, including Fermilab’s Holometer, a miniature version of the twin interferometers that made the groundbreaking first detection of gravitational waves. The Holometer searched for evidence of the quantization of space-time and went from idea to published result in seven years. Chou’s four students on the project were in great demand when they graduated, he says, in part because they had intense experience in all aspects of experimental physics.
Scholberg has seen students move from large to small collaborations for this reason. She served as a spokesperson for the COHERENT experiment, which studies neutrinos bouncing off atomic nuclei. It first formed in 2014, and by 2017 had built its 31-pound detector, the smallest ever to successfully detect a neutrino. Scholberg also works on large neutrino experiments, including the international, Fermilab-hosted Deep Underground Neutrino Experiment, a collaboration of over 1000 physicists.
Graduate students contribute to DUNE, in part because it’s a unique opportunity to help shape a huge, decades-long experiment. But some also work on other experiments such as COHERENT to complete their doctorates—US institutions typically require PhD theses be based on experimental data, and data collected by the DUNE prototypes will not be available until this fall.
Eventually, the COHERENT team hopes to scale the project up. But starting at the largest scale was too expensive, and the team couldn’t secure funding. So they created a small version instead. “The strategy was, ‘Let’s start with something really small and modest and see if we can do it,’” Scholberg says. “That, I think, was actually the right strategy.”
In that spirit, Chou and several researchers organized a workshop last year to gather ideas for small-scale dark matter experiments, many of which could serve as prototypes for future large experiments looking for alternative types of dark matter.
“We don’t know much about dark matter, and so maybe it’s something very different than what we thought,” says University of Minnesota physicist Priscilla Cushman, one of the workshop’s organizers. “If that’s the case, then large regions of unexplored parameter space become available, and small experiments with new ideas and technologies can make a big impact at the beginning.
The proposed small projects could serve simultaneously as experiments unto themselves and as tests for new technologies that might become the “big science” projects of the future. And because they allow for deep involvement of scientists at any stage of their career, they also help create a pipeline of well-trained, enthusiastic physicists who will keep the field strong for decades to come.
“[Dragonfly] really opened my eyes to the fact that these kinds of projects exist, where you can build your own telescope from ground zero and then do cutting-edge science,” Lokhorst says. “Just knowing that something like this actually worked would inspire me to try and think outside the box in the future, to explore and think about ways to image things that people haven’t seen before.”
These are the event displays of Large Hadron Collider physicists’ dreams.
Our current understanding of particle physics, known as the Standard Model, has been tremendously successful. Over the past 40 years, it has pointed experimentalists toward many a discovery, such as the Higgs boson in 2012. However, despite all of its triumphs, there are a few things that the Standard Model does not explain.
One major missing piece, for example, is gravity. Although gravity is one of the four fundamental forces of nature, and we understand it well on the scale of stars, planets and galaxies, there is no working theory that can combine this understanding with the laws of particle physics.
Another missing piece—quite a big one—is dark matter, thought to make up about 80 percent of the matter in the universe.
Theorists have come up with a variety of ways to fill in the missing pieces, predicting the existence of several undiscovered particles and processes along the way. Scientists at the Large Hadron Collider are looking for signs of physics beyond the Standard Model, and they have ideas about what they might look like if they appear.
The LHC produces new particles by colliding protons at very high energies. The energy from the collisions converts momentarily into new particles, including ones that you don’t typically see floating about in nature. Scientists call these collisions “events,” and visual representations of them are called “event displays.”
The trained eye can spot when an event display is different from the norm, a clue that something unexpected and interesting is afoot. Let’s have a look at some theories of physics beyond the Standard Model and find out what they might look like in the detectors at the LHC.
By observing the gravitational properties of stars and galaxies, scientists can tell that some invisible thing must be influencing our universe. Scientists call it dark matter, which many theories predict must be made up of particles that interact with other matter only through gravity. If scientists were able to find signs of dark matter particles at the LHC, it would be a huge step forward in our understanding.
Here, we see a simulated event display showing what dark matter might look like in the ATLAS experiment at the LHC. The event display offers two views: one from the side and the other head-on. The intersecting orange lines show where the particle collision took place, and the other colored blocks show where particles deposited energy in different layers of the ATLAS detector.
Courtesy of the ATLAS Collaboration
If we look at the head-on view on the right, we can see particles with a large amount of energy moving up and slightly to the right. The conservation of energy requires that there must be an equal amount of energy traveling away from the collision on the other side, but this event is unbalanced.
This uneven event display suggests that something that we don’t see must be carrying away the extra energy (represented on the right by a white dashed line). The energy thief could be dark matter, passing through the detector without interacting.
It could be that dark matter is made up of a single kind of particle. Or it could be that dark matter, like regular matter, consists of a whole collection of different types of particles. Theories that describe dark matter in this way are called “hidden sector” or “hidden valley” theories of dark matter.
Hidden sector dark matter particles, like the dark matter particle simulated above, would pass right through a particle detector unnoticed. But if they traveled away from the collision where they were born and then decayed into regular matter particles, those particles would show up in interesting tracks on the event display.
Courtesy of the CMS Collaboration
In the simulated event display from the CMS detector above, we see a signature that could indicate the existence of hidden sector particles.
On the left, we can see the tracks of two particles called muons (in green). (Muons are the more massive cousins of electrons.) Their point of origin (where the lines cross) is a significant distance from the site of the original particle collision (the orange dot at the center). On the right side of the event we can see two electrons (the shorter green lines), which also seem to be coming from slightly to the side of the particle collision. This suggests that they were produced by a new unseen particle that traveled some distance away from the original collision before decaying into the observed particles.
Each of the forces in the Standard Model has at least one associated force-carrying particle: The electromagnetic force has the photon; the strong force has the gluon; and the weak force has the W and Z bosons. But as mentioned previously, the force of gravity isn’t included in the Standard Model.
If it were, it could be represented by a theoretical particle called a graviton. Very little is known about gravitons, but one thing scientists can predict is how they would decay. When a graviton is produced, it should immediately transform into two Z bosons, which in turn should decay into other pairs of particles.
The picture below shows a simulated graviton event from the CMS experiment at the LHC. In it, one Z boson decays into a pair of muons (the two red lines on the left), and the other decays into a spray of particles known as a “jet” (the several red and blue bars on the right). From these decay products, we can reconstruct the mass of the original particle that produced the two Z bosons.
Courtesy of the CMS Collaboration
An observation of a single event like this would not by itself be evidence of a graviton, because other known processes could produce two Z bosons. However, if scientists saw a large number of events like this and could reconstruct them to show that they all came from a particle with the same mass, that would be strong evidence for a new particle like a graviton.
Microscopic black holes
Gravity is tricky for particle physicists. It’s just so different from the other forces scientists know. Compared to electromagnetism, the strong force (which holds protons and neutrons together in our atoms) and even the weak force (which mediates radioactive decay), gravity is incredibly weak.
One explanation why gravity is so strange is that it could be operating in more than our three known dimensions of space. If gravity is actually stronger than it seems and is simply being dissipated over unseen additional dimensions, scientists predict that they should be able to use the LHC to create miniature black holes—tiny objects with enough concentrated mass or energy to have an escape velocity higher than the speed of light. Like the particles produced in LHC collisions, these would quickly evaporate—in this case into a burst of smaller particles.
Courtesy of the ATLAS Collaboration
The above image is a simulated black hole decay in a collision in the ATLAS detector. What makes this distinctive is that there is so much going on: The event display is an eruption of electrons, muons and particle jets. The sheer volume of particles is a strong indication that something new might be happening.
Another way to infer the existence of extra dimensions would be to see particles leaking from them into ours.
The simulated event display from the CMS experiment below looks deceptively simple: The main features of interest are the two muons (the red lines) passing through the detector on the left and right of the display. However, what makes this display interesting is that when scientists reconstruct the mass of the parent particle that decayed into these muons, it adds up to more than 10 times the mass of any known particle.
Courtesy of the CMS Collaboration
Particles leaking in from extra dimensions could produce more muon pairs with a higher reconstructed mass than regular proton collisions in the LHC could. If we saw a significant number of events like this with a larger mass than expected, it would be a strong indication that new physics is at work.
Just a sample
This gallery only scratches the surface of what new physics might look like at the LHC. Every day, scientists propose new theories that could produce even stranger looking events. And of course, a single event alone is not enough to confirm or disprove a new theory; scientists would generally need to see many such events to make the case that something unexpected is going on.
That’s why the LHC continues to collect mountains of data and will soon undergo upgrades that will have it smashing particles at a rate 10 times higher than originally designed. Every collision is a chance to see something new.
Some scientists spend decades trying to catch a glimpse of a rare process. But with good experimental design and a lot of luck, they often need only a handful of signals to make a discovery.
In 2009, University of Naples physicist Giovanni de Lellis had a routine. Almost every day, he would sit at a microscope to examine the data from his experiment, the Oscillation Project with Emulsion-tRacking Apparatus, or OPERA, located in Gran Sasso, Italy. He was seeking the same thing he had been looking for since 1996, when he was with the CHORUS experiment at CERN: a tau neutrino.
More specifically, he was looking for evidence of a muon neutrino oscillating into a tau neutrino.
Neutrinos come in three flavors: electron, muon and tau. At the time, scientists knew that they oscillated, changing flavors as they traveled at close to the speed of light. But they had never seen a muon neutrino transform into a tau neutrino.
Until November 30, 2009. On that day, de Lellis and the rest of the OPERA collaboration spotted their first tau neutrino in a beam of muon neutrinos coming from CERN research center 730 kilometers away.
“Normally, what you would do is look and look, and nothing comes,” says de Lellis, now spokesperson for the OPERA collaboration. “So it's quite an exciting moment when you spot your event.”
For physicists seeking rare events, patience is key. Experiments like these often involve many years of waiting for a signal to appear. Some phenomena, such as neutrinoless double-beta decay, proton decay and dark matter, continue to elude researchers, despite decades of searching.
Still, scientists hope that after the lengthy wait, there will be a worthwhile reward. Finding neutrinoless double-beta decay would let researchers know that neutrinos are actually their own antiparticles and help explain why there’s more matter than antimatter. Discovering proton decay would test several grand unified theories—and let us know that one of the key components of atoms doesn’t last forever. And discovering dark matter would finally tell us what makes up about a quarter of the mass and energy in the universe.
“These are really hard experiments,” says Reina Maruyama, a physicist at Yale University working on neutrinoless double-beta decay experiment CUORE (Cryogenic Underground Observatory for Rare Events) as well as a number of direct dark matter searches. “But they will help answer really fundamental questions that have implications for how the universe was put together.”
Seeking signs, cutting noise
For the OPERA collaboration, finding a likely tau neutrino candidate was just the beginning. Hours of additional work, including further analyses and verification from other scientists, were required to confirm that signal didn’t originate from another source.
Luckily, the first signal passed all the checks, and the team was able to observe four more candidate events in the following years. By 2015, the team had gathered enough data to confidently confirm that muon neutrinos had transformed into tau neutrinos. More specifically, they were able to achieve a 5-sigma result, the gold standard of detection in particle physics, which means there's only a 1 in 3.5 million chance that the signal from an experiment was a fluke.
For some experiments, seeing as few as two or three events could be enough to make a discovery, says Tiziano Camporesi, a physicist working on the CMS experiment at CERN. This was true when scientists at CERN’s Super Proton Synchrotron discovered the Z boson, a neutral elementary particle carrying the weak force, in 1983. “The Z boson discovery was basically made looking at three events,” Camporesi says, “but these three events were so striking that no other kind of particle being produced at the accelerator at the time could fake it.”
There are a number of ways scientists can improve their odds of catching an elusive event. In general, they can boost signals by making their detectors bigger and by improving the speed and precision with which they record incoming events.
But a lot depends on background noise: How prevalent are other phenomena that could create a false signal that looks like the one the scientists are searching for?
When it comes to rare events, scientists often have to go to great lengths to eliminate—or at least reduce—all sources of potential background noise. “Designing an experiment that is immune to background is challenging,” says Augusto Ceccucci, spokesperson for NA62, an experiment searching for an extremely rare kaon decay.
For its part, NA62 scientists remove background noise by, for example, studying only the decay products that coincide in time with the passage of incoming particles from a kaon beam, and carefully identifying the characteristics of signals that could mimic what they’re looking for so they can eliminate them.
The Super Cryogenic Dark Matter Search experiment, or SuperCDMS, led by SLAC National Accelerator Laboratory, goes to great lengths to protect its detectors from cosmic rays, particles that regularly rain down on Earth from space. To eliminate this source of background, scientists shield the detectors with iron, ship them by ground and sea, and operate them deep underground. “So it would not take many dark matter particles detected to satisfy the 5-sigma detection rule,” says Fermilab’s Dan Bauer, spokesperson for SuperCDMS.
At particle accelerators, the search for rare phenomena looks a little different. Rather than simply waiting for a particle to show up in a detector, physicists try to create them in particle collisions. The more elusive a phenomenon is, the more collisions it requires to find. Thus, at the Large Hadron Collider, “in order to achieve smaller and smaller probability of production, we're getting more and more intense beams,” Camporesi says.
Triangulating the results of different experiments can help scientists build a picture of the particles or processes they’re looking for without actually finding them. For example, by understanding what dark matter is not, physicists can constrain what it could be. “You take combinations of different experiments and you start rejecting different hypotheses,” Maruyama says.
Only time will tell whether scientists will be able to detect neutrinoless double-beta decay, proton decay, dark matter or other rare events that have yet to be spotted at physicists’ detectors. But once they do—and once scientists know what specific signatures to find, Maruyama says, “it becomes a lot easier to look for these things, and you can go ahead and study the heck out of them.”
Halina Abramowicz leads the group effort to decide the future of European particle physics.
Physics projects are getting bigger, more global, more collaborative and more advanced than ever—with long lead times for complex physics machines. That translates into more international planning to set the course for the future.
In 2014, the United States particle physics community set its priorities for the coming years using recommendations from the Particle Physics Project Prioritization Panel, or P5. In 2020, the European community will refresh its vision with the European Strategy Update for Particle Physics.
The first European strategy launched in 2006 and was revisited in 2013. In 2019, teams will gather input through planning meetings in preparation for the next refresh.
Halina Abramowicz, a physicist who works on the ATLAS experiment at CERN’s Large Hadron Collider and the FCAL research and development collaboration through Tel Aviv University, is the chair of the massive undertaking. During a visit to Fermilab to provide US-based scientists with an overview of the process, she sat down with Symmetry writer Lauren Biron to discuss the future of physics in Europe.
What do you hope to achieve with the next European Strategy Update for Particle Physics?
Europe is a very good example of the fact that particle physics is very international, because of the size of the infrastructure that we need to progress, and because of the financial constraints.
The community of physicists working on particle physics is very large; Europe has probably about 10,000 physicists. They have different interests, different expertise, and somehow, we have to make sure to have a very balanced program, such that the community is satisfied, and that at the same time it remains attractive, dynamic, and pushing the science forward. We have to take into account the interests of various national programs, universities, existing smaller laboratories, CERN, and make sure that there is a complementarity, a spread of activities—because that’s the way to keep the field attractive, that is, to be able to answer more questions faster.
How do you decide when to revisit the European plan for particle physics?
Once the Higgs was discovered, it became clear that it was time to revisit the strategy, and the first update happened in 2013. The recommendation was to vigorously pursue the preparations for the high-luminosity upgrade of the [Large Hadron Collider]. The high-luminosity LHC program was formally approved by the CERN Council in September 2016. By the end of 2018, the LHC experiments will have collected almost a factor of 10 more data. It will be a good time to reflect on the latest results, to think about mid-term plans, to discuss what are the different options to consider next and their possible timelines, and to ponder what would make sense as we look into the long-term future.
The other aspect which is very important is the fact that the process is called “strategy,” rather than “roadmap,” because it is a discussion not only of the scientific goals and associated projects, but also of how to achieve them. The strategy basically is about everything that the community should be doing in order to achieve the roadmap.
What’s the difference between a strategy and a roadmap?
The roadmap is about prioritizing the scientific goals and about the way to address them, while the strategy covers also all the different aspects to consider in order to make the program a success. For example, outreach is part of the strategy. We have to make sure we are doing something that society knows about and is interested in. Education: making sure we share our knowledge in a way which is understandable. Detector developments. Technology transfer. Work with industry. Making sure the byproducts of our activities can also be used for society. It’s a much wider view.
What is your role in this process?
The role of the secretary of the strategy is to organize the process and to chair the discussions so that there is an orderly process. At this stage, we have one year to prepare all the elements of the process that are needed—i.e. to collect the input. In the near future we will have to nominate people for the physics preparatory group that will help us organize the open symposium, which is basically the equivalent of a town-hall meeting.
The hope is that if it’s well organized and we can reach a consensus, especially on the most important aspects, the outcome will come from the community. We have to make sure through interaction with the European community and the worldwide community that we aren’t forgetting anything. The more inputs we have, the better. It is very important that the process be open.
The first year we debate the physics goals and try to organize the community around a possible plan. Then comes the process that is maybe a little shorter than a year, during which the constraints related to funding and interests of various national communities have to be integrated. I’m of course also hoping that we will get, as an input to the strategy discussions, some national roadmaps. It’s the role of the chair to keep this process flowing.
Can you tell us a little about your background and how you came to serve as the chair for European Strategy Update?
That’s a good question. I really don’t know. I did my PhD in 1978; I was one of the youngest PhDs of Warsaw University, thus I’ve spent 40 years in the field. That means that I have participated in at least five large experiments and at least two or three smaller projects. I have a very broad view—not necessarily a deep view—but a broad view of what’s happening.
There are major particle physics projects going on around the world, like DUNE in the US and Belle II in Japan. How much will the panel look beyond Europe to coordinate activities, and how will it incorporate feedback from scientists on those projects?
This is one of the issues that was very much discussed during my visit. We shouldn’t try to organize the whole world—in fact, a little bit of competition is very healthy. And complementarity is also very important.
At the physics-level discussions, we’ll make sure that we have representatives from the United States and other countries so we are provided with all the information. As I was discussing with many people here, if there are ideas, experiments or existing collaborations which already include European partners, then of course, there is no issue [because the European partners will provide input to the strategy].
How do you see Europe working with Asia, in particular China, which has ambitions for a major collider?
Collaboration is very important, and at the global level we have to find the right balance between competition, which is stimulating, and complementarity. So we’re very much hoping to have one representative from China in the physics preparatory group, because China seems to have ambitions to realize some of the projects which have been discussed. And I’m not talking only about the equivalent of [the Future Circular Collider]; they are also thinking about an [electron-positron] circular collider, and there are also other projects that could potentially be realized in China. I also think that if the Chinese community decides on one of these projects, it may need contributions from around the world. Funding is an important aspect for any future project, but it is also important to reach a critical mass of expertise, especially for large research infrastructures.
This is a huge effort. What are some of the benefits and challenges of meeting with physicists from across Europe to come up with a single plan?
The benefits are obvious. The more input we have, the fuller the picture we have, and the more likely we are to converge on something that satisfies maybe not everybody, but at least the majority—which I think is very important for a good feeling in the community.
The challenges are also obvious. On one hand, we rely very much on individuals and their creative ideas. These are usually the people who also happen to be the big pushers and tend to generate most controversies. So we will have to find a balance to keep the process interesting but constructive. There is no doubt that there will be passionate and exciting discussions that will need to happen; this is part of the process. There would be no point in only discussing issues on which we all agree.
The various physics communities, in the ideal situation, get organized. We have the neutrino community, [electron-positron collider] community, precision measurements community, the axion community—and here you can see all kinds of divisions. But if these communities can get organized and come up with what one could call their own white paper, or what I would call a 10-page proposal, of how various projects could be lined up, and what would be the advantages or disadvantages of such an approach, then the job will be very easy.
And that input is what you’re aiming to get by December 2018?
How far does the strategy look out?
It doesn’t have an end date. This is why one of the requests for the input is for people to estimate the time scale—how much time would be needed to prepare and to realize the project. This will allow us to build a timeline.
We have at present a large project that is approved: the high-luminosity LHC. This will keep an important part of our community busy for the next 10 to 20 years. But will the entire community remain fully committed for the whole duration of the program if there are no major discoveries?
I’m not sure that we can be fed intellectually by one project. I think we need more than one. There’s a diversity program—diversity in the sense of trying to maximize the physics output by asking questions which can be answered with the existing facilities. Maybe this is the time to pause and diversify while waiting for the next big step.
Do you see any particular topics that you think are likely to come up in the discussion?
There are many questions on the table. For example, should we go for a proton-proton or an [electron-positron] program? There are, for instance, voices advocating for a dedicated Higgs factory, which would allow us to make measurements of the Higgs properties to a precision that would be extremely hard to achieve at the LHC. So we will have to discuss if the next machine should be an [electron-positron] machine and check whether it is realistic and on what time scale.
One of the subjects that I’m pretty sure will come up as well is about pushing the accelerating technologies. Are we getting to the limit of what we can do with the existing technologies, and is it time to think about something else?
Work has begun on an upgrade to the Facility for Advanced Accelerator Experimental Tests at SLAC National Accelerator Laboratory.
The Department of Energy’s SLAC National Accelerator Laboratory has started to assemble a new facility for revolutionary accelerator technologies that could make future accelerators 100 to 1000 times smaller and boost their capabilities.
The project is an upgrade to the Facility for Advanced Accelerator Experimental Tests (FACET), a DOE Office of Science user facility that operated from 2011 to 2016. FACET-II will produce beams of highly energetic electrons like its predecessor, but with even better quality. These beams will primarily be used to develop plasma acceleration techniques, which could lead to next-generation particle colliders that enhance our understanding of nature’s fundamental particles and forces, and novel X-ray lasers that provide us with unparalleled views of ultrafast processes in the atomic world around us.
FACET-II will be a unique facility that will help keep the US at the forefront of accelerator science, says SLAC’s Vitaly Yakimenko, project director. “Its high-quality beams will enable us to develop novel acceleration methods. In particular, those studies will bring us close to turning plasma acceleration into actual scientific applications.”
The DOE has now approved the $26 million project. The new facility, which is expected to be completed by the end of 2019, will also operate as an Office of Science user facility—a federally sponsored facility for advanced accelerator research available on a competitive, peer-reviewed basis to scientists from around the world.
“As a strategically important national user facility, FACET-II will allow us to explore the feasibility and applications of plasma-driven accelerator technology,” says James Siegrist, associate director of the High Energy Physics program of DOE’s Office of Science, which stewards advanced accelerator R&D in the US for the development of applications in science and society. “We’re looking forward to seeing the groundbreaking science in this area that FACET-II promises, with the potential for significant reduction of the size and cost of future accelerators, including free-electron lasers and medical accelerators.”
Bruce Dunham, head of SLAC’s Accelerator Directorate, says, “Our lab was built on accelerator technology and continues to push innovations in the field. We’re excited to see FACET-II move forward.”
Surfing the plasma wake
The new facility will build on the successes of FACET, where scientists already demonstrated that the plasma technique can very efficiently boost the energy of electrons and their antimatter particles, positrons. In this method, researchers send a bunch of very energetic particles through a hot ionized gas, or plasma, creating a plasma wake for a trailing bunch to “surf” on and gain energy.
Researchers will use FACET-II to develop the plasma wakefield acceleration method, in which researchers send a bunch of very energetic particles through a hot ionized gas, or plasma, creating a plasma wake for a trailing bunch to “surf” on and gain energy.
Greg Stewart, SLAC National Accelerator Laboratory
In conventional accelerators, particles draw energy from a radio-frequency field inside metal structures. However, these structures can only support a limited energy gain per distance before breaking down. Therefore, accelerators that generate very high energies become very long, and very expensive. The plasma wakefield approach promises to break new ground. Future plasma accelerators could, for example, unfold the same acceleration power as SLAC’s historic 2-mile-long copper accelerator in just a few meters.
Researchers will use FACET-II for crucial developments before plasma accelerators can become a reality. “We need to show that we’re able to preserve the quality of the beam as it passes through plasma,” says SLAC’s Mark Hogan, FACET-II project scientist. “High-quality beams are an absolute requirement for future applications in particle and X-ray laser physics.”
The FACET-II facility is currently funded to operate with electrons, but its design allows adding the capability to produce and accelerate positrons later—a step that would enable the development of plasma-based electron-positron particle colliders for particle physics experiments.
Another important objective is the development of novel electron sources that could lead to next-generation light sources, such as brighter-than-ever X-ray lasers. These powerful discovery machines provide scientists with unprecedented views of the ever-changing atomic world and open up new avenues for research in chemistry, biology and materials science.
Other science goals for FACET-II include compact wakefield accelerators that use certain electrical insulators instead of plasma, as well as diagnostics and computational tools that will accurately measure and simulate the physics of the new facility’s powerful electron beams. Science goals are being developed with regular input from the FACET user community.
“The approval for FACET-II is an exciting milestone for the science community,” says Chandrashekhar Joshi, a researcher from the University of California, Los Angeles, and longtime collaborator of SLAC’s plasma acceleration team. “The facility will push the boundaries of accelerator science, discover new and unexpected physics and substantially contribute to the nation’s coordinated effort in advanced accelerator R&D.”
Editor's note: This article is based on a press release issued by SLAC National Accelerator Laboratory.