**Amara McCune, PhD Student**

**University of California, Santa Barbara**** **

“So, what do you do all day?” is a typical response after I tell anybody, whether it be a friend or a stranger sitting next to me on an airplane, that I study theoretical physics. I have to admit that I haven’t formulated a concrete answer to this yet, even two years into my PhD. Sometimes the hours slip by and I ask myself “what *do *I do all day?” The most truthful response is that I sit down at my desk, read some papers, work on or outline a calculation, troubleshoot a concept I’m stuck on, attend talks and meetings, and consume 1-2 cups of coffee during the entire process. But I know this question digs deeper into the heart of what theoretical physics really is.

I am a graduate student at UC Santa Barbara, where I study a particular subfield of theoretical physics called particle phenomenology. This means I am a model builder, I investigate different extensions of the Standard Model (SM) to see where they might fit in our parameter space of data, working between the realms of mathematical definitions and experimental observations. The SM comprises our current understanding of particle physics, whereby matter is comprised of twelve different fermion (spin-1/2) matter particles (six quarks and six leptons), five force-carrying boson (spin-1) particles, and the forces between them are the strong nuclear force, weak nuclear force, and electromagnetic force. The latter two forces can be combined under a symmetry breaking scheme to form the electroweak force. The model has been extensively tested, most notably at the Large Hadron Collider (a 27 km-long circular particle accelerator located on the border between Switzerland and France) but also continually in tabletop experiments, smaller circular and linear accelerators around the world, and in cosmological probes and datasets. It has been largely successful at making predictions about the world around us.

However, we know that the SM is incomplete. It only includes three fundamental forces — the fourth, gravity, has not been successfully incorporated. We also have extensive evidence for the existence of dark matter and dark energy, neither of which are described within the realm of the SM. Then there are more subtle problems: there are a large number of free parameters in the theory, each of which have to be input directly into our models. There are a number of fine-tuning problems, whereby the values of certain constants must be precisely adjusted to fit experimental data or predictions. This means that the theory does not provide explanations for key questions, such as why the mass of the electron is what it is, falling short of a desired first-principles approach.

The majority of current work in theoretical particle physics revolves around these ideas. But how does this process actually transpire? There are a few basic ingredients that go into a particle-based theory: a description of the fields, their interactions, and which symmetries they obey. This is bundled together in a mathematical depiction, a Lagrangian, as the endpoint of model building. With any given number of parameters, a theory can often be adjusted to create several potential models aiming to describe some specified phenomenon. If this seems vague, it’s because it is — there’s no cookie-cutter method to come up with a new model because you have to see what works. Creating avenues for new quantum fields and interactions is very much an art form, and one that is inherently speculative. The vast majority of theories turn out to have little experimental evidence, but many of them spurn new ideas and keep our collective knowledge trekking forward in the constant pursuit of an all-encompassing theory, which may, or more likely may not, come within our lifetimes.

But we do not despair. Physicists love when they encounter a new finding that does not fit into predictions derived from our current understanding of physics because it means figuring out a new puzzle, ones which are sometimes decades or centuries in the making. This is the heartbeat that drives the field forward. When confronted with an unusual experimental result, or anomaly, our job as theorists is to figure out what the broader implications of such an anomaly might be. Maybe an excess signal was detected — to figure out what it could be, we might check constraints from dark matter parameters, or look into rare particle interactions. We often have a long list of experimental constraints that we must work within the confines of, but every once in a while, some explanation fits.

There are a few general principles that have procured promising results in the past, although have not proven to be consistently reliable. One of these, naturalness, is a principle which roughly states that there should be no parameters in the theory that are either incredibly large or unusually small numbers, misaligned with the magnitude of the other parameters. While a seemingly strange idea, it hints at new physics above the energy scale we can currently probe. And there is definitely unknown physics at a very small distance scale, and hence large energy scale, since we do not have a working theory of quantum gravity describing our universe.

As an example, for decades, physicists have been looking for supersymmetry, a theory that would nicely align the four fundamental forces at a particular energy scale via a relationship between fermions and bosons. However, we have found no evidence for this theory to date — in testing those potential predictions, we continually come up empty handed. While a disappointment, science is often not so straightforward, but we march forward by coming up with new theories. These might be some less-elegant modification of supersymmetry, string theory, or something completely different. It is not always the case that the prettiest or most compelling solutions end up being answers.

To further illustrate this process, let’s walk through an example that I am currently working on: a parity solution to the strong CP problem. The strong CP problem is one of those subtleties of the SM, and it has to do with the fundamental symmetries of charge conjugation (C) and parity (P). The former refers to a symmetry in which positive charges are exchanged for negative ones and vice versa, while the latter refers to a symmetry in which we flip the sign of all spatial coordinates. The CP transformation refers to the combination of these individual symmetries. The “problem” arises because, while this symmetry can be violated in the case of weak interactions (involving the weak force), it is seemingly conserved in the case of strong interactions (involving the strong force). This at first glance might not seem to be a problem, yet we know of no reason why CP should not be violated in the strong interaction. Further, it seemingly sets a value of a parameter in the theory, known as the CP-violating phase, to be a tiny value of no more than one part in 10e-10. This is an example of fine-tuning.

The most widely-studied solution to the strong CP problem is the axion solution, which sets the CP-violating phase to zero via an introduction of a new dynamical field, corresponding to a particle that has not been discovered but is the target for several current and upcoming experiments. Yet this is not the only option. We first notice that the symmetry group of the SM does not obey parity; when “left-handed” particles and symmetries are exchanged with their “right-handed” counterparts, you have fundamentally changed the theory. This is encompassed in the idea of chirality, in which certain phenomena are not identical to their corresponding mirror images. But if the SM symmetry group were to be extended such that parity is obeyed, this has the effect of also setting the CP-violating phase to zero. Hence, a solution!

Here’s an important nuance: when I say this “solves” the strong CP problem, I don’t mean that I’ve definitively found the answer. I mean that this model extends our list of potential answers, because determining the true solution relies on experimental evidence for verification, and we are very stringent in the requirement that we need a 5 sigma standard deviation in order to confirm a result. This means that there is a one in 3.5 million chance that this result originates from a random fluctuation, an incredible level of accuracy. Only once this has been achieved do we declare the “eureka” of discovery.

Yet we do not know how long it will take to reach such a moment, or even if one will come at all. Physicists often work on extended time scales, as theories can take decades or longer to be properly verified or debunked, and many theorists might not see the fruits of their labor. Current technologies we know and love, like smartphones and GPS, are built from relatively old ideas in fundamental physics and proliferated long after their theoretical foundations were developed. But the majority of us are driven instead by the process of continual scientific questioning and discovery that comes only with exploring the deepest questions the universe has to offer. It’s not for everyone, but it certainly works for me.

So, I suppose that is what I do with my days.

Wiki - Standard Model Image Credit: By MissMJ, Cush - Own work by uploader, PBS NOVA [1], Fermilab, Office of Science, United States Department of Energy, Particle Data Group, Public Domain, __https://commons.wikimedia.org/w/index.php?curid=4286964__

## Comments