This is an old revision of the document!
We eagerly await data from the Large Hadron Collider (LHC) and we anticipate that LHC data will provide hints for what physics lies beyond the Standard Model (BSM). In studies that explore how accurately the LHC can determine the parameters of a specific BSM scenario, it is found that the LHC can often achieve a remarkable level of precision.
However, this seems to contradict the well-known fact that there are only a limited number of observables that the LHC can measure. In particular, if a BSM scenario predicts the production of dark matter candidates at the LHC, then these dark matter particles pass invisibly through the detector, resulting in a fundamental loss of information.
How it is then that we can measure the parameters of a given theory very accurately even though there are relatively few measurements we can make? The answer seems to be that the loss of information at the LHC results in discrete ambiguities in mapping LHC data back to theoretical models. That is, even if one model fits the data very well with small error bars, there will be a (possibly large) number of other models that also fit the data equally well. Moreover, precisely because the error bars are small, these other “degenerate” models are hard to find through brute force scanning.
The presence of discrete ambiguities in interpreting LHC data is known as the LHC Inverse Problem. It is an “inverse problem” because it is well-known how to solve the forward problem of determining the potential LHC signatures for a particular model, while in real life were are trying to invert this process and determine the underlying model from a set of measured LHC signatures. Because these ambiguities are discrete, it seems likely that simple observables might be able to resolve at least some of these degeneracies. This inspires us to search for new methods to analyze LHC data.
Spring 2008
The top quark was discovered at the Tevatron in 1995, and it has very interesting properties compared to the other five quarks in the standard model. The top quark is almost 200 times heavier than a proton (this despite the fact that a proton is composed of three light quarks). This puts the mass of a top quark close to the mass of a gold atom. Unlike gold, though, the top quark is very unstable, and it has a distinctive three-body decay.
The top quark will be copiously produced at the LHC, with upwards of 100 million top pairs a year produced from Standard Model processes alone. The properties of the top quark will be measured to unprecedented accuracy using the three-body decay mode. But because of the novel top properties, it is natural to also expect BSM sources of top quarks. If the BSM source is heavy enough, the top will be created at high velocities, and the three-body decay of the top will be obscured by collimation effects, creating a narrow top jet.
How can we identify these so-called boosted tops? Though the three-body decay of the top is obscured, it is not entirely absent. With Lian-Tao Wang, we studied the energy distribution within top jets and found characteristics that could distinguish between top jets and ordinary jets formed by radiation. This technique will help increase the purity for BSM signals with boosted tops.
Fall 2006 - Spring 2007
What is the minimal parameterization of new physics signals that does not suffer from the problem of ambiguities? In studying the LHC Inverse problem, we found that discrete ambiguities arise when two models have similar particle masses, similar particle production modes, and similar particle decays models. Therefore, we proposed the idea of an “On-Shell Effective Theory” (OSET) which summarizes new physics models directly in terms of these three properies: masses, cross sections, and branching ratios.
In the language of OSETs, the problem of discrete ambiguities is mitigated because it is less likely for two OSETs to look similar since OSETs are directly related to the actual LHC signals that are observed. Moreover, OSETs are parameterized in terms of quantities that can be easily calculated in a fundamental theory. In this way, OSETs are a useful intermediary between LHC data and theoretical models.
To provide a concrete implementation of the OSET philosophy, we created a program MARMOSET (Mass and Ratio Modeling in On-Shell Effective Theories) that allows the user to create LHC pseudo-data based on OSET input parameters.
Summer/Fall 2005
The Minimal Supersymmetric Standard Model (MSSM) is one of the leading candidates for a theory beyond the standard model. The MSSM predicts a new supersymmetric partner particle for every standard model particle, effectively doubling the known number of fundamental states.
In principle, the spectrum of these partner states can yield information about the ultraviolet structure of the standard model. However, we found that given a generic strategy for analyzing LHC data, there are often discrete ambiguities in trying to determine the pattern of supersymmetric states. Moreover, these ambiguities are difficult to identify, because they correspond to drastically rearranging the MSSM spectrum.
This study offered an explicit confirmation of the LHC Inverse Problem.
Summer 2005 - Summer 2007
The LHC Olympics were a series of four workshops where theorists attempted to determine the underlying TeV scale model from pseudo-LHC “blackbox” data. Using crude but semi-realistic data analysis methods, we learned firsthand the challenges of interpreting ambiguous data, and developed an intuition for the LHC Inverse Problem.
I was a member of the Harvard team, and we created a U(1)B-L gauge-mediated supersymmetric blackbox for the 2nd LHC Olympics. We were able to solve (or solve up to degeneracies) several blackboxes created by other teams. As part of my participation with the LHC Olympics, I helped improve the user interface to John Conway's PGS (Pretty Good Simulation) detector simulation.