In Medias Res…

Image source: Link below

Topics: Applied Physics, Astrophysics, Computer Modeling, Einstein, High Energy Physics, Particle Physics, Theoretical Physics

In the search for new physics, a new kind of scientist is bridging the gap between theory and experiment.

Traditionally, many physicists have divided themselves into two tussling camps: the theorists and the experimentalists. Albert Einstein theorized general relativity, and Arthur Eddington observed it in action as “bending” starlight; Murray Gell-Mann and George Zweig thought up the idea of quarks, and Henry Kendall, Richard Taylor, Jerome Freidman and their teams detected them.

In particle physics especially, the divide is stark. Consider the Higgs boson, proposed in 1964 and discovered in 2012. Since then, physicists have sought to scrutinize its properties, but theorists and experimentalists don’t share Higgs data directly, and they’ve spent years arguing over what to share and how to format it. (There’s now some consensus, although the going was rough.)

But there’s a missing player in this dichotomy. Who, exactly, is facilitating the flow of data between theory and experiment?

Traditionally, the experimentalists filled this role, running the machines and looking at the data — but in high-energy physics and many other subfields, there’s too much data for this to be feasible. Researchers can’t just eyeball a few events in the accelerator and come to conclusions; at the Large Hadron Collider, for instance, about a billion particle collisions happen per second, which sensors detect, process, and store in vast computing systems. And it’s not just quantity. All this data is outrageously complex, made more so by simulation.

In other words, these experiments produce more data than anyone could possibly analyze with traditional tools. And those tools are imperfect anyway, requiring researchers to boil down many complex events into just a handful of attributes — say, the number of photons at a given energy. A lot of science gets left out.

In response to this conundrum, a growing movement in high-energy physics and other subfields, like nuclear physics and astrophysics, seeks to analyze data in its full complexity — to let the data speak for itself. Experts in this area are using cutting-edge data science tools to decide which data to keep and which to discard and to sniff out subtle patterns.


Opinion: The Rise of the Data Physicist, Benjamin Nachman, APS News

Published by reginaldgoodwin

Engineering Physics, Bachelors of Science, December 1984 Microelectronics & Photonics, Graduate Certificate, February 2016 Nanoengineering, Masters, December 2019 Nanoengineering, Ph.D., Summer 2022

Leave a comment