Climate CERN…

Worrying trend Reliable climate models are needed so that societies can adapt to the impact of climate change. (Courtesy: Shutterstock/Migel)

Topics: Applied Physics, Atmospheric Science, CERN, Civilization, Climate Change

It was a scorcher last year. Land and sea temperatures were up to 0.2 °C (32.36 °F) higher every single month in the second half of 2023, with these warm anomalies continuing into 2024. We know the world is warming, but the sudden heat spike had not been predicted. As NASA climate scientist Gavin Schmidt wrote in Nature recently: “It’s humbling and a bit worrying to admit that no year has confounded climate scientists’ predictive capabilities more than 2023 has.”

As Schmidt went on to explain, a spell of record-breaking warmth had been deemed “unlikely” despite 2023 being an El Niño year, where the relatively cool waters in the central and eastern equatorial Pacific Ocean are replaced with warmer waters. Trouble is, the complex interactions between atmospheric deep convection and equatorial modes of ocean variability, which lie behind El Niño, are poorly resolved in conventional climate models.

Our inability to simulate El Niño properly with current climate models (J. Climate 10.1175/JCLI-D-21-0648.1) is symptomatic of a much bigger problem. In 2011 I argued that contemporary climate models were not good enough to simulate the changing nature of weather extremes such as droughts, heat waves and floods (see “A CERN for climate change” March 2011 p13). With grid-point spacings typically around 100 km, these models provide a blurred, distorted vision of the future climate. For variables like rainfall, the systematic errors associated with such low spatial resolution are larger than the climate-change signals that the models attempt to predict.

Reliable climate models are vitally required so that societies can adapt to climate change, assess the urgency of reaching net-zero or implement geoengineering solutions if things get really bad. Yet how is it possible to adapt if we don’t know whether droughts, heat waves, storms or floods cause the greater threat? How do we assess the urgency of net-zero if models cannot simulate “tipping” points? How is it possible to agree on potential geoengineering solutions if it is not possible to reliably assess whether spraying aerosols in the stratosphere will weaken the monsoons or reduce the moisture supply to the tropical rainforests? Climate modelers have to take the issue of model inadequacy much more seriously if they wish to provide society with reliable actionable information about climate change.

I concluded in 2011 that we needed to develop global climate models with spatial resolution of around 1 km (with compatible temporal resolution) and the only way to achieve this is to pool human and computer resources to create one or more internationally federated institutes. In other words, we need a “CERN for climate change” – an effort inspired by the particle-physics facility near Geneva, which has become an emblem for international collaboration and progress.

Why we still need a CERN for climate change, Tim Palmer, Physics World

Methane on Mars…

Filled with briny lakes, the Quisquiro salt flat in South America’s Altiplano region represents the kind of landscape that scientists think may have existed in Gale Crater on Mars, which NASA’s Curiosity Rover is exploring. Credit: Maksym Bocharov

Topics: Astrobiology, Astrophysics, Atmospheric Science, Mars, NASA, Planetary Science

The most surprising revelation from NASA’s Curiosity Mars Rover—that methane is seeping from the surface of Gale Crater—has scientists scratching their heads.

Living creatures produce most of the methane on Earth. But scientists haven’t found convincing signs of current or ancient life on Mars, and thus didn’t expect to find methane there. Yet, the portable chemistry lab aboard Curiosity, known as SAM, or Sample Analysis at Mars, has continually sniffed out traces of the gas near the surface of Gale Crater, the only place on the surface of Mars where methane has been detected thus far. Its likely source, scientists assume, are geological mechanisms that involve water and rocks deep underground.

If that were the whole story, things would be easy. However, SAM has found that methane behaves in unexpected ways in Gale Crater. It appears at night and disappears during the day. It fluctuates seasonally, and sometimes spikes to levels 40 times higher than usual. Surprisingly, the methane also isn’t accumulating in the atmosphere: ESA’s (the European Space Agency) ExoMars Trace Gas Orbiter, sent to Mars specifically to study the gas in the atmosphere, has detected no methane.

Why is methane seeping on Mars? NASA scientists have new ideas, Lonnie Shekhtman, NASA, Phys.org.

AI and the Great Filter…

Two researchers have revised the Drake equation, a mathematical formula for the probability of finding life or advanced civilizations in the universe.

University of Rochester. Are We Alone in the Universe? Revisiting the Drake Equation, NASA

Topics: Astrobiology, Astrophysics, Artificial Intelligence, Civilization, SETI

See: Britannica.com/The-Fermi-Paradox/Where-Are-All-The-Aliens

Abstract
This study examines the hypothesis that the rapid development of Artificial Intelligence (AI), culminating in the emergence of Artificial Superintelligence (ASI), could act as a “Great Filter” that is responsible for the scarcity of advanced technological civilizations in the universe. It is proposed that such a filter emerges before these civilizations can develop a stable, multiplanetary existence, suggesting the typical longevity (L) of a technical civilization is less than 200 years. Such estimates for L, when applied to optimistic versions of the Drake equation, are consistent with the null results obtained by recent SETI surveys, and other efforts to detect various techno-signatures across the electromagnetic spectrum. Through the lens of SETI, we reflect on humanity’s current technological trajectory – the modest projections for L suggested here, underscore the critical need to quickly establish regulatory frameworks for AI development on Earth and the advancement of a multiplanetary society to mitigate against such existential threats. The persistence of intelligent and conscious life in the universe could hinge on the timely and effective implementation of such international regulatory measures and technological endeavors.

Is artificial intelligence the great filter that makes advanced technical civilizations rare in the universe? Michael A. Garrett, Acta Astronautica, Volume 219, June 2024, Pages 731-735

Seventy Years Ago…

Topics: Civics, Civil Rights, Civilization, Existentialism, Fascism, History

“On **May 17, 1954**, U.S. Supreme Court Justice Earl Warren delivered the unanimous ruling in the landmark civil rights case Brown v. Board of Education of Topeka, Kansas. State-sanctioned segregation of public schools was a violation of the 14th Amendment and was, therefore, unconstitutional. This historic decision marked the end of the “separate but equal” precedent set by the Supreme Court nearly 60 years earlier in Plessy v. Ferguson. It served as a catalyst for the expanding civil rights movement during the decade of the 1950s.”

Source: https://www.archives.gov/milestone-documents/brown-v-board-of-education

Dr. Martin Luther King Jr was assassinated on April 4, 1968, a Thursday. My graduating kindergarten class at Bethlehem Community Center in Winston-Salem, North Carolina, was told by our teachers on Friday, who cried with us and reassured us that the men outside with Confederate flags, shooting in the air, reveling “that n—-r’s dead” would not harm us, or prevent our celebration. We slept as well as we could through nap time and dressed for our day and our parents. Not one child in my photo of the event is smiling. Not one.

I attended segregated schools in Winston-Salem, NC, until my fourth grade year in 1971. “All deliberate speed” had some considerable foot-dragging.

I was bused across town for 4th grade only to Rural Hall, and their kids were bused across town to Fairview Elementary for 5th and 6th grade. I was bused to Mineral Springs Middle School for 7th -8th grade. ALL the former Black High Schools, like Atkins, Carver, Hanes, and Paisley, had to be “9th and 10th grades only,” as North, East, Parkland, and West were reserved for the higher grades for high school graduation.

“We have fought hard and long for integration, as I believe we should have, and I believe that we will win, but I have come to believe that we are integrating into a burning house. I’m afraid that America has lost the moral vision she may have had,” as the nation is not deeply concerned “with the plight of the poor and disenfranchised.” This failure, King argued, would only further stoke “the anger and violence that tears the soul of this nation. I fear I am integrating my people into a burning house.” Dr. King confessed to his friend, the Civil Rights activist and entertainer Harry Belafonte.

I am sixty-one years old, a grandfather, and a late entrant to the ranks of a Ph.D. I am sad to say that despite the optimism of the movement, NOTHING has changed.

Ecclesiastes 1:9 “The thing that hath been, it is that which shall be and that which is done is that which shall be done: and there is no new thing under the sun.

“What’s past is prologue.” William Shakespeare, The Tempest.

Matrix…

(a) Schematics of the word INFORMATION is written on a material in binary code using magnetic recording. Red denotes magnetization pointing out of the plane and blue is magnetization pointing into the plane. (b)–(d) Time evolution of the digital magnetic recording information states simulated using micromagnetic Monte Carlo. (b) Initial random state. (c) INFORMATION is written (t = 0 s). (d) Iteration 930 (t = 1395 s) showing the degradation of information states. Reproduced with permission from M. M. Vopson and S. Lepadatu, AIP Adv. 12, 075310 (2022). Copyright 2022 AIP Publishing.

Topics: Chemistry, DNA, General Relativity, Genetics, Nucleotides, Thermodynamics

Reference: Electronic Orbitals, Chem Libre Text dot org

As Morpheus describes, “You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in Wonderland. And I show you how deep the rabbit hole goes.” Neo takes the red pill and wakes up in the real world. Source: Britannica Online: Red Pill and Blue Pill Symbolism

The simulation hypothesis is a philosophical theory, in which the entire universe and our objective reality are just simulated constructs. Despite the lack of evidence, this idea is gaining traction in scientific circles as well as in the entertainment industry. Recent scientific developments in the field of information physics, such as the publication of the mass-energy-information equivalence principle, appear to support this possibility. In particular, the 2022 discovery of the second law of information dynamics (infodynamics) facilitates new and interesting research tools at the intersection between physics and information. In this article, we re-examine the second law of infodynamics and its applicability to digital information, genetic information, atomic physics, mathematical symmetries, and cosmology, and we provide scientific evidence that appears to underpin the simulated universe hypothesis.

Introduction

In 2022, a new fundamental law of physics has been proposed and demonstrated, called the second law of information dynamics, or simply the second law of infodynamics.1 Its name is an analogy to the second law of thermodynamics, which describes the time evolution of the physical entropy of an isolated system, which requires the entropy to remain constant or to increase over time. In contrast to the second law of thermodynamics, the second law of infodynamics states that the information entropy of systems containing information states must remain constant or decrease over time, reaching a certain minimum value at equilibrium. This surprising observation has massive implications for all branches of science and technology. With the ever-increasing importance of information systems such as digital information storage or biological information stored in DNA/RNA genetic sequences, this new powerful physics law offers an additional tool for examining these systems and their time evolution.2 

The second law of infodynamics and its implications for the simulated universe hypothesis, Melvin M. Vopson, AIP Advances

The Checkbook of Space Travel…

An illustration of NASA’s Orion spacecraft in orbit around the moon. (Image credit: Lockheed Martin)

Topics: Astronautics, History, NASA, Space Exploration, Spaceflight

Between 1969 and 1972, the Apollo missions sent a total of a dozen astronauts to the surface of the moon — and that was before the explosion of modern technology. So why does it seem like our current efforts, as embodied by NASA’s Artemis program, are so slow, halting and complex? 

There isn’t one easy answer, but it comes down to money, politics and priorities.

Let’s start with the money. Yes, the Apollo missions were enormously successful — and enormously expensive. At its peak, NASA was consuming around 5% of the entire federal budget, and more than half of that was devoted to the Apollo program. Accounting for inflation, the entire Apollo program would cost over $260 billion in today’s dollars. If you include project Gemini and the robotic lunar program, which were necessary precursors to Apollo, that figure reaches over $280 billion.

In comparison, today NASA commands less than half a percent of the total federal budget, with a much broader range of priorities and directives. Over the past decade, NASA has spent roughly $90 billion on the Artemis program. Naturally, with less money going to a new moon landing, we’re likely to make slower progress, even with advancements in technology.

Why is it so hard to send humans back to the moon? Paul Sutter, Space.com.

Spectral Molecule…

Scientists detected 2-Methoxyethanol in space for the first time using radio telescope observations of the star-forming region NGC 6334I. Credit: Massachusetts Institute of Technology

Topics: Astronomy, Chemistry, Instrumentation, Interstellar, Research, Spectrographic Analysis

New research from the group of MIT Professor Brett McGuire has revealed the presence of a previously unknown molecule in space. The team’s open-access paper, “Rotational Spectrum and First Interstellar Detection of 2-Methoxyethanol Using ALMA Observations of NGC 6334I,” was published in the April 12 issue of The Astrophysical Journal Letters.

Zachary T.P. Fried, a graduate student in the McGuire group and the lead author of the publication, worked to assemble a puzzle comprised of pieces collected from across the globe, extending beyond MIT to France, Florida, Virginia, and Copenhagen, to achieve this exciting discovery.

“Our group tries to understand what molecules are present in regions of space where stars and solar systems will eventually take shape,” explains Fried. “This allows us to piece together how chemistry evolves alongside the process of star and planet formation. We do this by looking at the rotational spectra of molecules, the unique patterns of light they give off as they tumble end-over-end in space.

“These patterns are fingerprints (barcodes) for molecules. To detect new molecules in space, we first must have an idea of what molecule we want to look for, then we can record its spectrum in the lab here on Earth, and then finally we look for that spectrum in space using telescopes.”

Researchers detect a new molecule in space, Danielle Randall Doughty, Massachusetts Institute of Technology, Phys.org.

Goldene…

Researchers have synthesized sheets of gold that are one atom thick. Credit: imaginima/Getty

Topics: Graphene, Materials Science, Nanoengineering, Nanomaterials, Solid-State Physics

It is the world’s thinnest gold leaf: a gossamer sheet of gold just one atom thick. Researchers have synthesized1 the long-sought material, known as goldene, which is expected to capture light in ways that could be useful in applications such as sensing and catalysis.

Goldene is a gilded cousin of graphene, the iconic atom-thin material made of carbon that was discovered in 2004. Since then, scientists have identified hundreds more of these 2D materials. But it has been particularly difficult to produce 2D sheets of metals, because their atoms have always tended to cluster together to make nanoparticles instead.

Researchers have previously reported single-atom-thick layers of tin2 and lead3 stuck to various substances, and they have produced gold sheets sandwiched between other materials. But “we submit that goldene is the first free-standing 2D metal, to the best of our knowledge”, says materials scientist Lars Hultman at Linköping University in Sweden, who is part of the team behind the new research. Crucially, the simple chemical method used to make goldene should be amenable to larger-scale production, the researchers reported in Nature Synthesis on 16 April1.

I’m very excited about it,” says Stephanie Reich, a solid-state physicist and materials scientist at the Free University of Berlin, who was not involved in the work. “People have been thinking for quite some time how to take traditional metals and make them into really well-ordered 2D monolayers.”

In 2022, researchers at New York University Abu Dhabi (NYUAD) said that they had produced goldene, but the Linköping team contends that the prior material probably contained multiple atomic layers, on the basis of the electron microscopy images and other data that were published in ACS Applied Materials and Interfaces4. Reich agrees that the 2022 study failed to prove that the material was singler-layer goldene. The principal authors of the NYUAD study did not respond to Nature’s questions about their work.

Meet ‘goldene’: this gilded cousin of graphene is also one atom thick, Mark Peplow, Nature

Swift Particles and Dark Matter…

Source: Same source for the Dark Matter definition below.

Topics: Astronomy, Astrophysics, Cosmology, Dark Matter, Einstein, General Relativity

Notes: Your “secret decoder ring” for reading the Abstract.

Dark matter: Makes up about 85% of the universe, is invisible, and doesn’t interact with matter except for gravitational effects. See: Center for Astrophysics, Harvard

“Tachyonic”: Of, or referring to tachyons, (Greek for “swift”) theoretical particles that already travel faster-than-light, and backwards in time. Their rest mass, m0i, is assumed to be imaginary. As it loses energy, it’s assumed to become infinitely fast, so you can see why it’s a favorite science fiction trope, along with dark matter, literally tableau rasas.

ΛCDM assumes that the universe is composed of photons, neutrinos, ordinary matter (baryons, electrons) and cold (non-relativistic) dark matter, which only interacts gravitationally, plus “dark energy,” which is responsible for the observed acceleration in the Hubble expansion. Source: Goddard Spaceflight Center: Lambda

H0 defines the Hubble constant, or, the rate at which the universe is expanding, determined by Hubble in the way back year of 1929 to be 500 km/s/Mpc. I’m going to defer to Wikipedia for this one.

km/s/Mpc = kilometers/second/megaparsec. Megaparsec is 1 million parsecs = 3,260,000 light years, or 3.26 x 106 light years.

t0 = the present age of the universe, t0 = 2tH/3, where “tH” is the Hubble time. t0 is roughly 13.7 × 109 years, or 4.32 × 1017 seconds.

Gyr = giga years, or 1 billion years = 1 x 109 years (a lot).

Abstract

An open or hyperbolic Friedmann-Robertson-Walker spacetime dominated by tachyonic dark matter can exhibit an “inflected” expansion—initially decelerating, later accelerating—similar but not identical to that of now-standard ΛCDM models dominated by dark energy. The features of the tachyonic model can be extracted by fitting the redshift-distance relation of the model to data obtained by treating Type Ia supernovae as standard candles. Here such a model is fitted to samples of 186 and 1048 Type Ia supernovae from the literature. The fits yield values of H0 = (66.6±1.5) km/s/Mpc and H0 = (69.6±0.4) km/s/Mpc, respectively, for the current-time Hubble parameter, and t0 = (8.35 ± 0.68) Gyr and t0 = (8.15 ± 0.36) Gyr, respectively, for the comoving-time age of the Universe. Tests of the model against other observations will be undertaken in subsequent works.

Subject headings: cosmology, dark matter, tachyons, distance-redshift relation, supernovae

Testing Tachyon-Dominated Cosmology with Type Ia Supernovae, Samuel H. Kramer, Ian H. Redmount, Physics arXiv

Esse Quam Videri…

Credit: Menno Schaefer/Adobe

Starlings flock in a so-called murmuration, a collective behavior of interest in biological physics — one of many subfields that did not always “belong” in physics.

Topics: Applied Physics, Cosmology, Einstein, History, Physics, Research, Science

“To be rather than to seem.” Translated from the Latin Esse Quam Videri, which also happens to be the state motto of North Carolina. It is from the treatise on Friendship by the Roman statesman Cicero, a reminder of the beauty and power of being true to oneself. Source: National Library of Medicine: Neurosurgery

If you’ve been in physics long enough, you’ve probably left a colloquium or seminar and thought to yourself, “That talk was interesting, but it wasn’t physics.”

If so, you’re one of many physicists who muse about the boundaries of their field, perhaps with colleagues over lunch. Usually, it’s all in good fun.

But what if the issue comes up when a physics faculty makes decisions about hiring or promoting individuals to build, expand, or even dismantle a research effort? The boundaries of a discipline bear directly on the opportunities departments can offer students. They also influence those students’ evolving identities as physicists, and on how they think about their own professional futures and the future of physics.

So, these debates — over physics and “not physics” — are important. But they are also not new. For more than a century, physicists have been drawing and redrawing the borders around the field, embracing and rejecting subfields along the way.

A key moment for “not physics” occurred in 1899 at the second-ever meeting of the American Physical Society. In his keynote address, the APS president Henry Rowland exhorted his colleagues to “cultivate the idea of the dignity” of physics.

“Much of the intellect of the country is still wasted in the pursuit of so-called practical science which ministers to our physical needs,” he scolded, “[and] not to investigations in the pure ethereal physics which our Society is formed to cultivate.”

Rowland’s elitism was not unique — a fact that first-rate physicists working at industrial laboratories discovered at APS meetings, when no one showed interest in the results of their research on optics, acoustics, and polymer science. It should come as no surprise that, between 1915 and 1930, physicists were among the leading organizers of the Optical Society of America (now Optica), the Acoustical Society of America, and the Society of Rheology.

That acousticians were given a cold shoulder at early APS meetings is particularly odd. At the time, acoustics research was not uncommon in American physics departments. Harvard University, for example, employed five professors who worked extensively in acoustics between 1919 and 1950. World War II motivated the U.S. Navy to sponsor a great deal of acoustics research, and many physics departments responded quickly. In 1948, the University of Texas hired three acousticians as assistant professors of physics. Brown University hired six physicists between 1942 and 1952, creating an acoustics powerhouse that ultimately trained 62 physics doctoral students.

The acoustics landscape at Harvard changed abruptly in 1946, when all teaching and research in the subject moved from the physics department to the newly created department of engineering sciences and applied physics. In the years after, almost all Ph.D. acoustics programs in the country migrated from physics departments to “not physics” departments.

The reason for this was explained by Cornell University professor Robert Fehr at a 1964 conference on acoustics education. Fehr pointed out that engineers like himself exploited the fundamental knowledge of acoustics learned from physicists to alter the environment for specific applications. Consequently, it made sense that research and teaching in acoustics passed from physics to engineering.

It took less than two decades for acoustics to go from being physics to “not physics.” But other fields have gone the opposite direction — a prime example being cosmology.

Albert Einstein applied his theory of general relativity to the cosmos in 1917. However, his work generated little interest because there was no empirical data to which it applied. Edwin Hubble’s work on extragalactic nebulae appeared in 1929, but for decades, there was little else to constrain mathematical speculations about the physical nature of the universe. The theoretical physicists Freeman Dyson and Steven Weinberg have both used the phrase “not respectable” to describe how cosmology was seen by physicists around 1960. The subject was simply “not physics.”

This began to change in 1965 with the discovery of thermal microwave radiation throughout the cosmos — empirical evidence of the nearly 20-year-old Big Bang model. Physicists began to engage with cosmology, and the percentage of U.S. physics departments with at least one professor who published in the field rose from 4% in 1964 to 15% in 1980. In the 1980s, physicists led the satellite mission to study the cosmic microwave radiation, and particle physicists — realizing that the hot early universe was an ideal laboratory to test their theories — became part-time cosmologists. Today, it’s hard to find a medium-to-large sized physics department that does not list cosmology as a research specialty.

Opinion: That’s Not Physics, Andrew Zangwill, APS