Science

20-years-after-katrina,-new-orleans-remembers

20 years after Katrina, New Orleans remembers


20 years ago, Ivor Van Heerden warned of impending disaster in New Orleans. Are his warnings still going unheeded?

A man is stranded on a rooftop in the aftermath of Hurricane Katrina in 2005. Credit: Wickes Helmboldt

Next month marks the 20th anniversary of one of the most devastating natural disasters in US history: Hurricane Katrina, a Category 3 storm that made landfall on August 29, 2005. The storm itself was bad enough, but the resulting surge of water caused havoc for New Orleans in particular when the city’s protective levees failed, flooding much of New Orleans and killing 1,392 people. National Geographic is marking the occasion with a new documentary series: Hurricane Katrina: Race Against Time.

The five-part documentary is directed by Oscar nominee Traci A. Curry (Attica) and co-produced by Ryan Coogler’s Proximity Media, in conjunction with Lightbox. The intent was to go beyond the headlines of yesteryear and re-examine the many systemic failures that occurred while also revealing “stories of survival, heroism, and resilience,” Proximity’s executive producers said in a statement. “It’s a vital historical record and a call to witness, remember and recon with the truth of Hurricane Katrina’s legacy.”

Race Against Time doesn’t just rehash the well-worn narrative of the disaster; it centers the voices of the people who were there on the ground: residents, first responders, officials, and so forth. Among those interviewed for the documentary is geologist/marine scientist Ivor Van Heerden, author of The Storm: What Went Wrong and Why During Hurricane Katrina: the Inside Story from One Louisiana Scientist (2006).

Around 1998, Van Heerden set up Louisiana State University’s (LSU) fledgling Hurricane Center with his colleague Marc Levitan, developing the first computer modeling efforts for local storm surges. They had a supercomputer for the modeling and LiDAR data for accurate digital elevation models, and since there was no way to share data among the five major parishes, they created a networked geographical information system GIS) to link them. Part of Van Heerden’s job involved driving all over New Orleans to inspect the levees, and he didn’t like what he saw: levees with big bows, sinking under their own weight, for example, and others with large cracks.

Van Heerden also participated in the 2004 Hurricane Pam mock scenario, designed as a test run for hurricane planning for the 13 parishes of southeastern Louisiana, including New Orleans. It was essentially a worst-case scenario for the conditions of Hurricane Betsy, assuming that the whole city would be flooded. “We really had hoped that the exercise would wake everybody up, but quite honesty we were laughed at a few times during the exercise,” Van Heerden told Ars. He recalled telling one woman from FEMA that they should be thinking about using tents to house evacuees: “She said, ‘Americans don’t live in tents.'”

Stormy weather

Mayor Ray Nagin orders a mandatory evacuation of New Orleans. ABC News Videosource

The tens of thousands of stranded New Orleans residents in the devastating aftermath of Katrina could have used those tents. Van Heerden still vividly recalls his frustration over the catastrophic failures that occurred on so many levels. “We knew the levees had failed, we knew that there had been catastrophic structural failure, but nobody wanted to hear it initially,” he said. He and his team were out in the field in the immediate aftermath, measuring water levels and sampling the water for pathogens and toxic chemicals. Naturally they came across people in need of rescue and were able to radio locations to the Louisiana State University police.

“An FBI agent told me, ‘If you find any bodies, tie them with a piece of string to something so they don’t float away and give us the lats and logs,'” Van Heerden recalled. The memories haunt him still. Some of the bodies were drowned children, which he found particularly devastating since he had a young daughter of his own at the time.

How did it all go so wrong? After 1965’s Hurricane Betsy flooded most of New Orleans, the federal government started a levee building program with the US Army Corps of Engineers (USACE) in charge. “Right at the beginning, the Corps used very old science in terms of determining how high to make the levees,” said Van Heerden. “They had access to other very good data, but they chose not to use it for some reason. So they made the levees way too low.”

“They also ignored some of their own geotechnical science when designing the levees,” he continued. “Some were built in sand with very shallow footings, so the water just went underneath and blew out the levee. Some were built on piles of earth, again with very shallow footings, and they just fell over. The 17th Street Canal, the whole levee structure actually slid 200 feet.”

There had also been significant alterations to the local landscape since Hurricane Betsy. In the past, the wetlands, especially the cypress tree swamps, provided some protection from storm surges. In 1992, for example, the Category 5 Hurricane Andrew made landfall on Atchafalaya Delta, where healthy wetlands reduced its energy by 50 percent between the coast and Morgan City, per Van Heerden. But other wetlands in the region changed drastically with the dredging of a canal called the Mississippi Gulf Outlet, running from Baton Rouge to the Gulf of Mexico.

“It was an open conduit for surge to get into New Orleans,” said Van Heerden. “The saltwater got into the wetlands and destroyed it, especially the cypress trees. This canal had opened up, in some places, to five times its width, allowing waves to build on the surface. The earthen levees weren’t armored in any way, so they just collapsed. They blew apart. That’s why parts of St. Bernard saw a wave of water 10 feet high.”

Just trying to survive

Stranded New Orleans residents gather in a shelter during Hurricane Katrina. KTVT-TV

Add in drastic cuts to FEMA under then-President George W. Bush—who inherited “a very functional, very well-organized” version of the department from his predecessor, Bill Clinton, per Van Heerden—and the stage was set for just such a disaster like Katrina’s harrowing aftermath. It didn’t help that New Orleans Mayor Ray Nagin delayed issuing a mandatory evacuation order until some 24 hours before the storm hit, making it much more difficult for residents to follow those orders in a timely fashion.

There were also delays in conveying the vital information that the levees had failed. “We now know that the USACE had a guy in a Coast Guard helicopter who actually witnessed the London Avenue Canal failure, at 9: 06 AM on Day One,” said Van Heerden. “That guy went to Baton Rouge and he didn’t tell a soul other than the Corps. So the Corps knew very early what was gong on and they did nothing about it. They had a big megaphone and millions of dollars in public relations and kept saying it was an act of God. It took until the third week of September for us to finally get the media to realize that this was a catastrophic failure of the levees.”

The USACE has never officially apologized for what happened, per Van Heerden. “Not one of them lost their job after Katrina,” he said. But LSU fired Van Heerden in 2009, sparking protest from faculty and students. The university gave no reason for his termination, but it was widely speculated at the time that Van Heerden’s outspoken criticism of the USACE was a factor, with LSU fearing it might jeopardize funding. Van Heerden, sued and the university settled. But he hasn’t worked in academia since and now consults with various nonprofit organizations on flooding and storm surge impacts.

The widespread reports of looting and civil war further exacerbated the situation as survivors swarmed the Superdome and the nearby convention center. The city had planned for food and water for 12,000 people housed at Superdome for 48 hours. The failure of the levees swelled that number to 30,000 people stranded for several days, waiting in vain for the promised cavalry to arrive.

Van Heerden acknowledges the looting but insists most of that was simply due to people trying to survive in the absence of any other aid. “How did they get water on the interstate?” said Van Heerden. “They went to a water company, broke in and hot-wired a truck, then went around and gave water to everyone.”

As for the widespread belief outside the city that there was unchecked violence and a brewing civil war, “That doesn’t happen in a catastrophe,” he said. The rumors were driven by reports of shots being fired but, “there are a lot of hunters in Louisiana, and the hunter’s SOS is to fire three shots in rapid succession,” he said. “One way to say ‘I’m here!’ is to fire a gun. But everybody bought into that civil war nonsense.”

“Another ticking time bomb”

LSU Hurricane Center co-founder Ivor Van Heerden working at his desk in 2005. Australian Broadcasting Corporation

The levees have since been rebuilt, and Van Heerden acknowledges that some of the repairs are robust. “They used more concrete, they put in protection pads and deeper footings,” he said. “But they didn’t take into account—and they admitted this a few years ago—subsidence in Louisiana, which is two to two-and-a-half feet every century. And they didn’t take into account global climate change and the associated rising sea levels. Within the next 70 years, sea level in Louisiana is going to rise four feet over millions of square miles. If you’ve got a levee with a [protective] marsh in front of it, before too long that marsh is no longer going to exist, so the water is going to move further and further in-shore.”

Then there’s the fact that hurricanes these days are now bigger in diameter than they were 30 years ago, thanks to the extra heat. “They get up to a Category 5 a lot quicker,” said Van Heerden. “The frequency also seems to be creeping up. It’s now four times as likely you will experience hurricane-force winds.” Van Heerden has run storm surge models assuming a 3-foot rise in sea level. “What we saw was the levees wouldn’t be high enough in New Orleans,” he said. “I hate to say it, but it looks like another ticking time bomb. Science is a quest for the truth. You ignore the science at your folly.”

Assuming there was sufficient public and political will, how should the US be preparing for future tropical storms? “In many areas we need to retreat,” said Van Heerden. “We need to get the houses and buildings out and rebuild the natural vegetation, rebuild the wetlands. On the Gulf Coast, sea level is really going to rise, and we need to rethink our infrastructure. This belief that, ‘Oh, we’re going to put up a big wall’—in the long run it’s not going to work. The devastation from tropical storms is going to spread further inland through very rapid downpours, and that’s something we’re going to have to plan mitigations for. But I just don’t see any movement in that direction.”

Perhaps documentaries like Race Against Time can help turn the tide; Van Heerden certainly hopes so. He also hopes the documentary can correct several public misconceptions of what happened—particularly the tendency to blame the New Orleans residents trying to survive in appalling conditions, rather than the government that failed them.

“I think this is a very good documentary in showing the plight of the people and what they suffered, which was absolutely horrendous,” said Van Heerden. “I hope people watching will realize that yes, this is a piece of our history, but sometimes the past is the key to the present. And ask themselves, ‘Is this a foretaste of what’s to come?'”

Hurricane Katrina: Race Against Time premieres on July 27, 2025, on National Geographic. It will be available for streaming starting July 28, 2025, on Disney+ and Hulu.

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

20 years after Katrina, New Orleans remembers Read More »

robots-eating-other-robots:-the-benefits-of-machine-metabolism

Robots eating other robots: The benefits of machine metabolism


If you define “metabolism” loosely enough, these robots may have one.

For decades we’ve been trying to make the robots smarter and more physically capable by mimicking biological intelligence and movement. “But in doing so, we’ve been just replicating the results of biological evolution—I say we need to replicate its methods,” argues Philippe Wyder, a developmental robotics researcher at Columbia University. Wyder led a team that demonstrated a machine with a rudimentary form of what they’re calling a metabolism.

He and his colleagues built a robot that could consume other robots to physically grow, become stronger, more capable, and continue functioning.

Nature’s methods

The idea of robotic metabolism combines various concepts in AI and robotics. The first is artificial life, which Wyder termed “a field where people study the evolution of organisms through computer simulations.” Then there is the idea of modular robots: reconfigurable machines that can change their architecture by rearranging collections of basic modules. That was pioneered in the US by Daniela Rus or Mark Yim at Carnegie Mellon University in the 1990s.

Finally, there is the idea that we need a shift from a goal-oriented design we’ve been traditionally implementing in our machines to a survivability-oriented design found in living organisms, which Magnus Egerstedt proposed in his book Robot Ecology.

Wyder’s team took all these ideas, merged them, and prototyped a robot that could “eat” other robots. “I kind of came at this from many different angles,” Wyder says.

The key source of inspiration, though, was the way nature builds its organisms. There are 20 standard amino acids universally used by life that can be combined into trillions of proteins, forming the building blocks of countless life forms. Wyder started his project by designing a basic robotic module that was intended to play a role roughly equivalent to a single amino acid. This module, called a Truss Link, looked like a rod, being 16 centimeters long and containing batteries, electronic controllers, and servomotors than enabled them to expand, contract, and crawl in a straight line. They had permanent magnets at each end, which let them connect to other rods and form lightweight lattices.

Wyder’s idea was to throw a number of these modules in a confined space to see if they would assemble into more complex structures by bumping into each other. The process might be analogous to how amino acids spontaneously formed simple organic molecules roughly 4 billion years ago.

Robotic growth

The first stage of Wyder’s experiment was set up in a space with a few terrain features, like a drop, a few obstacles, and a standing cylinder. The robots were operated by the team, which directed them to form various structures. Three Truss Links connected with the magnets at one center point formed a three-pointed star. Other structures they formed included a triangle, a diamond with a tail that was a triangle connected with a three-pointed star, or a tetrahedron, and a 3D structure that looked like a triangular pyramid. The robots had to find other Truss Links and make them part of their bodies to grow into more complex forms.

As they were growing, they were also becoming more capable. A single Truss Link could only move in a straight line, a triangle could turn left and right, a diamond with a tail could traverse small bumps, while a tetrahedron could move itself over small walls. Finally, a tetrahedron with a ratchet—an additional Truss Link the robot could use a bit like a walking stick—could assist other robots in forming tetrahedrons, which was a difficult, risky maneuver that took multiple attempts even for the skilled operators.

Still, all this growth in size and capability was orchestrated by the researchers controlling the hardware. The question was whether these self-assembly processes could work with no human overlords around.

“We wanted to know if the Truss Links would meet on their own,” Wyder says. “If the Truss Links are exactly parallel, they will never connect. But being parallel is just one configuration, and there are infinite configurations where they are not parallel.” To check how this would play out, the team used computer simulations of six randomly spawned and randomly moving Truss Links in a walled environment. In 2,000 runs, each 20 minutes long, the modules ended up with a 64 percent chance of forming two three-pointed star shapes; a roughly 8.4 percent of assembling into two triangles, and nearly 45 percent of ending up as a diamond with a tail. (Some of these configurations were intermediates on the pathway to others, so the numbers add up to more than 100 percent.)

When moving randomly, Truss Links could also repair structures after their magnets got disconnected and even replace a malfunctioning Truss Link in the structure with a new one. But did they really metabolize anything?

Searching for purpose

The name “metabolism” comes from the Greek word “metabolē” which means “change.” Wyder’s robots can assemble, grow, reconfigure, rebuild, and, to a limited extent, sustain themselves, which definitely qualifies as change.

But metabolism, as it’s commonly understood, involves consuming materials in ways that extract energy and transform their chemicals. The Truss Links are limited to using prefabricated, compatible modules—they can’t consume some plastic and old lithium-ion batteries and metabolize them into brand-new Truss Links. Whether this qualifies as metabolism depends more on how far we want to stretch the definition than on what the actual robots can do.

And stretching definitions, so far, may be their strongest use case. “I can’t give you a real-world use case,” Wyder acknowledges. “We tried to make the truss robots carry loads from one point to another, but it’s not even included in our paper—it’s a research platform at this point.” The first thing he thinks the robotic metabolism platform is missing is a wider variety of modules. The team used homogeneous modules in this work but is already thinking about branching out. “Life uses around 20 different amino acids to work, so we’re currently focusing on integrating additional modules with various sensors,” Wyder explains. But the robots  are also lacking something way more fundamental: a purpose.

Life evolves to improve the chances of survival. It does so in response to pressures like predators or a challenging environment. A living thing is usually doing its best to avoid dying.

Egerstedt in “Robot Ecology“ argues we should build and program robots the same way with “survivability constraints” in mind. Wyder, in his paper, also claims we need to develop a “self-sustained robot ecology” in the future. But he also thinks we shouldn’t take this life analogy too far. His goal is not creating a robotic ecosystem where robots would hunt and feed on other robots, constantly improving their own designs.

“We would give robots a purpose. Let’s say a purpose is to build a lunar colony,” Wyder says. Survival should be the first objective, because if the platform doesn’t survive on the Moon, it won’t build a lunar colony. Multiple small units would first disperse to explore the area and then assemble into a bigger structure like a building or a crane. “And this large structure would absorb, recycle, or eat, if you will, all these smaller robots to integrate and make use of them,” Wyder claims.

A robotic platform like this, Wyder thinks, should adapt to unexpected circumstances even better than life itself. “There may be a moment where having a third arm would really save your life, but you can’t grow one. A robot, given enough time, won’t have that problem,” he says.

Science Advances, 2025.  DOI: 10.1126/sciadv.adu6897

Photo of Jacek Krywko

Jacek Krywko is a freelance science and technology writer who covers space exploration, artificial intelligence research, computer science, and all sorts of engineering wizardry.

Robots eating other robots: The benefits of machine metabolism Read More »

this-aerogel-and-some-sun-could-make-saltwater-drinkable

This aerogel and some sun could make saltwater drinkable

Earth is about 71 percent water. An overwhelming 97 percent of that water is found in the oceans, leaving us with only 3 percent in the form of freshwater—and much of that is frozen in the form of glaciers. That leaves just 0.3 percent of that freshwater on the surface in lakes, swamps, springs, and our main sources of drinking water, rivers and streams.

Despite our planet’s famously blue appearance from space, thirsty aliens would be disappointed. Drinkable water is actually pretty scarce.

As if that doesn’t already sound unsettling, what little water we have is also threatened by climate change, urbanization, pollution, and a global population that continues to expand. Over 2 billion people live in regions where their only source of drinking water is contaminated. Pathogenic microbes in the water can cause cholera, diarrhea, dysentery, polio, and typhoid, which could be fatal in areas without access to vaccines or medical treatment.

Desalination of seawater is a possible solution, and one approach involves porous materials absorbing water that evaporates when heated by solar energy. The problem with most existing solar-powered evaporators is that they are difficult to scale up for larger populations. Performance decreases with size, because less water vapor can escape from materials with tiny pores and thick boundaries—but there is a way to overcome this.

Feeling salty

Researcher Xi Shen of the Hong Kong Polytechnic University wanted to figure out a way to improve these types of systems. He and his team have now created an aerogel that is far more efficient at turning over fresh water than previous methods of desalination.

“The key factors determining the evaporation performance of porous evaporators include heat localization, water transport, and vapor transport,” Shen said in a study recently published in ACS Energy Letters. “Significant advancements have been made in the structural design of evaporators to realize highly efficient thermal localization and water transport.”

Solar radiation is the only energy used to evaporate the water, which is why many attempts have been made to develop what are called photothermal materials. When sunlight hits these types of materials, they absorb light and convert it into heat energy, which can be used to speed up evaporation. Photothermal materials can be made of substances including polymers, metals, alloys, ceramics, or cements. Hydrogels have been used to successfully decontaminate and desalinate water before, but they are polymers designed to retain water, which negatively affects efficiency and stability, as opposed to aerogels, which are made of polymers that hold air. This is why Shen and his team decided to create a photothermal aerogel.

This aerogel and some sun could make saltwater drinkable Read More »

widely-panned-arsenic-life-paper-gets-retracted—15-years-after-brouhaha

Widely panned arsenic life paper gets retracted—15 years after brouhaha

In all, the astronomic hype was met with earth-shaking backlash in 2010 and 2011. In 2012, Science published two studies refuting the claim that GFAJ-1 incorporates arsenic atoms into its DNA. Outside scientists concluded that it is an arsenic-tolerant extremophile, but not a profoundly different life form.

Retraction

But now, in 2025, it is once again spurring controversy; on Thursday, Science announced that it is retracting the study.

Some critics, such as Redfield, cheered the move. Others questioned the timing, noting that 15 years had passed, but only a few months had gone by since The New York Times published a profile of Wolfe-Simon, who is now returning to science after being perceived as a pariah. Wolfe-Simon and most of her co-authors, meanwhile, continue to defend the original paper and protest the retraction.

In a blog post on Thursday, Science’s executive editor, Valda Vinson, and Editor-in-Chief Holden Thorp explained the retraction by saying that Science’s criteria for issuing a retraction have evolved since 2010. At the time, it was reserved for claims of misconduct or fraud but now can include serious flaws. Specifically, Vinson and Thorp referenced the criticism that the bacterium’s genetic material was not properly purified of background arsenic before it was analyzed. While emphasizing that there has been no suggestion of fraud or misconduct on the part of the authors, they wrote that “Science believes that the key conclusion of the paper is based on flawed data,” and it should therefore be retracted.

Jonathan Eisen, an evolutionary biologist at the University of California, Davis, criticized the move. Speaking with Science’s news team, which is independent from the journal’s research-publishing arm, Eisen said that despite being a critic of the 2010 paper, he thought the discussion of controversial studies should play out in the scientific literature and not rely on subjective decisions by editors.

In an eLetter attached to the retraction notice, the authors dispute the retraction, too, saying, “While our work could have been written and discussed more carefully, we stand by the data as reported. These data were peer-reviewed, openly debated in the literature, and stimulated productive research.”

One of the co-authors, Ariel Anbar, a geochemist at Arizona State University, told Nature that the study had no mistakes but that the data could be interpreted in different ways. “You don’t retract because of a dispute about data interpretation,” he said. If that were the case, “you’d have to retract half the literature.”

Widely panned arsenic life paper gets retracted—15 years after brouhaha Read More »

rocket-report:-channeling-the-future-at-wallops;-spacex-recovers-rocket-wreckage

Rocket Report: Channeling the future at Wallops; SpaceX recovers rocket wreckage


China’s Space Pioneer seems to be back on track a year after an accidental launch.

A SpaceX Falcon 9 rocket carrying a payload of 24 Starlink Internet satellites soars into space after launching from Vandenberg Space Force Base, California, shortly after sunset on July 18, 2025. This image was taken in Santee, California, approximately 250 miles (400 kilometers) away from the launch site. Credit: Kevin Carter/Getty Images

Welcome to Edition 8.04 of the Rocket Report! The Pentagon’s Golden Dome missile defense shield will be a lot of things. Along with new sensors, command and control systems, and satellites, Golden Dome will require a lot of rockets. The pieces of the Golden Dome architecture operating in orbit will ride to space on commercial launch vehicles. And Golden Dome’s space-based interceptors will essentially be designed as flying fuel tanks with rocket engines. This shouldn’t be overlooked, and that’s why we include a couple of entries discussing Golden Dome in this week’s Rocket Report.

As always, we welcome reader submissions. If you don’t want to miss an issue, please subscribe using the box below (the form will not appear on AMP-enabled versions of the site). Each report will include information on small-, medium-, and heavy-lift rockets, as well as a quick look ahead at the next three launches on the calendar.

Space-based interceptors are a real challenge. The newly installed head of the Pentagon’s Golden Dome missile defense shield knows the clock is ticking to show President Donald Trump some results before the end of his term in the White House, Ars reports. Gen. Michael Guetlein identified command-and-control and the development of space-based interceptors as two of the most pressing technical challenges for Golden Dome. He believes the command-and-control problem can be “overcome in pretty short order.” The space-based interceptor piece of the architecture is a different story.

Proven physics, unproven economics … “I think the real technical challenge will be building the space-based interceptor,” Guetlein said. “That technology exists. I believe we have proven every element of the physics that we can make it work. What we have not proven is, first, can I do it economically, and then second, can I do it at scale? Can I build enough satellites to get after the threat? Can I expand the industrial base fast enough to build those satellites? Do I have enough raw materials, etc.?” Military officials haven’t said how many space-based interceptors will be required for Golden Dome, but outside estimates put the number in the thousands.

The easiest way to keep up with Eric Berger’s and Stephen Clark’s reporting on all things space is to sign up for our newsletter. We’ll collect their stories and deliver them straight to your inbox.

Sign Me Up!

One big defense prime is posturing for Golden Dome. Northrop Grumman is conducting ground-based testing related to space-based interceptors as part of a competition for that segment of the Trump administration’s Golden Dome missile-defense initiative, The War Zone reports. Kathy Warden, Northrop Grumman’s CEO, highlighted the company’s work on space-based interceptors, as well as broader business opportunities stemming from Golden Dome, during a quarterly earnings call this week. Warden identified Northrop’s work in radars, drones, and command-and-control systems as potentially applicable to Golden Dome.

But here’s the real news … “It will also include new innovation, like space-based interceptors, which we’re testing now,” Warden continued. “These are ground-based tests today, and we are in competition, obviously, so not a lot of detail that I can provide here.” Warden declined to respond directly to a question about how the space-based interceptors Northrop Grumman is developing now will actually defeat their targets. (submitted by Biokleen)

Trump may slash environmental rules for rocket launches. The Trump administration is considering slashing rules meant to protect the environment and the public during commercial rocket launches, changes that companies like Elon Musk’s SpaceX have long sought, ProPublica reports. A draft executive order being circulated among federal agencies, and viewed by ProPublica, directs Secretary of Transportation Sean Duffy to “use all available authorities to eliminate or expedite” environmental reviews for launch licenses. It could also, in time, require states to allow more launches or even more launch sites along their coastlines.

Getting political at the FAA … The order is a step toward the rollback of federal oversight that Musk, who has fought bitterly with the Federal Aviation Administration over his space operations, and others have pushed for. Commercial rocket launches have grown exponentially more frequent in recent years. In addition to slashing environmental rules, the draft executive order would make the head of the FAA’s Office of Commercial Space Transportation a political appointee. This is currently a civil servant position, but the last head of the office took a voluntary separation offer earlier this year.

There’s a SPAC for that. An unproven small launch startup is partnering with a severely depleted SPAC trust to do the impossible: go public in a deal they say will be valued at $400 million, TechCrunch reports. Innovative Rocket Technologies Inc., or iRocket, is set to merge with a Special Purpose Acquisition Company, or SPAC, founded by former Commerce Secretary Wilbur Ross. But the most recent regulatory filings by this SPAC showed it was in a tenuous financial position last year, with just $1.6 million held in trust. Likewise, iRocket isn’t flooded with cash. The company has raised only a few million in venture funding, a fraction of what would be needed to develop and test the company’s small orbital-class rocket, named Shockwave.

SpaceX traces a path to orbit for NASA. Two NASA satellites soared into orbit from California aboard a SpaceX Falcon 9 rocket Wednesday, commencing a $170 million mission to study a phenomenon of space physics that has eluded researchers since the dawn of the Space Age, Ars reports. The twin spacecraft are part of the NASA-funded TRACERS mission, which will spend at least a year measuring plasma conditions in narrow regions of Earth’s magnetic field known as polar cusps. As the name suggests, these regions are located over the poles. They play an important but poorly understood role in creating colorful auroras as plasma streaming out from the Sun interacts with the magnetic field surrounding Earth. The same process drives geomagnetic storms capable of disrupting GPS navigation, radio communications, electrical grids, and satellite operations.

Plenty of room for more … The TRACERS satellites are relatively small, each about the size of a washing machine, so they filled only a fraction of the capacity of SpaceX’s Falcon 9 rocket. Three other small NASA tech demo payloads hitched a ride to orbit with TRACERS, kicking off missions to test an experimental communications terminal, demonstrate an innovative scalable satellite platform made of individual building blocks, and study the link between Earth’s atmosphere and the Van Allen radiation belts. In addition to those missions, the European Space Agency launched its own CubeSat to test 5G communications from orbit. Five smallsats from an Australian company rounded out the group. Still, the Falcon 9 rocket’s payload shroud was filled with less than a quarter of the payload mass it could have delivered to the TRACERS mission’s targeted Sun-synchronous orbit.

Tianlong launch pad ready for action. Chinese startup Space Pioneer has completed a launch pad at Jiuquan spaceport in northwestern China for its Tianlong 3 liquid propellent rocket ahead of a first orbital launch, Space News reports. Space Pioneer said the launch pad passed an acceptance test, and ground crews raised a full-scale model of the Tianlong 3 rocket on the launch pad. “The rehearsal test was successfully completed,” said Space Pioneer, one of China’s leading private launch companies. The activation of the launch pad followed a couple of weeks after Space Pioneer announced the completion of static loads testing on Tianlong 3.

More to come … While this is an important step forward for Space Pioneer, construction of the launch pad is just one element the company needs to finish before Tianlong 3 can lift off for the first time. In June 2024, the company ignited Tianlong 3’s nine-engine first stage on a test stand in China. But the rocket broke free of its moorings on the test stand and unexpectedly climbed into the sky before crashing in a fireball nearby. Space Pioneer says the “weak design of the rocket’s tail structure was the direct cause of the failure” last year. The company hasn’t identified next steps for Tianlong 3, or when it might be ready to fly. Tianlong 3 is a kerosene-fueled rocket with nine main engines, similar in design architecture and payload capacity to SpaceX’s Falcon 9. Also, like Falcon 9, Tianlong 3 is supposed to have a recoverable and reusable first stage booster.

Dredging up an issue at Wallops. Rocket Lab has asked regulators for permission to transport oversized Neutron rocket structures through shallow waters to a spaceport off the coast of Virginia as it races to meet a September delivery deadline, TechCrunch reports. The request, which was made in July, is a temporary stopgap while the company awaits federal clearance to dredge a permanent channel to the Wallops Island site. Rocket Lab plans to launch its Neutron medium-lift rocket from the Mid-Atlantic Regional Spaceport (MARS) on Wallops Island, Virginia, a lower-traffic spaceport that’s surrounded by shallow channels and waterways. Rocket Lab has a sizable checklist to tick off before Neutron can make its orbital debut, like mating the rocket stages, performing a “wet dress” rehearsal, and getting its launch license from the Federal Aviation Administration. Before any of that can happen, the rocket hardware needs to make it onto the island from Rocket Lab’s factory on the nearby mainland.

Kedging bets … Access to the channel leading to Wallops Island is currently available only at low tides. So, Rocket Lab submitted an application earlier this year to dredge the channel. The dredging project was approved by the Virginia Marine Resources Commission in May, but the company has yet to start digging because it’s still awaiting federal sign-off from the Army Corps of Engineers. As the company waits for federal approval, Rocket Lab is seeking permission to use a temporary method called “kedging” to ensure the first five hardware deliveries can arrive on schedule starting in September. We don’t cover maritime issues in the Rocket Report, but if you’re interested in learning a little about kedging, here’s a link.

Any better ideas for an Exploration Upper Stage? Not surprisingly, Congress is pushing back against the Trump administration’s proposal to cancel the Space Launch System, the behemoth rocket NASA has developed to propel astronauts back to the Moon. But legislation making its way through the House of Representatives includes an interesting provision that would direct NASA to evaluate alternatives for the Boeing-built Exploration Upper Stage, an upgrade for the SLS rocket set to debut on its fourth flight, Ars reports. Essentially, the House Appropriations Committee is telling NASA to look for cheaper, faster options for a new SLS upper stage.

CYA EUS? The four-engine Exploration Upper Stage, or EUS, is an expensive undertaking. Last year, NASA’s inspector general reported that the new upper stage’s development costs had ballooned from $962 million to $2.8 billion, and the project had been delayed more than six years. That’s almost a year-for-year delay since NASA and Boeing started development of the EUS. So, what are the options if NASA went with a new upper stage for the SLS rocket? One possibility is a modified version of United Launch Alliance’s dual-engine Centaur V upper stage that flies on the Vulcan rocket. It’s no longer possible to keep flying the SLS rocket’s existing single-engine upper stage because ULA has shut down the production line for it.

Raising Super Heavy from the deep. For the second time, SpaceX has retrieved an engine section from one of its Super Heavy boosters from the Gulf of Mexico, NASASpaceflight.com reports. Images posted on social media showed the tail end of a Super Heavy booster being raised from the sea off the coast of northern Mexico. Most of the rocket’s 33 Raptor engines appear to still be attached to the lower section of the stainless steel booster. Online sleuths who closely track SpaceX’s activities at Starbase, Texas, have concluded the rocket recovered from the Gulf is Booster 13, which flew on the sixth test flight of the Starship mega-rocket last November. The booster ditched in the ocean after aborting an attempted catch back at the launch pad in South Texas.

But why? … SpaceX recovered the engine section of a different Super Heavy booster from the Gulf last year. The company’s motivation for salvaging the wreckage is unclear. “Speculated reasons include engineering research, environmental mitigation, or even historical preservation,” NASASpaceflight reports.

Next three launches

July 26: Vega C | CO3D & MicroCarb | Guiana Space Center, French Guiana | 02: 03 UTC

July 26: Falcon 9 | Starlink 10-26 | Cape Canaveral Space Force Station, Florida | 08: 34 UTC

July 27: Falcon 9 | Starlink 17-2 | Vandenberg Space Force Base, California | 03: 55 UTC

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

Rocket Report: Channeling the future at Wallops; SpaceX recovers rocket wreckage Read More »

lawmakers-writing-nasa’s-budget-want-a-cheaper-upper-stage-for-the-sls-rocket

Lawmakers writing NASA’s budget want a cheaper upper stage for the SLS rocket


Eliminating the Block 1B upgrade now would save NASA at least $500 million per year.

Artist’s illustration of the Boeing-developed Exploration Upper Stage, with four hydrogen-fueled RL10 engines. Credit: NASA

Not surprisingly, Congress is pushing back against the Trump administration’s proposal to cancel the Space Launch System, the behemoth rocket NASA has developed to propel astronauts back to the Moon.

Spending bills making their way through both houses of Congress reject the White House’s plan to wind down the SLS rocket after two more launches, but the text of a draft budget recently released by the House Appropriations Committee suggests an openness to making some major changes to the program.

The next SLS flight, called Artemis II, is scheduled to lift off early next year to send a crew of four astronauts around the far side of the Moon. Artemis III will follow a few years later on a mission to attempt a crew lunar landing at the Moon’s south pole. These missions follow Artemis I, a successful unpiloted test flight in 2022.

After Artemis III, the official policy of the Trump administration is to terminate the SLS program, along with the Orion crew capsule designed to launch on top of the rocket. The White House also proposed canceling NASA’s Gateway, a mini-space station to be placed in orbit around the Moon. NASA would instead procure commercial launches and commercial spacecraft to ferry astronauts between the Earth and the Moon, while focusing the agency’s long-term gaze toward Mars.

CYA EUS?

House and Senate appropriations bills would preserve SLS, Orion, and the Gateway. However, the House version of NASA’s budget has an interesting paragraph directing NASA to explore cheaper, faster options for a new SLS upper stage.

NASA has tasked Boeing, which also builds SLS core stages, to develop an Exploration Upper Stage for debut on the Artemis IV mission, the fourth flight of the Space Launch System. This new upper stage would have large propellant tanks and carry four engines instead of the single engine used on the rocket’s interim upper stage, which NASA is using for the first three SLS flights.

The House version of NASA’s fiscal year 2026 budget raises questions about the long-term future of the Exploration Upper Stage. In one section of the bill, House lawmakers would direct NASA to “evaluate alternatives to the current Exploration Upper Stage (EUS) design for SLS.” The committee members wrote the evaluation should focus on reducing development and production costs, shortening the schedule, and maintaining the SLS rocket’s lift capability.

“NASA should also evaluate how alternative designs could support the long-term evolution of SLS and broader exploration goals beyond low-Earth orbit,” the lawmakers wrote. “NASA is directed to assess various propulsion systems, stage configurations, infrastructure compatibility, commercial and international collaboration opportunities, and the cost and schedule impacts of each alternative.”

The SLS rocket is expensive, projected to cost at least $2.5 billion per launch, not counting development costs or expenses related to the Orion spacecraft and the ground systems required to launch it at Kennedy Space Center in Florida. Those figures bring the total cost of an Artemis mission using SLS and Orion to more than $4 billion, according to NASA’s inspector general.

NASA’s Block 1B version of the SLS rocket will be substantially larger than Block 1. Credit: NASA

The EUS is likewise an expensive undertaking. Last year, NASA’s inspector general reported that the new upper stage’s development costs had ballooned from $962 million to $2.8 billion, and the Boeing-led project had been delayed more than six years. The version of the SLS rocket with the EUS, known as Block 1B, is supposed to deliver a 40 percent increase in performance over the Block 1 configuration used on the first three Space Launch System flights. Overall, NASA’s inspector general projected Block 1B’s development costs to total $5.7 billion.

Eliminating the Block 1B upgrade now would save NASA at least $500 million per year, and perhaps more if NASA could also end work on a costly mobile launch tower specifically designed to support SLS Block 1B missions.

NASA can’t go back to the interim upper stage, which is based on the design of the upper stage that flew on United Launch Alliance’s (ULA’s) now-retired Delta IV Heavy rocket. ULA has shut down its Delta production line, so there’s no way to build any more. What ULA does have is a new high-energy upper stage called Centaur V. This upper stage is sized for ULA’s new Vulcan rocket, with more capability than the interim upper stage but with lower performance than the larger EUS.

A season of compromise, maybe

Ars’ Eric Berger wrote last year about the possibility of flying the Centaur V upper stage on SLS missions.

Incorporating the Centaur V wouldn’t maintain the SLS rocket’s lift capability, as the House committee calls for in its appropriations bill. The primary reason for improving the rocket’s performance is to give SLS Block 1B enough oomph to carry “co-manifested” payloads, meaning it can launch an Orion crew capsule and equipment for NASA’s Gateway lunar space station on a single flight. The lunar Gateway is also teed up for cancellation in Trump’s budget proposal, but both congressional appropriations bills would save it, too. If the Gateway escapes cancellation, there are ways to launch its modules on commercial rockets.

Blue Origin also has an upper stage that could conceivably fly on the Space Launch System. But the second stage for Blue Origin’s New Glenn rocket would be a more challenging match for SLS for several reasons, chiefly its 7-meter (23-foot) diameter—too wide to be a drop-in replacement for the interim upper stage used on Block 1. ULA’s Centaur V is much closer in size to the existing upper stage.

The House budget bill has passed a key subcommittee vote but won’t receive a vote from the full appropriations committee until after Congress’s August recess. A markup of the bill by the House Appropriations Committee scheduled for Thursday was postponed after Speaker Mike Johnson announced an early start to the recess this week.

Ars reported last week on the broad strokes of how the House and Senate appropriations bills would affect NASA. Since then, members of the House Appropriations Committee released the text of the report attached to their version of the NASA budget. The report, which includes the paragraph on the Exploration Upper Stage, provides policy guidance and more detailed direction on where NASA should spend its money.

The House’s draft budget includes $2.5 billion for the Space Launch System, close to this year’s funding level and $500 million more than the Trump administration’s request for the next fiscal year, which begins October 1. The budget would continue development of SLS Block 1B and the Exploration Upper Stage while NASA completes a six-month study of alternatives.

The report attached to the Senate appropriations bill for NASA has no specific instructions regarding the Exploration Upper Stage. But like the House bill, the Senate’s draft budget directs NASA to continue ordering spares and long-lead parts for SLS and Orion missions beyond Artemis III. Both versions of the NASA budget require the agency to continue with SLS and Orion until a suitable commercial, human-rated rocket and crew vehicle are proven ready for service.

In a further indication of Congress’ position on the SLS and Orion programs, lawmakers set aside more than $4 billion for the procurement of SLS rockets for the Artemis IV and Artemis V rockets in the reconciliation bill signed into law by President Donald Trump earlier this month.

Congress must pass a series of federal appropriations bills by October 1, when funding for the current fiscal year runs out. If Congress doesn’t act by then, it could pass a continuing resolution to maintain funding at levels close to this year’s budget or face a government shutdown.

Lawmakers will reconvene in Washington, DC, in early September in hopes of finishing work on the fiscal year 2026 budget. The section of the budget that includes NASA still must go through a markup hearing by the House Appropriations Committee and pass floor votes in the House and Senate. Then the two chambers will have to come to a compromise on the differences in their appropriations bill. Only then can the budget be put to another vote in each chamber and go to the White House for Trump’s signature.

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

Lawmakers writing NASA’s budget want a cheaper upper stage for the SLS rocket Read More »

spacex-launches-a-pair-of-nasa-satellites-to-probe-the-origins-of-space-weather

SpaceX launches a pair of NASA satellites to probe the origins of space weather


“This is going to really help us understand how to predict space weather in the magnetosphere.”

This artist’s illustration of the Earth’s magnetosphere shows the solar wind (left) streaming from the Sun, and then most of it being blocked by Earth’s magnetic field. The magnetic field lines seen here fold in toward Earth’s surface at the poles, creating polar cusps. Credit: NASA/Goddard Space Flight Center

Two NASA satellites rocketed into orbit from California aboard a SpaceX Falcon 9 rocket Wednesday, commencing a $170 million mission to study a phenomenon of space physics that has eluded researchers since the dawn of the Space Age.

The twin spacecraft are part of the NASA-funded TRACERS mission, which will spend at least a year measuring plasma conditions in narrow regions of Earth’s magnetic field known as polar cusps. As the name suggests, these regions are located over the poles. They play an important but poorly understood role in creating colorful auroras as plasma streaming out from the Sun interacts with the magnetic field surrounding Earth.

The same process drives geomagnetic storms capable of disrupting GPS navigation, radio communications, electrical grids, and satellite operations. These outbursts are usually triggered by solar flares or coronal mass ejections that send blobs of plasma out into the Solar System. If one of these flows happens to be aimed at Earth, we are treated with auroras but vulnerable to the storm’s harmful effects.

For example, an extreme geomagnetic storm last year degraded GPS navigation signals, resulting in more than $500 million in economic losses in the agriculture sector as farms temporarily suspended spring planting. In 2022, a period of elevated solar activity contributed to the loss of 40 SpaceX Starlink satellites.

“Understanding our Sun and the space weather it produces is more important to us here on Earth, I think, than most realize,” said Joe Westlake, director of NASA’s heliophysics division.

NASA’s two TRACERS satellites launched Wednesday aboard a SpaceX Falcon 9 rocket from Vandenberg Space Force Base, California. Credit: SpaceX

The launch of TRACERS was delayed 24 hours after a regional power outage disrupted air traffic control over the Pacific Ocean near the Falcon 9 launch site on California’s Central Coast, according to the Federal Aviation Administration. SpaceX called off the countdown Tuesday less than a minute before liftoff, then rescheduled the flight for Wednesday.

TRACERS, short for Tandem Reconnection and Cusp Electrodynamics Reconnaissance Satellites, will study a process known as magnetic reconnection. As particles in the solar wind head out into the Solar System at up to 1 million mph, they bring along pieces of the Sun’s magnetic field. When the solar wind reaches our neighborhood, it begins interacting with Earth’s magnetic field.

The high-energy collision breaks and reconnects magnetic field lines, flinging solar wind particles across Earth’s magnetosphere at speeds that can approach the speed of light. Earth’s field draws some of these particles into the polar cusps, down toward the upper atmosphere. This is what creates dazzling auroral light shows and potentially damaging geomagnetic storms.

Over our heads

But scientists still aren’t sure how it all works, despite the fact that it’s happening right over our heads, within the reach of countless satellites in low-Earth orbit. But a single spacecraft won’t do the job. Scientists need at least two spacecraft, each positioned in bespoke polar orbits and specially instrumented to measure magnetic fields, electric fields, electrons, and ions.

That’s because magnetic reconnection is a dynamic process, and a single satellite would provide just a snapshot of conditions over the polar cusps every 90 minutes. By the time the satellite comes back around on another orbit, conditions will have changed, but scientists wouldn’t know how or why, according to David Miles, principal investigator for the TRACERS mission at the University of Iowa.

“You can’t tell, is that because the system itself is changing?” Miles said. “Is that because this magnetic reconnection, the coupling process, is moving around? Is it turning on and off, and if it’s turning on and off, how quickly can it do it? Those are fundamental things that we need to understand… how the solar wind arriving at the Earth does or doesn’t transfer energy to the Earth system, which has this downstream effect of space weather.”

This is why the tandem part of the TRACERS name is important. The novel part of this mission is it features two identical spacecraft, each about the size of a washing machine flying at an altitude of 367 miles (590 kilometers). Over the course of the next few weeks, the TRACERS satellites will drift into a formation with one trailing the other by about two minutes as they zip around the world at nearly five miles per second. This positioning will allow the satellites to sample the polar cusps one right after the other, instead of forcing scientists to wait another 90 minutes for a data refresh.

With TRACERS, scientists hope to pick apart smaller, fast-moving changes with each satellite pass. Within a year, TRACERS should collect 3,000 measurements of magnetic reconnections, a sample size large enough to start identifying why some space weather events evolve differently than others.

“Not only will it get a global picture of reconnection in the magnetosphere, but it’s also going to be able to statistically study how reconnection depends on the state of the solar wind,” said John Dorelli, TRACERS mission scientist at NASA’s Goddard Space Flight Center. “This is going to really help us understand how to predict space weather in the magnetosphere.”

One of the two TRACERS satellites undergoes launch preparations at Millennium Space Systems, the spacecraft’s manufacturer. Credit: Millennium Space Systems

“If we can understand these various different situations, whether it happens suddenly if you have one particular kind of event, or it happens in lots of different places, then we have a better way to model that and say, ‘Ah, here’s the likelihood of seeing a certain kind of effect that would affect humans,'” said Craig Kletzing, the principal investigator who led the TRACERS science team until his death in 2023.

There is broader knowledge to be gained with a mission like TRACERS. Magnetic reconnection is ubiquitous throughout the Universe, and the same physical processes produce solar flares and coronal mass ejections from the Sun.

Hitchhiking to orbit

Several other satellites shared the ride to space with TRACERS on Wednesday.

These secondary payloads included a NASA-sponsored mission named PExT, a small technology demonstration satellite carrying an experimental communications package capable of connecting with three different networks: NASA’s government-owned Tracking and Data Relay Satellites (TDRS) and commercial satellite networks owned by SES and Viasat.

What’s unique about the Polylingual Experimental Terminal, or PExT, is its ability to roam across multiple satellite relay networks. The International Space Station and other satellites in low-Earth orbit currently connect to controllers on the ground through NASA’s TDRS satellites. But NASA will retire its TDRS satellites in the 2030s and begin purchasing data relay services using commercial satellite networks.

The space agency expects to have multiple data relay providers, so radios on future NASA satellites must be flexible enough to switch between networks mid-mission. PExT is a pathfinder for these future missions.

Another NASA-funded tech demo named Athena EPIC was also aboard the Falcon 9 rocket. Led by NASA’s Langley Research Center, this mission uses a scalable satellite platform developed by a company named NovaWurks, using building blocks to piece together everything a spacecraft needs to operate in space.

Athena EPIC hosts a single science instrument to measure how much energy Earth radiates into space, an important data point for climate research. But the mission’s real goal is to showcase how an adaptable satellite design, such as this one using NovaWurks’ building block approach, might be useful for future NASA missions.

A handful of other payloads rounded out the payload list for Wednesday’s launch. They included REAL, a NASA-funded CubeSat project to investigate the Van Allen radiation belts and space weather, and LIDE, an experimental 5G communications satellite backed by the European Space Agency. Five commercial spacecraft from the Australian company Skykraft also launched to join a constellation of small satellites to provide tracking and voice communications between air traffic controllers and aircraft over remote parts of the world.

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

SpaceX launches a pair of NASA satellites to probe the origins of space weather Read More »

conspiracy-theorists-don’t-realize-they’re-on-the-fringe

Conspiracy theorists don’t realize they’re on the fringe


Gordon Pennycook: “It might be one of the biggest false consensus effects that’s been observed.”

Credit: Aurich Lawson / Thinkstock

Belief in conspiracy theories is often attributed to some form of motivated reasoning: People want to believe a conspiracy because it reinforces their worldview, for example, or doing so meets some deep psychological need, like wanting to feel unique. However, it might also be driven by overconfidence in their own cognitive abilities, according to a paper published in the Personality and Social Psychology Bulletin. The authors were surprised to discover that not only are conspiracy theorists overconfident, they also don’t realize their beliefs are on the fringe, massively overestimating by as much as a factor of four how much other people agree with them.

“I was expecting the overconfidence finding,” co-author Gordon Pennycook, a psychologist at Cornell University, told Ars. “If you’ve talked to someone who believes conspiracies, it’s self-evident. I did not expect them to be so ready to state that people agree with them. I thought that they would overestimate, but I didn’t think that there’d be such a strong sense that they are in the majority. It might be one of the biggest false consensus effects that’s been observed.”

In 2015, Pennycook made headlines when he co-authored a paper demonstrating how certain people interpret “pseudo-profound bullshit” as deep observations. Pennycook et al. were interested in identifying individual differences between those who are susceptible to pseudo-profound BS and those who are not and thus looked at conspiracy beliefs, their degree of analytical thinking, religious beliefs, and so forth.

They presented several randomly generated statements, containing “profound” buzzwords, that were grammatically correct but made no sense logically, along with a 2014 tweet by Deepak Chopra that met the same criteria. They found that the less skeptical participants were less logical and analytical in their thinking and hence much more likely to consider these nonsensical statements as being deeply profound. That study was a bit controversial, in part for what was perceived to be its condescending tone, along with questions about its methodology. But it did snag Pennycook et al. a 2016 Ig Nobel Prize.

Last year we reported on another Pennycook study, presenting results from experiments in which an AI chatbot engaged in conversations with people who believed at least one conspiracy theory. That study showed that the AI interaction significantly reduced the strength of those beliefs, even two months later. The secret to its success: the chatbot, with its access to vast amounts of information across an enormous range of topics, could precisely tailor its counterarguments to each individual. “The work overturns a lot of how we thought about conspiracies, that they’re the result of various psychological motives and needs,” Pennycook said at the time.

Miscalibrated from reality

Pennycook has been working on this new overconfidence study since 2018, perplexed by observations indicating that people who believe in conspiracies also seem to have a lot of faith in their cognitive abilities—contradicting prior research finding that conspiracists are generally more intuitive. To investigate, he and his co-authors conducted eight separate studies that involved over 4,000 US adults.

The assigned tasks were designed in such a way that participants’ actual performance and how they perceived their performance were unrelated. For example, in one experiment, they were asked to guess the subject of an image that was largely obscured. The subjects were then asked direct questions about their belief (or lack thereof) concerning several key conspiracy claims: the Apollo Moon landings were faked, for example, or that Princess Diana’s death wasn’t an accident. Four of the studies focused on testing how subjects perceived others’ beliefs.

The results showed a marked association between subjects’ tendency to be overconfident and belief in conspiracy theories. And while a majority of participants believed a conspiracy’s claims just 12 percent of the time, believers thought they were in the majority 93 percent of the time. This suggests that overconfidence is a primary driver of belief in conspiracies.

It’s not that believers in conspiracy theories are massively overconfident; there is no data on that, because the studies didn’t set out to quantify the degree of overconfidence, per Pennycook. Rather, “They’re overconfident, and they massively overestimate how much people agree with them,” he said.

Ars spoke with Pennycook to learn more.

Ars Technica: Why did you decide to investigate overconfidence as a contributing factor to believing conspiracies?

Gordon Pennycook: There’s a popular sense that people believe conspiracies because they’re dumb and don’t understand anything, they don’t care about the truth, and they’re motivated by believing things that make them feel good. Then there’s the academic side, where that idea molds into a set of theories about how needs and motivations drive belief in conspiracies. It’s not someone falling down the rabbit hole and getting exposed to misinformation or conspiratorial narratives. They’re strolling down: “I like it over here. This appeals to me and makes me feel good.”

Believing things that no one else agrees with makes you feel unique. Then there’s various things I think that are a little more legitimate: People join communities and there’s this sense of belongingness. How that drives core beliefs is different. Someone may stop believing but hang around in the community because they don’t want to lose their friends. Even with religion, people will go to church when they don’t really believe. So we distinguish beliefs from practice.

What we observed is that they do tend to strongly believe these conspiracies despite the fact that there’s counter evidence or a lot of people disagree. What would lead that to happen? It could be their needs and motivations, but it could also be that there’s something about the way that they think where it just doesn’t occur to them that they could be wrong about it. And that’s where overconfidence comes in.

Ars Technica: What makes this particular trait such a powerful driving force?

Gordon Pennycook: Overconfidence is one of the most important core underlying components, because if you’re overconfident, it stops you from really questioning whether the thing that you’re seeing is right or wrong, and whether you might be wrong about it. You have an almost moral purity of complete confidence that the thing you believe is true. You cannot even imagine what it’s like from somebody else’s perspective. You couldn’t imagine a world in which the things that you think are true could be false. Having overconfidence is that buffer that stops you from learning from other people. You end up not just going down the rabbit hole, you’re doing laps down there.

Overconfidence doesn’t have to be learned, parts of it could be genetic. It also doesn’t have to be maladaptive. It’s maladaptive when it comes to beliefs. But you want people to think that they will be successful when starting new businesses. A lot of them will fail, but you need some people in the population to take risks that they wouldn’t take if they were thinking about it in a more rational way. So it can be optimal at a population level, but maybe not at an individual level.

Ars Technica: Is this overconfidence related to the well-known Dunning-Kruger effect?

Gordon Pennycook: It’s because of Dunning-Kruger that we had to develop a new methodology to measure overconfidence, because the people who are the worst at a task are the worst at knowing that they’re the worst at the task. But that’s because the same things that you use to do the task are the things you use to assess how good you are at the task. So if you were to give someone a math test and they’re bad at math, they’ll appear overconfident. But if you give them a test of assessing humor and they’re good at that, they won’t appear overconfident. That’s about the task, not the person.

So we have tasks where people essentially have to guess, and it’s transparent. There’s no reason to think that you’re good at the task. In fact, people who think they’re better at the task are not better at it, they just think they are. They just have this underlying kind of sense that they can do things, they know things, and that’s the kind of thing that we’re trying to capture. It’s not specific to a domain. There are lots of reasons why you could be overconfident in a particular domain. But this is something that’s an actual trait that you carry into situations. So when you’re scrolling online and come up with these ideas about how the world works that don’t make any sense, it must be everybody else that’s wrong, not you.

Ars Technica: Overestimating how many people agree with them seems to be at odds with conspiracy theorists’ desire to be unique.  

Gordon Pennycook: In general, people who believe conspiracies often have contrary beliefs. We’re working with a population where coherence is not to be expected. They say that they’re in the majority, but it’s never a strong majority. They just don’t think that they’re in a minority when it comes to the belief. Take the case of the Sandy Hook conspiracy, where adherents believe it was a false flag operation. In one sample, 8 percent of people thought that this was true. That 8 percent thought 61 percent of people agreed with them.

So they’re way off. They really, really miscalibrated. But they don’t say 90 percent. It’s 60 percent, enough to be special, but not enough to be on the fringe where they actually are. I could have asked them to rank how smart they are relative to others, or how unique they thought their beliefs were, and they would’ve answered high on that. But those are kind of mushy self-concepts. When you ask a specific question that has an objectively correct answer in terms of the percent of people in the sample that agree with you, it’s not close.

Ars Technica: How does one even begin to combat this? Could last year’s AI study point the way?

Gordon Pennycook: The AI debunking effect works better for people who are less overconfident. In those experiments, very detailed, specific debunks had a much bigger effect than people expected. After eight minutes of conversation, a quarter of the people who believed the thing didn’t believe it anymore, but 75 percent still did. That’s a lot. And some of them, not only did they still believe it, they still believed it to the same degree. So no one’s cracked that. Getting any movement at all in the aggregate was a big win.

Here’s the problem. You can’t have a conversation with somebody who doesn’t want to have the conversation. In those studies, we’re paying people, but they still get out what they put into the conversation. If you don’t really respond or engage, then our AI is not going to give you good responses because it doesn’t know what you’re thinking. And if the person is not willing to think. … This is why overconfidence is such an overarching issue. The only alternative is some sort of propagandistic sit-them-downs with their eyes open and try to de-convert them. But you can’t really convert someone who doesn’t want to be converted. So I’m not sure that there is an answer. I think that’s just the way that humans are.

Personality and Social Psychology Bulletin, 2025. DOI: 10.1177/01461672251338358  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

Conspiracy theorists don’t realize they’re on the fringe Read More »

marine-biologist-for-a-day:-ars-goes-shark-tagging

Marine biologist for a day: Ars goes shark tagging


We did not need a bigger boat

Go shark fishing on the RV Garvin, get hooked on the ideas behind it.

Image of three people kneeling over a large brown shark, as two others look on.

Field School staff made sure the day out was a safe and satisfying experience.

Field School staff made sure the day out was a safe and satisfying experience.

MIAMI—We were beginning to run out of bait, and the sharks weren’t cooperating.

Everybody aboard the Research Vessel Garvin had come to Miami for the sharks—to catch them, sample them, and tag them, all in the name of science. People who once wanted to be marine biologists, actual marine biologists, shark enthusiasts, the man who literally wrote the book Why Sharks Matter, and various friends and family had spent much of the day sending fish heads set with hooks over the side of the Garvin. But each time the line was hauled back in, it came in slack, with nothing but half-eaten bait or an empty hook at the end.

And everyone was getting nervous.

I: “No assholes”

The Garvin didn’t start out as a research vessel. Initially, it was a dive boat that took people to wrecks on the East Coast. Later, owner Hank Garvin used it to take low-income students from New York City and teach them how to dive, getting them scuba certified. But when Garvin died, his family put the boat, no longer in prime condition, on the market.

A thousand miles away in Florida, Catherine MacDonald was writing “no assholes” on a Post-it note.

At the time, MacDonald was the coordinator of a summer internship program at the University of Miami, where she was a PhD student. And even at that stage in her career, she and her colleagues had figured out that scientific field work had a problem.

“Science in general does not have a great reputation of being welcoming and supportive and inclusive and kind,” said David Shiffman, author of the aforementioned book and a grad school friend of MacDonald’s. “Field science is perhaps more of a problem than that. And field science involving what are called charismatic megafauna, the big animals that everyone loves, is perhaps worse than that. It’s probably because a lot of people want to do this, which means if we treat someone poorly and they quit, it’s not going to be long before someone else wants to fill the spot.”

MacDonald and some of her colleagues—Christian Pankow, Jake Jerome, Nick Perni, and Julia Wester (a lab manager and some fellow grad students at the time)—were already doing their best to work against these tendencies at Miami and help people learn how to do field work in a supportive environment. “I don’t think that you can scream abuse at students all day long and go home and publish great science,” she said, “because I don’t think that the science itself escapes the process through which it was generated.”

So they started to think about how they might extend that to the wider ocean science community. The “no assholes” Post-it became a bit of a mission statement, one that MacDonald says now sits in a frame in her office. “We decided out the gate that the point of doing this in part was to make marine science more inclusive and accessible and that if we couldn’t do that and be a successful business, then we were just going to fail,” she told Ars. “That’s kind of the plan.”

But to do it properly, they needed a boat. And that meant they needed money. “We borrowed from our friends and family,” MacDonald said. “I took out a loan on my house. It was just our money and all of the money that people who loved us were willing to sink into the project.”

Even that might not have been quite enough to afford a badly run-down boat. But the team made a personal appeal to Hank Garvin’s family. “They told the family who was trying to offload the boat, ‘Maybe someone else can pay you more for it, but here’s what we’re going to use it for, and also we’ll name the boat after your dad,'” Shiffman said. “And they got it.”

For the day, everybody who signed up had the chance to do most of the work that scientists normally would. Julia Saltzman

But it wasn’t enough to launch what would become the Field School. The Garvin was in good enough shape to navigate to Florida, but it needed considerable work before it could receive all the Coast Guard certifications required to get a Research Vessel designation. And given the team’s budget, that mostly meant the people launching the Field School had to learn to do the work themselves.

“One of [co-founder] Julia’s good friends was a boat surveyor, and he introduced us to a bunch of people who taught us skills or introduced us to someone else who could fix the alignment of our propellers or could suggest this great place in Louisiana that we could send the transmissions for rebuilding or could help us figure out which paints to use,” MacDonald said.

“We like to joke that we are the best PhD-holding fiberglassers in Miami,” she told Ars. “I don’t actually know if that’s true. I couldn’t prove it. But we just kind of jumped off the cliff together in terms of trying to make it work. Although we certainly had to hire folks to help us with a variety of projects, including building a new fuel tank because we are not the best PhD-holding welders in Miami for certain.”

II: Fishing for sharks

On the now fully refurbished Garvin, we were doing drum-line fishing. This involved a 16 kg (35-pound) weight connected to some floats by an extremely thick piece of rope. Also linked to the weight was a significant amount of 800-pound test line (meaning a monofilament polymer that won’t snap until something exerts over 800 lbs/360 kg of force on it) with a hook at the end. Most species of sharks need to keep swimming to force water over their gills or else suffocate; the length of the line allows them to swim in circles around the weight. The hook is also shaped to minimize damage to the fish during removal.

To draw sharks to the drum line, each of the floats had a small metal cage to hold chunks of fish that would release odorants. A much larger piece—either a head or cross-section of the trunk of a roughly foot-long fish—was set on the hook.

Deploying all of this was where the Garvin‘s passengers, none of whom had field research experience, came in. Under the tutelage of the people from the Field School, we’d lower the drum from a platform at the stern of the Garvin to the floor of Biscayne Bay, within sight of Miami’s high rises. A second shark enthusiast would send the float overboard as the Garvin‘s crew logged its GPS coordinates. After that, it was simply a matter of gently releasing the monofilament line from a large hand-held spool.

From right to left, the floats, the weight, and the bait all had to go into the water through an organized process. Julia Saltzman

One by one, we set 10 drums in a long row near one of the exits from Biscayne Bay. With the last one set, we went back to the first and reversed the process: haul in the float, use the rope to pull in the drum, and then let a Field School student test whether the line had a shark at the end. If not, it and the spool were handed over to a passenger, accompanied by tips on how to avoid losing fingers if a shark goes after the bait while being pulled back in.

Rebait, redeploy, and move on. We went down the line of 10 drums once, then twice, then thrice, and the morning gave way to afternoon. The routine became far less exciting, and getting volunteers for each of the roles in the process seemed to require a little more prodding. Conversations among the passengers and Field School people started to become the focus, the fishing a distraction, and people starting giving the bait buckets nervous looks.

And then, suddenly, a line went tight while it was being hauled in, and a large brown shape started moving near the surface in the distance.

III: Field support

Mortgaging your home is not a long-term funding solution, so over time, the Field School has developed a bit of a mixed model. Most of the people who come to learn there pay the costs for their time on the Garvin. That includes some people who sign up for one of the formal training programs. Shiffman also uses them to give undergraduates in the courses he teaches some exposure to actual research work.

“Over spring break this year, Georgetown undergrads flew down to Miami with me and spent a week living on Garvin, and we did some of what you saw,” he told Ars. “But also mangrove, snorkeling, using research drones, and going to the Everglades—things like that.” They also do one-day outings with some local high schools.

Many of the school’s costs, however, are covered by groups that pay to get the experience of being an ocean scientist for a day. These have included everything from local Greenpeace chapters to companies signing up for a teamwork-building experience. “The fundraiser rate [they pay] factors in not only the cost of taking those people out but also the cost of taking a low-income school group out in the future at no cost,” Shiffman said.

And then there are groups like the one I was joining—paying the fundraiser rate but composed of random collections of people brought together by little more than meeting Shiffman, either in person or online. In these cases, the Garvin is filled with a combination of small groups nucleated by one shark fan or people who wanted to be a marine biologist at some point or those who simply have a general interest in science. They’ll then recruit one or more friends or family members to join them, with varying degrees of willingness.

For a day, they all get to contribute to research. A lot of what we know about most fish populations comes from the fishing industry. And that information is often biased by commercial considerations, changing regulations, and more. The Field School trips, by contrast, give an unbiased sampling of whatever goes for its bait.

“The hardest part about marine biology research is getting to the animals—it’s boat time,” Shiffman said. “And since they’re already doing that, often in the context of teaching people how to do field skills, they reached out to colleagues all over the place and said, ‘Hey, here’s where we’re going. Here’s what we’re doing, here’s what we’re catching. Can we get any samples for you?’ So they’re taking all kinds of biological samples from the animals, and depending on what we catch, it can be for up to 15 different projects, with collaborators all over the country.”

And taking those samples is the passengers’ job. So shortly after leaving the marina on Garvin, we were divided up into teams and told what our roles would be once a shark was on board. One team member would take basic measurements of the shark’s dimensions. A second would scan the shark for parasites and place them in a sample jar, while another would snip a small piece of fin off to get a DNA sample. Finally, someone would insert a small tag at the base of the shark’s dorsal fin using a tool similar to a hollow awl. Amid all that, one of the Field School staff members would use a syringe to get a blood sample.

All of this would happen while members of the Field School staff were holding the shark in place—larger ones on a platform at the stern of the Garvin, smaller ones brought on board. The staff were the only ones who were supposed to get close to what Shiffman referred to as “the bitey end” of the shark. For most species, this would involve inserting one of three different-sized PVC tubes (for different-sized sharks) that seawater would be pumped through to keep the shark breathing and give them something to chomp down on. Other staff members held down the “slappy end.”

For a long time, all of this choreography seemed abstract. But there was finally a shark on the other end of the line, slowly being hauled toward the boat.

IV: Pure muscle and rage?

The size and brown color were an immediate tip-off to those in the know: We had a nurse shark, one that Shiffman described as being “pure muscle and rage.” Despite that, a single person was able to haul it in using a hand spool. Once restrained, the shark largely remained a passive participant in what came next. Nurse sharks are one of the few species that can force water over their gills even when stationary, and the shark’s size—it would turn out to be over 2 meters long—meant that it would need to stay partly submerged on the platform in the back.

So one by one, the first team splashed onto the platform and got to work. Despite their extremely limited training, it took just over five minutes for them to finish the measurements and get all the samples they needed. Details like the time, location, and basic measurements were all logged by hand on paper, although the data would be transferred to a spreadsheet once it was back on land. And the blood sample had some preliminary work done on the Garvin itself, which was equipped with a small centrifuge. All of that data would eventually be sent off to many of the Field School’s collaborators.

Shark number two, a blacktip, being hauled to the Garvin. Julia Saltzman

Since the shark was showing no signs of distress, all the other teams were allowed to step onto the platform and pet it, partly due to the fear that this would be the only one we caught that day. Sharks have a skin that’s smooth in one direction but rough if stroked in the opposite orientation, and their cartilaginous skeleton isn’t as solid as the bone most other vertebrates rely on. It was very much not like touching any other fish I’d encountered.

After we had all literally gotten our feet wet, the shark, now bearing the label UM00229, was sent on its way, and we went back to checking the drum lines.

A short time later, we hauled in a meter-long blacktip shark. This time, we set it up on an ice chest on the back of the boat, with a PVC tube firmly inserted into its mouth. Again, once the Field School staff restrained the shark, the team of amateurs got to work quickly and efficiently, with the only mishap being a person who rubbed their fingers the wrong way against the shark skin and got an abrasion that drew a bit of blood. Next up would be team three, the final group—and the one I was a part of.

V: The culture beyond science

I’m probably the perfect audience for an outing like this. Raised on a steady diet of Jacques Cousteau documentaries, I was also drawn to the idea of marine biology at one point. And having spent many of my years in molecular biology labs, I found myself jealous of the amazing things the field workers I’d met had experienced. The idea of playing shark scientist for a day definitely appealed to me.

A shark swims away from the camera.

Once processed, the sharks seemed content to get back to the business of being a shark. Credit: Julia Saltzman

But I probably came away as impressed by the motivation behind the Field School as I was with the sharks. I’ve been in science long enough to see multiple examples of the sort of toxic behaviors that the school’s founders wanted to avoid, and I wondered how science would ever change when there’s no obvious incentive for anyone to improve their behavior. In the absence of those incentives, MacDonald’s idea is to provide an example of better behavior—and that might be the best option.

“Overall, the thing that I really wanted at the end of the day was for people to look at some of the worst things about the culture of science and say, ‘It doesn’t have to be like that,'” she told Ars.

And that, she argues, may have an impact that extends well beyond science. “It’s not just about training future scientists, it’s about training future people,” she said. “When science and science education hurts people, it affects our whole society—it’s not that it doesn’t matter to the culture of science, because it profoundly does, but it matters more broadly than that as well.”

With motivations like that, it would have felt small to be upset that my career as a shark tagger ended up in the realm of unfulfilled potential, since I was on tagging team three, and we never hooked shark number three. Still, I can’t say I wasn’t a bit annoyed when I bumped into Shiffman a few weeks later, and he gleefully informed me they caught 14 of them the day after.

If you have a large enough group, you can support the Field School by chartering the Garvin for an outing. For smaller groups, you need to get in touch with David Shiffman.

Listing image: Julia Saltzman

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

Marine biologist for a day: Ars goes shark tagging Read More »

win-for-chemical-industry-as-epa-shutters-scientific-research-office

Win for chemical industry as EPA shutters scientific research office


Deregulation runs rampant

Companies feared rules and lawsuits based on Office of Research and Development assessments.

Soon after President Donald Trump took office in January, a wide array of petrochemical, mining, and farm industry coalitions ramped up what has been a long campaign to limit use of the Environmental Protection Agency’s assessments of the health risks of chemicals.

That effort scored a significant victory Friday when EPA Administrator Lee Zeldin announced his decision to dismantle the agency’s Office of Research and Development (ORD).

The industry lobbyists didn’t ask for hundreds of ORD staff members to be laid off or reassigned. But the elimination of the agency’s scientific research arm goes a long way toward achieving the goal they sought.

In a January 27 letter to Zeldin organized by the American Chemistry Council, more than 80 industry groups—including leading oil, refining, and mining associations—asked him to end regulators’ reliance on ORD assessments of the risks that chemicals pose for human health. The future of that research, conducted under EPA’s Integrated Risk Information System program, or IRIS, is now uncertain.

“EPA’s IRIS program within ORD has a troubling history of being out of step with the best available science and methods, lacking transparency, and being unresponsive to peer review and stakeholder recommendations,” said an American Chemistry Council spokesperson in an email when asked about the decision to eliminate ORD. “This results in IRIS assessments that jeopardize access to critical chemistries, undercut national priorities, and harm American competitiveness.”

The spokesperson said the organization supports EPA evaluating its resources to ensure tax dollars are being used efficiently and effectively.

Christopher Frey, an associate dean at North Carolina State University who served as EPA assistant administrator in charge of ORD during the Biden administration, defended the quality of the science done by the office, which he said is “the poster case study of what it means to do science that’s subject to intense scrutiny.”

“There’s industry with a tremendous vested interest in the policy decisions that might occur later on,” based on the assessments made by ORD. “What the industry does is try to engage in a proxy war over the policy by attacking the science.”

Among the IRIS assessments that stirred the most industry concern were those outlining the dangers of formaldehyde, ethylene oxide, arsenic, and hexavalent chromium. Regulatory actions had begun or were looming on all during the Biden administration.

The Biden administration also launched a lawsuit against a LaPlace, Louisiana, plant that had been the only US manufacturer of neoprene, Denka Performance Elastomer, based in part on the IRIS assessment of one of its air pollutants, chloroprene, as a likely human carcinogen. Denka, a spinoff of DuPont, announced it was ceasing production in May because of the cost of pollution controls.

Public health advocates charge that eliminating the IRIS program, or shifting its functions to other offices in the agency, will rob the EPA of the independent expertise to inform its mission of protection.

“They’ve been trying for years to shut down IRIS,” said Darya Minovi, a senior analyst with the Union of Concerned Scientists and lead author of a new study on Trump administration actions that the group says undermine science. “The reason why is because when IRIS conducts its independent scientific assessments using a great amount of rigor… you get stronger regulations, and that is not in the best interest of the big business polluters and those who have a financial stake in the EPA’s demise.”

The UCS report tallied more than 400 firings, funding cuts, and other attacks on science in the first six months of the Trump administration, resulting in 54 percent fewer grants for research on topics including cancer, infectious disease, and environmental health.

EPA’s press office did not respond to a query on whether the IRIS controversy helped inform Zeldin’s decision to eliminate ORD, which had been anticipated since staff were informed of the potential plan at a meeting in March. In the agency’s official announcement Friday afternoon, Zeldin said the elimination of the office was part of “organizational improvements” that would deliver $748.8 million in savings to taxpayers. The reduction in force, combined with previous departures and layoffs, have reduced the agency’s workforce by 23 percent, to 12,448, the EPA said.

With the cuts, the EPA’s workforce will be at its lowest level since fiscal year 1986.

“Under President Trump’s leadership, EPA has taken a close look at our operations to ensure the agency is better equipped than ever to deliver on our core mission of protecting human health and the environment while Powering the Great American Comeback,” Zeldin said in the prepared statement. “This reduction in force will ensure we can better fulfill that mission while being responsible stewards of your hard-earned tax dollars.”

The agency will be creating a new Office of Applied Science and Environmental Solutions; a report by E&E News said an internal memo indicated the new office would be much smaller than ORD, and would focus on coastal areas, drinking water safety, and methodologies for assessing environmental contamination.

Zeldin’s announcement also said that scientific expertise and research efforts will be moved to “program offices”—for example, those concerned with air pollution, water pollution, or waste—to tackle “statutory obligations and mission essential functions.” That phrase has a particular meaning: The chemical industry has long complained that Congress never passed a law creating IRIS. Congress did, however, pass many laws requiring that the agency carry out its actions based on the best available science, and the IRIS program, established during President Ronald Reagan’s administration, was how the agency has carried out the task of assessing the science on chemicals since 1985.

Justin Chen, president of the American Federation of Government Employees Council 238, the union representing 8,000 EPA workers nationwide, said the organizational structure of ORD put barriers between the agency’s researchers and the agency’s political decision-making, enforcement, and regulatory teams—even though they all used ORD’s work.

“For them to function properly, they have to have a fair amount of distance away from political interference, in order to let the science guide and develop the kind of things that they do,” Chen said.

“They’re a particular bugbear for a lot of the industries which are heavy donors to the Trump administration and to the right wing,” Chen said. “They’re the ones, I believe, who do all the testing that actually factors into the calculation of risk.”

ORD also was responsible for regularly doing assessments that the Clean Air Act requires on pollutants like ozone and particulate matter, which result from the combustion of fossil fuels.

Frey said a tremendous amount of ORD work has gone into ozone, which is the result of complex interactions of precursor pollutants in the atmosphere. The open source computer modeling on ozone transport, developed by ORD researchers, helps inform decision-makers grappling with how to address smog around the country. The Biden administration finalized stricter standards for particulate matter in its final year based on ORD’s risk assessment, and the Trump administration is now undoing those rules.

Aidan Hughes contributed to this report.

This story originally appeared on Inside Climate News.

Photo of Inside Climate News

Win for chemical industry as EPA shutters scientific research office Read More »

nearly-3,000-people-are-leaving-nasa,-and-this-director-is-one-of-them

Nearly 3,000 people are leaving NASA, and this director is one of them

You can add another name to the thousands of employees leaving NASA as the Trump administration primes the space agency for a 25 percent budget cut.

On Monday, NASA announced that Makenzie Lystrup will leave her post as director of the Goddard Space Flight Center on Friday, August 1. Lystrup has held the top job at Goddard since April 2023, overseeing a staff of more than 8,000 civil servants and contractor employees and a budget last year of about $4.7 billion.

These figures make Goddard the largest of NASA’s 10 field centers primarily devoted to scientific research and development of robotic space missions, with a budget and workforce comparable to NASA’s human spaceflight centers in Texas, Florida, and Alabama. Officials at Goddard manage the James Webb and Hubble telescopes in space, and Goddard engineers are assembling the Nancy Grace Roman Space Telescope, another flagship observatory scheduled for launch late next year.

“We’re grateful to Makenzie for her leadership at NASA Goddard for more than two years, including her work to inspire a Golden Age of explorers, scientists, and engineers,” Vanessa Wyche, NASA’s acting associate administrator, said in a statement.

Cynthia Simmons, Goddard’s deputy director, will take over as acting chief at the space center. Simmons started work at Goddard as a contract engineer 25 years ago.

Lystrup came to NASA from Ball Aerospace, now part of BAE Systems, where she managed the company’s work on civilian space projects for NASA and other federal agencies. Before joining Ball Aerospace, Lystrup earned a doctorate in astrophysics from University College London and conducted research as a planetary astronomer.

Formal dissent

The announcement of Lystrup’s departure from Goddard came hours after the release of an open letter to NASA’s interim administrator, Transportation Secretary Sean Duffy, signed by hundreds of current and former agency employees. The letter, titled the “The Voyager Declaration,” identifies what the signatories call “recent policies that have or threaten to waste public resources, compromise human safety, weaken national security, and undermine the core NASA mission.”

Nearly 3,000 people are leaving NASA, and this director is one of them Read More »

southwestern-drought-likely-to-continue-through-2100,-research-finds

Southwestern drought likely to continue through 2100, research finds

This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here.

The drought in the Southwestern US is likely to last for the rest of the 21st century and potentially beyond as global warming shifts the distribution of heat in the Pacific Ocean, according to a study published last week led by researchers at the University of Texas at Austin.

Using sediment cores collected in the Rocky Mountains, paleoclimatology records and climate models, the researchers found warming driven by greenhouse gas emissions can alter patterns of atmospheric and marine heat in the North Pacific Ocean in a way resembling what’s known as the negative phase of the Pacific Decadal Oscillation (PDO), fluctuations in sea surface temperatures that result in decreased winter precipitation in the American Southwest. But in this case, the phenomenon can last far longer than the usual 30-year cycle of the PDO.

“If the sea surface temperature patterns in the North Pacific were just the result of processes related to stochastic [random] variability in the past decade or two, we would have just been extremely unlucky, like a really bad roll of the dice,” said Victoria Todd, the lead author of the study and a PhD student in geosciences at University of Texas at Austin. “But if, as we hypothesize, this is a forced change in the sea surface temperatures in the North Pacific, this will be sustained into the future, and we need to start looking at this as a shift, instead of just the result of bad luck.”

Currently, the Southwestern US is experiencing a megadrought resulting in the aridification of the landscape, a decades-long drying of the region brought on by climate change and the overconsumption of the region’s water. That’s led to major rivers and their basins, such as the Colorado and Rio Grande rivers, seeing reduced flows and a decline of the water stored in underground aquifers, which is forcing states and communities to reckon with a sharply reduced water supply. Farmers have cut back on the amount of water they use. Cities are searching for new water supplies. And states, tribes, and federal agencies are engaging in tense negotiations over how to manage declining resources like the Colorado River going forward.

Southwestern drought likely to continue through 2100, research finds Read More »