Science

rocket-report:-channeling-the-future-at-wallops;-spacex-recovers-rocket-wreckage

Rocket Report: Channeling the future at Wallops; SpaceX recovers rocket wreckage


China’s Space Pioneer seems to be back on track a year after an accidental launch.

A SpaceX Falcon 9 rocket carrying a payload of 24 Starlink Internet satellites soars into space after launching from Vandenberg Space Force Base, California, shortly after sunset on July 18, 2025. This image was taken in Santee, California, approximately 250 miles (400 kilometers) away from the launch site. Credit: Kevin Carter/Getty Images

Welcome to Edition 8.04 of the Rocket Report! The Pentagon’s Golden Dome missile defense shield will be a lot of things. Along with new sensors, command and control systems, and satellites, Golden Dome will require a lot of rockets. The pieces of the Golden Dome architecture operating in orbit will ride to space on commercial launch vehicles. And Golden Dome’s space-based interceptors will essentially be designed as flying fuel tanks with rocket engines. This shouldn’t be overlooked, and that’s why we include a couple of entries discussing Golden Dome in this week’s Rocket Report.

As always, we welcome reader submissions. If you don’t want to miss an issue, please subscribe using the box below (the form will not appear on AMP-enabled versions of the site). Each report will include information on small-, medium-, and heavy-lift rockets, as well as a quick look ahead at the next three launches on the calendar.

Space-based interceptors are a real challenge. The newly installed head of the Pentagon’s Golden Dome missile defense shield knows the clock is ticking to show President Donald Trump some results before the end of his term in the White House, Ars reports. Gen. Michael Guetlein identified command-and-control and the development of space-based interceptors as two of the most pressing technical challenges for Golden Dome. He believes the command-and-control problem can be “overcome in pretty short order.” The space-based interceptor piece of the architecture is a different story.

Proven physics, unproven economics … “I think the real technical challenge will be building the space-based interceptor,” Guetlein said. “That technology exists. I believe we have proven every element of the physics that we can make it work. What we have not proven is, first, can I do it economically, and then second, can I do it at scale? Can I build enough satellites to get after the threat? Can I expand the industrial base fast enough to build those satellites? Do I have enough raw materials, etc.?” Military officials haven’t said how many space-based interceptors will be required for Golden Dome, but outside estimates put the number in the thousands.

The easiest way to keep up with Eric Berger’s and Stephen Clark’s reporting on all things space is to sign up for our newsletter. We’ll collect their stories and deliver them straight to your inbox.

Sign Me Up!

One big defense prime is posturing for Golden Dome. Northrop Grumman is conducting ground-based testing related to space-based interceptors as part of a competition for that segment of the Trump administration’s Golden Dome missile-defense initiative, The War Zone reports. Kathy Warden, Northrop Grumman’s CEO, highlighted the company’s work on space-based interceptors, as well as broader business opportunities stemming from Golden Dome, during a quarterly earnings call this week. Warden identified Northrop’s work in radars, drones, and command-and-control systems as potentially applicable to Golden Dome.

But here’s the real news … “It will also include new innovation, like space-based interceptors, which we’re testing now,” Warden continued. “These are ground-based tests today, and we are in competition, obviously, so not a lot of detail that I can provide here.” Warden declined to respond directly to a question about how the space-based interceptors Northrop Grumman is developing now will actually defeat their targets. (submitted by Biokleen)

Trump may slash environmental rules for rocket launches. The Trump administration is considering slashing rules meant to protect the environment and the public during commercial rocket launches, changes that companies like Elon Musk’s SpaceX have long sought, ProPublica reports. A draft executive order being circulated among federal agencies, and viewed by ProPublica, directs Secretary of Transportation Sean Duffy to “use all available authorities to eliminate or expedite” environmental reviews for launch licenses. It could also, in time, require states to allow more launches or even more launch sites along their coastlines.

Getting political at the FAA … The order is a step toward the rollback of federal oversight that Musk, who has fought bitterly with the Federal Aviation Administration over his space operations, and others have pushed for. Commercial rocket launches have grown exponentially more frequent in recent years. In addition to slashing environmental rules, the draft executive order would make the head of the FAA’s Office of Commercial Space Transportation a political appointee. This is currently a civil servant position, but the last head of the office took a voluntary separation offer earlier this year.

There’s a SPAC for that. An unproven small launch startup is partnering with a severely depleted SPAC trust to do the impossible: go public in a deal they say will be valued at $400 million, TechCrunch reports. Innovative Rocket Technologies Inc., or iRocket, is set to merge with a Special Purpose Acquisition Company, or SPAC, founded by former Commerce Secretary Wilbur Ross. But the most recent regulatory filings by this SPAC showed it was in a tenuous financial position last year, with just $1.6 million held in trust. Likewise, iRocket isn’t flooded with cash. The company has raised only a few million in venture funding, a fraction of what would be needed to develop and test the company’s small orbital-class rocket, named Shockwave.

SpaceX traces a path to orbit for NASA. Two NASA satellites soared into orbit from California aboard a SpaceX Falcon 9 rocket Wednesday, commencing a $170 million mission to study a phenomenon of space physics that has eluded researchers since the dawn of the Space Age, Ars reports. The twin spacecraft are part of the NASA-funded TRACERS mission, which will spend at least a year measuring plasma conditions in narrow regions of Earth’s magnetic field known as polar cusps. As the name suggests, these regions are located over the poles. They play an important but poorly understood role in creating colorful auroras as plasma streaming out from the Sun interacts with the magnetic field surrounding Earth. The same process drives geomagnetic storms capable of disrupting GPS navigation, radio communications, electrical grids, and satellite operations.

Plenty of room for more … The TRACERS satellites are relatively small, each about the size of a washing machine, so they filled only a fraction of the capacity of SpaceX’s Falcon 9 rocket. Three other small NASA tech demo payloads hitched a ride to orbit with TRACERS, kicking off missions to test an experimental communications terminal, demonstrate an innovative scalable satellite platform made of individual building blocks, and study the link between Earth’s atmosphere and the Van Allen radiation belts. In addition to those missions, the European Space Agency launched its own CubeSat to test 5G communications from orbit. Five smallsats from an Australian company rounded out the group. Still, the Falcon 9 rocket’s payload shroud was filled with less than a quarter of the payload mass it could have delivered to the TRACERS mission’s targeted Sun-synchronous orbit.

Tianlong launch pad ready for action. Chinese startup Space Pioneer has completed a launch pad at Jiuquan spaceport in northwestern China for its Tianlong 3 liquid propellent rocket ahead of a first orbital launch, Space News reports. Space Pioneer said the launch pad passed an acceptance test, and ground crews raised a full-scale model of the Tianlong 3 rocket on the launch pad. “The rehearsal test was successfully completed,” said Space Pioneer, one of China’s leading private launch companies. The activation of the launch pad followed a couple of weeks after Space Pioneer announced the completion of static loads testing on Tianlong 3.

More to come … While this is an important step forward for Space Pioneer, construction of the launch pad is just one element the company needs to finish before Tianlong 3 can lift off for the first time. In June 2024, the company ignited Tianlong 3’s nine-engine first stage on a test stand in China. But the rocket broke free of its moorings on the test stand and unexpectedly climbed into the sky before crashing in a fireball nearby. Space Pioneer says the “weak design of the rocket’s tail structure was the direct cause of the failure” last year. The company hasn’t identified next steps for Tianlong 3, or when it might be ready to fly. Tianlong 3 is a kerosene-fueled rocket with nine main engines, similar in design architecture and payload capacity to SpaceX’s Falcon 9. Also, like Falcon 9, Tianlong 3 is supposed to have a recoverable and reusable first stage booster.

Dredging up an issue at Wallops. Rocket Lab has asked regulators for permission to transport oversized Neutron rocket structures through shallow waters to a spaceport off the coast of Virginia as it races to meet a September delivery deadline, TechCrunch reports. The request, which was made in July, is a temporary stopgap while the company awaits federal clearance to dredge a permanent channel to the Wallops Island site. Rocket Lab plans to launch its Neutron medium-lift rocket from the Mid-Atlantic Regional Spaceport (MARS) on Wallops Island, Virginia, a lower-traffic spaceport that’s surrounded by shallow channels and waterways. Rocket Lab has a sizable checklist to tick off before Neutron can make its orbital debut, like mating the rocket stages, performing a “wet dress” rehearsal, and getting its launch license from the Federal Aviation Administration. Before any of that can happen, the rocket hardware needs to make it onto the island from Rocket Lab’s factory on the nearby mainland.

Kedging bets … Access to the channel leading to Wallops Island is currently available only at low tides. So, Rocket Lab submitted an application earlier this year to dredge the channel. The dredging project was approved by the Virginia Marine Resources Commission in May, but the company has yet to start digging because it’s still awaiting federal sign-off from the Army Corps of Engineers. As the company waits for federal approval, Rocket Lab is seeking permission to use a temporary method called “kedging” to ensure the first five hardware deliveries can arrive on schedule starting in September. We don’t cover maritime issues in the Rocket Report, but if you’re interested in learning a little about kedging, here’s a link.

Any better ideas for an Exploration Upper Stage? Not surprisingly, Congress is pushing back against the Trump administration’s proposal to cancel the Space Launch System, the behemoth rocket NASA has developed to propel astronauts back to the Moon. But legislation making its way through the House of Representatives includes an interesting provision that would direct NASA to evaluate alternatives for the Boeing-built Exploration Upper Stage, an upgrade for the SLS rocket set to debut on its fourth flight, Ars reports. Essentially, the House Appropriations Committee is telling NASA to look for cheaper, faster options for a new SLS upper stage.

CYA EUS? The four-engine Exploration Upper Stage, or EUS, is an expensive undertaking. Last year, NASA’s inspector general reported that the new upper stage’s development costs had ballooned from $962 million to $2.8 billion, and the project had been delayed more than six years. That’s almost a year-for-year delay since NASA and Boeing started development of the EUS. So, what are the options if NASA went with a new upper stage for the SLS rocket? One possibility is a modified version of United Launch Alliance’s dual-engine Centaur V upper stage that flies on the Vulcan rocket. It’s no longer possible to keep flying the SLS rocket’s existing single-engine upper stage because ULA has shut down the production line for it.

Raising Super Heavy from the deep. For the second time, SpaceX has retrieved an engine section from one of its Super Heavy boosters from the Gulf of Mexico, NASASpaceflight.com reports. Images posted on social media showed the tail end of a Super Heavy booster being raised from the sea off the coast of northern Mexico. Most of the rocket’s 33 Raptor engines appear to still be attached to the lower section of the stainless steel booster. Online sleuths who closely track SpaceX’s activities at Starbase, Texas, have concluded the rocket recovered from the Gulf is Booster 13, which flew on the sixth test flight of the Starship mega-rocket last November. The booster ditched in the ocean after aborting an attempted catch back at the launch pad in South Texas.

But why? … SpaceX recovered the engine section of a different Super Heavy booster from the Gulf last year. The company’s motivation for salvaging the wreckage is unclear. “Speculated reasons include engineering research, environmental mitigation, or even historical preservation,” NASASpaceflight reports.

Next three launches

July 26: Vega C | CO3D & MicroCarb | Guiana Space Center, French Guiana | 02: 03 UTC

July 26: Falcon 9 | Starlink 10-26 | Cape Canaveral Space Force Station, Florida | 08: 34 UTC

July 27: Falcon 9 | Starlink 17-2 | Vandenberg Space Force Base, California | 03: 55 UTC

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

Rocket Report: Channeling the future at Wallops; SpaceX recovers rocket wreckage Read More »

lawmakers-writing-nasa’s-budget-want-a-cheaper-upper-stage-for-the-sls-rocket

Lawmakers writing NASA’s budget want a cheaper upper stage for the SLS rocket


Eliminating the Block 1B upgrade now would save NASA at least $500 million per year.

Artist’s illustration of the Boeing-developed Exploration Upper Stage, with four hydrogen-fueled RL10 engines. Credit: NASA

Not surprisingly, Congress is pushing back against the Trump administration’s proposal to cancel the Space Launch System, the behemoth rocket NASA has developed to propel astronauts back to the Moon.

Spending bills making their way through both houses of Congress reject the White House’s plan to wind down the SLS rocket after two more launches, but the text of a draft budget recently released by the House Appropriations Committee suggests an openness to making some major changes to the program.

The next SLS flight, called Artemis II, is scheduled to lift off early next year to send a crew of four astronauts around the far side of the Moon. Artemis III will follow a few years later on a mission to attempt a crew lunar landing at the Moon’s south pole. These missions follow Artemis I, a successful unpiloted test flight in 2022.

After Artemis III, the official policy of the Trump administration is to terminate the SLS program, along with the Orion crew capsule designed to launch on top of the rocket. The White House also proposed canceling NASA’s Gateway, a mini-space station to be placed in orbit around the Moon. NASA would instead procure commercial launches and commercial spacecraft to ferry astronauts between the Earth and the Moon, while focusing the agency’s long-term gaze toward Mars.

CYA EUS?

House and Senate appropriations bills would preserve SLS, Orion, and the Gateway. However, the House version of NASA’s budget has an interesting paragraph directing NASA to explore cheaper, faster options for a new SLS upper stage.

NASA has tasked Boeing, which also builds SLS core stages, to develop an Exploration Upper Stage for debut on the Artemis IV mission, the fourth flight of the Space Launch System. This new upper stage would have large propellant tanks and carry four engines instead of the single engine used on the rocket’s interim upper stage, which NASA is using for the first three SLS flights.

The House version of NASA’s fiscal year 2026 budget raises questions about the long-term future of the Exploration Upper Stage. In one section of the bill, House lawmakers would direct NASA to “evaluate alternatives to the current Exploration Upper Stage (EUS) design for SLS.” The committee members wrote the evaluation should focus on reducing development and production costs, shortening the schedule, and maintaining the SLS rocket’s lift capability.

“NASA should also evaluate how alternative designs could support the long-term evolution of SLS and broader exploration goals beyond low-Earth orbit,” the lawmakers wrote. “NASA is directed to assess various propulsion systems, stage configurations, infrastructure compatibility, commercial and international collaboration opportunities, and the cost and schedule impacts of each alternative.”

The SLS rocket is expensive, projected to cost at least $2.5 billion per launch, not counting development costs or expenses related to the Orion spacecraft and the ground systems required to launch it at Kennedy Space Center in Florida. Those figures bring the total cost of an Artemis mission using SLS and Orion to more than $4 billion, according to NASA’s inspector general.

NASA’s Block 1B version of the SLS rocket will be substantially larger than Block 1. Credit: NASA

The EUS is likewise an expensive undertaking. Last year, NASA’s inspector general reported that the new upper stage’s development costs had ballooned from $962 million to $2.8 billion, and the Boeing-led project had been delayed more than six years. The version of the SLS rocket with the EUS, known as Block 1B, is supposed to deliver a 40 percent increase in performance over the Block 1 configuration used on the first three Space Launch System flights. Overall, NASA’s inspector general projected Block 1B’s development costs to total $5.7 billion.

Eliminating the Block 1B upgrade now would save NASA at least $500 million per year, and perhaps more if NASA could also end work on a costly mobile launch tower specifically designed to support SLS Block 1B missions.

NASA can’t go back to the interim upper stage, which is based on the design of the upper stage that flew on United Launch Alliance’s (ULA’s) now-retired Delta IV Heavy rocket. ULA has shut down its Delta production line, so there’s no way to build any more. What ULA does have is a new high-energy upper stage called Centaur V. This upper stage is sized for ULA’s new Vulcan rocket, with more capability than the interim upper stage but with lower performance than the larger EUS.

A season of compromise, maybe

Ars’ Eric Berger wrote last year about the possibility of flying the Centaur V upper stage on SLS missions.

Incorporating the Centaur V wouldn’t maintain the SLS rocket’s lift capability, as the House committee calls for in its appropriations bill. The primary reason for improving the rocket’s performance is to give SLS Block 1B enough oomph to carry “co-manifested” payloads, meaning it can launch an Orion crew capsule and equipment for NASA’s Gateway lunar space station on a single flight. The lunar Gateway is also teed up for cancellation in Trump’s budget proposal, but both congressional appropriations bills would save it, too. If the Gateway escapes cancellation, there are ways to launch its modules on commercial rockets.

Blue Origin also has an upper stage that could conceivably fly on the Space Launch System. But the second stage for Blue Origin’s New Glenn rocket would be a more challenging match for SLS for several reasons, chiefly its 7-meter (23-foot) diameter—too wide to be a drop-in replacement for the interim upper stage used on Block 1. ULA’s Centaur V is much closer in size to the existing upper stage.

The House budget bill has passed a key subcommittee vote but won’t receive a vote from the full appropriations committee until after Congress’s August recess. A markup of the bill by the House Appropriations Committee scheduled for Thursday was postponed after Speaker Mike Johnson announced an early start to the recess this week.

Ars reported last week on the broad strokes of how the House and Senate appropriations bills would affect NASA. Since then, members of the House Appropriations Committee released the text of the report attached to their version of the NASA budget. The report, which includes the paragraph on the Exploration Upper Stage, provides policy guidance and more detailed direction on where NASA should spend its money.

The House’s draft budget includes $2.5 billion for the Space Launch System, close to this year’s funding level and $500 million more than the Trump administration’s request for the next fiscal year, which begins October 1. The budget would continue development of SLS Block 1B and the Exploration Upper Stage while NASA completes a six-month study of alternatives.

The report attached to the Senate appropriations bill for NASA has no specific instructions regarding the Exploration Upper Stage. But like the House bill, the Senate’s draft budget directs NASA to continue ordering spares and long-lead parts for SLS and Orion missions beyond Artemis III. Both versions of the NASA budget require the agency to continue with SLS and Orion until a suitable commercial, human-rated rocket and crew vehicle are proven ready for service.

In a further indication of Congress’ position on the SLS and Orion programs, lawmakers set aside more than $4 billion for the procurement of SLS rockets for the Artemis IV and Artemis V rockets in the reconciliation bill signed into law by President Donald Trump earlier this month.

Congress must pass a series of federal appropriations bills by October 1, when funding for the current fiscal year runs out. If Congress doesn’t act by then, it could pass a continuing resolution to maintain funding at levels close to this year’s budget or face a government shutdown.

Lawmakers will reconvene in Washington, DC, in early September in hopes of finishing work on the fiscal year 2026 budget. The section of the budget that includes NASA still must go through a markup hearing by the House Appropriations Committee and pass floor votes in the House and Senate. Then the two chambers will have to come to a compromise on the differences in their appropriations bill. Only then can the budget be put to another vote in each chamber and go to the White House for Trump’s signature.

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

Lawmakers writing NASA’s budget want a cheaper upper stage for the SLS rocket Read More »

spacex-launches-a-pair-of-nasa-satellites-to-probe-the-origins-of-space-weather

SpaceX launches a pair of NASA satellites to probe the origins of space weather


“This is going to really help us understand how to predict space weather in the magnetosphere.”

This artist’s illustration of the Earth’s magnetosphere shows the solar wind (left) streaming from the Sun, and then most of it being blocked by Earth’s magnetic field. The magnetic field lines seen here fold in toward Earth’s surface at the poles, creating polar cusps. Credit: NASA/Goddard Space Flight Center

Two NASA satellites rocketed into orbit from California aboard a SpaceX Falcon 9 rocket Wednesday, commencing a $170 million mission to study a phenomenon of space physics that has eluded researchers since the dawn of the Space Age.

The twin spacecraft are part of the NASA-funded TRACERS mission, which will spend at least a year measuring plasma conditions in narrow regions of Earth’s magnetic field known as polar cusps. As the name suggests, these regions are located over the poles. They play an important but poorly understood role in creating colorful auroras as plasma streaming out from the Sun interacts with the magnetic field surrounding Earth.

The same process drives geomagnetic storms capable of disrupting GPS navigation, radio communications, electrical grids, and satellite operations. These outbursts are usually triggered by solar flares or coronal mass ejections that send blobs of plasma out into the Solar System. If one of these flows happens to be aimed at Earth, we are treated with auroras but vulnerable to the storm’s harmful effects.

For example, an extreme geomagnetic storm last year degraded GPS navigation signals, resulting in more than $500 million in economic losses in the agriculture sector as farms temporarily suspended spring planting. In 2022, a period of elevated solar activity contributed to the loss of 40 SpaceX Starlink satellites.

“Understanding our Sun and the space weather it produces is more important to us here on Earth, I think, than most realize,” said Joe Westlake, director of NASA’s heliophysics division.

NASA’s two TRACERS satellites launched Wednesday aboard a SpaceX Falcon 9 rocket from Vandenberg Space Force Base, California. Credit: SpaceX

The launch of TRACERS was delayed 24 hours after a regional power outage disrupted air traffic control over the Pacific Ocean near the Falcon 9 launch site on California’s Central Coast, according to the Federal Aviation Administration. SpaceX called off the countdown Tuesday less than a minute before liftoff, then rescheduled the flight for Wednesday.

TRACERS, short for Tandem Reconnection and Cusp Electrodynamics Reconnaissance Satellites, will study a process known as magnetic reconnection. As particles in the solar wind head out into the Solar System at up to 1 million mph, they bring along pieces of the Sun’s magnetic field. When the solar wind reaches our neighborhood, it begins interacting with Earth’s magnetic field.

The high-energy collision breaks and reconnects magnetic field lines, flinging solar wind particles across Earth’s magnetosphere at speeds that can approach the speed of light. Earth’s field draws some of these particles into the polar cusps, down toward the upper atmosphere. This is what creates dazzling auroral light shows and potentially damaging geomagnetic storms.

Over our heads

But scientists still aren’t sure how it all works, despite the fact that it’s happening right over our heads, within the reach of countless satellites in low-Earth orbit. But a single spacecraft won’t do the job. Scientists need at least two spacecraft, each positioned in bespoke polar orbits and specially instrumented to measure magnetic fields, electric fields, electrons, and ions.

That’s because magnetic reconnection is a dynamic process, and a single satellite would provide just a snapshot of conditions over the polar cusps every 90 minutes. By the time the satellite comes back around on another orbit, conditions will have changed, but scientists wouldn’t know how or why, according to David Miles, principal investigator for the TRACERS mission at the University of Iowa.

“You can’t tell, is that because the system itself is changing?” Miles said. “Is that because this magnetic reconnection, the coupling process, is moving around? Is it turning on and off, and if it’s turning on and off, how quickly can it do it? Those are fundamental things that we need to understand… how the solar wind arriving at the Earth does or doesn’t transfer energy to the Earth system, which has this downstream effect of space weather.”

This is why the tandem part of the TRACERS name is important. The novel part of this mission is it features two identical spacecraft, each about the size of a washing machine flying at an altitude of 367 miles (590 kilometers). Over the course of the next few weeks, the TRACERS satellites will drift into a formation with one trailing the other by about two minutes as they zip around the world at nearly five miles per second. This positioning will allow the satellites to sample the polar cusps one right after the other, instead of forcing scientists to wait another 90 minutes for a data refresh.

With TRACERS, scientists hope to pick apart smaller, fast-moving changes with each satellite pass. Within a year, TRACERS should collect 3,000 measurements of magnetic reconnections, a sample size large enough to start identifying why some space weather events evolve differently than others.

“Not only will it get a global picture of reconnection in the magnetosphere, but it’s also going to be able to statistically study how reconnection depends on the state of the solar wind,” said John Dorelli, TRACERS mission scientist at NASA’s Goddard Space Flight Center. “This is going to really help us understand how to predict space weather in the magnetosphere.”

One of the two TRACERS satellites undergoes launch preparations at Millennium Space Systems, the spacecraft’s manufacturer. Credit: Millennium Space Systems

“If we can understand these various different situations, whether it happens suddenly if you have one particular kind of event, or it happens in lots of different places, then we have a better way to model that and say, ‘Ah, here’s the likelihood of seeing a certain kind of effect that would affect humans,'” said Craig Kletzing, the principal investigator who led the TRACERS science team until his death in 2023.

There is broader knowledge to be gained with a mission like TRACERS. Magnetic reconnection is ubiquitous throughout the Universe, and the same physical processes produce solar flares and coronal mass ejections from the Sun.

Hitchhiking to orbit

Several other satellites shared the ride to space with TRACERS on Wednesday.

These secondary payloads included a NASA-sponsored mission named PExT, a small technology demonstration satellite carrying an experimental communications package capable of connecting with three different networks: NASA’s government-owned Tracking and Data Relay Satellites (TDRS) and commercial satellite networks owned by SES and Viasat.

What’s unique about the Polylingual Experimental Terminal, or PExT, is its ability to roam across multiple satellite relay networks. The International Space Station and other satellites in low-Earth orbit currently connect to controllers on the ground through NASA’s TDRS satellites. But NASA will retire its TDRS satellites in the 2030s and begin purchasing data relay services using commercial satellite networks.

The space agency expects to have multiple data relay providers, so radios on future NASA satellites must be flexible enough to switch between networks mid-mission. PExT is a pathfinder for these future missions.

Another NASA-funded tech demo named Athena EPIC was also aboard the Falcon 9 rocket. Led by NASA’s Langley Research Center, this mission uses a scalable satellite platform developed by a company named NovaWurks, using building blocks to piece together everything a spacecraft needs to operate in space.

Athena EPIC hosts a single science instrument to measure how much energy Earth radiates into space, an important data point for climate research. But the mission’s real goal is to showcase how an adaptable satellite design, such as this one using NovaWurks’ building block approach, might be useful for future NASA missions.

A handful of other payloads rounded out the payload list for Wednesday’s launch. They included REAL, a NASA-funded CubeSat project to investigate the Van Allen radiation belts and space weather, and LIDE, an experimental 5G communications satellite backed by the European Space Agency. Five commercial spacecraft from the Australian company Skykraft also launched to join a constellation of small satellites to provide tracking and voice communications between air traffic controllers and aircraft over remote parts of the world.

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

SpaceX launches a pair of NASA satellites to probe the origins of space weather Read More »

conspiracy-theorists-don’t-realize-they’re-on-the-fringe

Conspiracy theorists don’t realize they’re on the fringe


Gordon Pennycook: “It might be one of the biggest false consensus effects that’s been observed.”

Credit: Aurich Lawson / Thinkstock

Belief in conspiracy theories is often attributed to some form of motivated reasoning: People want to believe a conspiracy because it reinforces their worldview, for example, or doing so meets some deep psychological need, like wanting to feel unique. However, it might also be driven by overconfidence in their own cognitive abilities, according to a paper published in the Personality and Social Psychology Bulletin. The authors were surprised to discover that not only are conspiracy theorists overconfident, they also don’t realize their beliefs are on the fringe, massively overestimating by as much as a factor of four how much other people agree with them.

“I was expecting the overconfidence finding,” co-author Gordon Pennycook, a psychologist at Cornell University, told Ars. “If you’ve talked to someone who believes conspiracies, it’s self-evident. I did not expect them to be so ready to state that people agree with them. I thought that they would overestimate, but I didn’t think that there’d be such a strong sense that they are in the majority. It might be one of the biggest false consensus effects that’s been observed.”

In 2015, Pennycook made headlines when he co-authored a paper demonstrating how certain people interpret “pseudo-profound bullshit” as deep observations. Pennycook et al. were interested in identifying individual differences between those who are susceptible to pseudo-profound BS and those who are not and thus looked at conspiracy beliefs, their degree of analytical thinking, religious beliefs, and so forth.

They presented several randomly generated statements, containing “profound” buzzwords, that were grammatically correct but made no sense logically, along with a 2014 tweet by Deepak Chopra that met the same criteria. They found that the less skeptical participants were less logical and analytical in their thinking and hence much more likely to consider these nonsensical statements as being deeply profound. That study was a bit controversial, in part for what was perceived to be its condescending tone, along with questions about its methodology. But it did snag Pennycook et al. a 2016 Ig Nobel Prize.

Last year we reported on another Pennycook study, presenting results from experiments in which an AI chatbot engaged in conversations with people who believed at least one conspiracy theory. That study showed that the AI interaction significantly reduced the strength of those beliefs, even two months later. The secret to its success: the chatbot, with its access to vast amounts of information across an enormous range of topics, could precisely tailor its counterarguments to each individual. “The work overturns a lot of how we thought about conspiracies, that they’re the result of various psychological motives and needs,” Pennycook said at the time.

Miscalibrated from reality

Pennycook has been working on this new overconfidence study since 2018, perplexed by observations indicating that people who believe in conspiracies also seem to have a lot of faith in their cognitive abilities—contradicting prior research finding that conspiracists are generally more intuitive. To investigate, he and his co-authors conducted eight separate studies that involved over 4,000 US adults.

The assigned tasks were designed in such a way that participants’ actual performance and how they perceived their performance were unrelated. For example, in one experiment, they were asked to guess the subject of an image that was largely obscured. The subjects were then asked direct questions about their belief (or lack thereof) concerning several key conspiracy claims: the Apollo Moon landings were faked, for example, or that Princess Diana’s death wasn’t an accident. Four of the studies focused on testing how subjects perceived others’ beliefs.

The results showed a marked association between subjects’ tendency to be overconfident and belief in conspiracy theories. And while a majority of participants believed a conspiracy’s claims just 12 percent of the time, believers thought they were in the majority 93 percent of the time. This suggests that overconfidence is a primary driver of belief in conspiracies.

It’s not that believers in conspiracy theories are massively overconfident; there is no data on that, because the studies didn’t set out to quantify the degree of overconfidence, per Pennycook. Rather, “They’re overconfident, and they massively overestimate how much people agree with them,” he said.

Ars spoke with Pennycook to learn more.

Ars Technica: Why did you decide to investigate overconfidence as a contributing factor to believing conspiracies?

Gordon Pennycook: There’s a popular sense that people believe conspiracies because they’re dumb and don’t understand anything, they don’t care about the truth, and they’re motivated by believing things that make them feel good. Then there’s the academic side, where that idea molds into a set of theories about how needs and motivations drive belief in conspiracies. It’s not someone falling down the rabbit hole and getting exposed to misinformation or conspiratorial narratives. They’re strolling down: “I like it over here. This appeals to me and makes me feel good.”

Believing things that no one else agrees with makes you feel unique. Then there’s various things I think that are a little more legitimate: People join communities and there’s this sense of belongingness. How that drives core beliefs is different. Someone may stop believing but hang around in the community because they don’t want to lose their friends. Even with religion, people will go to church when they don’t really believe. So we distinguish beliefs from practice.

What we observed is that they do tend to strongly believe these conspiracies despite the fact that there’s counter evidence or a lot of people disagree. What would lead that to happen? It could be their needs and motivations, but it could also be that there’s something about the way that they think where it just doesn’t occur to them that they could be wrong about it. And that’s where overconfidence comes in.

Ars Technica: What makes this particular trait such a powerful driving force?

Gordon Pennycook: Overconfidence is one of the most important core underlying components, because if you’re overconfident, it stops you from really questioning whether the thing that you’re seeing is right or wrong, and whether you might be wrong about it. You have an almost moral purity of complete confidence that the thing you believe is true. You cannot even imagine what it’s like from somebody else’s perspective. You couldn’t imagine a world in which the things that you think are true could be false. Having overconfidence is that buffer that stops you from learning from other people. You end up not just going down the rabbit hole, you’re doing laps down there.

Overconfidence doesn’t have to be learned, parts of it could be genetic. It also doesn’t have to be maladaptive. It’s maladaptive when it comes to beliefs. But you want people to think that they will be successful when starting new businesses. A lot of them will fail, but you need some people in the population to take risks that they wouldn’t take if they were thinking about it in a more rational way. So it can be optimal at a population level, but maybe not at an individual level.

Ars Technica: Is this overconfidence related to the well-known Dunning-Kruger effect?

Gordon Pennycook: It’s because of Dunning-Kruger that we had to develop a new methodology to measure overconfidence, because the people who are the worst at a task are the worst at knowing that they’re the worst at the task. But that’s because the same things that you use to do the task are the things you use to assess how good you are at the task. So if you were to give someone a math test and they’re bad at math, they’ll appear overconfident. But if you give them a test of assessing humor and they’re good at that, they won’t appear overconfident. That’s about the task, not the person.

So we have tasks where people essentially have to guess, and it’s transparent. There’s no reason to think that you’re good at the task. In fact, people who think they’re better at the task are not better at it, they just think they are. They just have this underlying kind of sense that they can do things, they know things, and that’s the kind of thing that we’re trying to capture. It’s not specific to a domain. There are lots of reasons why you could be overconfident in a particular domain. But this is something that’s an actual trait that you carry into situations. So when you’re scrolling online and come up with these ideas about how the world works that don’t make any sense, it must be everybody else that’s wrong, not you.

Ars Technica: Overestimating how many people agree with them seems to be at odds with conspiracy theorists’ desire to be unique.  

Gordon Pennycook: In general, people who believe conspiracies often have contrary beliefs. We’re working with a population where coherence is not to be expected. They say that they’re in the majority, but it’s never a strong majority. They just don’t think that they’re in a minority when it comes to the belief. Take the case of the Sandy Hook conspiracy, where adherents believe it was a false flag operation. In one sample, 8 percent of people thought that this was true. That 8 percent thought 61 percent of people agreed with them.

So they’re way off. They really, really miscalibrated. But they don’t say 90 percent. It’s 60 percent, enough to be special, but not enough to be on the fringe where they actually are. I could have asked them to rank how smart they are relative to others, or how unique they thought their beliefs were, and they would’ve answered high on that. But those are kind of mushy self-concepts. When you ask a specific question that has an objectively correct answer in terms of the percent of people in the sample that agree with you, it’s not close.

Ars Technica: How does one even begin to combat this? Could last year’s AI study point the way?

Gordon Pennycook: The AI debunking effect works better for people who are less overconfident. In those experiments, very detailed, specific debunks had a much bigger effect than people expected. After eight minutes of conversation, a quarter of the people who believed the thing didn’t believe it anymore, but 75 percent still did. That’s a lot. And some of them, not only did they still believe it, they still believed it to the same degree. So no one’s cracked that. Getting any movement at all in the aggregate was a big win.

Here’s the problem. You can’t have a conversation with somebody who doesn’t want to have the conversation. In those studies, we’re paying people, but they still get out what they put into the conversation. If you don’t really respond or engage, then our AI is not going to give you good responses because it doesn’t know what you’re thinking. And if the person is not willing to think. … This is why overconfidence is such an overarching issue. The only alternative is some sort of propagandistic sit-them-downs with their eyes open and try to de-convert them. But you can’t really convert someone who doesn’t want to be converted. So I’m not sure that there is an answer. I think that’s just the way that humans are.

Personality and Social Psychology Bulletin, 2025. DOI: 10.1177/01461672251338358  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

Conspiracy theorists don’t realize they’re on the fringe Read More »

marine-biologist-for-a-day:-ars-goes-shark-tagging

Marine biologist for a day: Ars goes shark tagging


We did not need a bigger boat

Go shark fishing on the RV Garvin, get hooked on the ideas behind it.

Image of three people kneeling over a large brown shark, as two others look on.

Field School staff made sure the day out was a safe and satisfying experience.

Field School staff made sure the day out was a safe and satisfying experience.

MIAMI—We were beginning to run out of bait, and the sharks weren’t cooperating.

Everybody aboard the Research Vessel Garvin had come to Miami for the sharks—to catch them, sample them, and tag them, all in the name of science. People who once wanted to be marine biologists, actual marine biologists, shark enthusiasts, the man who literally wrote the book Why Sharks Matter, and various friends and family had spent much of the day sending fish heads set with hooks over the side of the Garvin. But each time the line was hauled back in, it came in slack, with nothing but half-eaten bait or an empty hook at the end.

And everyone was getting nervous.

I: “No assholes”

The Garvin didn’t start out as a research vessel. Initially, it was a dive boat that took people to wrecks on the East Coast. Later, owner Hank Garvin used it to take low-income students from New York City and teach them how to dive, getting them scuba certified. But when Garvin died, his family put the boat, no longer in prime condition, on the market.

A thousand miles away in Florida, Catherine MacDonald was writing “no assholes” on a Post-it note.

At the time, MacDonald was the coordinator of a summer internship program at the University of Miami, where she was a PhD student. And even at that stage in her career, she and her colleagues had figured out that scientific field work had a problem.

“Science in general does not have a great reputation of being welcoming and supportive and inclusive and kind,” said David Shiffman, author of the aforementioned book and a grad school friend of MacDonald’s. “Field science is perhaps more of a problem than that. And field science involving what are called charismatic megafauna, the big animals that everyone loves, is perhaps worse than that. It’s probably because a lot of people want to do this, which means if we treat someone poorly and they quit, it’s not going to be long before someone else wants to fill the spot.”

MacDonald and some of her colleagues—Christian Pankow, Jake Jerome, Nick Perni, and Julia Wester (a lab manager and some fellow grad students at the time)—were already doing their best to work against these tendencies at Miami and help people learn how to do field work in a supportive environment. “I don’t think that you can scream abuse at students all day long and go home and publish great science,” she said, “because I don’t think that the science itself escapes the process through which it was generated.”

So they started to think about how they might extend that to the wider ocean science community. The “no assholes” Post-it became a bit of a mission statement, one that MacDonald says now sits in a frame in her office. “We decided out the gate that the point of doing this in part was to make marine science more inclusive and accessible and that if we couldn’t do that and be a successful business, then we were just going to fail,” she told Ars. “That’s kind of the plan.”

But to do it properly, they needed a boat. And that meant they needed money. “We borrowed from our friends and family,” MacDonald said. “I took out a loan on my house. It was just our money and all of the money that people who loved us were willing to sink into the project.”

Even that might not have been quite enough to afford a badly run-down boat. But the team made a personal appeal to Hank Garvin’s family. “They told the family who was trying to offload the boat, ‘Maybe someone else can pay you more for it, but here’s what we’re going to use it for, and also we’ll name the boat after your dad,'” Shiffman said. “And they got it.”

For the day, everybody who signed up had the chance to do most of the work that scientists normally would. Julia Saltzman

But it wasn’t enough to launch what would become the Field School. The Garvin was in good enough shape to navigate to Florida, but it needed considerable work before it could receive all the Coast Guard certifications required to get a Research Vessel designation. And given the team’s budget, that mostly meant the people launching the Field School had to learn to do the work themselves.

“One of [co-founder] Julia’s good friends was a boat surveyor, and he introduced us to a bunch of people who taught us skills or introduced us to someone else who could fix the alignment of our propellers or could suggest this great place in Louisiana that we could send the transmissions for rebuilding or could help us figure out which paints to use,” MacDonald said.

“We like to joke that we are the best PhD-holding fiberglassers in Miami,” she told Ars. “I don’t actually know if that’s true. I couldn’t prove it. But we just kind of jumped off the cliff together in terms of trying to make it work. Although we certainly had to hire folks to help us with a variety of projects, including building a new fuel tank because we are not the best PhD-holding welders in Miami for certain.”

II: Fishing for sharks

On the now fully refurbished Garvin, we were doing drum-line fishing. This involved a 16 kg (35-pound) weight connected to some floats by an extremely thick piece of rope. Also linked to the weight was a significant amount of 800-pound test line (meaning a monofilament polymer that won’t snap until something exerts over 800 lbs/360 kg of force on it) with a hook at the end. Most species of sharks need to keep swimming to force water over their gills or else suffocate; the length of the line allows them to swim in circles around the weight. The hook is also shaped to minimize damage to the fish during removal.

To draw sharks to the drum line, each of the floats had a small metal cage to hold chunks of fish that would release odorants. A much larger piece—either a head or cross-section of the trunk of a roughly foot-long fish—was set on the hook.

Deploying all of this was where the Garvin‘s passengers, none of whom had field research experience, came in. Under the tutelage of the people from the Field School, we’d lower the drum from a platform at the stern of the Garvin to the floor of Biscayne Bay, within sight of Miami’s high rises. A second shark enthusiast would send the float overboard as the Garvin‘s crew logged its GPS coordinates. After that, it was simply a matter of gently releasing the monofilament line from a large hand-held spool.

From right to left, the floats, the weight, and the bait all had to go into the water through an organized process. Julia Saltzman

One by one, we set 10 drums in a long row near one of the exits from Biscayne Bay. With the last one set, we went back to the first and reversed the process: haul in the float, use the rope to pull in the drum, and then let a Field School student test whether the line had a shark at the end. If not, it and the spool were handed over to a passenger, accompanied by tips on how to avoid losing fingers if a shark goes after the bait while being pulled back in.

Rebait, redeploy, and move on. We went down the line of 10 drums once, then twice, then thrice, and the morning gave way to afternoon. The routine became far less exciting, and getting volunteers for each of the roles in the process seemed to require a little more prodding. Conversations among the passengers and Field School people started to become the focus, the fishing a distraction, and people starting giving the bait buckets nervous looks.

And then, suddenly, a line went tight while it was being hauled in, and a large brown shape started moving near the surface in the distance.

III: Field support

Mortgaging your home is not a long-term funding solution, so over time, the Field School has developed a bit of a mixed model. Most of the people who come to learn there pay the costs for their time on the Garvin. That includes some people who sign up for one of the formal training programs. Shiffman also uses them to give undergraduates in the courses he teaches some exposure to actual research work.

“Over spring break this year, Georgetown undergrads flew down to Miami with me and spent a week living on Garvin, and we did some of what you saw,” he told Ars. “But also mangrove, snorkeling, using research drones, and going to the Everglades—things like that.” They also do one-day outings with some local high schools.

Many of the school’s costs, however, are covered by groups that pay to get the experience of being an ocean scientist for a day. These have included everything from local Greenpeace chapters to companies signing up for a teamwork-building experience. “The fundraiser rate [they pay] factors in not only the cost of taking those people out but also the cost of taking a low-income school group out in the future at no cost,” Shiffman said.

And then there are groups like the one I was joining—paying the fundraiser rate but composed of random collections of people brought together by little more than meeting Shiffman, either in person or online. In these cases, the Garvin is filled with a combination of small groups nucleated by one shark fan or people who wanted to be a marine biologist at some point or those who simply have a general interest in science. They’ll then recruit one or more friends or family members to join them, with varying degrees of willingness.

For a day, they all get to contribute to research. A lot of what we know about most fish populations comes from the fishing industry. And that information is often biased by commercial considerations, changing regulations, and more. The Field School trips, by contrast, give an unbiased sampling of whatever goes for its bait.

“The hardest part about marine biology research is getting to the animals—it’s boat time,” Shiffman said. “And since they’re already doing that, often in the context of teaching people how to do field skills, they reached out to colleagues all over the place and said, ‘Hey, here’s where we’re going. Here’s what we’re doing, here’s what we’re catching. Can we get any samples for you?’ So they’re taking all kinds of biological samples from the animals, and depending on what we catch, it can be for up to 15 different projects, with collaborators all over the country.”

And taking those samples is the passengers’ job. So shortly after leaving the marina on Garvin, we were divided up into teams and told what our roles would be once a shark was on board. One team member would take basic measurements of the shark’s dimensions. A second would scan the shark for parasites and place them in a sample jar, while another would snip a small piece of fin off to get a DNA sample. Finally, someone would insert a small tag at the base of the shark’s dorsal fin using a tool similar to a hollow awl. Amid all that, one of the Field School staff members would use a syringe to get a blood sample.

All of this would happen while members of the Field School staff were holding the shark in place—larger ones on a platform at the stern of the Garvin, smaller ones brought on board. The staff were the only ones who were supposed to get close to what Shiffman referred to as “the bitey end” of the shark. For most species, this would involve inserting one of three different-sized PVC tubes (for different-sized sharks) that seawater would be pumped through to keep the shark breathing and give them something to chomp down on. Other staff members held down the “slappy end.”

For a long time, all of this choreography seemed abstract. But there was finally a shark on the other end of the line, slowly being hauled toward the boat.

IV: Pure muscle and rage?

The size and brown color were an immediate tip-off to those in the know: We had a nurse shark, one that Shiffman described as being “pure muscle and rage.” Despite that, a single person was able to haul it in using a hand spool. Once restrained, the shark largely remained a passive participant in what came next. Nurse sharks are one of the few species that can force water over their gills even when stationary, and the shark’s size—it would turn out to be over 2 meters long—meant that it would need to stay partly submerged on the platform in the back.

So one by one, the first team splashed onto the platform and got to work. Despite their extremely limited training, it took just over five minutes for them to finish the measurements and get all the samples they needed. Details like the time, location, and basic measurements were all logged by hand on paper, although the data would be transferred to a spreadsheet once it was back on land. And the blood sample had some preliminary work done on the Garvin itself, which was equipped with a small centrifuge. All of that data would eventually be sent off to many of the Field School’s collaborators.

Shark number two, a blacktip, being hauled to the Garvin. Julia Saltzman

Since the shark was showing no signs of distress, all the other teams were allowed to step onto the platform and pet it, partly due to the fear that this would be the only one we caught that day. Sharks have a skin that’s smooth in one direction but rough if stroked in the opposite orientation, and their cartilaginous skeleton isn’t as solid as the bone most other vertebrates rely on. It was very much not like touching any other fish I’d encountered.

After we had all literally gotten our feet wet, the shark, now bearing the label UM00229, was sent on its way, and we went back to checking the drum lines.

A short time later, we hauled in a meter-long blacktip shark. This time, we set it up on an ice chest on the back of the boat, with a PVC tube firmly inserted into its mouth. Again, once the Field School staff restrained the shark, the team of amateurs got to work quickly and efficiently, with the only mishap being a person who rubbed their fingers the wrong way against the shark skin and got an abrasion that drew a bit of blood. Next up would be team three, the final group—and the one I was a part of.

V: The culture beyond science

I’m probably the perfect audience for an outing like this. Raised on a steady diet of Jacques Cousteau documentaries, I was also drawn to the idea of marine biology at one point. And having spent many of my years in molecular biology labs, I found myself jealous of the amazing things the field workers I’d met had experienced. The idea of playing shark scientist for a day definitely appealed to me.

A shark swims away from the camera.

Once processed, the sharks seemed content to get back to the business of being a shark. Credit: Julia Saltzman

But I probably came away as impressed by the motivation behind the Field School as I was with the sharks. I’ve been in science long enough to see multiple examples of the sort of toxic behaviors that the school’s founders wanted to avoid, and I wondered how science would ever change when there’s no obvious incentive for anyone to improve their behavior. In the absence of those incentives, MacDonald’s idea is to provide an example of better behavior—and that might be the best option.

“Overall, the thing that I really wanted at the end of the day was for people to look at some of the worst things about the culture of science and say, ‘It doesn’t have to be like that,'” she told Ars.

And that, she argues, may have an impact that extends well beyond science. “It’s not just about training future scientists, it’s about training future people,” she said. “When science and science education hurts people, it affects our whole society—it’s not that it doesn’t matter to the culture of science, because it profoundly does, but it matters more broadly than that as well.”

With motivations like that, it would have felt small to be upset that my career as a shark tagger ended up in the realm of unfulfilled potential, since I was on tagging team three, and we never hooked shark number three. Still, I can’t say I wasn’t a bit annoyed when I bumped into Shiffman a few weeks later, and he gleefully informed me they caught 14 of them the day after.

If you have a large enough group, you can support the Field School by chartering the Garvin for an outing. For smaller groups, you need to get in touch with David Shiffman.

Listing image: Julia Saltzman

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

Marine biologist for a day: Ars goes shark tagging Read More »

win-for-chemical-industry-as-epa-shutters-scientific-research-office

Win for chemical industry as EPA shutters scientific research office


Deregulation runs rampant

Companies feared rules and lawsuits based on Office of Research and Development assessments.

Soon after President Donald Trump took office in January, a wide array of petrochemical, mining, and farm industry coalitions ramped up what has been a long campaign to limit use of the Environmental Protection Agency’s assessments of the health risks of chemicals.

That effort scored a significant victory Friday when EPA Administrator Lee Zeldin announced his decision to dismantle the agency’s Office of Research and Development (ORD).

The industry lobbyists didn’t ask for hundreds of ORD staff members to be laid off or reassigned. But the elimination of the agency’s scientific research arm goes a long way toward achieving the goal they sought.

In a January 27 letter to Zeldin organized by the American Chemistry Council, more than 80 industry groups—including leading oil, refining, and mining associations—asked him to end regulators’ reliance on ORD assessments of the risks that chemicals pose for human health. The future of that research, conducted under EPA’s Integrated Risk Information System program, or IRIS, is now uncertain.

“EPA’s IRIS program within ORD has a troubling history of being out of step with the best available science and methods, lacking transparency, and being unresponsive to peer review and stakeholder recommendations,” said an American Chemistry Council spokesperson in an email when asked about the decision to eliminate ORD. “This results in IRIS assessments that jeopardize access to critical chemistries, undercut national priorities, and harm American competitiveness.”

The spokesperson said the organization supports EPA evaluating its resources to ensure tax dollars are being used efficiently and effectively.

Christopher Frey, an associate dean at North Carolina State University who served as EPA assistant administrator in charge of ORD during the Biden administration, defended the quality of the science done by the office, which he said is “the poster case study of what it means to do science that’s subject to intense scrutiny.”

“There’s industry with a tremendous vested interest in the policy decisions that might occur later on,” based on the assessments made by ORD. “What the industry does is try to engage in a proxy war over the policy by attacking the science.”

Among the IRIS assessments that stirred the most industry concern were those outlining the dangers of formaldehyde, ethylene oxide, arsenic, and hexavalent chromium. Regulatory actions had begun or were looming on all during the Biden administration.

The Biden administration also launched a lawsuit against a LaPlace, Louisiana, plant that had been the only US manufacturer of neoprene, Denka Performance Elastomer, based in part on the IRIS assessment of one of its air pollutants, chloroprene, as a likely human carcinogen. Denka, a spinoff of DuPont, announced it was ceasing production in May because of the cost of pollution controls.

Public health advocates charge that eliminating the IRIS program, or shifting its functions to other offices in the agency, will rob the EPA of the independent expertise to inform its mission of protection.

“They’ve been trying for years to shut down IRIS,” said Darya Minovi, a senior analyst with the Union of Concerned Scientists and lead author of a new study on Trump administration actions that the group says undermine science. “The reason why is because when IRIS conducts its independent scientific assessments using a great amount of rigor… you get stronger regulations, and that is not in the best interest of the big business polluters and those who have a financial stake in the EPA’s demise.”

The UCS report tallied more than 400 firings, funding cuts, and other attacks on science in the first six months of the Trump administration, resulting in 54 percent fewer grants for research on topics including cancer, infectious disease, and environmental health.

EPA’s press office did not respond to a query on whether the IRIS controversy helped inform Zeldin’s decision to eliminate ORD, which had been anticipated since staff were informed of the potential plan at a meeting in March. In the agency’s official announcement Friday afternoon, Zeldin said the elimination of the office was part of “organizational improvements” that would deliver $748.8 million in savings to taxpayers. The reduction in force, combined with previous departures and layoffs, have reduced the agency’s workforce by 23 percent, to 12,448, the EPA said.

With the cuts, the EPA’s workforce will be at its lowest level since fiscal year 1986.

“Under President Trump’s leadership, EPA has taken a close look at our operations to ensure the agency is better equipped than ever to deliver on our core mission of protecting human health and the environment while Powering the Great American Comeback,” Zeldin said in the prepared statement. “This reduction in force will ensure we can better fulfill that mission while being responsible stewards of your hard-earned tax dollars.”

The agency will be creating a new Office of Applied Science and Environmental Solutions; a report by E&E News said an internal memo indicated the new office would be much smaller than ORD, and would focus on coastal areas, drinking water safety, and methodologies for assessing environmental contamination.

Zeldin’s announcement also said that scientific expertise and research efforts will be moved to “program offices”—for example, those concerned with air pollution, water pollution, or waste—to tackle “statutory obligations and mission essential functions.” That phrase has a particular meaning: The chemical industry has long complained that Congress never passed a law creating IRIS. Congress did, however, pass many laws requiring that the agency carry out its actions based on the best available science, and the IRIS program, established during President Ronald Reagan’s administration, was how the agency has carried out the task of assessing the science on chemicals since 1985.

Justin Chen, president of the American Federation of Government Employees Council 238, the union representing 8,000 EPA workers nationwide, said the organizational structure of ORD put barriers between the agency’s researchers and the agency’s political decision-making, enforcement, and regulatory teams—even though they all used ORD’s work.

“For them to function properly, they have to have a fair amount of distance away from political interference, in order to let the science guide and develop the kind of things that they do,” Chen said.

“They’re a particular bugbear for a lot of the industries which are heavy donors to the Trump administration and to the right wing,” Chen said. “They’re the ones, I believe, who do all the testing that actually factors into the calculation of risk.”

ORD also was responsible for regularly doing assessments that the Clean Air Act requires on pollutants like ozone and particulate matter, which result from the combustion of fossil fuels.

Frey said a tremendous amount of ORD work has gone into ozone, which is the result of complex interactions of precursor pollutants in the atmosphere. The open source computer modeling on ozone transport, developed by ORD researchers, helps inform decision-makers grappling with how to address smog around the country. The Biden administration finalized stricter standards for particulate matter in its final year based on ORD’s risk assessment, and the Trump administration is now undoing those rules.

Aidan Hughes contributed to this report.

This story originally appeared on Inside Climate News.

Photo of Inside Climate News

Win for chemical industry as EPA shutters scientific research office Read More »

nearly-3,000-people-are-leaving-nasa,-and-this-director-is-one-of-them

Nearly 3,000 people are leaving NASA, and this director is one of them

You can add another name to the thousands of employees leaving NASA as the Trump administration primes the space agency for a 25 percent budget cut.

On Monday, NASA announced that Makenzie Lystrup will leave her post as director of the Goddard Space Flight Center on Friday, August 1. Lystrup has held the top job at Goddard since April 2023, overseeing a staff of more than 8,000 civil servants and contractor employees and a budget last year of about $4.7 billion.

These figures make Goddard the largest of NASA’s 10 field centers primarily devoted to scientific research and development of robotic space missions, with a budget and workforce comparable to NASA’s human spaceflight centers in Texas, Florida, and Alabama. Officials at Goddard manage the James Webb and Hubble telescopes in space, and Goddard engineers are assembling the Nancy Grace Roman Space Telescope, another flagship observatory scheduled for launch late next year.

“We’re grateful to Makenzie for her leadership at NASA Goddard for more than two years, including her work to inspire a Golden Age of explorers, scientists, and engineers,” Vanessa Wyche, NASA’s acting associate administrator, said in a statement.

Cynthia Simmons, Goddard’s deputy director, will take over as acting chief at the space center. Simmons started work at Goddard as a contract engineer 25 years ago.

Lystrup came to NASA from Ball Aerospace, now part of BAE Systems, where she managed the company’s work on civilian space projects for NASA and other federal agencies. Before joining Ball Aerospace, Lystrup earned a doctorate in astrophysics from University College London and conducted research as a planetary astronomer.

Formal dissent

The announcement of Lystrup’s departure from Goddard came hours after the release of an open letter to NASA’s interim administrator, Transportation Secretary Sean Duffy, signed by hundreds of current and former agency employees. The letter, titled the “The Voyager Declaration,” identifies what the signatories call “recent policies that have or threaten to waste public resources, compromise human safety, weaken national security, and undermine the core NASA mission.”

Nearly 3,000 people are leaving NASA, and this director is one of them Read More »

southwestern-drought-likely-to-continue-through-2100,-research-finds

Southwestern drought likely to continue through 2100, research finds

This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here.

The drought in the Southwestern US is likely to last for the rest of the 21st century and potentially beyond as global warming shifts the distribution of heat in the Pacific Ocean, according to a study published last week led by researchers at the University of Texas at Austin.

Using sediment cores collected in the Rocky Mountains, paleoclimatology records and climate models, the researchers found warming driven by greenhouse gas emissions can alter patterns of atmospheric and marine heat in the North Pacific Ocean in a way resembling what’s known as the negative phase of the Pacific Decadal Oscillation (PDO), fluctuations in sea surface temperatures that result in decreased winter precipitation in the American Southwest. But in this case, the phenomenon can last far longer than the usual 30-year cycle of the PDO.

“If the sea surface temperature patterns in the North Pacific were just the result of processes related to stochastic [random] variability in the past decade or two, we would have just been extremely unlucky, like a really bad roll of the dice,” said Victoria Todd, the lead author of the study and a PhD student in geosciences at University of Texas at Austin. “But if, as we hypothesize, this is a forced change in the sea surface temperatures in the North Pacific, this will be sustained into the future, and we need to start looking at this as a shift, instead of just the result of bad luck.”

Currently, the Southwestern US is experiencing a megadrought resulting in the aridification of the landscape, a decades-long drying of the region brought on by climate change and the overconsumption of the region’s water. That’s led to major rivers and their basins, such as the Colorado and Rio Grande rivers, seeing reduced flows and a decline of the water stored in underground aquifers, which is forcing states and communities to reckon with a sharply reduced water supply. Farmers have cut back on the amount of water they use. Cities are searching for new water supplies. And states, tribes, and federal agencies are engaging in tense negotiations over how to manage declining resources like the Colorado River going forward.

Southwestern drought likely to continue through 2100, research finds Read More »

local-cuisine-was-on-the-menu-at-cafe-neanderthal

Local cuisine was on the menu at Cafe Neanderthal

Gazelle prepared “a la Amud,” or “a la Kebara”?

Neanderthals at Kebara had pretty broad tastes in meat. The butchered bones found in the cave were mostly an even mix of small ungulates (largely gazelle) and medium-sized ones (red deer, fallow deer, wild goats, and boar), with just a few larger game animals thrown in. And it looks like the Kebara Neanderthals were “use the whole deer” sorts of hunters because the bones came from all parts of the animals’ bodies.

On the other hand (or hoof), at Amud, archaeologists found that the butchered bones were almost entirely long bone shafts—legs, in other words—from gazelle. Apparently, the Neanderthal hunters at Amud focused more on gazelle than on larger prey like red deer or boar, and they seemingly preferred meat from the legs.

And not too fresh, apparently—the bones at Kebara showed fewer cut marks, and the marks that were there tended to be straighter. Meanwhile, at Amud, the bones were practically cluttered with cut marks, which crisscrossed over each other and were often curved, not straight. According to Jallon and her colleagues, the difference probably wasn’t a skill issue. Instead, it may be a clue that Neanderthals at Amud liked their meat dried, boiled, or even slightly rotten.

That’s based on comparisons to what bones look like when modern hunter-gatherers butcher their game, along with archaeologists’ experiments with stone tool butchery. First, differences in skill between newbie butchers and advanced ones don’t produce the same pattern of cut marks Jallon and her colleagues saw at Amud. But “it has been shown that decaying carcasses tend to be more difficult to process, often resulting in the production of haphazard, deep, and sinuous cut marks,” as Jallon and her colleagues wrote in their recent paper.

So apparently, for reasons unknown to modern archaeologists, the meat on the menu at Amud was, shall we say, a bit less fresh than that at Kebara. Said menu was also considerably less varied. All of that meant that if you were a Neanderthal from Amud and stopped by Kebara for dinner (or vice versa) your meal might seem surprisingly foreign.

Local cuisine was on the menu at Cafe Neanderthal Read More »

how-android-phones-became-an-earthquake-warning-system

How Android phones became an earthquake warning system

Of course, the trick is that you only send out the warning if there’s an actual earthquake, and not when a truck is passing by. Here, the sheer volume of Android phones sold plays a key role. As a first pass, AEA can simply ignore events that aren’t picked up by a lot of phones in the same area. But we also know a lot about the patterns of shaking that earthquakes produce. Different waves travel at different speeds, cause different types of ground motion, and may be produced at different intensities as the earthquake progresses.

So, the people behind AEA also include a model of earthquakes and seismic wave propagation, and check whether the pattern seen in phones’ accelerometers is consistent with that model. It only triggers an alert when there’s widespread phone activity that matches the pattern expected for an earthquake.

Raising awareness

In practical terms, AEA is distributed as part of the core Android software, and is set to on by default, so it is active in most Android phones. It starts monitoring when the phone has been stationary for a little while, checking for acceleration data that’s consistent with the P or S waves produced by earthquakes. If it gets a match, it forwards the information along with some rough location data (to preserve privacy) to Google servers. Software running on those servers then performs the positional analysis to see if the waves are widespread enough to have been triggered by an earthquake.

If so, it estimates the size and location, and uses that information to estimate the ground motion that will be experienced in different locations. Based on that, AEA sends out one of two alerts, either “be aware” or “take action.” The “be aware” alert is similar to a standard Android notification, but it plays a distinctive sound and is sent to users further from the epicenter. In contrast, the “take action” warning that’s sent to those nearby will display one of two messages in the appropriate language, either “Protect yourself” or “Drop, cover, and hold on.” It ignores any do-not-disturb settings, takes over the entire screen, and also plays a distinct noise.

How Android phones became an earthquake warning system Read More »

rough-road-to-“energy-dominance”-after-gop-kneecaps-wind-and-solar

Rough road to “energy dominance” after GOP kneecaps wind and solar


Experts argue that Trump’s One Big Beautiful Bill Act will increase costs for consumers.

As the One Big Beautiful Bill Act squeaked its way through Congress earlier this month, its supporters heralded what they described as a new era for American energy and echoed what has become a familiar phrase among President Donald Trump’s supporters.

“Congress has taken decisive action to advance President Trump’s energy dominance agenda,” said American Petroleum Institute President and CEO Mike Sommers in a statement after the House passed the bill.

Republicans concurred, with legislators ranging from Rep. Mariannette Miller-Meeks of Iowa, chair of the Conservative Climate Caucus, to Energy and Commerce Committee Chairman Rep. Brett Guthrie of Kentucky releasing statements after the bill’s passage championing its role in securing “energy dominance.”

The idea and rhetoric of energy dominance has its roots in the first Trump administration, although a formal definition for the phrase is hard to come by. When Trump signed an executive order this February establishing the National Energy Dominance Council, he included expanding energy production, lowering prices and reducing reliance on foreign entities among the council’s goals, while also emphasizing the importance of oil production and liquefied natural gas (LNG) exports.

The phrase has become something of a battle cry among the president’s supporters, with EPA Administrator Lee Zeldin writing in the Washington Examiner on July 8 that “Trump is securing America’s energy future in a modern-day version of how our Founding Fathers secured our freedom.”

“Through American energy dominance, we’re not just powering homes and businesses,” Zeldin said. “We’re Powering the Great American Comeback.”

But despite claims from Republican officials and the fossil fuel industry that the megabill will help secure energy dominance, some experts worry that the legislation’s cuts to wind and solar actually undermine those goals at a time when electricity demand is rising, limiting America’s ability to add new generation capacity, raising prices for consumers and ceding global leadership in the clean energy transition.

Dan O’Brien, a senior modeling analyst at the climate policy think tank Energy Innovation, said the bill will increase domestic production of oil and gas by increasing lease sales for drilling—mostly in the Gulf of Mexico, onshore and in Alaska, O’Brien said.

A January study commissioned by the American Petroleum Institute reported that a legislatively directed offshore oil and natural gas leasing program, which API says is similar to the measures included in the One Big Beautiful Bill Act months later, would increase oil and natural gas production by 140,000 barrels of oil equivalent (BOE) per day by 2034.

That number would rise to 510,000 BOE per day by 2040, the study says.

Losses likely to outweigh the gains

However, O’Brien said the gains America can expect from the fossil fuel industry pale in comparison to losses from renewable energy.

Energy Innovation’s analysis projects that less than 20 gigawatts of additional generation capacity from fossil fuels can be expected by 2035 as a result of the bill, compared to a decrease of more than 360 gigawatts in additional capacity from renewable energy.

The difference between those numbers—a decrease of 344 gigawatts—is roughly equivalent to the energy use of about 100 million homes, O’Brien said.

According to O’Brien, if the One Big Beautiful Bill had not been passed, the U.S. could have expected to add around 1,000 gigawatts of electricity generation capacity in the next 10 years.

But as a result of the bill, “around a third of that will be lost,” O’Brien said.

Those losses largely stem from the bill’s rollback of incentives for wind and solar projects.

“Solar and wind are subject to different—and harsher—treatment under the OBBB than other technologies,” according to the law firm Latham & Watkins. Tax credits for those projects are now set to phase out on a significantly faster timeline, rolling back some of the commitments promised under the Inflation Reduction Act.

Lucero Marquez, the associate director for federal climate policy at the Center for American Progress, said that removing those incentives undercuts America’s ability to achieve its energy needs.

“America needs affordable, reliable and domestically produced energy, which wind and solar does,” Marquez said. “Gutting clean energy incentives really just does not help meet those goals.”

New projects will also be subject to rules “primarily intended to prevent Chinese companies from claiming the tax credits and to reduce reliance on China for supply chains of clean energy technologies,” the Bipartisan Policy Center wrote in an explainer.

However, those rules are “extremely complex” and could lead to “decreased U.S. manufacturing and increased Chinese dominance in these supply chains, contrary to their goal,” according to the think tank.

Surging energy prices

O’Brien said Energy Innovation’s modeling suggests that the loss in additional generation capacity from renewable energies will lead existing power plants, which are more expensive to run than new renewable energy projects would have been, to run more frequently to offset the lack of generation from wind and solar projects not coming online.

The consequences of that, according to O’Brien, are that energy prices will rise, which also means the amount of energy produced will go down in response to decreased demand for the more expensive supply.

An analysis by the REPEAT Project from the Princeton ZERO Lab and Evolved Energy Research similarly predicted increased energy prices for consumers as a result of the bill.

According to that analysis, average household energy costs will increase by over $280 per year by 2035, a more than 13 percent hike.

One of the authors of that analysis, Princeton University professor Jesse D. Jenkins, did not respond to interview requests for this article but previously wrote in an email to Inside Climate News that Republicans’ claims about securing energy dominance through the bill “don’t hold up.”

In an emailed statement responding to questions about those analyses and how their findings align with the administration’s goals of attaining energy dominance, White House assistant press secretary Taylor Rogers wrote that “since Day One, President Trump has taken decisive steps to unleash American energy, which has driven oil production and reduced the cost of energy.”

“The One, Big, Beautiful Bill will turbocharge energy production by streamlining operations for maximum efficiency and expanding domestic production capacity,” Rogers wrote, “which will deliver further relief to American families and businesses.”

In an emailed statement, Rep. Guthrie said that the bill “takes critical steps toward both securing our energy infrastructure and bringing more dispatchable power online.”

“Specifically, the bill does this by repairing and beginning to refill the Strategic Petroleum Reserve that was drained during the Biden-Harris Administration, and through the creation of the Energy Dominance Financing program to support new investments that unleash affordable and reliable energy,” the Energy and Commerce chairman wrote.

Cullen Hendrix, a senior fellow at the Peterson Institute for International Economics, also said that the bill “advances the administration’s stated goal of energy dominance,” but added that it does so “primarily in sunsetting, last-generation technologies, while ceding the renewable energy future to others.”

“It wants lower energy costs at home and more U.S. energy exports abroad—for both economic and strategic reasons … the OBBB delivers on that agenda,” Hendrix said.

Still, Hendrix added that “the United States that emerges from all this may be a bigger player in a declining sector—fossil fuels—and a massively diminished player in a rapidly growing one: renewable energy.”

“It will help promote the Trump administration’s ambitions of fossil dominance (or at least influence) but on pain of helping build a renewable energy sector for the future,” Hendrix wrote. “That is net-negative globally (and locally) from a holistic perspective.”

Adam Hersh, a senior economist at the Economic Policy Institute, argued that he sees a lot in the bill “that is going to move us in the opposite direction from energy dominance.”

“They should have named this bill the ‘Energy Inflation Act,’ because what it’s going to mean is less energy generated and higher costs for households and for businesses, and particularly manufacturing businesses,” Hersh said.

Hersh also said that even if the bill does lead to increased exports of U.S. produced energy, that would have a direct negative impact on costs for consumers at home.

“That’s only going to increase domestic prices for energy, and this has long been known and why past administrations have been reluctant to expand exports of LNG,” Hersh said. “That increased demand for the products and competition for the resources will mean higher energy prices for U.S. consumers and businesses.”

“Pushing against energy dominance”

Frank Maisano, a senior principal at the lobbying firm Bracewell LLP, said that although the bill creates important opportunities for things such as oil and gas leasing and the expansion of geothermal and hydrogen energy, the bill’s supporters “undercut themselves” by limiting opportunities for growth in wind and solar.

“The Biden folks tried to lean heavily onto the energy transition because they wanted to limit emissions,” Maisano said. “They wanted to push oil and gas out and push renewables in.”

Now, “these guys are doing the opposite, which is to push oil and gas and limit wind and solar,” Maisano said. “Neither of those strategies are good strategies. You need to have a combination of all these strategies and all these generation sources, especially on the electricity side, to make it work and to meet the challenges that we face.”

Samantha Gross, director of the Brookings Institution’s Energy Security and Climate Initiative, said that while she isn’t concerned about whether the U.S. will build enough electricity generation to meet the needs of massive consumers like data centers and AI, she is worried that the bill pushes the next generation of that growth further towards fossil fuels.

“I don’t think energy dominance—not just right this instant, but going forward—is just in fossil fuels,” Gross said.

Even beyond the One Big Beautiful Bill, Gross said that many of the administration’s actions run counter to their stated objectives on energy.

“You hear all this talk about energy dominance, but for me it’s just a phrase, because a lot of things that the administration is actually doing are pushing against energy dominance,” Gross said.

“If you think about the tariff policy, for instance, ‘drill, baby, drill’ and a 50 percent tariff on pipeline steel do not go together. Those are pulling in completely opposite directions.”

Aside from domestic energy needs, Gross also worried that the pullback from renewable energy will harm America’s position on the global stage.

“It’s pretty clear which way the world is going,” Gross said. “I worry that we’re giving up … I don’t like the term ‘energy dominance,’ but future leadership in the world’s energy supply by pulling back from those.”

“We’re sort of ceding those technologies to China in a way that is very frustrating to me.”

Yet even in the wake of the bill’s passage, some experts see hope for the future of renewable energy in the U.S.

Kevin Book, managing director at the research firm ClearView Energy Partners, said that the bill “sets up a slower, shallower transition” toward renewable energy. However, he added that he doesn’t think it represents the end of that transition.

“Most of the capacity we’re adding to our grid in America these days is renewable, and it’s not simply because of federal incentives,” Book said. “So if you take away those federal incentives, there were still economic drivers.”

Still, Book said that the final impacts of the Trump administration’s actions on renewable energy are yet to be seen.

“The One Big Beautiful Bill Act is not the end of the story,” Book said. “There’s more coming, either regulatorily and/or legislatively.”

This story originally appeared on Inside Climate News.

Photo of Inside Climate News

Rough road to “energy dominance” after GOP kneecaps wind and solar Read More »

there-could-be-“dark-main-sequence”-stars-at-the-galactic-center

There could be “dark main sequence” stars at the galactic center


Dark matter particle and antiparticle collisions could make some stars immortal.

For a star, its initial mass is everything. It determines how quickly it burns through its hydrogen and how it will evolve once it starts fusing heavier elements. It’s so well understood that scientists have devised a “main sequence” that acts a bit like a periodic table for stars, correlating their mass and age with their properties.

The main sequence, however, is based on an assumption that’s almost always true: All of the energy involved comes from the gravity-driven fusion of lighter elements into heavier ones. However, three astrophysicists consider an alternative source of energy that may apply at the very center of our galaxy— energy released when dark matter particles and antiparticles collide and annihilate. While we don’t even know that dark matter can do that, it’s a hypothetical with some interesting consequences, like seemingly immortal stars, and others that move backward along the main sequence path.

Dark annihilations

We haven’t figured out what dark matter is, but there are lots of reasons to think that it is comprised of elementary particles. And, if those behave like all of the particles we understand well, then there will be both regular and antimatter versions. Should those collide, they should annihilate each other, releasing energy in the process. Given dark matter’s general propensity not to interact with anything, these collisions will be extremely rare except in locations with very high dark matter concentrations.

The only place that’s likely to happen is at the very center of our galaxy. And, for a while, there was an excess of radiation coming from the galactic core that people thought might be due to dark matter annihilations, although it eventually turned out to have a more mundane explanation.

At the extreme densities found within a light year of the supermassive black hole at the center of our galaxy, concentrations are high enough that these collisions could be a major source of energy. And so astronomers have considered what all that energy might do to stars that end up in a black hole’s orbit, finding that under the right circumstances, dark matter destruction could provide more energy to a star than fusion.

That prompted three astrophysicists (Isabelle John, Rebecca Leane, and Tim Linden) to try to look at things in an organized fashion, modeling a “dark main sequence” of stars as they might exist within a close proximity to the Milky Way’s center.

The intense gravity and radiation found near the galaxy’s core mean that stars can’t form there. So, anything that’s in a tight orbit had formed somewhere else before gravitational interactions had pushed it into the gravitational grasp of the galaxy’s central black hole. The researchers used a standard model of star evolution to build a collection of moderate-sized stars, from one to 20 solar masses at 0.05 solar mass intervals. These are allowed to ignite fusion at their cores and then shift into a dark-matter-rich environment.

Since we have no idea how often dark matter particles might run into each other, John, Leane, and Linden use two different collision frequencies. These determine how much energy is imparted into these stars by dark matter, which the researchers simply add as a supplement to the amount of fusion energy the stars are producing. Then, the stars are allowed to evolve forward in time.

(The authors note that stars that are thrown into the grasp of a supermassive black hole tend to have very eccentric orbits, so they spend a lot of time outside the zone where dark matter collisions take place with a significant frequency. So, what they’ve done is the equivalent of having these stars experience the energy input given their average orbital distance from the galaxy’s core. In reality, a star would spend some years with higher energy input and some years with lower input as it moves about its orbit.)

Achieving immortality

The physics of what happens is based on the same balance of forces that govern fusion-powered stars, but produces some very strange results. Given only fusion power, a star will exist at a balance point. If gravity compresses it, fusion speeds up, more energy is released, and that energy causes the star to expand outward again. That causes the density drop, slowing fusion back down again.

The dark matter annihilations essentially provide an additional source of energy that stays constant regardless of what happens to the star’s density. At the low end of the mass range the researchers considered, this can cause the star to nearly shut off fusion, essentially looking like a far younger star than it actually is. That has the effect of causing the star to move backward along the main sequence diagram.

The researchers note that even lighter stars could essentially get so much additional energy that they can’t hold together and end up dissipating, something that’s been seen in models run by other researchers.

As the mass gets higher, stars reach the point where they essentially give up on fusion and get by with nothing but dark matter annihilations. They have enough mass to hold together gravitationally, but end up too diffused for fusion to continue. And they’ll stay that way as long as they continue to get additional injections of energy. “A star like this might look like a young, still-forming star,” the authors write, “but has features of a star that has undergone nuclear fusion in the past and is effectively immortal.”

John, Leane, and Linden find that the higher mass stars remain dense enough for fusion to continue even in proximity to the galaxy’s black hole. But the additional energy kept that fusion happening at a moderate rate. They proceeded through the main sequence, but at a pace that was exceptionally slow, so that running the simulation for a total of 10 billion years didn’t see them change significantly.

The other strange thing here is that all of this is very sensitive to how much dark matter annihilation is taking place. A star that’s “immortal” at one average distance will progress slowly through the main sequence if its average distance is a light year further out. Similarly, stars that are too light to survive at one location will hold together if they are a bit further from the supermassive black hole.

Is there anything to this?

The big caution is that this work only looks at the average input from dark matter annihilation. In reality, a star that might be immortal at its average distance will likely spend a few years too hot to hold together, and then several years cooling off in conditions that should allow fusion to reignite. It would be nice to see a model run with this sort of pulsed input, perhaps basing it on the orbits of some of the stars we’ve seen that get close to the Milky Way’s central black hole.

In the meantime, John, Leane, and Linden write that their results are consistent with some of the oddities that are apparent in the stars we’ve observed at the galaxy’s center. These have two distinctive properties: They appear heavier than the average star in the Milky Way, and all seem to be quite young. If there is a “dark main sequence,” then the unusual heft can be explained simply by the fact that lower mass stars end up dissipating due to the additional energy. And the model would suggest that these stars simply appear to be young because they haven’t undergone much fusion.

The researchers suggest that we could have a clearer picture if we were able to spend enough time observing the stars at our galaxy’s core with a large enough telescope, allowing us to understand their nature and orbits.

Physical Review D, 2025. DOI: Not yet available  (About DOIs).

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

There could be “dark main sequence” stars at the galactic center Read More »