Science

spacex’s-most-flown-reusable-rocket-will-go-for-its-20th-launch-tonight

SpaceX’s most-flown reusable rocket will go for its 20th launch tonight

File photo of a Falcon 9 rocket rolling out of its hangar at Cape Canaveral Space Force Station, Florida.

Enlarge / File photo of a Falcon 9 rocket rolling out of its hangar at Cape Canaveral Space Force Station, Florida.

For the first time, SpaceX will launch one of its reusable Falcon 9 boosters for a 20th time Friday night on a flight to deliver 23 more Starlink Internet satellites to orbit.

This milestone mission is scheduled to lift off at 9: 22 pm EDT Friday (01: 22 UTC Saturday) from Space Launch Complex 40 (SLC-40) at Cape Canaveral Space Force Station, Florida. Forecasters from the US Space Force predict “excellent” weather for the primetime launch.

Falcon 9 will blaze a familiar trail into space, following the same profile as dozens of past Starlink missions.

The rocket’s first-stage booster will shut off its nine kerosene-fueled Merlin engines about two-and-a-half minutes into the flight, reaching a top speed of more than 5,000 mph (8,000 km per hour). The first stage will detach from the Falcon 9’s upper stage, which will continue firing into orbit. The 15-story-tall Falcon 9 booster, meanwhile, will follow an arcing trajectory before braking for a vertical landing on a drone ship floating in the Atlantic Ocean near the Bahamas.

The 23 flat-packed Starlink spacecraft will deploy from the upper stage a little more than an hour after liftoff, bringing the total number of Starlinks in low-Earth orbit to more than 5,800 spacecraft.

A hunger for launch

Pretty much every day, SpaceX is either launching a rocket or rolling one out of the hangar to the launch pad. At this pace, SpaceX is redefining what is routine in the space industry, but the rapid-fire launch rate also means the company is continually breaking records, mostly its own.

Friday night’s launch will break another one of those records. This first-stage booster, designated by the tail number B1062, has flown 19 times since its first flight in November 2020. The booster will now be the first in SpaceX’s inventory to go for a 20th flight, breaking a tie with three other rockets as the company’s fleet leader.

When SpaceX debuted the latest version of its Falcon 9 rocket, the Falcon 9 Block 5, officials said the reusable first stage could fly 10 times with minimal refurbishment and perhaps additional flights with a more extensive overhaul. Now, SpaceX is certifying Falcon 9 boosters for 40 flights.

This particular rocket has not undergone any extended maintenance or long-term grounding. It has flown an average of once every two months since debuting three-and-a-half years ago. So the 20-flight milestone SpaceX will achieve Friday night means this rocket has doubled its original design life and, at the same time, has reached the halfway point of its extended service life.

In its career, this booster has launched eight people and 530 spacecraft, mostly Starlinks. The rocket’s first two flights launched GPS navigation satellites for the US military, then it launched two commercial human spaceflight missions with Dragon crew capsules. These were the all-private Inspiration4 mission and Axiom Mission 1, the first fully commercial crew flight to the International Space Station.

A SpaceX Falcon 9 rocket lifts off Sunday, April 7, on the Bandwagon 1 rideshare mission.

Enlarge / A SpaceX Falcon 9 rocket lifts off Sunday, April 7, on the Bandwagon 1 rideshare mission.

Remarkably, this will be the sixth Falcon 9 launch in less than eight days, more flights than SpaceX’s main US rival, United Launch Alliance, has launched in 17 months.

It will be the 38th Falcon 9 launch of the year and the 111th flight of a Falcon 9 or Falcon Heavy rocket—the 114th launch by SpaceX overall—in the last 365 days. More than a third of SpaceX’s Falcon 9 or Falcon Heavy missions, a number that will stand at 332 after Friday night’s flight, have launched in the past year.

This month, for the first time, SpaceX demonstrated it could launch two Falcon 9 rockets in less than five days from the company’s launch pad at Vandenberg Space Force Base, California. SpaceX has also cut the turnaround time between Falcon 9 rockets at Launch Complex 39A at NASA’s Kennedy Space Center. The company’s most-used launch pad, SLC-40, can handle two Falcon 9 flights in less than four days.

It’s not just launch pad turnaround. SpaceX uses its drone ships—two based in Florida and one in California—for most Falcon 9 landings. In order to meet the appetite for Falcon 9 launches, SpaceX is getting rockets back to port and re-deploying drone ships back to sea at a faster rate.

SpaceX’s most-flown reusable rocket will go for its 20th launch tonight Read More »

the-space-force-is-planning-what-could-be-the-first-military-exercise-in-orbit

The Space Force is planning what could be the first military exercise in orbit

Artist's illustration of two satellites performing rendezvous and proximity operations in low-Earth orbit.

Enlarge / Artist’s illustration of two satellites performing rendezvous and proximity operations in low-Earth orbit.

The US Space Force announced Thursday it is partnering with two companies, Rocket Lab and True Anomaly, for a first-of-its-kind mission to demonstrate how the military might counter “on-orbit aggression.”

On this mission, a spacecraft built and launched by Rocket Lab will chase down another satellite made by True Anomaly, a Colorado-based startup. “The vendors will exercise a realistic threat response scenario in an on-orbit space domain awareness demonstration called Victus Haze,” the Space Force’s Space Systems Command said in a statement.

This threat scenario could involve a satellite performing maneuvers that approach a US spacecraft or a satellite doing something else unusual or unexpected. In such a scenario, the Space Force wants to have the capability to respond, either to deter an adversary from taking action or to defend a US satellite from an attack.

Going up to take a look

“When another nation puts an asset up into space and we don’t quite know what that asset is, we don’t know what its intent is, we don’t know what its capabilities are, we need the ability to go up there and figure out what this thing is,” said Gen. Michael Guetlein, the Space Force’s vice chief of space operations.

This is what the Space Force wants to demonstrate with Victus Haze. For this mission, True Anomaly’s spacecraft will launch first, posing as a satellite from a potential adversary, like China or Russia. Rocket Lab will have a satellite on standby to go up and inspect True Anomaly’s spacecraft and will launch it when the Space Force gives the launch order.

“Pretty sporty,” said Even Rogers, co-founder and CEO of True Anomaly.

Then, if all goes according to plan, the two spacecraft will switch roles, with True Anomaly’s Jackal satellite actively maneuvering around Rocket Lab’s satellite. According to the Space Force, True Anomaly and Rocket Lab will deliver their spacecraft no later than the fall of 2025.

“If a near-peer competitor makes a movement, we need to have it in our quiver to make a counter maneuver, whether that be go up and do a show of force or go up and do space domain awareness or understand the characterization of the environment—what’s going on?” Guetlein said.

Victus Haze is the next in a series of military missions dedicated to validating Tactically Responsive Space (TacRS) capabilities. With these efforts, the Space Force and its commercial partners have shown how they can compress the time it takes to prepare and launch a satellite.

Last year, the Space Force partnered with Firefly Aerospace and Millennium Space Systems on the Victus Nox mission. The Victus Nox satellite was built and tested in less than a year and then readied for launch in less than 60 hours. Firefly successfully launched the spacecraft on its Alpha rocket 27 hours after receiving launch orders from the Space Force, a remarkable achievement in an industry where satellites take years to build and launch campaigns typically last weeks or months.

One of True Anomaly's first two Jackal

Enlarge / One of True Anomaly’s first two Jackal “autonomous orbital vehicles,” which launched in March on a SpaceX rideshare mission.

“We no longer have the luxury of time to wait years, even 10 or 15 years, to deliver some of these capabilities.” Guetlein said in a discussion in January hosted by the Center for Strategic and International Studies. “A tactically relevant timeline is a matter of weeks, days, or even hours.”

“Victus Haze is about continuing to break those paradigms and to show how we would rapidly put up a space domain awareness capability and operate it in real time against a threat,” Guetlein said.

The Victus Haze mission is more complicated than Victus Nox, involving two prime contractors, two spacecraft, and two rocket launches from different spaceports, all timed to occur with short timelines “to keep the demonstration as realistic as possible,” a Space Force spokesperson told Ars.

“This demonstration will ultimately prepare the United States Space Force to provide future forces to combatant commands to conduct rapid operations in response to adversary on-orbit aggression,” Space Systems Command said in a statement.

The Space Force is planning what could be the first military exercise in orbit Read More »

researchers-find-a-new-organelle-evolving

Researchers find a new organelle evolving

Image of a single celled algae.

Enlarge / A photo of Braarudosphaera bigelowii with the nitroplast indicated by an arrowhead.

The complex cells that underlie animals and plants have a large collection of what are called organelles—compartments surrounded by membranes that perform specialized functions. Two of these were formed through a process called endosymbiosis, in which a once free-living organism is incorporated into a cell. These are the mitochondrion, where a former bacteria now handles the task of converting chemical energy into useful forms, and the chloroplast, where photosynthesis happens.

The fact that there are only a few cases of organelles that evolved through endosymbiosis suggests that it’s an extremely rare event. Yet researchers may have found a new case, in which an organelle devoted to fixing nitrogen from the atmosphere is in the process of evolving. The resulting organelle, termed a nitroplast, is still in the process of specialization.

Getting nitrogen

Nitrogen is one of the elements central to life. Every DNA base, every amino acid in a protein contains at least one, and often several, nitrogen atoms. But nitrogen is remarkably difficult for life to get ahold of. N2 molecules might be extremely abundant in our atmosphere, but they’re extremely difficult to break apart. The enzymes that can, called nitrogenases, are only found in bacteria, and they don’t work in the presence of oxygen. Other organisms have to get nitrogen from their environment, which is one of the reasons we use so much energy to supply nitrogen fertilizers to many crops.

Some plants (notably legumes), however, can obtain nitrogen via a symbiotic relationship with bacteria. These plants form specialized nodules that provide a habitat for the nitrogen-producing bacteria. This relationship is a form of endosymbiosis, where microbes take up residence inside an organism’s body or cells, with each organism typically providing chemicals that the other needs.

In more extreme cases, endosymbiosis can become obligatory. with neither organism able to survive without the other. In many insects, endosymbionts are passed on to offspring during the production of eggs, and the microbes themselves often lack key genes that would allow them to live independently.

But even states like this fall short of the situation found in mitochondria and chloroplasts. These organelles are thoroughly integrated into the cell, being duplicated and distributed when cells divide. They also have minimal genomes, with most of their proteins made by the cell and imported into the organelles. This level of integration is the product of over a billion years of evolution since the endosymbiotic relationship first started.

It’s also apparently a difficult process, based on its apparent rarity. Beyond mitochondria and chloroplasts, there’s only one confirmed example of a more recent endosymbiosis between eukaryotes and a bacterial species. (There are a number of cases where eukaryotic algae have been incorporated by other eukaryotes. Because these cells have compatible genetics, this occurs with a higher frequency.)

That’s why finding another example is such an exciting prospect.

Researchers find a new organelle evolving Read More »

sketchy-botox-shots-spark-multistate-outbreak-of-botulism-like-condition

Sketchy Botox shots spark multistate outbreak of botulism-like condition

Yikes —

So far at least six people in two states have fallen ill; four of them were hospitalized.

A woman in New Jersey receiving a Botox treatment at a Botox party in a New Jersey salon hosted by a radio station.

Enlarge / A woman in New Jersey receiving a Botox treatment at a Botox party in a New Jersey salon hosted by a radio station.

Sketchy cosmetic injections of what seem to be counterfeit Botox are behind a multistate outbreak of botulism-like illnesses, state health officials report.

So far, at least six people have fallen ill in two states: four in Tennessee and two in Illinois. Four of the six people required hospitalization for their condition (two in Tennessee and both cases in Illinois).

The Centers for Disease Control and Prevention is reportedly planning to nationwide alert to notify clinicians of the potentially counterfeit Botox and advise them to be on the lookout for botulism-like illnesses. The agency did not immediately respond to Ars’ request for information.

Botox is a regulated drug product that contains purified, controlled quantities of the botulinum neurotoxin, which is made by certain Clostridium bacterial species, especially Clostridium botulinum. The toxin causes muscle paralysis by blocking the release of a neurotransmitter. When people are exposed to the toxin from wound infections or by accidentally eating contaminated foods, it can lead to full paralysis, including in muscles used for breathing. But, the toxin can also be used safely for cosmetic procedures to smooth facial wrinkles—when well-regulated and approved doses administered by licensed medical professionals are used.

All of those important conditions for use did not seem to be met in the cases identified so far. Tennessee reported that its four cases were linked to injections given in “non-medical settings such as homes or cosmetic spas.” Investigators found that the injections were of “products with unclear origin” and that information collected so far suggests the products were counterfeit.

The two people sickened in Illinois, meanwhile, both received injections from a nurse in LaSalle County who was “performing work outside her authority.” State officials said the injections were of Botox or a similar, possibly counterfeit product.

The early symptoms of botulism can include double or blurred vision, drooping eyelids, slurred speech, difficulty swallowing, dry mouth, and difficulty breathing, Tennessee health officials noted. After that, people may suffer descending, symmetric muscle weakness that progresses over hours to days, requiring hospitalization and treatment with an anti-toxin.

Illinois officials reported that the cases reported similar symptoms, such as blurred or double vision, droopy face, fatigue, shortness of breath, difficulty breathing, and a hoarse voice, after getting their injections.

“Illinois residents should exercise caution when considering cosmetic treatment,” Illinois Department of Public Health Director Sameer Vohra said in a statement. “Receiving these treatments in unlicensed, unapproved settings can put you or your loved ones at serious risk for health problems. Please only seek cosmetic services under the care of licensed professionals trained to do these procedures and who use FDA approved products. If you are experiencing any health problems after a recent cosmetic treatment, please contact your healthcare provider immediately for help and assistance.”

Sketchy Botox shots spark multistate outbreak of botulism-like condition Read More »

computer-scientist-wins-turing-award-for-seminal-work-on-randomness

Computer scientist wins Turing Award for seminal work on randomness

Foundational questions —

Avi Wigderson helped prove that randomness is not required for efficient computation.

Avi Wigderton of the Institute for Advanced Study in Princeton is the recipient of the 2023 A.M. Turing Award.

Enlarge / Avi Wigderton of the Institute for Advanced Study in Princeton is the recipient of the 2023 A.M. Turing Award.

Andrea Kane/Institute for Advanced Study

Computational scientist and mathematician Avi Wigderson of the Institute for Advanced Study (IAS) at Princeton University has won the 2023 A.M. Turing Award. The prize, which is given annually by the Association for Computing Machinery (ACM) to a computer scientist for their contributions to the field, comes with $1 million thanks to Google. It is named in honor of the British mathematician Alan Turing, who helped develop a theoretical foundation for understanding machine computation.

Wigderson is being honored “for foundational contributions to the theory of computation, including reshaping our understanding of the role of randomness in computation and for his decades of intellectual leadership in theoretical computer science.” He also won the prestigious Abel Prize (essentially the Nobel for mathematics) in 2021 for his work in theoretical computer science—the first person to be so doubly honored.

“Avi has made fundamental contributions to the theory of computation from parallel algorithms to cryptography to absolutely all aspects of complexity theory,” said Shafi Goldwasser, director of the Simons Institute for the Theory of Computing, who won the 2012 Turing Award. “His numerous contributions over decades to the areas of derandomization and pseudorandomness has led us to a deep understanding of the deep role of randomness in computing.”

Born in Haifa, Israel, Wigderson was the son of an electrical engineer and a nurse. His father passed his own love of solving puzzles and mathematics to his son. Wigderson was an undergraduate at the Technion (Israeli Institute of Technology) and went on to earn his PhD in computer science from Princeton in 1983. He held a few short-term positions before joining the faculty of Hebrew University three years later. He has been with the IAS since 1999 and a full-time resident since 2003.

Wigderton is also recognized as mentor to the next generation of promising young researchers.

Enlarge / Wigderton is also recognized as mentor to the next generation of promising young researchers.

Andrea Kane/Institute for Advanced Study

While computers are fundamentally deterministic systems, researchers discovered in the 1970s that they could enrich their algorithms by letting them make random choices during computation in hopes of improving their efficiency. And it worked. It was easier for computer scientists to start with a randomized version of a deterministic algorithm and then “de-randomize” it to get an algorithm that was deterministic.

In 1994, Wigderson co-authored a seminal paper on hardness versus randomness with Noam Nisan, demonstrating that as useful as randomness can be, it is not a necessity. Essentially, “Every probabilistic algorithm that’s efficient can be replaced by a deterministic one, so you don’t really need [randomness],” he said. “The power believed to be in probabilistic algorithms doesn’t exist.” He subsequently coauthored two more highly influential papers further extending that work on randomness, among many others.

Wigderson’s 2019 book, Mathematics and Computation: A Theory Revolutionizing Technology and Science, is available for download on his website. “One central theme is that computation happens everywhere, not just in computers,” Wigderson told Ars. “It is part of the processes in our brain, the way we can talk and the cells in our body, but also trees growing or weather and celestial things. In all these natural processes, there are the laws of nature, which are local, and they evolve systems. Like in a computer, there are very simple rules, and you start with a problem and discover a complex solution to it. So, the methodology is applicable to essentially any science process or study. There are fantastic collaborations with statistical physics, with quantum physics, with computational biology, with economics, with social science—lots of beautiful, extremely fruitful connections.”

Avi Wigderson in conversation with David Nirenberg, director of the Institute of Advanced Study.

Wigderson’s own research is purely theoretical. “I’m not motivated by applications,” he said. “But I know that fundamental work, we find uses. Think about Alan Turing. He wrote a mathematical paper in logic in an obscure journal about Entscheidungsproblem. It was not motivated by application. But this is what starts computer science. He himself recognized the model he was suggesting is so simple, that we can just start building it.”

That said, he does confess to being pleasantly surprised by the eventual application of his work on zero-knowledge interactive proofs in the mid-1980s. With Silvio Micali and Oded Goldreich, Wigderson extended Micali’s earlier work on interactive proofs to NP problems, concluding that the solution to every such problem can also be proved with a zero-knowledge proof.

“Basically we discovered that everything that can be proved, can be proved, without revealing to the person who is verifying the proof any knowledge they didn’t know,” said Wigderson. “The motivation came from cryptography, where I want to prove to you that I selected my secret key in the way the protocol requires, but I don’t want to tell you what my secret key is. The result is very general and while very satisfying, it was a theoretical solution that it seemed to me very complicated to implement. But now variants of it are part of blockchains and other crypto systems. So sometimes we are surprised by the diligence of people who really care about practice and really want to see things working.”

Wigderson remains as actively curious as ever and is particularly excited about getting to collaborate with fresh groups of postdocs every year. One current project concerns convex optimization in non-Euclidean settings. Convex optimization has been broadly applied in machine learning, signal processing, computer vision, and automatic control systems, for example. Wigderson’s project seeks to “generalize the theory to manifolds, to structures that appear in quite a variety of mathematical and physics areas—quantum information theory, invariant theory, and definitely in computer science,” he said. “It also appears in analysis, for proving inequalities, and in algebra for proving identities. It’s pretty broad, and I’m very excited about it.”

Computer scientist wins Turing Award for seminal work on randomness Read More »

the-urban-rural-death-divide-is-getting-alarmingly-wider-for-working-age-americans

The urban-rural death divide is getting alarmingly wider for working-age Americans

Growing divide —

The cause is unclear, but poverty and worsening health care access are likely factors.

Dental students, working as volunteers, attend to patients at a Remote Area Medical (RAM) mobile dental and medical clinic on October 7, 2023 in Grundy, Virginia. More than a thousand people were expected to seek free dental, medical and vision care at the two-day event in the rural and financially struggling area of western Virginia.

Enlarge / Dental students, working as volunteers, attend to patients at a Remote Area Medical (RAM) mobile dental and medical clinic on October 7, 2023 in Grundy, Virginia. More than a thousand people were expected to seek free dental, medical and vision care at the two-day event in the rural and financially struggling area of western Virginia.

In the 1960s and 1970s, people who lived in rural America fared a little better than their urban counterparts. The rate of deaths from all causes was a tad lower outside of metropolitan areas. In the 1980s, though, things evened out, and in the early 1990s, a gap emerged, with rural areas seeing higher death rates—and the gap has been growing ever since. By 1999, the gap was 6 percent. In 2019, just before the pandemic struck, the gap was over 20 percent.

While this news might not be surprising to anyone following mortality trends, a recent analysis by the Department of Agriculture’s Economic Research Service drilled down further, finding a yet more alarming chasm in the urban-rural divide. The report focused in on a key indicator of population health: mortality among prime working-age adults (people ages 25 to 54) and only their natural-cause mortality (NCM) rates—deaths among 100,000 residents from chronic and acute diseases—clearing away external causes of death, including suicides, drug overdoses, violence, and accidents. On this metric, rural areas saw dramatically worsening trends compared with urban populations.

Change in age-adjusted, prime working-age, external- and natural-cause mortality rates for metro and nonmetro areas, 1999–2001 to 2017–2019.

Enlarge / Change in age-adjusted, prime working-age, external- and natural-cause mortality rates for metro and nonmetro areas, 1999–2001 to 2017–2019.

The federal researchers compared NCM rates of prime working-age adults in two three-year periods: 1999 to 2001, and 2017 to 2019. In 1999, the NCM rate in 25- to 54-year-olds in rural areas was 6 percent higher than the NCM rate of this age group in urban areas. In 2019, the gap had grown to a whopping 43 percent. In fact, prime working-age adults in rural areas was the only age group in the US that saw an increased NCM rate in this time period. In urban areas, working-age adults’ NCM rate declined.

Broken down further, the researchers found that non-Hispanic White people in rural areas had the largest NCM rate increases when compared to their urban counterparts. Among just rural residents, American Indian and Alaska Native (AIAN) and non-Hispanic White people registered the largest increases between the two time periods. In both groups, women had the largest increases. Regionally, rural residents in the South had the highest NCM rate, with the rural residents in the Northeast maintaining the lowest rate. But again, across all regions, women saw larger increases than men.

  • Age-adjusted prime working-age natural-cause mortality rates, metro and nonmetro areas, 1999–2019.

  • Change in natural-cause, crude mortality rates by 5-year age cohorts for metro and nonmetro areas, 1999–2001 to 2017–2019.

Among all rural working-age residents, the leading natural causes of death were cancer and heart disease—which was true among urban residents as well. But, in rural residents, these conditions had significantly higher mortality rates than what was seen in urban residents. In 2019, women in rural areas had a mortality rate from heart disease that was 69 percent higher than their urban counterparts, for example. Otherwise, lung disease- and hepatitis-related mortality saw the largest increases in prevalence in rural residents compared with urban peers. Breaking causes down by gender, rural working-age women saw a 313 percent increase in mortality from pregnancy-related conditions between the study’s two time periods, the largest increase of the mortality causes. For rural working-age men, the largest increase was seen from hypertension-related deaths, with a 132 percent increase between the two time periods.

Nonmetro age-adjusted, prime working-age mortality rates by sex for 15 leading natural causes of death, 1999–2001 and 2017–2019, as percent above or below corresponding metro rates.

Enlarge / Nonmetro age-adjusted, prime working-age mortality rates by sex for 15 leading natural causes of death, 1999–2001 and 2017–2019, as percent above or below corresponding metro rates.

The study, which drew from CDC death certificate and epidemiological data, did not explore the reasons for the increases. But, there are a number of plausible factors, the authors note. Rural areas have higher rates of poverty, which contributes to poor health outcomes and higher probabilities of death from chronic diseases. Rural areas also have differences in health behaviors compared with urban areas, including higher incidences of smoking and obesity. Further, rural areas have less access to health care and fewer health care resources. Both rural hospital closures and physician shortages in rural areas have been of growing concern among health experts, the researchers note. Last, some of the states with higher rural mortality rates, particularly those in the South, have failed to implement Medicaid expansions under the 2010 Affordable Care Act, which could help improve health care access and, thus, mortality rates among rural residents.

The urban-rural death divide is getting alarmingly wider for working-age Americans Read More »

epa’s-pfas-rules:-we’d-prefer-zero,-but-we’ll-accept-4-parts-per-trillion

EPA’s PFAS rules: We’d prefer zero, but we’ll accept 4 parts per trillion

Approaching zero —

For two chemicals, any presence in water supplies is too much.

A young person drinks from a public water fountain.

Today, the Environmental Protection Agency announced that it has finalized rules for handling water supplies that are contaminated by a large family of chemicals collectively termed PFAS (perfluoroalkyl and polyfluoroalkyl substances). Commonly called “forever chemicals,” these contaminants have been linked to a huge range of health issues, including cancers, heart disease, immune dysfunction, and developmental disorders.

The final rules keep one striking aspect of the initial proposal intact: a goal of completely eliminating exposure to two members of the PFAS family. The new rules require all drinking water suppliers to monitor for the chemicals’ presence, and the EPA estimates that as many as 10 percent of them may need to take action to remove them. While that will be costly, the health benefits are expected to exceed those costs.

Going low

PFAS are a collection of hydrocarbons where some of the hydrogen atoms have been swapped out for fluorine. This swap retains the water-repellant behavior of hydrocarbons while making the molecules highly resistant to breaking down through natural processes—hence the forever chemicals moniker. They’re widely used in water-resistant clothing and non-stick cooking equipment and have found uses in firefighting foam. Their widespread use and disposal has allowed them to get into water supplies in many locations.

They’ve also been linked to an enormous range of health issues. The EPA expects that its new rules will have the following effects: fewer cancers, lower incidence of heart attacks and strokes, reduced birth complications, and a drop in other developmental, cardiovascular, liver, immune, endocrine, metabolic, reproductive, musculoskeletal, and carcinogenic effects. These are not chemicals you want to be drinking.

The striking thing was how far the EPA was willing to go to get them out of drinking water. For two chemicals, Perfluorooctanoic acid (PFOA) and Perfluorooctanesulfonic acid (PFOS), the Agency’s ideal contamination level is zero. Meaning no exposure to these chemicals whatsoever. Since current testing equipment is limited to a sensitivity of four parts per trillion, the new rules settle for using that as the standard. Other family members see limits of 10 parts per trillion, and an additional limit sets a cap on how much total exposure is acceptable when a mixture of PFAS is present.

Overall, the EPA estimates that there are roughly 66,000 drinking water suppliers that will be subject to these new rules. They’ll be given three years to get monitoring and testing programs set up and provided access to funds from the Bipartisan Infrastructure Law to help offset the costs. All told, over $20 billion will be made available for the testing and improvements to equipment needed for compliance.

The Agency expects that somewhere between 4,000 and 6,500 of those systems will require some form of decontamination. While those represent a relatively small fraction of the total drinking water suppliers, it’s estimated that nearly a third of the US’ population will see its exposure to PFAS drop. Several technologies, including reverse osmosis and exposure to activated carbon, are capable of pulling PFAS from water, and the EPA is leaving it up to each supplier to choose a preferred method.

Cost/benefit

All of that monitoring and decontamination will not come cheap. The EPA estimates that the annual costs will be in the neighborhood of $150 billion, which will likely be passed on to consumers via their water suppliers. Those same consumers, however, are expected to see health benefits that outweigh these costs. EPA estimates place the impact of just three of the health improvements (cancer, cardiovascular, and birth complications) at $150 billion annually. Adding all the benefits of the rest of the health improvements should greatly exceed the costs.

The problem, of course, is that people will immediately recognize the increased cost of their water bills, while the savings of medical problems that don’t happen are much more abstract.

Overall, the final plan is largely unchanged from the EPA’s original proposal. The biggest differences are that the Agency is giving water suppliers more time to comply, somewhat more specific exposure allowances, and the ability of suppliers with minimal contamination to go longer in between submitting test results.

“People will live longer, healthier lives because of this action, and the benefits justify the costs,” the agency concluded in announcing the new rules.

EPA’s PFAS rules: We’d prefer zero, but we’ll accept 4 parts per trillion Read More »

after-a-fiery-finale,-the-delta-rocket-family-now-belongs-to-history

After a fiery finale, the Delta rocket family now belongs to history

Delta 389 —

“It is bittersweet to see the last one, but there are great things ahead.”

In this video frame from ULA's live broadcast, three RS-68A engines power the Delta IV Heavy rocket into the sky over Cape Canaveral, Florida.

Enlarge / In this video frame from ULA’s live broadcast, three RS-68A engines power the Delta IV Heavy rocket into the sky over Cape Canaveral, Florida.

United Launch Alliance

The final flight of United Launch Alliance’s Delta IV Heavy rocket took off Tuesday from Cape Canaveral, Florida, with a classified spy satellite for the National Reconnaissance Office.

The Delta IV Heavy, one of the world’s most powerful rockets, launched for the 16th and final time Tuesday. It was the 45th and last flight of a Delta IV launcher and the final rocket named Delta to ever launch, ending a string of 389 missions dating back to 1960.

United Launch Alliance (ULA) tried to launch this rocket on March 28 but aborted the countdown about four minutes prior to liftoff due to trouble with nitrogen pumps at an off-site facility at Cape Canaveral. The nitrogen is necessary for purging parts inside the Delta IV rocket before launch, reducing the risk of a fire or explosion during the countdown.

The pumps, operated by Air Liquide, are part of a network that distributes nitrogen to different launch pads at the Florida spaceport. The nitrogen network has caused problems before, most notably during the first launch campaign for NASA’s Space Launch System rocket in 2022. Air Liquide did not respond to questions from Ars.

A flawless liftoff

With a solution in place, ULA gave the go-ahead for another launch attempt Tuesday. After a smooth countdown, the final Delta IV Heavy lifted off from Cape Canaveral Space Force Station at 12: 53 pm EDT (16: 53 UTC).

Three hydrogen-fueled RS-68A engines made by Aerojet Rocketdyne flashed to life in the final seconds before launch and throttled up to produce more than 2 million pounds of thrust. The ignition sequence was accompanied by a dramatic hydrogen fireball, a hallmark of Delta IV Heavy launches, that singed the bottom of the 235-foot-tall (71.6-meter) rocket, turning a patch of its orange insulation black. Then, 12 hold-down bolts fired and freed the Delta IV Heavy for its climb into space with a top-secret payload for the US government’s spy satellite agency.

Heading east from Florida’s Space Coast, the Delta IV Heavy appeared to perform well in the early phases of its mission. After fading from view from ground-based cameras, the rocket’s two liquid-fueled side boosters jettisoned around four minutes into the flight, a moment captured by onboard video cameras. The core stage engine increased power to fire for a couple more minutes. Nearly six minutes after liftoff, the core stage was released, and the Delta IV upper stage took over for a series of burns with its RL10 engine.

At that point, ULA cut the public video and audio feeds from the launch control center, and the mission flew into a news blackout. The final portions of rocket launches carrying National Reconnaissance Office (NRO) satellites are usually performed in secret.

In all likelihood, the Delta IV Heavy’s upper stage was expected to fire its engine at least three times to place the classified NRO satellite into a circular geostationary orbit more than 22,000 miles (nearly 36,000 kilometers) over the equator. In this orbit, the spacecraft will move in lock-step with the planet’s rotation, giving the NRO’s newest spy satellite constant coverage over a portion of the Earth.

It will take about six hours for the rocket’s upper stage to deploy its payload into this high-altitude orbit and only then will ULA and the NRO declare the launch a success.

Eavesdropping from space

While the payload is classified, experts can glean a few insights from the circumstances of its launch. Only the largest NRO spy satellites require a launch on a Delta IV Heavy, and the payload on this mission is “almost certainly” a type of satellite known publicly as an “Advanced Orion” or “Mentor” spacecraft, according to Marco Langbroek, an expert Dutch satellite tracker.

The Advanced Orion satellites require the combination of the Delta IV Heavy rocket’s lift capability, long-duration upper stage, and huge, 65-foot-long (19.8-meter) trisector payload fairing, the largest payload enclosure of any operational rocket. In 2010, Bruce Carlson, then-director of the NRO, referred to the Advanced Orion platform as the “largest satellite in the world.”

When viewed from Earth, these satellites shine with the brightness of an eighth-magnitude star, making them easily visible with small binoculars despite their distant orbits, according to Ted Molczan, a skywatcher who tracks satellite activity.

“The satellites feature a very large parabolic unfoldable mesh antenna, with estimates of the size of this antenna ranging from 20 to 100 (!) meters,” Langbroek writes on his website, citing information leaked by Edward Snowden.

The purpose of these Advanced Orion satellites, each with mesh antennas that unfurl to a diameter of up to 330 feet (100 meters), is to listen in on communications and radio transmissions from US adversaries, and perhaps allies. Six previous Delta IV Heavy missions also likely launched Advanced Orion or Mentor satellites, giving the NRO a global web of listening posts parked high above the planet.

With the last Delta IV Heavy off the launch pad, ULA has achieved a goal of its corporate strategy sent into motion a decade ago, when the company decided to retire the Delta IV and Atlas V rockets in favor of a new-generation rocket named Vulcan. The first Vulcan rocket successfully launched in January, so the last few months have been a time of transition for ULA, a 50-50 joint venture owned by Boeing and Lockheed Martin.

“This is such an amazing piece of technology: 23 stories tall, half a million gallons of propellant, two and a quarter million pounds of thrust, and the most metal of all rockets, setting itself on fire before it goes to space,” Bruno said of the Delta IV Heavy before its final launch. “Retiring it is (key to) the future, moving to Vulcan, a less expensive, higher-performance rocket. But it’s still sad.”

“Everything that Delta has done … is being done better on Vulcan, so this is a great evolutionary step,” said Bill Cullen, ULA’s launch systems director. “It is bittersweet to see the last one, but there are great things ahead.”

After a fiery finale, the Delta rocket family now belongs to history Read More »

rip-peter-higgs,-who-laid-foundation-for-the-higgs-boson-in-the-1960s

RIP Peter Higgs, who laid foundation for the Higgs boson in the 1960s

A particle physics hero —

Higgs shared the 2013 Nobel Prize in Physics with François Englert.

Smiling Peter Higgs, seated in front of microphone with Edinburgh logo in the background

Enlarge / A visibly emotional Peter Higgs was present when CERN announced Higgs boson discovery in July 2012.

University of Edinburgh

Peter Higgs, the shy, somewhat reclusive physicist who won a Nobel Prize for his theoretical work on how the Higgs boson gives elementary particles their mass, has died at the age of 94. According to a statement from the University of Edinburgh, the physicist passed “peacefully at home on Monday 8 April following a short illness.”

“Besides his outstanding contributions to particle physics, Peter was a very special person, a man of rare modesty, a great teacher and someone who explained physics in a very simple and profound way,” Fabiola Gianotti, director general at CERN and former leader of one of the experiments that helped discover the Higgs particle in 2012, told The Guardian. “An important piece of CERN’s history and accomplishments is linked to him. I am very saddened, and I will miss him sorely.”

The Higgs boson is a manifestation of the Higgs field, an invisible entity that pervades the Universe. Interactions between the Higgs field and particles help provide particles with mass, with particles that interact more strongly having larger masses. The Standard Model of Particle Physics describes the fundamental particles that make up all matter, like quarks and electrons, as well as the particles that mediate their interactions through forces like electromagnetism and the weak force. Back in the 1960s, theorists extended the model to incorporate what has become known as the Higgs mechanism, which provides many of the particles with mass. One consequence of the Standard Model’s version of the Higgs boson is that there should be a force-carrying particle, called a boson, associated with the Higgs field.

Despite its central role in the function of the Universe, the road to predicting the existence of the Higgs boson was bumpy, as was the process of discovering it. As previously reported, the idea of the Higgs boson was a consequence of studies on the weak force, which controls the decay of radioactive elements. The weak force only operates at very short distances, which suggests that the particles that mediate it (the W and Z bosons) are likely to be massive. While it was possible to use existing models of physics to explain some of their properties, these predictions had an awkward feature: just like another force-carrying particle, the photon, the resulting W and Z bosons were massless.

Schematic of the Standard Model of particle physics.

Enlarge / Schematic of the Standard Model of particle physics.

Over time, theoreticians managed to craft models that included massive W and Z bosons, but they invariably came with a hitch: a massless partner, which would imply a longer-range force. In 1964, however, a series of papers was published in rapid succession that described a way to get rid of this problematic particle. If a certain symmetry in the models was broken, the massless partner would go away, leaving only a massive one.

The first of these papers, by François Englert and Robert Brout, proposed the new model in terms of quantum field theory; the second, by Higgs (then 35), noted that a single quantum of the field would be detectable as a particle. A third paper, by Gerald Guralnik, Carl Richard Hagen, and Tom Kibble, provided an independent validation of the general approach, as did a completely independent derivation by students in the Soviet Union.

At that time, “There seemed to be excitement and concern about quantum field theory (the underlying structure of particle physics) back then, with some people beginning to abandon it,” David Kaplan, a physicist at Johns Hopkins University, told Ars. “There were new particles being regularly produced at accelerator experiments without any real theoretical structure to explain them. Spin-1 particles could be written down comfortably (the photon is spin-1) as long as they didn’t have a mass, but the massive versions were confusing to people at the time. A bunch of people, including Higgs, found this quantum field theory trick to give spin-1 particles a mass in a consistent way. These little tricks can turn out to be very useful, but also give the landscape of what is possible.”

“It wasn’t clear at the time how it would be applied in particle physics.”

Ironically, Higgs’ seminal paper was rejected by the European journal Physics Letters. He then added a crucial couple of paragraphs noting that his model also predicted the existence of what we now know as the Higgs boson. He submitted the revised paper to Physical Review Letters in the US, where it was accepted. He examined the properties of the boson in more detail in a 1966 follow-up paper.

RIP Peter Higgs, who laid foundation for the Higgs boson in the 1960s Read More »

epa-seeks-to-cut-“cancer-alley”-pollutants

EPA seeks to cut “Cancer Alley” pollutants

Out of the air —

Chemical plants will have to monitor how much is escaping and stop leaks.

Image of a large industrial facility on the side of a river.

Enlarge / An oil refinery in Louisiana. Facilities such as this have led to a proliferation of petrochemical plants in the area.

On Tuesday, the US Environmental Protection Agency announced new rules that are intended to cut emissions of two chemicals that have been linked to elevated incidence of cancer: ethylene oxide and chloroprene. While production and use of these chemicals takes place in a variety of locations, they’re particularly associated with an area of petrochemical production in Louisiana that has become known as “Cancer Alley.”

The new regulations would require chemical manufacturers to monitor the emissions at their facilities and take steps to repair any problems that result in elevated emissions. Despite extensive evidence linking these chemicals to elevated risk of cancer, industry groups are signaling their opposition to these regulations, and the EPA has seen two previous attempts at regulation set aside by courts.

Dangerous stuff

The two chemicals at issue are primarily used as intermediates in the manufacture of common products. Chloroprene, for example, is used for the production of neoprene, a synthetic rubber-like substance that’s probably familiar from products like insulated sleeves and wetsuits. It’s a four-carbon chain with two double-bonds that allow for polymerization and an attached chlorine that alters its chemical properties.

According to the National Cancer Institute (NCI), chloroprene “is a mutagen and carcinogen in animals and is reasonably anticipated to be a human carcinogen.” Given that cancers are driven by DNA damage, any mutagen would be “reasonably anticipated” to drive the development of cancer. Beyond that, it appears to be pretty nasty stuff, with the NCI noting that “exposure to this substance causes damage to the skin, lungs, CNS, kidneys, liver and depression of the immune system.”

The NCI’s take on Ethylene Oxide is even more definitive, with the Institute placing it on its list of cancer-causing substances. The chemical is very simple, with two carbons that are linked to each other directly, and also linked via an oxygen atom, which makes the molecule look a bit like a triangle. This configuration allows the molecule to participate in a broad range of reactions that break one of the oxygen bonds, making it useful in the production of a huge range of chemicals. Its reactivity also makes it useful for sterilizing items such as medical equipment.

Its sterilization function works through causing damage to DNA, which again makes it prone to causing cancers.

In addition to these two chemicals, the EPA’s new regulations will target a number of additional airborne pollutants, including benzene, 1,3-butadiene, ethylene dichloride, and vinyl chloride, all of which have similar entries at the NCI.

Despite the extensive record linking these chemicals to cancer, The New York Times quotes the US Chamber of Commerce, a pro-industry group, as saying that “EPA should not move forward with this rule-making based on the current record because there remains significant scientific uncertainty.”

A history of exposure

The petrochemical industry is the main source of these chemicals, so their release is associated with areas where the oil and gas industry has a major presence; the EPA notes that the regulations will target sources in Delaware, New Jersey, and the Ohio River Valley. But the primary focus will be on chemical plants in Texas and Louisiana. These include the area that has picked up the moniker Cancer Alley due to a high incidence of the disease in a stretch along the Mississippi River with a large concentration of chemical plants.

As is the case with many examples of chemical pollution, the residents of Cancer Alley are largely poor and belong to minority groups. As a result, the EPA had initially attempted to regulate the emissions under a civil rights provision of the Clean Air Act, but that has been bogged down due to lawsuits.

The new regulations simply set limits on permissible levels of release at what’s termed the “fencelines” of the facilities where these chemicals are made, used, or handled. If levels exceed an annual limit, the owners and operators “must find the source of the pollution and make repairs.” This gets rid of previous exemptions for equipment startup, shutdown, and malfunctions; those exemptions had been held to violate the Clean Air Act in a separate lawsuit.

The EPA estimates that the sites subject to regulation will see their collective emissions of these chemicals drop by nearly 80 percent, which works out to be 54 tons of ethylene oxide, 14 tons of chloroprene, and over 6,000 tons of the other pollutants. That in turn will reduce the cancer risk from these toxins by 96 percent among those subjected to elevated exposures. Collectively, the chemicals subject to these regulations also contribute to smog, so these reductions will have an additional health impact by reducing its levels as well.

While the EPA says that “these emission reductions will yield significant reductions in lifetime cancer risk attributable to these air pollutants,” it was unable to come up with an estimate of the financial benefits that will result from that reduction. By contrast, it estimates that the cost of compliance will end up being approximately $150 million annually. “Most of the facilities covered by the final rule are owned by large corporations,” the EPA notes. “The cost of implementing the final rule is less than one percent of their annual national sales.”

This sort of cost-benefit analysis is a required step during the formulation of Clean Air Act regulations, so it’s worth taking a step back and considering what’s at stake here: the EPA is basically saying that companies that work with significant amounts of carcinogens need to take stronger steps to make sure that they don’t use the air people breathe as a dumping ground for them.

Unsurprisingly, The New York Times quotes a neoprene manufacturer that the EPA is currently suing over its chloroprene emissions as claiming the new regulations are “draconian.”

EPA seeks to cut “Cancer Alley” pollutants Read More »

moments-of-totality:-how-ars-experienced-the-eclipse

Moments of totality: How Ars experienced the eclipse

Total eclipse of the Ars —

The 2024 total eclipse is in the books. Here’s how it looked across the US.

Baily's Beads are visible in this shot taken by Stephen Clark in Athens, Texas.

Enlarge / Baily’s Beads are visible in this shot taken by Stephen Clark in Athens, Texas.

Stephen Clark

“And God said, Let there be light: and there was light. And God saw the light, that it was good: and God divided the light from the darkness. And God called the light Day, and the darkness he called Night. And the evening and the morning were the first day.”

The steady rhythm of the night-day, dark-light progression is a phenomenon acknowledged in ancient sacred texts as a given. When it’s interrupted, people take notice. In the days leading up to the eclipse, excitement within the Ars Orbiting HQ grew, and plans to experience the last total eclipse in the continental United States until 2045 were made. Here’s what we saw across the country.

Kevin Purdy (watched from Buffalo, New York)

  • 3: 19 pm on April 8 in Buffalo overlooking Richmond Ave. near Symphony Circle.

    Kevin Purdy

  • A view of First Presbyterian Church from Richmond Avenue in Buffalo, NY.

    Kevin Purdy

  • The cloudy, strange skies at 3: 12 pm Eastern time in Buffalo on April 8.

    Kevin Purdy

  • A kind of second sunrise at 3: 21 p.m. on April 8 in Buffalo.

    Kevin Purdy

  • A clearer view of the total eclipse from Colden, New York, 30 minutes south of Buffalo on April 8, 2024.

    Sabrina May

Buffalo, New York, is a frequently passed-over city. Super Bowl victories, the shift away from Great Lakes shipping and American-made steel, being the second-largest city in a state that contains New York City: This city doesn’t get many breaks.

So, with Buffalo in the eclipse’s prime path, I, a former resident and booster, wanted to be there. So did maybe a million people, doubling the wider area’s population. With zero hotels, negative Airbnbs, and no flights below trust-fund prices, I arrived early, stayed late, and slept on sofas and air mattresses. I wanted to see if Buffalo’s moment of global attention would go better than last time.

The day started cloudy, as is typical in early April here. With one hour to go, I chatted with Donald Blank. He was filming an eclipse time-lapse as part of a larger documentary on Buffalo: its incredible history, dire poverty, heroes, mistakes, everything. The shot he wanted had the First Presbyterian Church, with its grand spire and Tiffany windows, in the frame. A 200-year-old stone church adds a certain context to a solar event many of us humans will never see again.

The sky darkened. Automatic porch lights flicked on at 3: 15 pm, then street lights, then car lights, for those driving to somehow more important things. People on front lawns cheered, clapped, and quietly couldn’t believe it. When it was over, I heard a neighbor say they forgot their phone inside. Blank walked over and offered to email her some shots he took. It was very normal in Buffalo, even when it was strange.

Benj Edwards (Raleigh, North Carolina)

  • Benj’s low-tech, but creative way of viewing the eclipse.

    Benj Edwards

  • So many crescents.

    Benj Edwards

I’m in Raleigh, North Carolina, and we were lucky to have a clear day today. We reached peak eclipse at around 3: 15 pm (but not total eclipse, sadly), and leading up to that time, the sun slowly began to dim as I looked out my home office window. Around 3 pm, I went outside on the back deck and began crafting makeshift pinhole lenses using cardboard and a steel awl, poking holes so that myself and my kids could see the crescent shape of the eclipse projected indirectly on a dark surface.

My wife had also bought some eclipse glasses from a local toy store, and I very briefly tried them while squinting. I could see the eclipse well, but my eyes were still feeling a little blurry. I didn’t trust them enough to let the kids use them. For the 2017 eclipse, I had purchased very dark welder’s lenses that I have since lost. Even then, I think I got a little bit of eye damage at that time. A floater formed in my left eye that still plagues me to this day. I have the feeling I’ll never learn this lesson, and the next time an eclipse comes around, I’ll just continue to get progressively more blind. But oh what fun to see the sun eclipsed.

Beth Mole (Raleigh, North Carolina)

Another view from Raleigh.

Enlarge / Another view from Raleigh.

Beth Mole

It was a perfect day for eclipse watching in North Carolina—crystal clear blue sky and a high of 75. Our peak was at 3: 15 pm with 78.6 percent sun coverage. The first hints of the moon’s pass came just before 2 pm. The whole family was out in the backyard (alongside a lot of our neighbors!), ready with pin-hole viewers, a couple of the NASA-approved cereal-box viewers, and eclipse glasses. We all watched as the moon progressively slipped in and stole the spotlight. At peak coverage, it was noticeably dimmer and it got remarkably cooler and quieter. It was not nearly as dramatic as being in the path of totality, but still really neat and fun. My 5-year-old had a blast watching the sun go from circle to bitten cookie to banana and back again.

Moments of totality: How Ars experienced the eclipse Read More »

teen’s-vocal-cords-act-like-coin-slot-in-worst-case-ingestion-accident

Teen’s vocal cords act like coin slot in worst-case ingestion accident

What are the chances? —

Luckily his symptoms were relatively mild, but doctors noted ulceration of his airway.

Teen’s vocal cords act like coin slot in worst-case ingestion accident

Most of the time, when kids accidentally gulp down a non-edible object, it travels toward the stomach. In the best-case scenarios for these unfortunate events, it’s a small, benign object that safely sees itself out in a day or two. But in the worst-case scenarios, it can go down an entirely different path.

That was the case for a poor teen in California, who somehow swallowed a quarter. The quarter didn’t head down the esophagus and toward the stomach, but veered into the airway, sliding passed the vocal cords like they were a vending-machine coin slot.

 Radiographs of the chest (Panel A, postero- anterior view) and neck (Panel B, lateral view). Removal with optical forceps (Panel C and Video 1), and reinspection of ulceration (Panel D, asterisks)

Enlarge / Radiographs of the chest (Panel A, postero- anterior view) and neck (Panel B, lateral view). Removal with optical forceps (Panel C and Video 1), and reinspection of ulceration (Panel D, asterisks)

In a clinical report published recently in the New England Journal of Medicine, doctors who treated the 14-year-old boy reported how they found—and later retrieved—the quarter from its unusual and dangerous resting place. Once it passed the vocal cords and the glottis, the coin got lodged in the subglottis, a small region between the vocal cords and the trachea.

Luckily, when the boy arrived at the emergency department, his main symptoms were hoarseness and difficulty swallowing. He was surprisingly breathing comfortably and without drooling, they noted. But imaging quickly revealed the danger his airway was in when the vertical coin lit up his scans.

“Airway foreign bodies—especially those in the trachea and larynx—necessitate immediate removal to reduce the risk of respiratory compromise,” they wrote in the NEJM report.

The teen was given general anesthetic while doctors used long, optical forceps, guided by a camera, to pluck the coin from its snug spot. After grabbing the coin, they re-inspected the boy’s airway noting ulcerations on each side matching the coin’s ribbed edge.

After the coin’s retrieval, the boy’s symptoms improved and he was discharged home, the doctors reported.

Teen’s vocal cords act like coin slot in worst-case ingestion accident Read More »