Science

simple-voltage-pulse-can-restore-capacity-to-li-si-batteries

Simple voltage pulse can restore capacity to Li-Si batteries

The new work, then, is based on a hypothetical: What if we just threw silicon particles in, let them fragment, and then fixed them afterward?

As mentioned, the reason fragmentation is a problem is that it leads to small chunks of silicon that have essentially dropped off the grid—they’re no longer in contact with the system that shuttles charges into and out of the electrode. In many cases, these particles are also partly filled with lithium, which takes it out of circulation, cutting the battery’s capacity even if there’s sufficient electrode material around.

The researchers involved here, all based at Stanford University, decided there was a way to nudge these fragments back into contact with the electrical system and demonstrated it could restore a lot of capacity to a badly degraded battery.

Bringing things together

The idea behind the new work was that it could be possible to attract the fragments of silicon to an electrode, or at least some other material connected to the charge-handling network. On their own, the fragments in the anode shouldn’t have a net charge; when the lithium gives up an electron there, it should go back into solution. But the lithium is unlikely to be evenly distributed across the fragment, making them a polar material—net neutral, but with regions of higher and lower electron densities. And polar materials will move in an uneven electric field.

And, because of the uneven, chaotic structure of an electrode down at the nano scale, any voltage applied to it will create an uneven electric field. Depending on its local structure, that may attract or repel some of the particles. But because these are mostly within the electrode’s structure, most of the fragments of silicon are likely to bump into some other part of electrode in short order. And that could potentially re-establish a connection to the electrode’s current handling system.

To demonstrate that what should happen in theory actually does happen in an electrode, the researchers started by taking a used electrode and brushing some of its surface off into a solution. They then passed a voltage through the solution and confirmed the small bits of material from the battery started moving toward one of the electrodes that they used to apply a voltage to the solution. So, things worked as expected.

Simple voltage pulse can restore capacity to Li-Si batteries Read More »

rocket-report:-bloomberg-calls-for-sls-cancellation;-spacex-hits-century-mark

Rocket Report: Bloomberg calls for SLS cancellation; SpaceX hits century mark


All the news that’s fit to lift

“For the first time, Canada will host its own homegrown rocket technology.”

SpaceX’s fifth flight test ended in success. Credit: SpaceX

Welcome to Edition 7.16 of the Rocket Report! Even several days later, it remains difficult to process the significance of what SpaceX achieved in South Texas last Sunday. The moment of seeing a rocket fall out of the sky and be captured by two arms felt historic to me, as historic as the company’s first drone ship landing in April 2016. What a time to be alive.

As always, we welcome reader submissions, and if you don’t want to miss an issue, please subscribe using the box below (the form will not appear on AMP-enabled versions of the site). Each report will include information on small-, medium-, and heavy-lift rockets as well as a quick look ahead at the next three launches on the calendar.

Surprise! Rocket Lab adds a last-minute mission. After signing a launch contract less than two months ago, Rocket Lab says it will launch a customer as early as Saturday from New Zealand on board its Electron launch vehicle. Rocket Lab added that the customer for the expedited mission, to be named “Changes In Latitudes, Changes In Attitudes,” is confidential. This is an impressive turnaround in launch times and will allow Rocket Lab to burnish its credentials for the US Space Force, which has prioritized “responsive” launch in recent years.

Rapid turnaround down under … The basic idea is that if an adversary were to take out assets in space, the military would like to be able to rapidly replace them. “This quick turnaround from contract to launch is not only a showcase of Electron’s capability, but also of the relentless and fast-paced execution by the experienced team behind it that continues to deliver trusted and reliable access to space for our customers,” Rocket Lab Chief Executive Peter Beck said in a statement. (submitted by EllPeaTea and Ken the Bin)

Canadian spaceport and rocket firm link up. A Canadian spaceport developer, Maritime Launch Services, says it has partnered with a Canadian rocket firm, Reaction Dynamics. Initially, Reaction Dynamics will attempt a suborbital launch from the Nova Scotia-based spaceport. This first mission will serve as a significant step toward enabling Canada’s first-ever orbital launch of a domestically developed rocket, Space Daily reports.

A homegrown effort … “For the first time, Canada will host its own homegrown rocket technology, launched from a Canadian-built commercial spaceport, offering launch vehicle and satellite customers the opportunity to reach space without leaving Canadian soil,” said Stephen Matier, president and CEO of Maritime Launch. Reaction Dynamics is developing the Aurora rocket, which uses hybrid-propulsion technology and is projected to have a payload capacity of 200 kg to low-Earth orbit. (submitted by Joey Schwartz and brianrhurley)

The easiest way to keep up with Eric Berger’s and Stephen Clark’s reporting on all things space is to sign up for our newsletter. We’ll collect their stories and deliver them straight to your inbox.

Sign Me Up!

Sirius completes engine test campaign. French launch startup Sirius Space Services said Thursday that it had completed a hot fire test campaign of the thrust chamber for its STAR-1 rocket engine, European Spaceflight reports. During the campaign, the prototype completed two 60-second hot fire tests powered by liquid methane and liquid oxygen. The successful completion of the testing validates the design of the STAR-1 thrust chamber. Full-scale engine testing may begin during the second quarter of next year.

A lot of engines needed … Sirius Space Services is developing a range of three rockets that all use a modular booster system. Sirius 1 will be a two-stage single-stick rocket capable of delivering 175 kilograms to low-Earth orbit. Sirius 13 will feature two strap-on boosters and will have the capacity to deliver 600 kilograms. Finally, the Sirius 15 rocket will feature four boosters and will be capable of carrying payloads of up to 1,000 kilograms. (submitted by Ken the Bin)

SpaceX, California commission lock horns over launch rates. Last week the California Coastal Commission rejected a plan agreed to between SpaceX and the US Space Force to increase the number of launches from Vandenberg Space Force Base to as many as 50 annually, the Los Angeles Times reports. The commission voted 6–4 to block the request to increase from a maximum of 36 launches. In rejecting the plan, some members of the commission cited their concerns about Elon Musk, the owner of SpaceX. “We’re dealing with a company, the head of which has aggressively injected himself into the presidential race,” commission Chair Caryl Hart said.

Is this a free speech issue? … SpaceX responded to the dispute quickly, suing the California commission in federal court on Tuesday, Reuters reports. The company seeks an order that would bar the agency from regulating the company’s workhorse Falcon 9 rocket launch program at Vandenberg. The lawsuit claims the commission, which oversees use of land and water within the state’s more than 1,000 miles of coastline, unfairly asserted regulatory powers. Musk’s lawsuit called any consideration of his public statements improper, violating speech rights protected by the US Constitution. (submitted by brianrhurley)

SpaceX launches 100th rocket of the year. SpaceX launched its 100th rocket of the year early Tuesday morning and followed it up with another liftoff just hours later, Space.com reports. SpaceX’s centenary mission of the year lifted off from Florida with a Falcon 9 rocket carrying 23 of the company’s Starlink Internet satellites aloft.

Mostly Falcon 9s … The company followed that milestone with another launch two hours later from the opposite US coast. SpaceX’s 101st liftoff of 2024 saw 20 more Starlinks soar to space from Vandenberg Space Force Base in California. The company has already exceeded its previous record for annual launches, 98, set last year. The company’s tally in 2023 included 91 Falcon 9s, five Falcon Heavies, and two Starships. This year the mix is similar. (submitted by Ken the Bin)

Fifth launch of Starship a massive success. SpaceX accomplished a groundbreaking engineering feat Sunday when it launched the fifth test flight of its gigantic Starship rocket and then caught the booster back at the launch pad in Texas with mechanical arms seven minutes later, Ars reports. This achievement is the first of its kind, and it’s crucial for SpaceX’s vision of rapidly reusing the Starship rocket, enabling human expeditions to the Moon and Mars, routine access to space for mind-bogglingly massive payloads, and novel capabilities that no other company—or country—seems close to attaining.

Catching a rocket by its tail … High over the Gulf of Mexico, the first stage of the Starship rocket used its engines to reverse course and head back toward the Texas coastline. After reaching a peak altitude of 59 miles (96 kilometers), the Super Heavy booster began a supersonic descent before reigniting 13 engines for a final braking burn. The rocket then shifted down to just three engines for the fine maneuvering required to position the rocket in a hover over the launch pad. That’s when the launch pad’s tower, dubbed Mechazilla, ensnared the rocket in its two weight-bearing mechanical arms, colloquially known as “chopsticks.” The engines switched off, leaving the booster suspended perhaps 200 feet above the ground. The upper stage of the rocket, Starship, executed what appeared to be a nominal vertical landing into the Indian Ocean as part of its test flight.

Clipper launches on Falcon Heavy. NASA’s Europa Clipper spacecraft lifted off Monday from Kennedy Space Center in Florida aboard a SpaceX Falcon Heavy rocket, Ars reports, kicking off a $5.2 billion robotic mission to explore one of the most promising locations in the Solar System for finding extraterrestrial life. Delayed several days due to Hurricane Milton, which passed through Central Florida late last week, the launch of Europa Clipper signaled the start of a five-and-a-half- year journey to Jupiter, where the spacecraft will settle into an orbit taking it repeatedly by one of the giant planet’s numerous moons.

Exploring oceans, saving money … There’s strong evidence of a global ocean of liquid water below Europa’s frozen crust, and Europa Clipper is going there to determine if it has the ingredients for life. “This is an epic mission,” said Curt Niebur, Europa Clipper’s program scientist at NASA Headquarters. “It’s a chance for us not to explore a world that might have been habitable billions of years ago, but a world that might be habitable today, right now.” The Clipper mission was originally supposed to launch on NASA’s Space Launch System rocket, but it had to be moved off that vehicle because vibrations from the solid rocket motors could have damaged the spacecraft. The change to Falcon Heavy also saved the agency $2 billion.

ULA recovers pieces of shattered booster nozzle. When the exhaust nozzle on one of the Vulcan rocket’s strap-on boosters failed shortly after liftoff earlier this month, it scattered debris across the beachfront landscape just east of the launch pad on Florida’s Space Coast, Ars reports. United Launch Alliance, the company that builds and launches the Vulcan rocket, is investigating the cause of the booster anomaly before resuming Vulcan flights. Despite the nozzle failure, the rocket continued its climb and ended up reaching its planned trajectory heading into deep space.

Not clear what the schedule impacts will be … The nozzle fell off one of Vulcan’s two solid rocket boosters around 37 seconds after taking off from Cape Canaveral Space Force Station on October 4. A shower of sparks and debris fell away from the Vulcan rocket when the nozzle failed. Julie Arnold, a ULA spokesperson, confirmed to Ars that the company has retrieved some of the debris. “We recovered some small pieces of the GEM 63XL SRB nozzle that were liberated in the vicinity of the launch pad,” Arnold said. “The team is inspecting the hardware to aid in the investigation.” ULA has not publicly said what impacts there might be on the timeline for the next Vulcan launch, USSF-106, which had been due to occur before the end of this year.

Bloomberg calls for cancellation of the SLS rocket. In an op-ed that is critical of NASA’s Artemis Program, billionaire Michael Bloomberg—the founder of Bloomberg News and a former US Presidential candidate—called for cancellation of the Space Launch System rocket. “Each launch will likely cost at least $4 billion, quadruple initial estimates,” Bloomberg wrote. “This exceeds private-sector costs many times over, yet it can launch only about once every two years and—unlike SpaceX’s rockets—can’t be reused.”

NASA is falling behind … Bloomberg essentially is calling for the next administration to scrap all elements of the Artemis Program that are not essential to establishing and maintaining a presence on the surface of the Moon. “A celestial irony is that none of this is necessary,” he wrote. “A reusable SpaceX Starship will very likely be able to carry cargo and robots directly to the moon—no SLS, Orion, Gateway, Block 1B or ML-2 required—at a small fraction of the cost. Its successful landing of the Starship booster was a breakthrough that demonstrated how far beyond NASA it is moving.” None of the arguments that Bloomberg is advancing are new, but it is noteworthy to hear them from such a prominent person who is outside the usual orbit of space policy commentators.

Artemis II likely to be delayed. A new report from the US Government Accountability Office found that NASA’s Exploration Ground Systems program—this is, essentially, the office at Kennedy Space Center in Florida responsible for building ground infrastructure to support the Space Launch System rocket and Orion—is in danger of missing its schedule for Artemis II, according to Ars Technica. The new report, published Thursday, finds that the Exploration Ground Systems program had several months of schedule margin in its work toward a September 2025 launch date at the beginning of the year. But now, the program has allocated all of that margin to technical issues experienced during work on the rocket’s mobile launcher and pad testing.

Heat shield issue also a concern … NASA also has yet to provide any additional information on the status of its review of the Orion spacecraft’s heat shield. During the Artemis I mission that sent Orion beyond the Moon in late 2022, chunks of charred material cracked and chipped away from Orion’s heat shield during reentry into Earth’s atmosphere. Once the spacecraft landed, engineers found more than 100 locations where the stresses of reentry damaged the heat shield. To prepare for the Artemis II launch next September, Artemis officials had previously said they planned to begin stacking operations of the rocket in September of this year. But so far, this activity remains on hold pending a decision on the heat shield issue.

Next three launches

Oct. 18: Falcon 9 | Starlink 8-19 | Cape Canaveral Space Force Station, Fla. | 19: 31 UTC

Oct. 19: Electron | Changes In Latitudes, Changes In Attitudes | Māhia Peninsula, New Zealand | 10: 30 UTC

Oct. 20: Falcon 9 | OneWeb no. 20 | Vandenberg Space Force Base, Calif. | 05: 09 UTC

Photo of Eric Berger

Eric Berger is the senior space editor at Ars Technica, covering everything from astronomy to private space to NASA policy, and author of two books: Liftoff, about the rise of SpaceX; and Reentry, on the development of the Falcon 9 rocket and Dragon. A certified meteorologist, Eric lives in Houston.

Rocket Report: Bloomberg calls for SLS cancellation; SpaceX hits century mark Read More »

biden-administration-curtails-controls-on-some-space-related-exports

Biden administration curtails controls on some space-related exports

The US Commerce Department announced Thursday it is easing restrictions on exports of space-related technology, answering a yearslong call from space companies to reform regulations governing international trade.

This is the most significant update to space-related export regulations in a decade and opens more opportunities for US companies to sell their satellite hardware abroad.

“We are very excited about this rollout,” a senior Commerce official said during a background call with reporters. “It’s been a long time coming, and I think it’s going to be very meaningful for our national security and foreign policy interests and certainly facilitate secure trade with our partners.”

Overdue reform

One of the changes will allow US companies to export more products related to electro-optical and radar remote sensing, as well as space-based logistics, assembly, or servicing spacecraft destined for Australia, Canada, and the United Kingdom.

“They’re easing restrictions on some of the less sensitive space-related technologies and on spacecraft-related items going to our closest allies, like Australia, Canada, and the UK,” the senior Commerce official said. “These changes will offer relief to US companies and they’ll increase innovation without comprising the critical technologies that keep our nation safe.”

Another update to the Commerce Department’s regulations will remove license requirements for exports of “certain spacecraft components” to more than 40 allied nations, including NATO and European Union member states, Argentina, Australia, Canada, India, Israel, Japan, Mexico, New Zealand, Singapore, South Africa, South Korea, and Taiwan. This will also create more license exceptions to support NASA’s cooperative programs with other nations, officials said.

A third change, which hasn’t been finalized and must go through a public comment period, proposes to transfer some space-related item—spacecraft capable of in-space docking, grappling, and refueling, autonomous collision avoidance, and autonomous detection of ground vehicles and aircraft—from the highly restrictive State Department’s US Munitions List to the more flexible Commerce Control List.

Biden administration curtails controls on some space-related exports Read More »

bizarre-fish-has-sensory-“legs”-it-uses-for-walking-and-tasting

Bizarre fish has sensory “legs” it uses for walking and tasting

Finding out what controls the formation of sensory legs meant growing sea robins from eggs. The research team observed that the legs of sea robins develop from the three pectoral fin rays that are around the stomach area of the fish, then separate from the fin as they continue to develop. Among the most active genes in the developing legs is the transcription factor (a protein that binds to DNA and turns genes on and off) known as tbx3a. When genetically engineered sea robins had tbx3a edited out with CRISPR-Cas9, it resulted in fewer legs, deformed legs, or both.

“Disruption of tbx3a results in upregulation of pectoral fin markers prior to leg separation, indicating that leg rays become more similar to fins in the absence of tbx3a,” the researchers said in a second study, also published in Current Biology.

To see whether genes for sensory legs are a dominant feature, the research team also tried creating sea robin hybrids, crossing species with and without sensory legs. This resulted in offspring with legs that had sensory capabilities, indicating that it’s a genetically dominant trait.

Exactly why sea robins evolved the way they did is still unknown, but the research team came up with a hypothesis. They think the legs of sea robin ancestors were originally intended for locomotion, but they gradually started gaining some sensory utility, allowing the animal to search the visible surface of the seafloor for food. Those fish that needed to search deeper for food developed sensory legs that allowed them to taste and dig for hidden prey.

“Future work will leverage the remarkable biodiversity of sea robins to understand the genetic basis of novel trait formation and diversification in vertebrates,” the team also said in the first study. “Our work represents a basis for understanding how novel traits evolve.”

Current Biology, 2024. DOI:  10.1016/j.cub.2024.08.014, 10.1016/j.cub.2024.08.042

Bizarre fish has sensory “legs” it uses for walking and tasting Read More »

desalination-system-adjusts-itself-to-work-with-renewable-power

Desalination system adjusts itself to work with renewable power


Instead of needing constant power, new system adjusts to use whatever is available.

Image of a small tanker truck parked next to a few shipping container shaped structures, which are connected by pipes to storage tanks.

Mobile desalination plants might be easier to operate with renewable power. Credit: Ismail BELLAOUALI

Fresh water we can use for drinking or agriculture is only about 3 percent of the global water supply, and nearly 70 percent of that is trapped in glaciers and ice caps. So far, that was enough to keep us going, but severe draughts have left places like Jordan, Egypt, sub-Saharan Africa, Spain, and California with limited access to potable water.

One possible solution is to tap into the remaining 97 percent of the water we have on Earth. The problem is that this water is saline, and we need to get the salt out of it to make it drinkable. Desalination is also an energy-expensive process. But MIT researchers led by Jonathan Bessette might have found an answer to that. They built an efficient, self-regulating water desalination system that runs on solar power alone with no need for batteries or a connection to the grid.

Probing the groundwaters

Oceans are the most obvious source of water for desalination. But they are a good option only for a small portion of people who live in coastal areas. Most of the global population—more or less 60 percent—lives farther than 100 kilometers from the coast, which makes using desalinated ocean water infeasible. So, Bessette and his team focused on groundwater instead.

“In terms of global demand, about 50 percent of low- to middle-income countries rely on groundwater,” Bessette says. This groundwater is trapped in underground reservoirs, abundant, and, in most places, present at depths below 300 meters. It comes mostly from the rain that penetrates the ground and fills empty spaces left by fractured rock formations. Sadly, as the rainwater seeps down it also picks up salts from the soil on its way. As a result, in New Mexico, for example, around 75 percent of groundwater is brackish, meaning less salty than seawater, but still too salty to drink.

Getting rid of the salt

We already have the ability to get the salt back out. “There are two broad categories within desalination technologies. The first is thermal and the other is based on using membranes,” Bessette explains.

Thermal desalination is something we figured out ages ago. You just boil the water and condense the steam, which leaves the salt behind. Boiling, however, needs lots of energy. Bringing 1 liter of room temperature water to 100° Celsius costs around 330 kilojoules of energy, assuming there’s no heat lost in the process. If you want a sense of how much energy that is, stop using your electric kettle for a month and see how your bill shrinks.

“So, around 100 years ago we developed reverse osmosis and electrodialysis, which are two membrane-based desalination technologies. This way, we reduced the power consumption by a factor of 10,” Bessette claims.

Reverse osmosis is a pressure-driven process; you push the water through a membrane that works like a very fine sieve that lets the molecules of water pass but stops other things like salts. Technologically advanced implementations of this idea are widely used at industrial facilities such as the Sydney Desalination Plant in Australia. Reverse osmosis today is the go-to technology when you want to desalinate water at scale. But it has its downsides.

“The issue is reverse osmosis requires a lot of pretreatment. We have to treat the water down to a pretty good quality, making sure the physical, chemical, or biological foul doesn’t end up on the membrane before we do the desalination process,” says Bessette. Another thing is that reverse osmosis relies on pressure, so it requires a steady supply of power to maintain this pressure, which is difficult to achieve in places where the grid is not reliable. Sensitivity to power fluctuations also makes it challenging to use with renewable energy sources like wind or solar. This is why to make their system work on solar energy alone, Bessette’s team went for electrodialysis.

Synching with the Sun

“Unlike reverse osmosis, electrodialysis is an electrically driven process,” Bessette says. The membranes are arranged in such a way that the water is not pushed through them but flows along them. On both sides of those membranes are positive and negative electrodes that create an electric field, which draws salt ions through the membranes and out of the water.

Off-grid desalination systems based on electrodialysis operate at constant power levels like toasters or other appliances, which means they require batteries to even out renewable energy’s fluctuations. Using batteries, in most cases, made them too expensive for the low-income communities that need them the most. Bessette and his colleagues solved that by designing a clever control system.

The two most important parameters in electrodialysis desalination are the flow rate of the water and the power you apply to the electrodes. To make the process efficient, you need to match those two. The advantage of electrodialysis is that it can operate at different power levels. When you have more available power, you can just pump more water through the system. When you have less power, you can slow the system down by reducing the water flow rate. You’ll produce less freshwater, but you won’t break anything this way.

Bessette’s team simplified the control down to two feedback loops. The first outer loop was tracking the power coming from the solar panels. On a sunny day, when the panels generated plenty of power, it fed more water into the system; when there was less power, it fed less water. The second inner loop tracked flow rate. When the flow rate was high, it applied more power to the electrodes; when it was low, it applied less power. The trick was to apply maximum available power while avoiding splitting the water into hydrogen and oxygen.

Once Bessette and his colleagues figured out the control system, they built a prototype desalination device. And it worked, with very little supervision, for half a year.

Water production at scale

Bessette’s prototype system, complete with solar panels, pumps, electronics, and an electrodialysis stack with all the electrodes and membranes, was compact enough to fit in a trailer. They took this trailer to the Brackish Groundwater National Research Facility in Alamogordo, New Mexico, and ran it for six months. On average, it desalinated around 5,000 liters of water per day—enough for a community of roughly 2,000 people.

“The nice thing with our technology is it is more of a control method. The concept can be scaled anywhere from this small community treatment system all the way to large-scale plants,” Bessette says. He said his team is now busy building an equivalent of a single water treatment train, a complete water desalination unit designed for big municipal water supplies. “Multiple such [systems] are implemented in such plants to increase the scale of water desalination process,” Bessette says. But he also thinks about small-scale solutions that can be fitted on a pickup truck and deployed rapidly in crisis scenarios like natural disasters.

“We’re also working on building a company. Me, two other staff engineers, and our professor. We’re really hoping to bring this technology to market and see that it reaches a lot of people. Our aim is to provide clean drinking water to folks in remote regions around the world,” Bessette says.

Nature Water, 2024.  DOI: 10.1038/s44221-024-00314-6

Photo of Jacek Krywko

Jacek Krywko is a freelance science and technology writer who covers space exploration, artificial intelligence research, computer science, and all sorts of engineering wizardry.

Desalination system adjusts itself to work with renewable power Read More »

how-the-malleus-maleficarum-fueled-the-witch-trial-craze

How the Malleus maleficarum fueled the witch trial craze


Invention of printing press, influence of nearby cities created perfect conditions for social contagion.

Between 1400 and 1775, a significant upsurge of witch trials swept across early-modern Europe, resulting in the execution of an estimated 40,000–60,000 accused witches. Historians and social scientists have long studied this period in hopes of learning more about how large-scale social changes occur. Some have pointed to the invention of the printing press and the publication of witch-hunting manuals—most notably the highly influential Malleus maleficarum—as a major factor, making it easier for the witch-hunting hysteria to spread across the continent.

The abrupt emergence of the craze and its rapid spread, resulting in a pronounced shift in social behaviors—namely, the often brutal persecution of suspected witches—is consistent with a theory of social change dubbed “ideational diffusion,” according to a new paper published in the journal Theory and Society. There is the introduction of new ideas, reinforced by social networks, that eventually take root and lead to widespread behavioral changes in a society.

The authors had already been thinking about cultural change and the driving forces by which it occurs, including social contagion—especially large cultural shifts like the Reformation and the Counter-Reformation, for example. One co-author, Steve Pfaff, a sociologist at Chapman University, was working on a project about witch trials in Scotland and was particularly interested in the role the Malleus maleficarum might have played.

“Plenty of other people have written about witch trials, specific trials or places or histories,” co-author Kerice Doten-Snitker, a social scientist with the Santa Fe Institute, told Ars. “We’re interested in building a general theory about change and wanted to use that as a particular opportunity. We realized that the printing of the Mallleus maleficarum was something we could measure, which is useful when you want to do empirical work, not just theoretical work.”

Ch-ch-ch-changes…

The Witch, No. 1, c. 1892 lithograph by Joseph E. Baker. shows a woman in a courtroom, in the dock with arms outstretched before a judge and jury.

The Witch, No. 1, c. 1892 lithograph by Joseph E. Baker.

Credit: Public domain

The Witch, No. 1, c. 1892 lithograph by Joseph E. Baker. Credit: Public domain

Modeling how sweeping cultural change happens has been a hot research topic for decades, hitting the cultural mainstream with the publication of Malcolm Gladwell’s 2000 bestseller The Tipping Point. Researchers continue to make advances in this area. University of Pennsylvania sociologist Damon Centola, for instance, published How Behavior Spreads: the Science of Complex Contagions in 2018, in which he applied new lessons learned in epidemiology—on how viral epidemics spread—to our understanding of how social networks can broadly alter human behavior. But while epidemiological modeling might be useful for certain simple forms of social contagion—people come into contact with something and it spreads rapidly, like a viral meme or hit song—other forms of social contagion are more complicated, per Doten-Snitker.

Doten-Snitker et al.’s ideational diffusion model differs from Centola’s in some critical respects. For cases like the spread of witch trials, “It’s not just that people are coming into contact with a new idea, but that there has to be something cognitively that is happening,” said Doten-Snitker. “People have to grapple with the ideas and undergo some kind of idea adoption. We talk about this as reinterpreting the social world. They have to rethink what’s happening around them in ways that make them think that not only are these attractive new ideas, but also those new ideas prescribe different types of behavior. You have to act differently because of what you’re encountering.”

The authors chose to focus on social networks and trade routes for their analysis of the witch trials, building on prior research that prioritized broader economic and environmental factors. Cultural elites were already exchanging ideas through letters, but published books added a new dimension to those exchanges. Researchers studying 21st century social contagion can download massive amounts of online data from social networks. That kind of data is sparse from the medieval era. “We don’t have the same archives of communication,” said Doten-Snitker. “There’s this dual thing happening: the book itself, and people sharing information, arguing back and forth with each other” about new ideas.

Graph showing the stages of the ideation diffusion model

The stages of the ideation diffusion model.

Credit: K. Dooten-Snitker et al., 2024

The stages of the ideation diffusion model. Credit: K. Dooten-Snitker et al., 2024

So she and her co-authors et al. turned to trade routes to determine which cities were more central and thus more likely to be focal points of new ideas and information. “The places that are more central in these trade networks have more stuff passing through and are more likely to come into contact with new ideas from multiple directions—specifically ideas about witchcraft,” said Doten-Snitker. Then they looked at which of 553 cities in Central Europe held their first witch trials, and when, as well as those where the Malleus maleficarum and similar manuals had been published.

Social contagion

They found that each new published edition of the Malleus maleficarum corresponded with a subsequent increase in witch trials. But that wasn’t the only contributing factor; trends in neighboring cities also influenced the increase, resulting in a slow-moving ripple effect that spread across the continent. “What’s the behavior of neighboring cities?” said Doten-Snitker. “Are they having witch trials? That makes your city more likely to have a witch trial when you have the opportunity.”

In epidemiological models like Centola’s, the pattern of change is a slow start with early adoption that then picks up speed and spreads before slowing down again as a saturation point is reached, because most people have now adopted the new idea or technology. That doesn’t happen with witch trials or other complex social processes such as the spread of medieval antisemitism. “Most things don’t actually spread that widely; they don’t reach complete saturation,” said Doten-Snitker. “So we need to have theories that build that in as well.”

In the case of witch trials, the publication of the Malleus maleficarum helped shift medieval attitudes toward witchcraft, from something that wasn’t viewed as a particularly pressing problem to something evil that was menacing society. The tome also offered practical advice on what should be done about it. “So there’s changing ideas about witchcraft and this gets coupled with, well, you need to do something about it,” said Doten-Snitker. “Not only is witchcraft bad, but it’s a threat. So you have a responsibility as a community to do something about witches.”

The term “witch hunt” gets bandied about frequently in modern times, particularly on social media, and is generally understood to reference a mob mentality unleashed on a given target. But Doten-Snitker emphasizes that medieval witch trials were not “mob justice”; they were organized affairs, with official accusations to an organized local judiciary that collected and evaluated evidence, using the Malleus malficarum and similar treatises as a guide. The process, she said, is similar to how today’s governments adopt new policies.

Why conspiracy theories take hold

Graphic showing cities where witch trials did and did not take place in Central EuropeWitch trials in Central Europe, 1400–1679, as well as those that printed the Malleus Maleficarum.

Cities where witch trials did and did not take place in Central Europe, 1400–1679, as well as those with printed copies of the Malleus Maleficarum.

Credit: K. Doten-Snitker et al., 2024

Cities where witch trials did and did not take place in Central Europe, 1400–1679, as well as those with printed copies of the Malleus Maleficarum. Credit: K. Doten-Snitker et al., 2024

The authors developed their model using the witch trials as a useful framework, but there are contemporary implications, particularly with regard to the rampant spread of misinformation and conspiracy theories via social media. These can also lead to changes in real-world behavior, including violent outbreaks like the January 6, 2021, attack on the US Capitol or, more recently, threats aimed at FEMA workers in the wake of Hurricane Helene. Doten-Snitker thinks their model could help identify the emergence of certain telltale patterns, notably the combination of the spread of misinformation or conspiracy theories on social media along with practical guidelines for responding.

“People have talked about the ways that certain conspiracy theories end up making sense to people,” said Doten-Snitker. “It’s because they’re constructing new ways of thinking about their world. This is why people start with one conspiracy theory belief that is then correlated with belief in others. It’s because you’ve already started rebuilding your image of what’s happening in the world around you and that serves as a basis for how you should act.”

On the plus side, “It’s actually hard for something that feels compelling to certain people to spread throughout the whole population,” she said. “We should still be concerned about ideas that spread that could be socially harmful. We just need to figure out where it might be most likely to happen and focus our efforts in those places rather than assuming it is a global threat.”

There was a noticeable sharp decline in both the frequency and intensity of witch trial persecutions in 1679 and onward, raising the question of how such cultural shifts eventually ran their course. That aspect is not directly addressed by their model, according to Doten-Snitker, but it does provide a framework for the kinds of things that might signal a similar major shift, such as people starting to push back against extreme responses or practices.  In the case of the tail end of the witch trials craze, for instance, there was increased pressure to prioritize clear and consistent judicial practices that excluded extreme measures such as extracting confessions via torture, for example, or excluding dreams as evidence of witchcraft.

“That then supplants older ideas about what is appropriate and how you should behave in the world and you could have a de-escalation of some of the more extremist tendencies,” said Doten-Snitker. “It’s not enough to simply say those ideas or practices are wrong. You have to actually replace it with something. And that is something that is in our model. You have to get people to re-interpret what’s happening around them and what they should do in response. If you do that, then you are undermining a worldview rather than just criticizing it.”

Theory and Society, 2024. DOI: 10.1007/s11186-024-09576-1  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior reporter at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

How the Malleus maleficarum fueled the witch trial craze Read More »

us-vaccinations-fall-again-as-more-parents-refuse-lifesaving-shots-for-kids

US vaccinations fall again as more parents refuse lifesaving shots for kids

Measles, whopping cough, polio, tetanus—devastating and sometimes deadly diseases await comebacks in the US as more and more parents are declining routine childhood vaccines that have proved safe and effective.

The vaccination rates among kindergartners have fallen once again, dipping into the range of 92 percent in the 2023–2024 school year, down from about 93 percent the previous school year and 95 percent in 2019–2020. That’s according to an analysis of the latest vaccination data published today by the Centers for Disease Control and Prevention.

The analysis also found that vaccination exemptions rose to an all-time high of 3.3 percent, up from 3 percent in the previous school year. The rise in exemptions is nearly entirely driven by non-medical exemptions—in other words, religious or philosophical exemptions. Only 0.2 percent of all vaccination exemptions are medically justified.

The new stats mean that more parents are choosing to decline lifesaving vaccines and, for the fourth consecutive year, the US has remained below the 95 percent vaccination target that would keep vaccine-preventable diseases from spreading within communities. In fact, the country continues to slip further away from that target.

Based on data from 49 states plus the District of Columbia (Montana did not report data), 80 percent of jurisdictions saw declines in vaccinations of all four key vaccines assessed: MMR, against measles, mumps, and rubella; DTaP, against diphtheria, tetanus, and pertussis (whooping cough); VAR, against chickenpox; and polio.

Vulnerable kids

Coverage for MMR fell to 92.7 percent in 2023–2024, down from 93.1 percent in the previous school year. That means that about 280,000 (7.3 percent) kindergartners in the US are at risk of measles, mumps, and rubella infections. Likewise, DTaP coverage fell to 92.3 percent, down from 92.7 percent. Polio vaccination fell to 92.6 percent from 93.1 percent, and VAR was down to 92.4 percent from 92.9 percent.

US vaccinations fall again as more parents refuse lifesaving shots for kids Read More »

it’s-increasingly-unlikely-that-humans-will-fly-around-the-moon-next-year

It’s increasingly unlikely that humans will fly around the Moon next year

Don’t book your tickets for the launch of NASA’s Artemis II mission next year just yet.

We have had reason to doubt the official September 2025 launch date for the mission, the first crewed flight into deep space in more than five decades, for a while now. This is principally because NASA is continuing to mull the implications of damage to the Orion spacecraft’s heat shield from the Artemis I mission nearly two years ago.

However, it turns out that there are now other problems with holding to this date as well.

No schedule margin

A new report from the US Government Accountability Office found that NASA’s Exploration Ground Systems program—this is, essentially, the office at Kennedy Space Center in Florida responsible for building ground infrastructure to support the Space Launch System rocket and Orion—is in danger of missing its schedule for Artemis II.

During this flight a crew of four astronauts, commanded by NASA’s Reid Wiseman, will launch inside Orion on a 10-day mission out to the Moon and back. The spacecraft will follow a free-return trajectory, which is important, because if there is a significant problem with Orion spacecraft’s propulsion system, the trajectory of the vehicle will still carry it back to Earth. At their closest approach, the crew will come within about 6,500 miles (10,400 km) of the surface of the far side of the Moon.

The new report, published Thursday, finds that the Exploration Ground Systems program had several months of schedule margin in its work toward a September 2025 launch date at the beginning of the year. But now, the program has allocated all of that margin to technical issues experienced during work on the rocket’s mobile launcher and pad testing.

“Earlier in 2024, the program was reserving that time for technical issues that may arise during testing of the integrated SLS and Orion vehicle or if weather interferes with planned activities, among other things,” the report states. “Officials said it is likely that issues will arise because this is the first time testing many of these systems. Given the lack of margin, if further issues arise during testing or integration, there will likely be delays to the September 2025 Artemis II launch date.”

It’s increasingly unlikely that humans will fly around the Moon next year Read More »

there’s-another-massive-meat-recall-over-listeria—and-it’s-a-doozy

There’s another massive meat recall over Listeria—and it’s a doozy

Another nationwide meat recall is underway over Listeria contamination—and its far more formidable than the last.

As of October 15, meat supplier BrucePac, of Durant, Oklahoma, is recalling 11.8 million pounds of ready-to-eat meat and poultry products after routine federal safety testing found Listeria monocytogenes, a potentially deadly bacterium, in samples of the company’s poultry. The finding triggered an immediate recall, which was first issued on October 9. But, officials are still working to understand the extent of the contamination—and struggling to identify the hundreds of potentially contaminated products.

“Because we sell to other companies who resell, repackage, or use our products as ingredients in other foods, we do not have a list of retail products that contain our recalled items,” BrucePac said in a statement updated October 15.

Depending on the packaging, the products may have establishment numbers 51205 or P-51205 inside or under the USDA mark of inspection. But, for now, consumers’ best chance of determining whether they’ve purchased any of the affected products is to look through a 342-page list of products identified by the US Department of Agriculture so far.

The unorganized document lists fresh and frozen foods sold at common retailers, including 7-Eleven, Aldi, Amazon Fresh, Giant Eagle, Kroger, Target, Trader Joe’s, Walmart, and Wegmans. Affected products carry well-known brand names, such as Atkins, Boston Market, Dole, Fresh Express, Jenny Craig, Michelina’s, Taylor Farms, and stores’ brands, such as Target’s Good & Gather. The recalled products were made between May 31, 2024 and October 8, 2024.

In the latest update, the USDA noted that some of the recalled products were also distributed to schools, but the agency hasn’t identified the schools that received the products. Restaurants and other institutions also received the products.

There’s another massive meat recall over Listeria—and it’s a doozy Read More »

amazon-joins-google-in-investing-in-small-modular-nuclear-power

Amazon joins Google in investing in small modular nuclear power


Small nukes is good nukes?

What’s with the sudden interest in nuclear power among tech titans?

Diagram of a reactor and its coolant system. There are two main components, the reactor itself, which has a top-to-bottom flow of fuel pellets, and the boiler, which receives hot gas from the reactor and uses it to boil water.

Fuel pellets flow down the reactor (left), as gas transfer heat to a boiler (right). Credit: X-energy

On Tuesday, Google announced that it had made a power purchase agreement for electricity generated by a small modular nuclear reactor design that hasn’t even received regulatory approval yet. Today, it’s Amazon’s turn. The company’s Amazon Web Services (AWS) group has announced three different investments, including one targeting a different startup that has its own design for small, modular nuclear reactors—one that has not yet received regulatory approval.

Unlike Google’s deal, which is a commitment to purchase power should the reactors ever be completed, Amazon will lay out some money upfront as part of the agreements. We’ll take a look at the deals and technology that Amazon is backing before analyzing why companies are taking a risk on unproven technologies.

Money for utilities and a startup

Two of Amazon’s deals are with utilities that serve areas where it already has a significant data center footprint. One of these is Energy Northwest, which is an energy supplier that sends power to utilities in the Pacific Northwest. Amazon is putting up the money for Energy Northwest to study the feasibility of adding small modular reactors to its Columbia Generating Station, which currently houses a single, large reactor. In return, Amazon will get the right to purchase power from an initial installation of four small modular reactors. The site could potentially support additional reactors, which Energy Northwest would be able to use to meet demands from other users.

The deal with Virginia’s Dominion Energy is similar in that it would focus on adding small modular reactors to Dominion’s existing North Anna Nuclear Generating Station. But the exact nature of the deal is a bit harder to understand. Dominion says the companies will “jointly explore innovative ways to advance SMR development and financing while also mitigating potential cost and development risks.”

Should either or both of these projects go forward, the reactor designs used will come from a company called X-energy, which is involved in the third deal Amazon is announcing. In this case, it’s a straightforward investment in the company, although the exact dollar amount is unclear (the company says Amazon is “anchoring” a $500 million round of investments). The money will help finalize the company’s reactor design and push it through the regulatory approval process.

Small modular nuclear reactors

X-energy is one of several startups attempting to develop small modular nuclear reactors. The reactors all have a few features that are expected to help them avoid the massive time and cost overruns associated with the construction of large nuclear power stations. In these small reactors, the limited size allows them to be made at a central facility and then be shipped to the power station for installation. This limits the scale of the infrastructure that needs to be built in place and allows the assembly facility to benefit from economies of scale.

This also allows a great deal of flexibility at the installation site, as you can scale the facility to power needs simply by adjusting the number of installed reactors. If demand rises in the future, you can simply install a few more.

The small modular reactors are also typically designed to be inherently safe. Should the site lose power or control over the hardware, the reactor will default to a state where it can’t generate enough heat to melt down or damage its containment. There are various approaches to achieving this.

X-energy’s technology is based on small, self-contained fuel pellets called TRISO particles for TRi-structural ISOtropic. These contain both the uranium fuel and a graphite moderator and are surrounded by a ceramic shell. They’re structured so that there isn’t sufficient uranium present to generate temperatures that can damage the ceramic, ensuring that the nuclear fuel will always remain contained.

The design is meant to run at high temperatures and extract heat from the reactor using helium, which is used to boil water and generate electricity. Each reactor can produce 80 megawatts of electricity, and the reactors are designed to work efficiently as a set of four, creating a 320 MW power plant. As of yet, however, there are no working examples of this reactor, and the design hasn’t been approved by the Nuclear Regulatory Commission.

Why now?

Why is there such sudden interest in small modular reactors among the tech community? It comes down to growing needs and a lack of good alternatives, even given the highly risky nature of the startups that hope to build the reactors.

It’s no secret that data centers require enormous amounts of energy, and the sudden popularity of AI threatens to raise that demand considerably. Renewables, as the cheapest source of power on the market, would be one way of satisfying that growth, but they’re not ideal. For one thing, the intermittent nature of the power they supply, while possible to manage at the grid level, is a bad match for the around-the-clock demands of data centers.

The US has also benefitted from over a decade of efficiency gains keeping demand flat despite population and economic growth. This has meant that all the renewables we’ve installed have displaced fossil fuel generation, helping keep carbon emissions in check. Should newly installed renewables instead end up servicing rising demand, it will make it considerably more difficult for many states to reach their climate goals.

Finally, renewable installations have often been built in areas without dedicated high-capacity grid connections, resulting in a large and growing backlog of projects (2.6 TW of generation and storage as of 2023) that are stalled as they wait for the grid to catch up. Expanding the pace of renewable installation can’t meet rising server farm demand if the power can’t be brought to where the servers are.

These new projects avoid that problem because they’re targeting sites that already have large reactors and grid connections to use the electricity generated there.

In some ways, it would be preferable to build more of these large reactors based on proven technologies. But not in two very important ways: time and money. The last reactor completed in the US was at the Vogtle site in Georgia, which started construction in 2009 but only went online this year. Costs also increased from $14 billion to over $35 billion during construction. It’s clear that any similar projects would start generating far too late to meet the near-immediate needs of server farms and would be nearly impossible to justify economically.

This leaves small modular nuclear reactors as the least-bad option in a set of bad options. Despite many startups having entered the space over a decade ago, there is still just a single reactor design approved in the US, that of NuScale. But the first planned installation saw the price of the power it would sell rise to the point where it was no longer economically viable due to the plunge in the cost of renewable power; it was canceled last year as the utilities that would have bought the power pulled out.

The probability that a different company will manage to get a reactor design approved, move to construction, and manage to get something built before the end of the decade is extremely low. The chance that it will be able to sell power at a competitive price is also very low, though that may change if demand rises sufficiently. So the fact that Amazon is making some extremely risky investments indicates just how worried it is about its future power needs. Of course, when your annual gross profit is over $250 billion a year, you can afford to take some risks.

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

Amazon joins Google in investing in small modular nuclear power Read More »

drugmakers-can-keep-making-off-brand-weight-loss-drugs-as-fda-backpedals

Drugmakers can keep making off-brand weight-loss drugs as FDA backpedals

The judge in the case, District Judge Mark Pittman, granted the FDA’s request, canceling an October 15 hearing, and ordering the parties to submit a joint status report on November 21.

Drugmakers respond

The move was celebrated by the Outsourcing Facilities Association (OFA), which filed the lawsuit.

“We believe that this is a fair resolution in light of the agency’s rash decision to take the drug off of the list at a time when the agency has acknowledged ‘supply disruptions,’ which immediately created a major access issue for patients everywhere,” OFA Chairperson Lee Rosebush said in a statement. “Most important, should the FDA repeat its removal decision when a shortage still genuinely exists, we will return to court.”

The move is also likely to please patients who have come to rely on cheaper, more readily available compounded versions of the drugs. For some, compounded products may have been their only access to tirzepatide.  Still, those drugs are not without risk. The FDA has repeatedly emphasized that compounded drugs are not FDA-approved and do not go through the same safety, efficacy, and quality reviews. And the agency has warned of dosing errors and other safety concerns with compounded versions.

The one party that is certainly unhappy with the FDA’s move is Eli Lilly, which had reportedly sent cease-and-desist letters to compounders. In an emailed statement to Ars, a spokesperson for Lilly said that there was sufficient supply of the company’s drug and continued use of compounded versions is unwarranted. “Nothing changes the fact that, as FDA has recognized, Mounjaro and Zepbound are available and the shortage remains ‘resolved,'” the spokesperson said.

Lilly also noted the FDA’s safety concerns about the compounded versions, adding that its own examination of some compounded products found impurities, bacteria, strange coloring, incorrect potency, puzzling chemical structures, and, in one case, a product that was just sugar alcohol.

“All doses of Lilly’s FDA-approved medicines are available and it is important that patients not be exposed to the risks in taking untested, unapproved knockoffs,” the spokesperson said.

In terms of the supply “disruptions” the FDA mentioned and that some patients and pharmacies have reportedly experienced, Lilly said that the supply chain was complex and there are many reasons why a given pharmacy may not have a specific dose at hand, such as limited refrigerated storage space.

Drugmakers can keep making off-brand weight-loss drugs as FDA backpedals Read More »

sustainable-building-effort-reaches-new-heights-with-wooden-skyscrapers

Sustainable building effort reaches new heights with wooden skyscrapers


Wood offers architects an alternative to carbon-intensive steel and concrete.

At the University of Toronto, just across the street from the football stadium, workers are putting up a 14-story building with space for classrooms and faculty offices. What’s unusual is how they’re building it — by bolting together giant beams, columns, and panels made of manufactured slabs of wood.

As each wood element is delivered by flatbed, a tall crane lifts it into place and holds it in position while workers attach it with metal connectors. In its half-finished state, the building resembles flat-pack furniture in the process of being assembled.

The tower uses a new technology called mass timber. In this kind of construction, massive, manufactured wood elements that can extend more than half the length of a football field replace steel beams and concrete. Though still relatively uncommon, it is growing in popularity and beginning to pop up in skylines around the world.

A photo of a modern apartment interior with wooden beams, floor and ceiling. Windows overlook the surrounding neighborhood.

Mass timber can lend warmth and beauty to an interior. Pictured is a unit in the eight-story Carbon12 condominium in Portland, Oregon.

Mass timber can lend warmth and beauty to an interior. Pictured is a unit in the eight-story Carbon12 condominium in Portland, Oregon. Credit: KAISER + PATH

Today, the tallest mass timber building is the 25-story Ascent skyscraper in Milwaukee, completed in 2022. As of that year, there were 84 mass timber buildings eight stories or higher either built or under construction worldwide, with another 55 proposed. Seventy percent of the existing and future buildings were in Europe, about 20 percent in North America, and the rest in Australia and Asia, according to a report from the Council on Tall Buildings and Urban Habitat. When you include smaller buildings, at least 1,700 mass timber buildings had been constructed in the United States alone as of 2023.

Mass timber is an appealing alternative to energy-intensive concrete and steel, which together account for almost 15 percent of global carbon dioxide emissions. Though experts are still debating mass timber’s role in fighting climate change, many are betting it’s better for the environment than current approaches to construction. It relies on wood, after all, a renewable resource.

Mass timber also offers a different aesthetic that can make a building feel special. “People get sick and tired of steel and concrete,” says Ted Kesik, a building scientist at the University of Toronto’s Mass Timber Institute, which promotes mass timber research and development. With its warm, soothing appearance and natural variations, timber can be more visually pleasing. “People actually enjoy looking at wood.”

Same wood, stronger structure

Using wood for big buildings isn’t new, of course. Industrialization in the 18th and 19th centuries led to a demand for large factories and warehouses, which were often “brick and beam” construction—a frame of heavy wooden beams supporting exterior brick walls.

As buildings became ever taller, though, builders turned to concrete and steel for support. Wood construction became mostly limited to houses and other small buildings made from the standard-sized “dimensional” lumber you see stacked at Home Depot.

But about 30 years ago, builders in Germany and Austria began experimenting with techniques for making massive wood elements out of this readily available lumber. They used nails, dowels and glue to combine smaller pieces of wood into big, strong and solid masses that don’t require cutting down large old-growth trees.

Engineers including Julius Natterer, a German engineer based in Switzerland, pioneered new methods for building with the materials. And architects including Austria’s Hermann Kaufmann began gaining attention for mass timber projects, including the Ölzbündt apartments in Austria, completed in 1997, and Brock Commons, an 18-story student residence at the University of British Columbia, completed in 2017.

In principle, mass timber is like plywood but on a much larger scale: The smaller pieces are layered and glued together under pressure in large specialized presses. Today, beams up to 50 meters long, usually made of what’s called glue-laminated timber, or glulam, can replace steel elements. Panels up to 50 centimeters thick, typically cross-laminated timber, or CLT, replace concrete for walls and floors.

These wood composites can be surprisingly strong—stronger than steel by weight. But a mass timber element must be bulkier to achieve that same strength. As a building gets higher, the wooden supports must get thicker; at some point, they simply take up too much space. So for taller mass timber buildings, including the Ascent skyscraper, architects often turn to a combination of wood, steel and concrete.

Historically, one of the most obvious concerns with using mass timber for tall buildings was fire safety. Until recently, many building codes limited wood construction to low-rise buildings.

Though they don’t have to be completely fireproof, buildings need to resist collapse long enough to give firefighters a chance to bring the flames under control, and for occupants to get out. Materials used in conventional skyscrapers, for instance, are required to maintain their integrity in a fire for three hours or more.

To demonstrate mass timber’s fire resistance, engineers put the wood elements in gas-fired chambers and monitor their integrity. Other tests set fire to mock-ups of mass timber buildings and record the results.

These tests have gradually convinced regulators and customers that mass timber can resist burning long enough to be fire-safe. That’s partly because a layer of char tends to form early on the outside of the timber, insulating the interior from much of the fire’s heat.

Mass timber got a major stamp of approval in 2021, when the International Code Council changed the International Building Code, which serves as a model for jurisdictions around the world, to allow mass timber construction up to 18 stories tall. With this change, more and more localities are expected to update their codes to routinely allow tall mass timber buildings, rather than requiring them to get special approvals.

There are other challenges, though. “Moisture is the real problem, not fire,” says Steffen Lehmann, an architect and scholar of urban sustainability at the University of Nevada, Las Vegas.

All buildings must control moisture, but it’s absolutely crucial for mass timber. Wet wood is vulnerable to deterioration from fungus and insects like termites. Builders are careful to prevent the wood from getting wet during transportation and construction, and they deploy a comprehensive moisture management plan, including designing heat and ventilation systems to keep moisture from accumulating. For extra protection from insects, wood can be treated with chemical pesticides or surrounded by mesh or other physical barriers where it meets the ground.

Another problem is acoustics, since wood transmits sound so well. Designers use sound insulation materials, leave space between walls and install raised floors, among other methods.

Potential upsides of mass timber

Combating global warming means reducing greenhouse gas emissions from the building sector, which is responsible for 39 percent of emissions globally. Diana Ürge-Vorsatz, an environmental scientist at the Central European University in Vienna, says mass timber and other bio-based materials could be an important part of that effort.

In a 2020 paper in the Annual Review of Environment and Resources, she and colleagues cite an estimate from the lumber industry that the 18-story Brock Commons, in British Columbia, avoided the equivalent of 2,432 metric tons of CO2 emissions compared with a similar building of concrete and steel. Of those savings, 679 tons came from the fact that less greenhouse gas emissions are generated in the manufacture of wood versus concrete and steel. Another 1,753 metric tons of CO2 equivalent were locked away in the building’s wood.

“If you use bio-based material, we have a double win,” Ürge-Vorsatz says.

But a lot of the current enthusiasm over mass timber’s climate benefits is based on some big assumptions. The accounting often assumes, for instance, that any wood used in a mass timber building will be replaced by the growth of new trees, and that those new trees will take the same amount of CO2 out of the atmosphere across time. But if old-growth trees are replaced with new tree plantations, the new trees may never reach the same size as the original trees, some environmental groups argue. There are also concerns that increasing demand for wood could lead to more deforestation and less land for food production.

Studies also tend to assume that once the wood is in a building, the carbon is locked up for good. But not all the wood from a felled tree ends up in the finished product. Branches, roots and lumber mill waste may decompose or get burned. And when the building is torn down, if the wood ends up in a landfill, the carbon can find its way out in the form of methane and other emissions.

“A lot of architects are scratching their heads,” says Stephanie Carlisle, an architect and environmental researcher at the nonprofit Carbon Leadership Forum, wondering whether mass timber always has a net benefit. “Is that real?” She believes climate benefits do exist. But she says understanding the extent of those benefits will require more research.

In the meantime, mass timber is at the forefront of a whole different model of construction called integrated design. In traditional construction, an architect designs a building first and then multiple firms are hired to handle different parts of the construction, from laying the foundation, to building the frame, to installing the ventilation system, and so on.

In integrated design, says Kesik, the design phase is much more detailed and involves the various firms from the beginning. The way different components will fit and work together is figured out in advance. Exact sizes and shapes of elements are predetermined, and holes can even be pre-drilled for attachment points. That means many of the components can be manufactured off-site, often with advanced computer-controlled machinery.

A lot of architects like this because it gives them more control over the building elements. And because so much of the work is done in advance, the buildings tend to go up faster on-site — up to 40 percent faster than other buildings, Lehmann says.

Mass timber buildings tend to be manufactured more like automobiles, Kesik says, with all the separate pieces shipped to a final location for assembly. “When the mass timber building shows up on-site, it’s really just like an oversized piece of Ikea furniture,” he says. “Everything sort of goes together.”

This story originally appeared in Knowable Magazine.

Photo of Knowable Magazine

Knowable Magazine explores the real-world significance of scholarly work through a journalistic lens.

Sustainable building effort reaches new heights with wooden skyscrapers Read More »