Science

how-the-malleus-maleficarum-fueled-the-witch-trial-craze

How the Malleus maleficarum fueled the witch trial craze


Invention of printing press, influence of nearby cities created perfect conditions for social contagion.

Between 1400 and 1775, a significant upsurge of witch trials swept across early-modern Europe, resulting in the execution of an estimated 40,000–60,000 accused witches. Historians and social scientists have long studied this period in hopes of learning more about how large-scale social changes occur. Some have pointed to the invention of the printing press and the publication of witch-hunting manuals—most notably the highly influential Malleus maleficarum—as a major factor, making it easier for the witch-hunting hysteria to spread across the continent.

The abrupt emergence of the craze and its rapid spread, resulting in a pronounced shift in social behaviors—namely, the often brutal persecution of suspected witches—is consistent with a theory of social change dubbed “ideational diffusion,” according to a new paper published in the journal Theory and Society. There is the introduction of new ideas, reinforced by social networks, that eventually take root and lead to widespread behavioral changes in a society.

The authors had already been thinking about cultural change and the driving forces by which it occurs, including social contagion—especially large cultural shifts like the Reformation and the Counter-Reformation, for example. One co-author, Steve Pfaff, a sociologist at Chapman University, was working on a project about witch trials in Scotland and was particularly interested in the role the Malleus maleficarum might have played.

“Plenty of other people have written about witch trials, specific trials or places or histories,” co-author Kerice Doten-Snitker, a social scientist with the Santa Fe Institute, told Ars. “We’re interested in building a general theory about change and wanted to use that as a particular opportunity. We realized that the printing of the Mallleus maleficarum was something we could measure, which is useful when you want to do empirical work, not just theoretical work.”

Ch-ch-ch-changes…

The Witch, No. 1, c. 1892 lithograph by Joseph E. Baker. shows a woman in a courtroom, in the dock with arms outstretched before a judge and jury.

The Witch, No. 1, c. 1892 lithograph by Joseph E. Baker.

Credit: Public domain

The Witch, No. 1, c. 1892 lithograph by Joseph E. Baker. Credit: Public domain

Modeling how sweeping cultural change happens has been a hot research topic for decades, hitting the cultural mainstream with the publication of Malcolm Gladwell’s 2000 bestseller The Tipping Point. Researchers continue to make advances in this area. University of Pennsylvania sociologist Damon Centola, for instance, published How Behavior Spreads: the Science of Complex Contagions in 2018, in which he applied new lessons learned in epidemiology—on how viral epidemics spread—to our understanding of how social networks can broadly alter human behavior. But while epidemiological modeling might be useful for certain simple forms of social contagion—people come into contact with something and it spreads rapidly, like a viral meme or hit song—other forms of social contagion are more complicated, per Doten-Snitker.

Doten-Snitker et al.’s ideational diffusion model differs from Centola’s in some critical respects. For cases like the spread of witch trials, “It’s not just that people are coming into contact with a new idea, but that there has to be something cognitively that is happening,” said Doten-Snitker. “People have to grapple with the ideas and undergo some kind of idea adoption. We talk about this as reinterpreting the social world. They have to rethink what’s happening around them in ways that make them think that not only are these attractive new ideas, but also those new ideas prescribe different types of behavior. You have to act differently because of what you’re encountering.”

The authors chose to focus on social networks and trade routes for their analysis of the witch trials, building on prior research that prioritized broader economic and environmental factors. Cultural elites were already exchanging ideas through letters, but published books added a new dimension to those exchanges. Researchers studying 21st century social contagion can download massive amounts of online data from social networks. That kind of data is sparse from the medieval era. “We don’t have the same archives of communication,” said Doten-Snitker. “There’s this dual thing happening: the book itself, and people sharing information, arguing back and forth with each other” about new ideas.

Graph showing the stages of the ideation diffusion model

The stages of the ideation diffusion model.

Credit: K. Dooten-Snitker et al., 2024

The stages of the ideation diffusion model. Credit: K. Dooten-Snitker et al., 2024

So she and her co-authors et al. turned to trade routes to determine which cities were more central and thus more likely to be focal points of new ideas and information. “The places that are more central in these trade networks have more stuff passing through and are more likely to come into contact with new ideas from multiple directions—specifically ideas about witchcraft,” said Doten-Snitker. Then they looked at which of 553 cities in Central Europe held their first witch trials, and when, as well as those where the Malleus maleficarum and similar manuals had been published.

Social contagion

They found that each new published edition of the Malleus maleficarum corresponded with a subsequent increase in witch trials. But that wasn’t the only contributing factor; trends in neighboring cities also influenced the increase, resulting in a slow-moving ripple effect that spread across the continent. “What’s the behavior of neighboring cities?” said Doten-Snitker. “Are they having witch trials? That makes your city more likely to have a witch trial when you have the opportunity.”

In epidemiological models like Centola’s, the pattern of change is a slow start with early adoption that then picks up speed and spreads before slowing down again as a saturation point is reached, because most people have now adopted the new idea or technology. That doesn’t happen with witch trials or other complex social processes such as the spread of medieval antisemitism. “Most things don’t actually spread that widely; they don’t reach complete saturation,” said Doten-Snitker. “So we need to have theories that build that in as well.”

In the case of witch trials, the publication of the Malleus maleficarum helped shift medieval attitudes toward witchcraft, from something that wasn’t viewed as a particularly pressing problem to something evil that was menacing society. The tome also offered practical advice on what should be done about it. “So there’s changing ideas about witchcraft and this gets coupled with, well, you need to do something about it,” said Doten-Snitker. “Not only is witchcraft bad, but it’s a threat. So you have a responsibility as a community to do something about witches.”

The term “witch hunt” gets bandied about frequently in modern times, particularly on social media, and is generally understood to reference a mob mentality unleashed on a given target. But Doten-Snitker emphasizes that medieval witch trials were not “mob justice”; they were organized affairs, with official accusations to an organized local judiciary that collected and evaluated evidence, using the Malleus malficarum and similar treatises as a guide. The process, she said, is similar to how today’s governments adopt new policies.

Why conspiracy theories take hold

Graphic showing cities where witch trials did and did not take place in Central EuropeWitch trials in Central Europe, 1400–1679, as well as those that printed the Malleus Maleficarum.

Cities where witch trials did and did not take place in Central Europe, 1400–1679, as well as those with printed copies of the Malleus Maleficarum.

Credit: K. Doten-Snitker et al., 2024

Cities where witch trials did and did not take place in Central Europe, 1400–1679, as well as those with printed copies of the Malleus Maleficarum. Credit: K. Doten-Snitker et al., 2024

The authors developed their model using the witch trials as a useful framework, but there are contemporary implications, particularly with regard to the rampant spread of misinformation and conspiracy theories via social media. These can also lead to changes in real-world behavior, including violent outbreaks like the January 6, 2021, attack on the US Capitol or, more recently, threats aimed at FEMA workers in the wake of Hurricane Helene. Doten-Snitker thinks their model could help identify the emergence of certain telltale patterns, notably the combination of the spread of misinformation or conspiracy theories on social media along with practical guidelines for responding.

“People have talked about the ways that certain conspiracy theories end up making sense to people,” said Doten-Snitker. “It’s because they’re constructing new ways of thinking about their world. This is why people start with one conspiracy theory belief that is then correlated with belief in others. It’s because you’ve already started rebuilding your image of what’s happening in the world around you and that serves as a basis for how you should act.”

On the plus side, “It’s actually hard for something that feels compelling to certain people to spread throughout the whole population,” she said. “We should still be concerned about ideas that spread that could be socially harmful. We just need to figure out where it might be most likely to happen and focus our efforts in those places rather than assuming it is a global threat.”

There was a noticeable sharp decline in both the frequency and intensity of witch trial persecutions in 1679 and onward, raising the question of how such cultural shifts eventually ran their course. That aspect is not directly addressed by their model, according to Doten-Snitker, but it does provide a framework for the kinds of things that might signal a similar major shift, such as people starting to push back against extreme responses or practices.  In the case of the tail end of the witch trials craze, for instance, there was increased pressure to prioritize clear and consistent judicial practices that excluded extreme measures such as extracting confessions via torture, for example, or excluding dreams as evidence of witchcraft.

“That then supplants older ideas about what is appropriate and how you should behave in the world and you could have a de-escalation of some of the more extremist tendencies,” said Doten-Snitker. “It’s not enough to simply say those ideas or practices are wrong. You have to actually replace it with something. And that is something that is in our model. You have to get people to re-interpret what’s happening around them and what they should do in response. If you do that, then you are undermining a worldview rather than just criticizing it.”

Theory and Society, 2024. DOI: 10.1007/s11186-024-09576-1  (About DOIs).

Photo of Jennifer Ouellette

Jennifer is a senior reporter at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

How the Malleus maleficarum fueled the witch trial craze Read More »

us-vaccinations-fall-again-as-more-parents-refuse-lifesaving-shots-for-kids

US vaccinations fall again as more parents refuse lifesaving shots for kids

Measles, whopping cough, polio, tetanus—devastating and sometimes deadly diseases await comebacks in the US as more and more parents are declining routine childhood vaccines that have proved safe and effective.

The vaccination rates among kindergartners have fallen once again, dipping into the range of 92 percent in the 2023–2024 school year, down from about 93 percent the previous school year and 95 percent in 2019–2020. That’s according to an analysis of the latest vaccination data published today by the Centers for Disease Control and Prevention.

The analysis also found that vaccination exemptions rose to an all-time high of 3.3 percent, up from 3 percent in the previous school year. The rise in exemptions is nearly entirely driven by non-medical exemptions—in other words, religious or philosophical exemptions. Only 0.2 percent of all vaccination exemptions are medically justified.

The new stats mean that more parents are choosing to decline lifesaving vaccines and, for the fourth consecutive year, the US has remained below the 95 percent vaccination target that would keep vaccine-preventable diseases from spreading within communities. In fact, the country continues to slip further away from that target.

Based on data from 49 states plus the District of Columbia (Montana did not report data), 80 percent of jurisdictions saw declines in vaccinations of all four key vaccines assessed: MMR, against measles, mumps, and rubella; DTaP, against diphtheria, tetanus, and pertussis (whooping cough); VAR, against chickenpox; and polio.

Vulnerable kids

Coverage for MMR fell to 92.7 percent in 2023–2024, down from 93.1 percent in the previous school year. That means that about 280,000 (7.3 percent) kindergartners in the US are at risk of measles, mumps, and rubella infections. Likewise, DTaP coverage fell to 92.3 percent, down from 92.7 percent. Polio vaccination fell to 92.6 percent from 93.1 percent, and VAR was down to 92.4 percent from 92.9 percent.

US vaccinations fall again as more parents refuse lifesaving shots for kids Read More »

it’s-increasingly-unlikely-that-humans-will-fly-around-the-moon-next-year

It’s increasingly unlikely that humans will fly around the Moon next year

Don’t book your tickets for the launch of NASA’s Artemis II mission next year just yet.

We have had reason to doubt the official September 2025 launch date for the mission, the first crewed flight into deep space in more than five decades, for a while now. This is principally because NASA is continuing to mull the implications of damage to the Orion spacecraft’s heat shield from the Artemis I mission nearly two years ago.

However, it turns out that there are now other problems with holding to this date as well.

No schedule margin

A new report from the US Government Accountability Office found that NASA’s Exploration Ground Systems program—this is, essentially, the office at Kennedy Space Center in Florida responsible for building ground infrastructure to support the Space Launch System rocket and Orion—is in danger of missing its schedule for Artemis II.

During this flight a crew of four astronauts, commanded by NASA’s Reid Wiseman, will launch inside Orion on a 10-day mission out to the Moon and back. The spacecraft will follow a free-return trajectory, which is important, because if there is a significant problem with Orion spacecraft’s propulsion system, the trajectory of the vehicle will still carry it back to Earth. At their closest approach, the crew will come within about 6,500 miles (10,400 km) of the surface of the far side of the Moon.

The new report, published Thursday, finds that the Exploration Ground Systems program had several months of schedule margin in its work toward a September 2025 launch date at the beginning of the year. But now, the program has allocated all of that margin to technical issues experienced during work on the rocket’s mobile launcher and pad testing.

“Earlier in 2024, the program was reserving that time for technical issues that may arise during testing of the integrated SLS and Orion vehicle or if weather interferes with planned activities, among other things,” the report states. “Officials said it is likely that issues will arise because this is the first time testing many of these systems. Given the lack of margin, if further issues arise during testing or integration, there will likely be delays to the September 2025 Artemis II launch date.”

It’s increasingly unlikely that humans will fly around the Moon next year Read More »

there’s-another-massive-meat-recall-over-listeria—and-it’s-a-doozy

There’s another massive meat recall over Listeria—and it’s a doozy

Another nationwide meat recall is underway over Listeria contamination—and its far more formidable than the last.

As of October 15, meat supplier BrucePac, of Durant, Oklahoma, is recalling 11.8 million pounds of ready-to-eat meat and poultry products after routine federal safety testing found Listeria monocytogenes, a potentially deadly bacterium, in samples of the company’s poultry. The finding triggered an immediate recall, which was first issued on October 9. But, officials are still working to understand the extent of the contamination—and struggling to identify the hundreds of potentially contaminated products.

“Because we sell to other companies who resell, repackage, or use our products as ingredients in other foods, we do not have a list of retail products that contain our recalled items,” BrucePac said in a statement updated October 15.

Depending on the packaging, the products may have establishment numbers 51205 or P-51205 inside or under the USDA mark of inspection. But, for now, consumers’ best chance of determining whether they’ve purchased any of the affected products is to look through a 342-page list of products identified by the US Department of Agriculture so far.

The unorganized document lists fresh and frozen foods sold at common retailers, including 7-Eleven, Aldi, Amazon Fresh, Giant Eagle, Kroger, Target, Trader Joe’s, Walmart, and Wegmans. Affected products carry well-known brand names, such as Atkins, Boston Market, Dole, Fresh Express, Jenny Craig, Michelina’s, Taylor Farms, and stores’ brands, such as Target’s Good & Gather. The recalled products were made between May 31, 2024 and October 8, 2024.

In the latest update, the USDA noted that some of the recalled products were also distributed to schools, but the agency hasn’t identified the schools that received the products. Restaurants and other institutions also received the products.

There’s another massive meat recall over Listeria—and it’s a doozy Read More »

amazon-joins-google-in-investing-in-small-modular-nuclear-power

Amazon joins Google in investing in small modular nuclear power


Small nukes is good nukes?

What’s with the sudden interest in nuclear power among tech titans?

Diagram of a reactor and its coolant system. There are two main components, the reactor itself, which has a top-to-bottom flow of fuel pellets, and the boiler, which receives hot gas from the reactor and uses it to boil water.

Fuel pellets flow down the reactor (left), as gas transfer heat to a boiler (right). Credit: X-energy

On Tuesday, Google announced that it had made a power purchase agreement for electricity generated by a small modular nuclear reactor design that hasn’t even received regulatory approval yet. Today, it’s Amazon’s turn. The company’s Amazon Web Services (AWS) group has announced three different investments, including one targeting a different startup that has its own design for small, modular nuclear reactors—one that has not yet received regulatory approval.

Unlike Google’s deal, which is a commitment to purchase power should the reactors ever be completed, Amazon will lay out some money upfront as part of the agreements. We’ll take a look at the deals and technology that Amazon is backing before analyzing why companies are taking a risk on unproven technologies.

Money for utilities and a startup

Two of Amazon’s deals are with utilities that serve areas where it already has a significant data center footprint. One of these is Energy Northwest, which is an energy supplier that sends power to utilities in the Pacific Northwest. Amazon is putting up the money for Energy Northwest to study the feasibility of adding small modular reactors to its Columbia Generating Station, which currently houses a single, large reactor. In return, Amazon will get the right to purchase power from an initial installation of four small modular reactors. The site could potentially support additional reactors, which Energy Northwest would be able to use to meet demands from other users.

The deal with Virginia’s Dominion Energy is similar in that it would focus on adding small modular reactors to Dominion’s existing North Anna Nuclear Generating Station. But the exact nature of the deal is a bit harder to understand. Dominion says the companies will “jointly explore innovative ways to advance SMR development and financing while also mitigating potential cost and development risks.”

Should either or both of these projects go forward, the reactor designs used will come from a company called X-energy, which is involved in the third deal Amazon is announcing. In this case, it’s a straightforward investment in the company, although the exact dollar amount is unclear (the company says Amazon is “anchoring” a $500 million round of investments). The money will help finalize the company’s reactor design and push it through the regulatory approval process.

Small modular nuclear reactors

X-energy is one of several startups attempting to develop small modular nuclear reactors. The reactors all have a few features that are expected to help them avoid the massive time and cost overruns associated with the construction of large nuclear power stations. In these small reactors, the limited size allows them to be made at a central facility and then be shipped to the power station for installation. This limits the scale of the infrastructure that needs to be built in place and allows the assembly facility to benefit from economies of scale.

This also allows a great deal of flexibility at the installation site, as you can scale the facility to power needs simply by adjusting the number of installed reactors. If demand rises in the future, you can simply install a few more.

The small modular reactors are also typically designed to be inherently safe. Should the site lose power or control over the hardware, the reactor will default to a state where it can’t generate enough heat to melt down or damage its containment. There are various approaches to achieving this.

X-energy’s technology is based on small, self-contained fuel pellets called TRISO particles for TRi-structural ISOtropic. These contain both the uranium fuel and a graphite moderator and are surrounded by a ceramic shell. They’re structured so that there isn’t sufficient uranium present to generate temperatures that can damage the ceramic, ensuring that the nuclear fuel will always remain contained.

The design is meant to run at high temperatures and extract heat from the reactor using helium, which is used to boil water and generate electricity. Each reactor can produce 80 megawatts of electricity, and the reactors are designed to work efficiently as a set of four, creating a 320 MW power plant. As of yet, however, there are no working examples of this reactor, and the design hasn’t been approved by the Nuclear Regulatory Commission.

Why now?

Why is there such sudden interest in small modular reactors among the tech community? It comes down to growing needs and a lack of good alternatives, even given the highly risky nature of the startups that hope to build the reactors.

It’s no secret that data centers require enormous amounts of energy, and the sudden popularity of AI threatens to raise that demand considerably. Renewables, as the cheapest source of power on the market, would be one way of satisfying that growth, but they’re not ideal. For one thing, the intermittent nature of the power they supply, while possible to manage at the grid level, is a bad match for the around-the-clock demands of data centers.

The US has also benefitted from over a decade of efficiency gains keeping demand flat despite population and economic growth. This has meant that all the renewables we’ve installed have displaced fossil fuel generation, helping keep carbon emissions in check. Should newly installed renewables instead end up servicing rising demand, it will make it considerably more difficult for many states to reach their climate goals.

Finally, renewable installations have often been built in areas without dedicated high-capacity grid connections, resulting in a large and growing backlog of projects (2.6 TW of generation and storage as of 2023) that are stalled as they wait for the grid to catch up. Expanding the pace of renewable installation can’t meet rising server farm demand if the power can’t be brought to where the servers are.

These new projects avoid that problem because they’re targeting sites that already have large reactors and grid connections to use the electricity generated there.

In some ways, it would be preferable to build more of these large reactors based on proven technologies. But not in two very important ways: time and money. The last reactor completed in the US was at the Vogtle site in Georgia, which started construction in 2009 but only went online this year. Costs also increased from $14 billion to over $35 billion during construction. It’s clear that any similar projects would start generating far too late to meet the near-immediate needs of server farms and would be nearly impossible to justify economically.

This leaves small modular nuclear reactors as the least-bad option in a set of bad options. Despite many startups having entered the space over a decade ago, there is still just a single reactor design approved in the US, that of NuScale. But the first planned installation saw the price of the power it would sell rise to the point where it was no longer economically viable due to the plunge in the cost of renewable power; it was canceled last year as the utilities that would have bought the power pulled out.

The probability that a different company will manage to get a reactor design approved, move to construction, and manage to get something built before the end of the decade is extremely low. The chance that it will be able to sell power at a competitive price is also very low, though that may change if demand rises sufficiently. So the fact that Amazon is making some extremely risky investments indicates just how worried it is about its future power needs. Of course, when your annual gross profit is over $250 billion a year, you can afford to take some risks.

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

Amazon joins Google in investing in small modular nuclear power Read More »

drugmakers-can-keep-making-off-brand-weight-loss-drugs-as-fda-backpedals

Drugmakers can keep making off-brand weight-loss drugs as FDA backpedals

The judge in the case, District Judge Mark Pittman, granted the FDA’s request, canceling an October 15 hearing, and ordering the parties to submit a joint status report on November 21.

Drugmakers respond

The move was celebrated by the Outsourcing Facilities Association (OFA), which filed the lawsuit.

“We believe that this is a fair resolution in light of the agency’s rash decision to take the drug off of the list at a time when the agency has acknowledged ‘supply disruptions,’ which immediately created a major access issue for patients everywhere,” OFA Chairperson Lee Rosebush said in a statement. “Most important, should the FDA repeat its removal decision when a shortage still genuinely exists, we will return to court.”

The move is also likely to please patients who have come to rely on cheaper, more readily available compounded versions of the drugs. For some, compounded products may have been their only access to tirzepatide.  Still, those drugs are not without risk. The FDA has repeatedly emphasized that compounded drugs are not FDA-approved and do not go through the same safety, efficacy, and quality reviews. And the agency has warned of dosing errors and other safety concerns with compounded versions.

The one party that is certainly unhappy with the FDA’s move is Eli Lilly, which had reportedly sent cease-and-desist letters to compounders. In an emailed statement to Ars, a spokesperson for Lilly said that there was sufficient supply of the company’s drug and continued use of compounded versions is unwarranted. “Nothing changes the fact that, as FDA has recognized, Mounjaro and Zepbound are available and the shortage remains ‘resolved,'” the spokesperson said.

Lilly also noted the FDA’s safety concerns about the compounded versions, adding that its own examination of some compounded products found impurities, bacteria, strange coloring, incorrect potency, puzzling chemical structures, and, in one case, a product that was just sugar alcohol.

“All doses of Lilly’s FDA-approved medicines are available and it is important that patients not be exposed to the risks in taking untested, unapproved knockoffs,” the spokesperson said.

In terms of the supply “disruptions” the FDA mentioned and that some patients and pharmacies have reportedly experienced, Lilly said that the supply chain was complex and there are many reasons why a given pharmacy may not have a specific dose at hand, such as limited refrigerated storage space.

Drugmakers can keep making off-brand weight-loss drugs as FDA backpedals Read More »

sustainable-building-effort-reaches-new-heights-with-wooden-skyscrapers

Sustainable building effort reaches new heights with wooden skyscrapers


Wood offers architects an alternative to carbon-intensive steel and concrete.

At the University of Toronto, just across the street from the football stadium, workers are putting up a 14-story building with space for classrooms and faculty offices. What’s unusual is how they’re building it — by bolting together giant beams, columns, and panels made of manufactured slabs of wood.

As each wood element is delivered by flatbed, a tall crane lifts it into place and holds it in position while workers attach it with metal connectors. In its half-finished state, the building resembles flat-pack furniture in the process of being assembled.

The tower uses a new technology called mass timber. In this kind of construction, massive, manufactured wood elements that can extend more than half the length of a football field replace steel beams and concrete. Though still relatively uncommon, it is growing in popularity and beginning to pop up in skylines around the world.

A photo of a modern apartment interior with wooden beams, floor and ceiling. Windows overlook the surrounding neighborhood.

Mass timber can lend warmth and beauty to an interior. Pictured is a unit in the eight-story Carbon12 condominium in Portland, Oregon.

Mass timber can lend warmth and beauty to an interior. Pictured is a unit in the eight-story Carbon12 condominium in Portland, Oregon. Credit: KAISER + PATH

Today, the tallest mass timber building is the 25-story Ascent skyscraper in Milwaukee, completed in 2022. As of that year, there were 84 mass timber buildings eight stories or higher either built or under construction worldwide, with another 55 proposed. Seventy percent of the existing and future buildings were in Europe, about 20 percent in North America, and the rest in Australia and Asia, according to a report from the Council on Tall Buildings and Urban Habitat. When you include smaller buildings, at least 1,700 mass timber buildings had been constructed in the United States alone as of 2023.

Mass timber is an appealing alternative to energy-intensive concrete and steel, which together account for almost 15 percent of global carbon dioxide emissions. Though experts are still debating mass timber’s role in fighting climate change, many are betting it’s better for the environment than current approaches to construction. It relies on wood, after all, a renewable resource.

Mass timber also offers a different aesthetic that can make a building feel special. “People get sick and tired of steel and concrete,” says Ted Kesik, a building scientist at the University of Toronto’s Mass Timber Institute, which promotes mass timber research and development. With its warm, soothing appearance and natural variations, timber can be more visually pleasing. “People actually enjoy looking at wood.”

Same wood, stronger structure

Using wood for big buildings isn’t new, of course. Industrialization in the 18th and 19th centuries led to a demand for large factories and warehouses, which were often “brick and beam” construction—a frame of heavy wooden beams supporting exterior brick walls.

As buildings became ever taller, though, builders turned to concrete and steel for support. Wood construction became mostly limited to houses and other small buildings made from the standard-sized “dimensional” lumber you see stacked at Home Depot.

But about 30 years ago, builders in Germany and Austria began experimenting with techniques for making massive wood elements out of this readily available lumber. They used nails, dowels and glue to combine smaller pieces of wood into big, strong and solid masses that don’t require cutting down large old-growth trees.

Engineers including Julius Natterer, a German engineer based in Switzerland, pioneered new methods for building with the materials. And architects including Austria’s Hermann Kaufmann began gaining attention for mass timber projects, including the Ölzbündt apartments in Austria, completed in 1997, and Brock Commons, an 18-story student residence at the University of British Columbia, completed in 2017.

In principle, mass timber is like plywood but on a much larger scale: The smaller pieces are layered and glued together under pressure in large specialized presses. Today, beams up to 50 meters long, usually made of what’s called glue-laminated timber, or glulam, can replace steel elements. Panels up to 50 centimeters thick, typically cross-laminated timber, or CLT, replace concrete for walls and floors.

These wood composites can be surprisingly strong—stronger than steel by weight. But a mass timber element must be bulkier to achieve that same strength. As a building gets higher, the wooden supports must get thicker; at some point, they simply take up too much space. So for taller mass timber buildings, including the Ascent skyscraper, architects often turn to a combination of wood, steel and concrete.

Historically, one of the most obvious concerns with using mass timber for tall buildings was fire safety. Until recently, many building codes limited wood construction to low-rise buildings.

Though they don’t have to be completely fireproof, buildings need to resist collapse long enough to give firefighters a chance to bring the flames under control, and for occupants to get out. Materials used in conventional skyscrapers, for instance, are required to maintain their integrity in a fire for three hours or more.

To demonstrate mass timber’s fire resistance, engineers put the wood elements in gas-fired chambers and monitor their integrity. Other tests set fire to mock-ups of mass timber buildings and record the results.

These tests have gradually convinced regulators and customers that mass timber can resist burning long enough to be fire-safe. That’s partly because a layer of char tends to form early on the outside of the timber, insulating the interior from much of the fire’s heat.

Mass timber got a major stamp of approval in 2021, when the International Code Council changed the International Building Code, which serves as a model for jurisdictions around the world, to allow mass timber construction up to 18 stories tall. With this change, more and more localities are expected to update their codes to routinely allow tall mass timber buildings, rather than requiring them to get special approvals.

There are other challenges, though. “Moisture is the real problem, not fire,” says Steffen Lehmann, an architect and scholar of urban sustainability at the University of Nevada, Las Vegas.

All buildings must control moisture, but it’s absolutely crucial for mass timber. Wet wood is vulnerable to deterioration from fungus and insects like termites. Builders are careful to prevent the wood from getting wet during transportation and construction, and they deploy a comprehensive moisture management plan, including designing heat and ventilation systems to keep moisture from accumulating. For extra protection from insects, wood can be treated with chemical pesticides or surrounded by mesh or other physical barriers where it meets the ground.

Another problem is acoustics, since wood transmits sound so well. Designers use sound insulation materials, leave space between walls and install raised floors, among other methods.

Potential upsides of mass timber

Combating global warming means reducing greenhouse gas emissions from the building sector, which is responsible for 39 percent of emissions globally. Diana Ürge-Vorsatz, an environmental scientist at the Central European University in Vienna, says mass timber and other bio-based materials could be an important part of that effort.

In a 2020 paper in the Annual Review of Environment and Resources, she and colleagues cite an estimate from the lumber industry that the 18-story Brock Commons, in British Columbia, avoided the equivalent of 2,432 metric tons of CO2 emissions compared with a similar building of concrete and steel. Of those savings, 679 tons came from the fact that less greenhouse gas emissions are generated in the manufacture of wood versus concrete and steel. Another 1,753 metric tons of CO2 equivalent were locked away in the building’s wood.

“If you use bio-based material, we have a double win,” Ürge-Vorsatz says.

But a lot of the current enthusiasm over mass timber’s climate benefits is based on some big assumptions. The accounting often assumes, for instance, that any wood used in a mass timber building will be replaced by the growth of new trees, and that those new trees will take the same amount of CO2 out of the atmosphere across time. But if old-growth trees are replaced with new tree plantations, the new trees may never reach the same size as the original trees, some environmental groups argue. There are also concerns that increasing demand for wood could lead to more deforestation and less land for food production.

Studies also tend to assume that once the wood is in a building, the carbon is locked up for good. But not all the wood from a felled tree ends up in the finished product. Branches, roots and lumber mill waste may decompose or get burned. And when the building is torn down, if the wood ends up in a landfill, the carbon can find its way out in the form of methane and other emissions.

“A lot of architects are scratching their heads,” says Stephanie Carlisle, an architect and environmental researcher at the nonprofit Carbon Leadership Forum, wondering whether mass timber always has a net benefit. “Is that real?” She believes climate benefits do exist. But she says understanding the extent of those benefits will require more research.

In the meantime, mass timber is at the forefront of a whole different model of construction called integrated design. In traditional construction, an architect designs a building first and then multiple firms are hired to handle different parts of the construction, from laying the foundation, to building the frame, to installing the ventilation system, and so on.

In integrated design, says Kesik, the design phase is much more detailed and involves the various firms from the beginning. The way different components will fit and work together is figured out in advance. Exact sizes and shapes of elements are predetermined, and holes can even be pre-drilled for attachment points. That means many of the components can be manufactured off-site, often with advanced computer-controlled machinery.

A lot of architects like this because it gives them more control over the building elements. And because so much of the work is done in advance, the buildings tend to go up faster on-site — up to 40 percent faster than other buildings, Lehmann says.

Mass timber buildings tend to be manufactured more like automobiles, Kesik says, with all the separate pieces shipped to a final location for assembly. “When the mass timber building shows up on-site, it’s really just like an oversized piece of Ikea furniture,” he says. “Everything sort of goes together.”

This story originally appeared in Knowable Magazine.

Photo of Knowable Magazine

Knowable Magazine explores the real-world significance of scholarly work through a journalistic lens.

Sustainable building effort reaches new heights with wooden skyscrapers Read More »

what-do-planet-formation-and-badminton-have-in-common?

What do planet formation and badminton have in common?

It might not come as a surprise to learn that Lin is a badminton player. “The experience of playing badminton is really the thing that kick-started the idea and led me to ask the right questions,” he said.

Previous explanations attribute the dust alignment to the magnetic influence of the central star, the physics of which can be complicated and not always intuitive. The beauty of the proposed birdie mechanism is its simplicity. “It’s a very good first step,” said Bing Ren, an astronomer at France’s Côte d’Azur Observatory who wasn’t involved in the study.

Still, the birdie-alignment hypothesis is just that—a hypothesis. To confirm whether it holds water, scientists will need to throw their full observational arsenal at protoplanetary disks, such as viewing them at different wavelengths, to sniff out the finer details of particle-gas interactions.

Tracing invisible gas

Real-life protoplanetary disks are likely more complicated than a uniform squadron of space potatoes suspended in thin air. Ren suspects that the grains come in various shapes, sizes, and speeds. Nevertheless, he says Lin’s study is a good foundation for computer models of interstellar clouds, onto which scientists can tack layers of complexity.

The new research points a way forward for probing protoplanetary disks, particularly gas behavior. Given that the grains trace the gas direction, studying dust organization using existing tools like polarized light can allow scientists to map a disk’s aerodynamic flow. Essentially, these grains are tiny flags that signal where the wind blows.

As granular as the details are, the dust alignment is a small but key step in a grand journey of particle-to-planet progression. The nitty-gritty of a particle’s conduct will determine its fate for millions of years—perhaps the primordial seed will hoover up hydrogen and helium to become a gas giant or amass dust to transform into a terrestrial world like Earth. It all starts with it flailing or keeping steady amid a sea of other specks.

Monthly Notices of the Royal Astronomical Society, 2024. DOI: 10.1093/mnras/stae2248 (About DOIs)

Shi En Kim is a DC-based freelance journalist who writes about health, the environment, technology, and the physical sciences. She and three other journalists founded Sequencer Magazine in early 2024. Occasionally, she creates art to accompany her writings or does it simply for fun. Follow her on Twitter at @goes_by_kim, or see more of her work on her personal website

What do planet formation and badminton have in common? Read More »

routine-dental-x-rays-are-not-backed-by-evidence—experts-want-it-to-stop

Routine dental X-rays are not backed by evidence—experts want it to stop


The actual recommendations might surprise you—along with the state of modern dentistry.

An expert looking at a dental X-ray and saying “look at that unnecessary X-ray,” probably. Credit: Getty | MilanEXPO

Has your dentist ever told you that it’s recommended to get routine dental X-rays every year? My (former) dentist’s office did this year—in writing, even. And they claimed that the recommendation came from the American Dental Association.

It’s a common refrain from dentists, but it’s false. The American Dental Association does not recommend annual routine X-rays. And this is not new; it’s been that way for well over a decade.

The association’s guidelines from 2012 recommended that adults who don’t have an increased risk of dental caries (myself included) need only bitewing X-rays of the back teeth every two to three years. Even people with a higher risk of caries can go as long as 18 months between bitewings. The guidelines also note that X-rays should not be preemptively used to look for problems: “Radiographic screening for the purpose of detecting disease before clinical examination should not be performed,” the guidelines read. In other words, dentists are supposed to examine your teeth before they take any X-rays.

But, of course, the 2012 guidelines are outdated—the latest ones go further. In updated guidance published in April, the ADA doesn’t recommend any specific time window for X-rays at all. Rather, it emphasizes that patient exposure to X-rays should be minimized, and any X-rays should be clinically justified.

There’s a good chance you’re surprised. Dentistry’s overuse of X-rays is a problem dentists do not appear eager to discuss—and would likely prefer to skirt. My former dentist declined to comment for this article, for example. And other dentists have been doing that for years. Nevertheless, the problem is well-established. A New York Times article from 2016, titled “You Probably Don’t Need Dental X-Rays Every Year,” quoted a dental expert noting the exact problem:

“Many patients of all ages receive bitewing X-rays far more frequently than necessary or recommended. And adults in good dental health can go a decade between full-mouth X-rays.”

Data is lacking

The problem has bubbled up again in a series of commentary pieces published in JAMA Internal Medicine today. The pieces were all sparked by a viewpoint that Ars reported on in May, in which three dental and health experts highlighted that many routine aspects of dentistry, including biannual cleanings, are not evidence-based and that the industry is rife with overdiagnosis and overtreatment. That viewpoint, titled “Too Much Dentistry,” also appeared in JAMA Internal Medicine.

The new pieces take a more specific aim at dental radiography. But, as in the May viewpoint, experts also blasted dentistry more generally for being out of step with modern medicine in its lack of data to support its practices—practices that continue amid financial incentives to overtreat and little oversight to stop it, they note.

In a piece titled “Too Much Dental Radiography,” Sheila Feit, a retired medical expert based in New York, pointed out that using X-rays for dental screenings is not backed by evidence. “Data are lacking about outcomes,” she wrote. If anything, the weak data we have makes it look ineffective. For instance, a 2021 systemic review of 77 studies that included data on a total of 15,518 tooth sites or surfaces found that using X-rays to detect early tooth decay led to a high degree of false-negative results. In other words, it led to missed cases.

Feit called for gold-standard randomized clinical trials to evaluate the risks and benefits of X-ray screenings for patients, particularly adults at low risk of caries. “Financial aspects of dental radiography also deserve further study,” Feit added. Overall, Feit called the May viewpoint “a timely call for evidence to support or refute common clinical dental practices.”

Dentistry without oversight

In a response published simultaneously in JAMA Internal Medicine, oral medicine expert Yehuda Zadik championed Feit’s point, calling it “an essential discussion about the necessity and risks of routine dental radiography, emphasizing once again the need for evidence-based dental care.”

Zadik, a professor of dental medicine at The Hebrew University of Jerusalem, noted that the overuse of radiography in dentistry is a global problem, one aided by dentistry’s unique delivery:

“Dentistry is among the few remaining health care professions where clinical examination, diagnostic testing including radiographs, diagnosis, treatment planning, and treatment are all performed in place, often by the same care practitioner” Zadik wrote. “This model of care delivery prevents external oversight of the entire process.”

While routine X-rays continue at short intervals, Zadik notes that current data “favor the reduction of patient exposure to diagnostic radiation in dentistry,” while advancements in dentistry dictate that X-rays should be used at “longer intervals and based on clinical suspicion.”

Though the digital dental X-rays often used today provide smaller doses of radiation than the film X-rays used in the past, radiation’s harms are cumulative. Zadik emphasizes that with the primary tenet of medicine being “First, do no harm,” any unnecessary X-ray is an unnecessary harm. Further, other technology can sometimes be used instead of radiography, including electronic apex locators for root canal procedures.

“Just as it is now unimaginable that, in the past, shoe fittings for children were conducted using X-rays, in the future it will be equally astonishing to learn that the fit of dental crowns was assessed using radiographic imaging,” Zadik wrote.

X-rays do more harm than good in children

Feit’s commentary also prompted a reply from the three authors of the original May viewpoint: Paulo Nadanovsky, Ana Paula Pires dos Santos, and David Nunan. The three followed up on Feit’s point that data is weak on whether X-rays are useful for detecting early decay, specifically white spot lesions. The experts raise the damning point that even if dental X-rays were shown to be good at doing that, there’s still no evidence that that’s good for patients.

“[T]here is no evidence that detecting white spot lesions, with or without radiographs, benefits patients,” the researchers wrote. “Most of these lesions do not progress into dentine cavities,” and there’s no evidence that early treatments make a difference in the long run.

To bolster the point, the three note that data from children suggest that X-ray screening does more harm than good. In a randomized clinical trial published in 2021, 216 preschool children were split into two groups: one that received only a visual-tactile dental exam, while the others received both a visual-tactile exam and X-rays. The study found that adding X-rays caused more harm than benefit because the X-rays led to false positives and overdiagnosis of cavitated caries needing restorative treatment. The authors of the trial concluded that “visual inspection should be conducted alone in regular clinical practice.”

Like Zadik, the three researchers note that screenings for decay and cavities are not the only questionable use of X-rays in dental practice. Other common dental and orthodontic treatments involving radiography—practices often used in children and teens—might also be unnecessary harms. They raise the argument against the preventive removal of wisdom teeth, which is also not backed by evidence.

Like Feit, the three researchers reiterate the call for well-designed trials to back up or refute common dental practices.

Photo of Beth Mole

Beth is Ars Technica’s Senior Health Reporter. Beth has a Ph.D. in microbiology from the University of North Carolina at Chapel Hill and attended the Science Communication program at the University of California, Santa Cruz. She specializes in covering infectious diseases, public health, and microbes.

Routine dental X-rays are not backed by evidence—experts want it to stop Read More »

people-think-they-already-know-everything-they-need-to-make-decisions

People think they already know everything they need to make decisions

The obvious difference was the decisions they made. In the group that had read the article biased in favor of merging the schools, nearly 90 percent favored the merger. In the group that had read the article that was biased by including only information in favor of keeping the schools separate, less than a quarter favored the merger.

The other half of the experimental population wasn’t given the survey immediately. Instead, they were given the article that they hadn’t read—the one that favored the opposite position of the article that they were initially given. You can view this group as doing the same reading as the control group, just doing so successively rather than in a single go. In any case, this group’s responses looked a lot like the control’s, with people roughly evenly split between merger and separation. And they became less confident in their decision.

It’s not too late to change your mind

There is one bit of good news about this. When initially forming hypotheses about the behavior they expected to see, Gehlbach, Robinson, and Fletcher suggested that people would remain committed to their initial opinions even after being exposed to a more complete picture. However, there was no evidence of this sort of stubbornness in these experiments. Instead, once people were given all the potential pros and cons of the options, they acted as if they had that information the whole time.

But that shouldn’t obscure the fact that there’s a strong cognitive bias at play here. “Because people assume they have adequate information, they enter judgment and decision-making processes with less humility and more confidence than they might if they were worrying whether they knew the whole story or not,” Gehlbach, Robinson, and Fletcher.

This is especially problematic in the current media environment. Many outlets have been created with the clear intent of exposing their viewers to only a partial view of the facts—or, in a number of cases, the apparent intent of spreading misinformation. The new work clearly indicates that these efforts can have a powerful effect on beliefs, even if accurate information is available from various sources.

PLOS ONE, 2024. DOI: 10.1371/journal.pone.0310216  (About DOIs).

People think they already know everything they need to make decisions Read More »

can-walls-of-oysters-protect-shores-against-hurricanes?-darpa-wants-to-know.

Can walls of oysters protect shores against hurricanes? Darpa wants to know.


Colonized artificial reef structures could absorb the power of storms.

picture of some shoreline

Credit: Kemter/Getty Images

On October 10, 2018, Tyndall Air Force Base on the Gulf of Mexico—a pillar of American air superiority—found itself under aerial attack. Hurricane Michael, first spotted as a Category 2 storm off the Florida coast, unexpectedly hulked up to a Category 5. Sustained winds of 155 miles per hour whipped into the base, flinging power poles, flipping F-22s, and totaling more than 200 buildings. The sole saving grace: Despite sitting on a peninsula, Tyndall avoided flood damage. Michael’s 9- to 14-foot storm surge swamped other parts of Florida. Tyndall’s main defense was luck.

That $5 billion disaster at Tyndall was just one of a mounting number of extreme-weather events that convinced the US Department of Defense that it needed new ideas to protect the 1,700 coastal bases it’s responsible for globally. As hurricanes Helene and Milton have just shown, beachfront residents face compounding threats from climate change, and the Pentagon is no exception. Rising oceans are chewing away the shore. Stronger storms are more capable of flooding land.

In response, Tyndall will later this month test a new way to protect shorelines from intensified waves and storm surges: a prototype artificial reef, designed by a team led by Rutgers University scientists. The 50-meter-wide array, made up of three chevron-shaped structures each weighing about 46,000 pounds, can take 70 percent of the oomph out of waves, according to tests. But this isn’t your grandaddy’s seawall. It’s specifically designed to be colonized by oysters, some of nature’s most effective wave-killers.

If researchers can optimize these creatures to work in tandem with new artificial structures placed at sea, they believe the resulting barriers can take 90 percent of the energy out of waves. David Bushek, who directs the Haskin Shellfish Research Laboratory at Rutgers, swears he’s not hoping for a megastorm to come and show what his team’s unit is made of. But he’s not not hoping for one. “Models are always imperfect. They’re always a replica of something,” he says. “They’re not the real thing.”

Playing defense Reefense

The project is one of three being developed under a $67.6 million program launched by the US government’s Defense Advanced Research Projects Agency, or Darpa. Cheekily called Reefense, the initiative is the Pentagon’s effort to test if “hybrid” reefs, combining manmade structures with oysters or corals, can perform as well as a good ol’ seawall. Darpa chose three research teams, all led by US universities, in 2022. After two years of intensive research and development, their prototypes are starting to go into the water, with Rutgers’ first up.

Today, the Pentagon protects its coastal assets much as civilians do: by hardening them. Common approaches involve armoring the shore with retaining walls or arranging heavy objects, like rocks or concrete blocks, in long rows. But hardscape structures come with tradeoffs. They deflect rather than absorb wave energy, so protecting one’s own shoreline means exposing someone else’s. They’re also static: As sea levels rise and storms get stronger, it’s getting easier for water to surmount these structures. This wears them down faster and demands constant, expensive repairs.

In recent decades, a new idea has emerged: using nature as infrastructure. Restoring coastal habitats like marshes and mangroves, it turns out, helps hold off waves and storms. “Instead of armoring, you’re using nature’s natural capacity to absorb wave energy,” says Donna Marie Bilkovic, a professor at the Virginia Institute for Marine Science. Darpa is particularly interested in two creatures whose numbers have been decimated by humans but which are terrific wave-breakers when allowed to thrive: oysters and corals.

Oysters are effective wave-killers because of how they grow. The bivalves pile onto each other in large, sturdy mounds. The resulting structure, unlike a smooth seawall, is replete with nooks, crannies, and convolutions. When a wave strikes, its energy gets diffused into these gaps, and further spent on the jagged, complex surfaces of the oysters. Also unlike a seawall, an oyster wall can grow. Oysters have been shown to be capable of building vertically at a rate that matches sea-level rise—which suggests they’ll retain some protective value against higher tides and stronger storms.

Today hundreds of human-tended oyster reefs, particularly on America’s Atlantic coast, use these principles to protect the shore. They take diverse approaches; some look much like natural reefs, while others have an artificial component. Some cultivate oysters for food, with coastal protection a nice co-benefit; others are built specifically to preserve shorelines. What’s missing amid all this experimentation, says Bilkovic, is systematic performance data—the kind that could validate which approaches are most effective and cost-effective. “Right now the innovation is outpacing the science,” she says. “We need to have some type of systematic monitoring of projects, so we can better understand where the techniques work the best. There just isn’t funding, frankly.”

Hybrid deployments

Rather than wait for the data needed to engineer the perfect reef, Darpa wants to rapidly innovate them through a burst of R&D. Reefense has given awardees five years to deploy hybrid reefs that take up to 90 percent of the energy out of waves, without costing significantly more than traditional solutions. The manmade component should block waves immediately. But it should be quickly enhanced by organisms that build, in months or years, a living structure that would take nature decades.

The Rutgers team has built its prototype out of 788 interlocked concrete modules, each 2 feet wide and ranging in height from 1 to 2 feet tall. They have a scalloped appearance, with shelves jutting in all directions. Internally, all these shelves are connected by holes.

A Darpa-funded team will install sea barriers, made of hundreds of concrete modules, near a Florida military base. The scalloped shape should not only dissipate wave energy but invite oysters to build their own structures.

What this means is that when a wave strikes this structure, it smashes into the internal geometry, swirls around, and exits with less energy. This effect alone weakens the wave by 70 percent, according to the US Army Corps of Engineers, which tested a scale model in a wave simulator in Mississippi. But the effect should only improve as oysters colonize the structure. Bushek and his team have tried to design the shelves with the right hardness, texture, and shading to entice them.

But the reef’s value would be diminished if, say, disease were to wipe the mollusks out. This is why Darpa has tasked Rutgers with also engineering oysters resistant to dermo, a protozoan that’s dogged Atlantic oysters for decades. Darpa prohibited them using genetic-modification techniques. But thanks to recent advances in genomics, the Rutgers team can rapidly identify individual oysters with disease-resistant traits. It exposes these oysters to dermo in a lab, and crossbreeds the survivors, producing hardier mollusks. Traditionally it takes about three years to breed a generation of oysters for better disease resistance; Bushek says his team has done it in one.

The tropics are a different story

Oysters may suit the DoD’s needs in temperate waters, but for bases in tropical climates, it’s coral that builds the best seawalls. Hawaii, for instance, enjoys the protection of “fringing” coral reefs that extend offshore for hundreds of yards in a gentle slope along the seabed. The colossal, complex, and porous character of this surface exhausts wave energy over long distances, says Ben Jones, an oceanographer for the Applied Research Laboratory at the University of Hawaii—and head of the university’s Reefense project. He said it’s not unusual to see ocean swells of 6 to 8 feet way offshore, while the water at the seashore laps gently.

A Marine base in Hawaii will test out a new approach to coastal protection inspired by local coral reefs: A forward barrier will take the first blows of the waves, and a scattering of pyramids will further weaken waves before they get to shore.

Inspired by this effect, Jones and a team of researchers are designing an array that they’ll deploy near a US Marine Corps base in Oahu whose shoreline is rapidly receding. While the final design isn’t set yet, the broad strokes are: It will feature two 50-meter-wide barriers laid in rows, backed by 20 pyramid-like obstacles. All of these are hollow, thin-walled structures with sloping profiles and lots of big holes. Waves that crash into them will lose energy by crawling up the sides, but two design aspects of the structure—the width of the holes and the thinness of the walls—will generate turbulence in the water, causing it to spin off more energy as heat.

The manmade structures in Hawaii will be studded with concrete domes meant to encourage coral colonization. Though at grave risk from global warming, coral reefs are thought to provide coastal-protection benefits worth billions of dollars.

In the team’s full vision, the units are bolstered by about a thousand small coral colonies. Jones’ group plans to cover the structures with concrete modules that are about 20 inches in diameter. These have grooves and crevices that offer perfect shelters for coral larvae. The team will initially implant them with lab-bred coral. But they’re also experimenting with enticements, like light and sound, that help attract coral larvae from the wild—the better to build a wall that nature, not the Pentagon, will tend.

A third Reefense team, led by scientists at the University of Miami, takes its inspiration from a different sort of coral. Its design has a three-tiered structure. The foundation is made of long, hexagonal logs punctured with large holes; atop it is a dense layer with smaller holes—“imagine a sponge made of concrete,” says Andrew Baker, director of the university’s Coral Reef Futures Lab and the Reefense team lead.

The team thinks these artificial components will soak up plenty of wave energy—but it’s a crest of elkhorn coral at the top that will finish the job. Native to Florida, the Bahamas, and the Caribbean, elkhorn like to build dense reefs in shallow-water areas with high-intensity waves. They don’t mind getting whacked by water because it helps them harvest food; this whacking keeps wave energy from getting to shore.

Disease has ravaged Florida’s elkhorn populations in recent decades, and now ocean heat waves are dealing further damage. But their critical condition has also motivated policymakers to pursue options to save this iconic state species—including Baker’s, which is to develop an elkhorn more rugged against disease, higher temperatures, and nastier waves. Under Reefense, Baker says, his lab has developed elkhorn with 1.5° to 2° Celsius more heat tolerance than their ancestors. They also claim to have boosted the heat thresholds of symbiotic algae—an existentially important occupant of any healthy reef—and cross-bred local elkhorn with those from Honduras, where reefs have mysteriously withstood scorching waters.

An unexpected permitting issue, though, will force the Miami team to exit Reefense in 2025, without building the test unit it hoped to deploy near a Florida naval base. The federal permitting authority wanted a pot of money set aside to uninstall the structure if needed; Darpa felt it couldn’t do that in a timely way, according to Baker. (Darpa told WIRED every Reefense project has unique permitting challenges, so the Miami team’s fate doesn’t necessarily speak to anything broader. Representatives for the other two Reefense projects said Baker’s issue hasn’t come up for them.)

Though his team’s work with Reefense is coming to a premature end, Baker says, he’s confident their innovations will get deployed elsewhere. He’s been working with Key Biscayne, an island village near Miami whose shorelines have been chewed up by storms. Roland Samimy, the village’s chief resilience and sustainability officer, says they spend millions of dollars every few years importing sand for their rapidly receding beaches. He’s eager to see if a hybrid structure, like the University of Miami design, could offer protection at far lower cost. “People are realizing their manmade structures aren’t as resilient as nature is,” he says.

Not just Darpa

By no means is Darpa the only one experimenting in these areas. Around the world, there are efforts tackling various pieces of the puzzle, like breeding coral for greater heat resistance, or combining coral and oysters with artificial reefs, or designing low-carbon concrete that makes building these structures less environmentally damaging. Bilkovic, of the Virginia Institute for Marine Science, says Reefense will be a success if it demonstrates better ways of doing things than the prevailing methods—and has the data to back this up. “I’m looking forward to seeing what their findings are,” she says. “They’re systematically assessing the effectiveness of the project. Those lessons learned can be translated to other areas, and if the techniques are effective and work well, they can easily be translated to other regions.”

As for Darpa, though the Reefense prototypes are just starting to go in the water, the work is just beginning. All of these first-generation units will be scrutinized—both by the research teams and independent government auditors—to see whether their real-world performance matches what was in the models. Reefense is scheduled to conclude with a final report to the DoD in 2027. It won’t have a “winner” per se; as the Pentagon has bases around the world, it’s likely these three projects will all produce learnings that are relevant elsewhere.

Although their client has the largest military budget in the world, the three Reefense teams have been asked to keep an eye on the economics. Darpa has asked that project costs “not greatly exceed” those of conventional solutions, and tasked government monitors with checking the teams’ math. Catherine Campbell, Reefense’s program manager at Darpa, says affordability doesn’t just make it more likely the Pentagon will employ the technology—but that civilians can, too.

“This isn’t something bespoke for the military… we need to be in line with those kinds of cost metrics [in the civilian sector],” Campbell said in an email. “And that gives it potential for commercialization.”

This story originally appeared on wired.com.

Photo of WIRED

Wired.com is your essential daily guide to what’s next, delivering the most original and complete take you’ll find anywhere on innovation’s impact on technology, science, business and culture.

Can walls of oysters protect shores against hurricanes? Darpa wants to know. Read More »

starship-is-about-to-launch-on-its-fifth-flight,-and-this-time-there’s-a-catch

Starship is about to launch on its fifth flight, and this time there’s a catch

“We landed with half a centimeter accuracy in the ocean, so we think we have a reasonable chance to come back to the tower,” Gerstenmaier said.

Launch playbook

The Starship upper stage, meanwhile, will light six Raptor engines to accelerate to nearly orbital velocity, giving the rocket enough oomph to coast halfway around the world before falling back into the atmosphere over the Indian Ocean.

This is a similar trajectory to the one Starship flew in June, when it survived a fiery reentry for a controlled splashdown. It was the first time SpaceX completed an end-to-end Starship test flight. Onboard cameras showed fragments of the heat shield falling off Starship when it reentered the atmosphere, but the vehicle maintained control and reignited its Raptor engines, flipped from a horizontal to a vertical orientation, and settled into the Indian Ocean northwest of Australia.

After analyzing the results from the June mission, SpaceX engineers decided to rework the heat shield for the next Starship vehicle. The company said its technicians spent more than 12,000 hours replacing the entire thermal protection system with new-generation tiles, a backup ablative layer, and additional protections between the ship’s flap structures.

From start to finish, Sunday’s test flight should last approximately 1 hour and 5 minutes.

This diagram illustrates the path the Super Heavy booster will take to return to the launch pad in Texas, while the Starship upper stage continues the climb to space.

Credit: SpaceX

This diagram illustrates the path the Super Heavy booster will take to return to the launch pad in Texas, while the Starship upper stage continues the climb to space. Credit: SpaceX

Here’s an overview of the key events during Sunday’s flight:

 T+00: 00: 02: Liftoff

 T+00: 01: 02: Maximum aerodynamic pressure

 T+00: 02: 33: Super Heavy MECO (most engines cut off)

 T+00: 02: 41: Stage separation and ignition of Starship engines

• T+00: 02: 48: Super Heavy boost-back burn start

 T+00: 03: 41: Super Heavy boost-back burn shutdown

 T+00: 03: 43: Hot staging ring jettison

• T+00: 06: 08: Super Heavy is subsonic

• T+00: 06: 33: Super Heavy landing burn start

• T+00: 06: 56: Super Heavy landing burn shutdown and catch attempt

• T+00: 08: 27: Starship engine cutoff

• T+00: 48: 03: Starship reentry

• T+01: 02: 34: Starship is transonic

• T+01: 03: 43: Starship is subsonic

• T+01: 05: 15: Starship landing flip

• T+01: 05: 20: Starship landing burn

• T+01: 05: 34: Starship splashdown in Indian Ocean

SpaceX officials hope to see Starship’s heat shield stay intact as it dips into the atmosphere, when temperatures will reach 2,600° Fahrenheit (1,430° Celsius), hot enough to melt aluminum, the metal used to build many launch vehicles. SpaceX chose stainless steel for Starship because it strong at cryogenic temperatures—the rocket consumes super-cold fuel and oxidizer—and has a higher melting point than aluminum.

Starship is about to launch on its fifth flight, and this time there’s a catch Read More »