Science

blue-cheese-shows-off-new-colors,-but-the-taste-largely-remains-the-same

Blue cheese shows off new colors, but the taste largely remains the same

Am I blue? —

Future varieties could be yellow-green, reddish-brown-pink, or light blue.

Scientists at University of the Nottingham have discovered how to create different colours of blue cheese.

Enlarge / Scientists at the University of Nottingham have discovered how to create different colors of blue cheese.

University of Nottingham

Gourmands are well aware of the many varieties of blue cheese, known by the blue-green veins that ripple through the cheese. Different kinds of blue cheese have distinctive flavor profiles: they can be mild or strong, sweet or salty, for example. Soon we might be able to buy blue cheeses that belie the name and sport veins of different colors: perhaps yellow-green, reddish-brown-pink, or lighter/darker shades of blue, according to a recent paper published in the journal Science of Food.

“We’ve been interested in cheese fungi for over 10 years, and traditionally when you develop mould-ripened cheeses, you get blue cheeses such as Stilton, Roquefort, and Gorgonzola, which use fixed strains of fungi that are blue-green in color,” said co-author Paul Dyer of the University of Nottingham of this latest research. “We wanted to see if we could develop new strains with new flavors and appearances.”

Blue cheese has been around for a very long time. Legend has it that a young boy left his bread and ewe’s milk cheese in a nearby cave to pursue a lovely young lady he’d spotted in the distance. Months later, he came back to the cave and found it had molded into Roquefort. It’s a fanciful tale, but scholars think the basic idea is sound: people used to store cheeses in caves because their temperature and moisture levels were especially hospitable to harmless molds. That was bolstered by a 2021 analysis of paleofeces that found evidence that Iron Age salt miners in Hallstatt (Austria) between 800 and 400 BCE were already eating blue cheese and quaffing beer.

Color derivatives.

Enlarge / Color derivatives.

The manufacturing process for blue cheese is largely the same as for any cheese, with a few crucial additional steps. It requires cultivation of Penicillium roqueforti, a mold that thrives on exposure to oxygen. The P. roqueforti is added to the cheese, sometimes before curds form and sometimes mixed in with curds after they form. The cheese is then aged in a temperature-controlled environment. Lactic acid bacteria trigger the initial fermentation but eventually die off, and the P. roqueforti take over as secondary fermenters. Piercing the curds forms air tunnels in the cheese, and the mold grows along those surfaces to produce blue cheese’s signature veining.

Once scientists published the complete genome for P. roqueforti, it opened up opportunities for studying this blue cheese fungus, per Dyer et al. Different strains “can have different colony cultures and textures, with commercial strains being sold partly on the basis of color development,” they wrote. This coloration comes from pigments in the coatings of the spores that form as the colony grows. Dyer and his co-authors set out to determine the genetic basis of this pigment formation in the hopes of producing altered strains with different spore coat colors.

The team identified a specific biochemical pathway, beginning with a white color that gradually goes from yellow-green, red-brown-pink, dark brown, light blue, and ultimately that iconic dark blue-green. They used targeted gene deletion to block pigment biosynthesis genes at various points in this pathway. This altered the spore color, providing a proof of principle without adversely affecting the production of flavor volatiles and levels of secondary metabolites called mycotoxins. (The latter are present in low enough concentrations in blue cheese so as not to be a health risk for humans, and the team wanted to ensure those concentrations remained low.)

Pencillium roqueforti. (right) Cross sections of cheeses made with the original (dark blue-green) or new color (red-brown, bright green, white albino) strains of the fungus.” height=”371″ src=”https://cdn.arstechnica.net/wp-content/uploads/2024/02/bluecheese3-640×371.jpg” width=”640″>

Enlarge / (left) Spectrum of color strains produced in Pencillium roqueforti. (right) Cross sections of cheeses made with the original (dark blue-green) or new color (red-brown, bright green, white albino) strains of the fungus.

University of Nottingham

However, food industry regulations prohibit gene-deletion fungal strains for commercial cheese production. So Dyer et al. used UV mutagenesis—essentially “inducing sexual reproduction in the fungus,” per Dyer—to produce non-GMO mutant strains of the fungi to create “blue” cheeses of different colors, without increasing mycotoxin levels or impacting the volatile compounds responsible for flavor.

“The interesting part was that once we went on to make some cheese, we then did some taste trials with volunteers from across the wider university, and we found that when people were trying the lighter colored strains they thought they tasted more mild,” said Dyer. “Whereas they thought the darker strain had a more intense flavor. Similarly, with the more reddish-brown and a light green one, people thought they had a fruity, tangy element to them—whereas, according to the lab instruments, they were very similar in flavor. This shows that people do perceive taste not only from what they taste but also by what they see.”

Dyer’s team is hoping to work with local cheese makers in Nottingham and Scotland, setting up a spinoff company in hopes of commercializing the mutant strains. And there could be other modifications on the horizon. “Producers could almost dial up their list of desirable characteristics—more or less color, faster or slower growth rate, acidity differences,” Donald Glover of the University of Queensland in Australia, who was not involved in the research, told New Scientist.

Science of Food, 2024. DOI: 10.1038/s41538-023-00244-9  (About DOIs).

Blue cheese shows off new colors, but the taste largely remains the same Read More »

seeding-steel-frames-brings-destroyed-coral-reefs-back-to-life

Seeding steel frames brings destroyed coral reefs back to life

Image of a large school of fish above a reef.

Coral reefs, some of the most stunningly beautiful marine ecosystems on Earth, are dying. Ninety percent of them will likely be gone by 2050 due to rising ocean temperatures and pollution. “But it’s not that when they are gone, they are gone forever. We can rebuild them,” said Dr. Timothy Lamont, a marine biologist working at Lancaster University.

Lamont’s team evaluated coral reef restoration efforts done through the MARS Coral Reef Restoration Program on the coast of Indonesia and found that planting corals on a network of sand-coated steel frames brought a completely dead reef back to life in just four years. It seems like we can fix something for once.

Growing up in rubble

The restored reef examined by Lamont’s team was damaged by blast fishing done 30–40 years ago. “People were using dynamite to blow up the reef. It kills all the fish, the fish float to the surface, and you can scoop them all up. Obviously, this is very damaging to the habitat and leaves behind loose rubble fields with lots of coral skeletons,” said Lamont.

Because this loose ruble is in constant motion, tumbling and rolling around, coral larvae don’t have enough time to grow before they get squashed. So the first step to bringing damaged reefs back to life was stabilizing the rubble. The people running the MARS program did this using Reef Stars, hexagonal steel structures coated with sand. “These structures are connected into networks and pinned to the seabed to reduce the movement of the rubble,” Lamont said.

Before the reef stars were placed on the seabed, though, the MARS team manually tied little corals around them. This was meant to speed up recovery compared to letting coral larvae settle on the steel structures naturally. Based on some key measures, it worked. But there are questions about whether those measures capture everything we need to know.

Artificial coral reefs

The metric Lamont’s team used to measure the success of the MARS program restoration was a carbonate budget, which describes an overall growth of the whole reef structure. According to Lamont, a healthy coral reef has a positive carbonate budget and produces roughly 20 kilograms of limestone per square meter per year. This is exactly what his team measured in restored sites on the Indonesian reef. But while the recovered reef had the same carbonate budget as a healthy one, the organisms contributing to this budget were different.

An untouched natural reef is a diverse mixture including massive, encrusting, and plating coral species like Isopora or Porites, which contribute roughly a third of the carbonate budget. Restored reefs were almost completely dominated by smaller, branching corals like Stylophora, Acropora, and Pocillopora, which are all fast-growing species initially tied onto reef stars. The question was whether the MARS program achieved its astounding four-year reef recovery time by sacrificing biodiversity and specifically choosing corals that grow faster.

Seeding steel frames brings destroyed coral reefs back to life Read More »

some-states-are-now-trying-to-ban-lab-grown-meat

Some states are now trying to ban lab-grown meat

A franken-burger and a side of fries —

Spurious “war on ranching” cited as reason for legislation.

tanks for growing cell-cultivated chicken

Enlarge / Cell-cultivated chicken is made in the pictured tanks at the Eat Just office on July 27, 2023, in Alameda, Calif.

Justin Sullivan/Getty Images

Months in jail and thousands of dollars in fines and legal fees—those are the consequences Alabamians and Arizonans could soon face for selling cell-cultured meat products that could cut into the profits of ranchers, farmers, and meatpackers in each state.

State legislators from Florida to Arizona are seeking to ban meat grown from animal cells in labs, citing a “war on our ranching” and a need to protect the agriculture industry from efforts to reduce the consumption of animal protein, thereby reducing the high volume of climate-warming methane emissions the sector emits.

Agriculture accounts for about 11 percent of the country’s greenhouse gas emissions, according to federal data, with livestock such as cattle making up a quarter of those emissions, predominantly from their burps, which release methane—a potent greenhouse gas that’s roughly 80 times more effective at warming the atmosphere than carbon dioxide over 20 years. Globally, agriculture accounts for about 37 percent of methane emissions.

For years, climate activists have been calling for more scrutiny and regulation of emissions from the agricultural sector and for nations to reduce their consumption of meat and dairy products due to their climate impacts. Last year, over 150 countries pledged to voluntarily cut emissions from food and agriculture at the United Nations’ annual climate summit.

But the industry has avoided increased regulation and pushed back against efforts to decrease the consumption of meat, with help from local and state governments across the US.

Bills in Alabama, Arizona, Florida, and Tennessee are just the latest legislation passed in statehouses across the US that have targeted cell-cultured meat, which is produced by taking a sample of an animal’s muscle cells and growing them into edible products in a lab. Sixteen states—Alabama, Arkansas, Georgia, Kansas, Kentucky, Louisiana, Maine, Mississippi, Missouri, Montana, North Dakota, Oklahoma, South Carolina, South Dakota, Texas, and Wyoming—have passed laws addressing the use of the word “meat” in such products’ packaging, according to the National Agricultural Law Center at the University of Arkansas, with some prohibiting cell-cultured, plant-based, or insect-based food products from being labeled as meat.

“Cell-cultured meat products are so new that there’s not really a framework for how state and federal labeling will work together,” said Rusty Rumley, a senior staff attorney with the National Agricultural Law Center, resulting in no standardized requirements for how to label the products, though legislation has been proposed that could change that.

At the federal level, Rep. Mark Alford (R-Mo.) introduced the Fair and Accurate Ingredient Representation on Labels Act of 2024, which would authorize the United States Department of Agriculture to regulate imitation meat products and restrict their sale if they are not properly labeled, and US Sens. Jon Tester (D-Mont.) and Mike Rounds (R-S.D.) introduced a bill to ban schools from serving cell-cultured meat.

But while plant-based meat substitutes are widespread, cell-cultivated meats are not widely available, with none currently being sold in stores. Just last summer, federal agencies gave their first-ever approvals to two companies making cell-cultivated poultry products, which are appearing on restaurant menus. The meat substitutes have garnered the support of some significant investors, including billionaire Bill Gates, who has been the subject of attacks from supporters of some of the state legislation proposed.

“Let me start off by explaining why I drafted this bill,” said Rep. David Marshall, an Arizona Republican who proposed legislation to ban cell-cultured meat from being sold or produced in the state, during a hearing on the bill. “It’s because of organizations like the FDA and the World Economic Forum, also Bill Gates and others, who have openly declared war on our ranching.”

In Alabama, fear of “franken-meat” competition spurs legislation

In Alabama, an effort to ban lab-grown meat is winding its way through the State House in Montgomery.

There, state senators have already passed a bill that would make it a misdemeanor, punishable by up to three months in jail and a $500 fine, to sell, manufacture, or distribute what the proposed legislation labels “cultivated food products.” An earlier version of the bill called lab-grown protein “meat,” but it was quickly revised by lawmakers. The bill passed out of committee and through the Senate without opposition from any of its members.

Now, the bill is headed toward a vote in the Alabama House of Representatives, where the body’s health committee recently held a public hearing on the issue. Rep. Danny Crawford, who is carrying the bill in the body, told fellow lawmakers during that hearing that he’s concerned about two issues: health risks and competition for Alabama farmers.

“Lab-grown meat or whatever you want to call it—we’re not sure of all of the long-term problems with that,” he said. “And it does compete with our farming industry.”

Crawford said that legislators had heard from NASA, which expressed concern about the bill’s impact on programs to develop alternative proteins for astronauts. An amendment to the bill will address that problem, Crawford said, allowing an exemption for research purposes.

Some states are now trying to ban lab-grown meat Read More »

daily-telescope:-gigantic-new-stars-stir-up-a-nebula

Daily Telescope: Gigantic new stars stir up a nebula

It’s full of red —

Astronomers know of no other region so packed with large stars as this nebula.

Behold, the star-forming region of NGC 604.

Enlarge / Behold, the star-forming region of NGC 604.

NASA, ESA, CSA, STScI

Welcome to the Daily Telescope. There is a little too much darkness in this world and not enough light, a little too much pseudoscience and not enough science. We’ll let other publications offer you a daily horoscope. At Ars Technica, we’re going to take a different route, finding inspiration from very real images of a universe that is filled with stars and wonder.

Good morning. It’s March 12, and today’s photo comes from the James Webb Space Telescope.

Astronomers have long been fascinated by a nebula, NGC 604, in the relatively nearby Triangulum Galaxy. That’s because this nebula contains about 200 of the hottest and largest types of stars, most of which are in the early stages of their lives. Some of these stars are 100 times or more massive than the Sun. Astronomers know of no other region in the Universe so densely packed with large stars as this nebula.

In this image, captured by the Near-Infrared Camera on the Webb telescope, there are brilliant reds and oranges. Here’s the explanation from astronomers for these colors:

The most noticeable features are tendrils and clumps of emission that appear bright red, extending out from areas that look like clearings, or large bubbles in the nebula. Stellar winds from the brightest and hottest young stars have carved out these cavities, while ultraviolet radiation ionizes the surrounding gas. This ionized hydrogen appears as a white and blue ghostly glow. The bright orange streaks in the Webb near-infrared image signify the presence of carbon-based molecules known as polycyclic aromatic hydrocarbons.

The nebula is only about 3.5 million years old.

Source: NASA, ESA, CSA, STScI

Do you want to submit a photo for the Daily Telescope? Reach out and say hello.

Daily Telescope: Gigantic new stars stir up a nebula Read More »

study:-conflicting-values-for-hubble-constant-not-due-to-measurement-error

Study: Conflicting values for Hubble Constant not due to measurement error

A long-standing tension —

Something else is influencing the expansion rate of the Universe.

This image of NGC 5468, a galaxy located about 130 million light-years from Earth, combines data from the Hubble and James Webb space telescopes.

Enlarge / This image of NGC 5468, about 130 million light-years from Earth, combines data from the Hubble and Webb space telescopes.

NASA/ESA/CSA/STScI/A. Riess (JHU)

Astronomers have made new measurements of the Hubble Constant, a measure of how quickly the Universe is expanding, by combining data from the Hubble Space Telescope and the James Webb Space Telescope. Their results confirmed the accuracy of Hubble’s earlier measurement of the Constant’s value, according to their recent paper published in The Astrophysical Journal Letters, with implications for a long-standing discrepancy in values obtained by different observational methods known as the “Hubble tension.”

There was a time when scientists believed the Universe was static, but that changed with Albert Einstein’s general theory of relativity. Alexander Friedmann published a set of equations in 1922 showing that the Universe might actually be expanding, with Georges Lemaitre later making an independent derivation to arrive at that same conclusion. Edwin Hubble confirmed this expansion with observational data in 1929. Prior to this, Einstein had been trying to modify general relativity by adding a cosmological constant in order to get a static universe from his theory; after Hubble’s discovery, legend has it, he referred to that effort as his biggest blunder.

As previously reported, the Hubble Constant is a measure of the Universe’s expansion expressed in units of kilometers per second per megaparsec. So, each second, every megaparsec of the Universe expands by a certain number of kilometers. Another way to think of this is in terms of a relatively stationary object a megaparsec away: Each second, it gets a number of kilometers more distant.

How many kilometers? That’s the problem here. There are basically three methods scientists use to measure the Hubble Constant: looking at nearby objects to see how fast they are moving, gravitational waves produced by colliding black holes or neutron stars, and measuring tiny deviations in the afterglow of the Big Bang known as the Cosmic Microwave Background (CMB). However, the various methods have come up with different values. For instance, tracking distant supernovae produced a value of 73 km/s Mpc, while measurements of the CMB using the Planck satellite produced a value of 67 km/s Mpc.

Just last year, researchers made a third independent measure of the Universe’s expansion by tracking the behavior of a gravitationally lensed supernova, where the distortion in space-time caused by a massive object acts as a lens to magnify an object in the background. The best fits of those models all ended up slightly below the value of the Hubble Constant derived from the CMB, with the difference being within the statistical error. Values closer to those derived from measurements of other supernovae were a considerably worse fit for the data. The method is new, with considerable uncertainties, but it did provide an independent means of getting at the Hubble Constant.

Comparison of Hubble and Webb views of a Cepheid variable star.

Enlarge / Comparison of Hubble and Webb views of a Cepheid variable star.

NASA/ESA/CSA/STScI/A. Riess (JHU)

“We’ve measured it using information in the cosmic microwave background and gotten one value,” Ars Science Editor John Timmer wrote. “And we’ve measured it using the apparent distance to objects in the present-day Universe and gotten a value that differs by about 10 percent. As far as anyone can tell, there’s nothing wrong with either measurement, and there’s no obvious way to get them to agree.” One hypothesis is that the early Universe briefly experienced some kind of “kick” from repulsive gravity (akin to the notion of dark energy) that then mysteriously turned off and vanished. But it remains a speculative idea, albeit a potentially exciting one for physicists.

This latest measurement builds on last year’s confirmation based on Webb data that Hubble’s measurements of the expansion rate were accurate, at least for the first few “rungs” of the “cosmic distance ladder.” But there was still the possibility of as-yet-undetected errors that might increase the deeper (and hence further back in time) one looked into the Universe, particularly for brightness measurements of more distant stars.

So a new team made additional observations of Cepheid variable stars—a total of 1,000 in five host galaxies as far out as 130 million light-years—and correlated them with the Hubble data. The Webb telescope is able to see past the interstellar dust that has made Hubble’s own images of those stars more blurry and overlapping, so astronomers could more easily distinguish between individual stars.

The results further confirmed the accuracy of the Hubble data. “We’ve now spanned the whole range of what Hubble observed, and we can rule out a measurement error as the cause of the Hubble Tension with very high confidence,” said co-author and team leader Adam Riess, a physicist at Johns Hopkins University. “Combining Webb and Hubble gives us the best of both worlds. We find that the Hubble measurements remain reliable as we climb farther along the cosmic distance ladder. With measurement errors negated, what remains is the real and exciting possibility that we have misunderstood the Universe.”

The Astrophysical Journal Letters, 2024. DOI: 10.3847/2041-8213/ad1ddd  (About DOIs).

Study: Conflicting values for Hubble Constant not due to measurement error Read More »

after-coming-back-from-the-dead,-the-world’s-largest-aircraft-just-flew-a-real-payload

After coming back from the dead, the world’s largest aircraft just flew a real payload

Roc-n-roll —

Falling just short of hypersonic velocity.

The world's largest aircraft takes off with the Talon A vehicle on Saturday.

Enlarge / The world’s largest aircraft takes off with the Talon A vehicle on Saturday.

Stratolaunch/Matt Hartman

Built and flown by Stratolaunch, the massive Roc aircraft took off from Mojave Air and Space Port in California on Saturday. The airplane flew out over the Pacific Ocean, where it deployed the Talon-A vehicle, which looks something like a mini space shuttle.

This marked the first time this gargantuan airplane released an honest-to-goodness payload, the first Talon-A vehicle, TA-1, which is intended to fly at hypersonic speed. During the flight, TA-1 didn’t quite reach hypersonic velocity, which begins at Mach 5, or five times greater than the speed of sound.

“While I can’t share the specific altitude and speed TA-1 reached due to proprietary agreements with our customers, we are pleased to share that in addition to meeting all primary and customer objectives of the flight, we reached high supersonic speeds approaching Mach 5 and collected a great amount of data at an incredible value to our customers,” said Zachary Krevor, chief executive of Stratolaunch, in a statement.

In essence, the TA-1 vehicle is a pathfinder for subsequent versions of the vehicle that will be both reusable and capable of reaching hypersonic speeds. The flight of the company’s next vehicle, TA-2, could come later this year, Krevor said.

A long road

It has been a long, strange road for Stratolaunch to reach this moment. The company was founded in 2011 to build a super-sized carrier aircraft from which rockets would be launched mid-air. It was bankrolled by Microsoft cofounder and airplane enthusiast Paul Allen, who put at least hundreds of millions of dollars into the private project.

As the design of the vehicle evolved, its wingspan grew to 117 meters, nearly double the size of a Boeing 747 aircraft. It far exceeded the wingspan of the Spruce Goose, built by Howard Hughes in the 1940s, which had a wingspan of 97.5 meters. The Roc aircraft was so large that it seemed impractical to fly on a regular basis.

At the same time, the company was struggling to identify a rocket that could be deployed from the aircraft. At various times, Stratolaunch worked with SpaceX and Orbital ATK to develop a launch vehicle. But both of those partnerships fell through, and eventually, the company said it would develop its own line of rockets.

Allen would never see his large plane fly, dying of septic shock in October 2018 due to his non-Hodgkin lymphoma. Roc did finally take flight for the first time in April 2019, but it seemed like a Pyrrhic victory. Following the death of Allen, for whom Stratolaunch was a passion project, the company’s financial future was in doubt. Later in 2019, Allen’s family put the company’s assets up for sale and said it would cease to exist.

However, Stratolaunch did not die. Rather, the aircraft was acquired by the private equity firm Cerberus, and in 2020, the revitalized Stratolaunch changed course. Instead of orbital rockets, it would now launch hypersonic vehicles to test the technology—a priority for the US military. China, Russia, and the United States are all racing to develop hypersonic missiles, as well as new countermeasure technology as high-speed missiles threaten to penetrate most existing defenses.

Featuring a new engine

This weekend’s flight also marked an important moment for another US aerospace company, Ursa Major Technologies. The TA-1 vehicle was powered by the Hadley rocket engine designed and built by Ursa Major, which specializes in the development of rocket propulsion engines.

Hadley is a 5,000-lb-thrust liquid oxygen and kerosene, oxygen-rich staged combustion cycle rocket engine for small vehicles. Its known customers include Stratolaunch and a vertical launch company, Phantom Space, which is developing a small orbital rocket.

Founded in 2015, Ursa Major seeks to provide off-the-shelf propulsion solutions to launch customers. While Ursa Major started small, the company is already well into the development of its much larger Ripley engine. With 50,000 pounds of thrust, Ripley is aimed at the medium-launch market. The company completed a hot-fire test campaign of Ripley last year. For Ursa Major, it must feel pretty good to finally see an engine in flight.

After coming back from the dead, the world’s largest aircraft just flew a real payload Read More »

shields-up:-new-ideas-might-make-active-shielding-viable

Shields up: New ideas might make active shielding viable

Shields up: New ideas might make active shielding viable

Aurich Lawson | Getty Images | NASA

On October 19, 1989, at 12: 29 UT, a monstrous X13 class solar flare triggered a geomagnetic storm so strong that auroras lit up the skies in Japan, America, Australia, and even Germany the following day. Had you been flying around the Moon at that time, you would have absorbed well over 6 Sieverts of radiation—a dose that would most likely kill you within a month or so.

This is why the Orion spacecraft that is supposed to take humans on a Moon fly-by mission this year has a heavily shielded storm shelter for the crew. But shelters like that aren’t sufficient for a flight to Mars—Orion’s shield is designed for a 30-day mission.

To obtain protection comparable to what we enjoy on Earth would require hundreds of tons of material, and that’s simply not possible in orbit. The primary alternative—using active shields that deflect charged particles just like the Earth’s magnetic field does—was first proposed in the 1960s. Today, we’re finally close to making it work.

Deep-space radiation

Space radiation comes in two different flavors. Solar events like flares or coronal mass ejections can cause very high fluxes of charged particles (mostly protons). They’re nasty when you have no shelter but are relatively easy to shield against since solar protons are mostly low energy. The majority of solar particle events flux is between 30 Mega-electronVolts to 100 MeV and could be stopped by Orion-like shelters.

Then there are galactic cosmic rays: particles coming from outside the Solar System, set in motion by faraway supernovas or neutron stars. These are relatively rare but are coming at you all the time from all directions. They also have high energies, starting at 200 MeV and going to several GeVs, which makes them extremely penetrating. Thick masses don’t provide much shielding against them. When high-energy cosmic ray particles hit thin shields, they produce many lower-energy particles—you’d be better off with no shield at all.

The particles with energies between 70 MeV and 500 MeV are responsible for 95 percent of the radiation dose that astronauts get in space. On short flights, solar storms are the main concern because they can be quite violent and do lots of damage very quickly. The longer you fly, though, GCRs become more of an issue because their dose accumulates over time, and they can go through pretty much everything we try to put in their way.

What keeps us safe at home

The reason nearly none of this radiation can reach us is that Earth has a natural, multi-stage shielding system. It begins with its magnetic field, which deflects most of the incoming particles toward the poles. A charged particle in a magnetic field follows a curve—the stronger the field, the tighter the curve. Earth’s magnetic field is very weak and barely bends incoming particles, but it is huge, extending thousands of kilometers into space.

Anything that makes it through the magnetic field runs into the atmosphere, which, when it comes to shielding, is the equivalent of an aluminum wall that’s 3 meters thick. Finally, there is the planet itself, which essentially cuts the radiation in half since you always have 6.5 billion trillion tons of rock shielding you from the bottom.

To put that in perspective, the Apollo crew module had on average 5 grams of mass per square centimeter standing between the crew and radiation. A typical ISS module has twice that, about 10 g/cm2. The Orion shelter has 35–45 g/cm2, depending on where you sit exactly, and it weighs 36 tons. On Earth, the atmosphere alone gives you 810 g/cm2—roughly 20 times more than our best shielded spaceships.

The two options are to add more mass—which gets expensive quickly—or to shorten the length of the mission, which isn’t always possible. So solving radiation with passive mass won’t cut it for longer missions, even using the best shielding materials like polyethylene or water. This is why making a miniaturized, portable version of the Earth’s magnetic field was on the table from the first days of space exploration. Unfortunately, we discovered it was far easier said than done.

Shields up: New ideas might make active shielding viable Read More »

these-scientists-built-their-own-stone-age-tools-to-figure-out-how-they-were-used

These scientists built their own Stone Age tools to figure out how they were used

hands-on experiments —

Telltale fractures and microscopic wear marks should be applicable to real artifacts.

Testing replica Stone Age tools with a bit of wood-scraping.

Enlarge / Testing replica Stone Age tools with a bit of wood-scraping.

A. Iwase et al., 2024/Tokyo Metropolitan University

When Japanese scientists wanted to learn more about how ground stone tools dating back to the Early Upper Paleolithic might have been used, they decided to build their own replicas of adzes, axes, and chisels and used those tools to perform tasks that might have been typical for that era. The resulting fractures and wear enabled them to develop new criteria for identifying the likely functions of ancient tools, according to a recent paper published in the Journal of Archaeological Science.  If these kinds of traces were indeed found on genuine Stone Age tools, it would be evidence that humans had been working with wood and honing techniques significantly earlier than previously believed.

The development of tools and techniques for woodworking purposes started out simple, with the manufacture of cruder tools like the spears and throwing sticks common in the early Stone Age. Later artifacts dating back to Mesolithic and Neolithic time periods were more sophisticated, as people learned how to use polished stone tools to make canoes, bows, wells, and to build houses. Researchers typically date the emergence of those stone tools to about 10,000 years ago. However, archaeologists have found lots of stone artifacts with ground edges dating as far back as 60,000 to 30,000 years ago. But it’s unclear how those tools might have been used.

So Akira Iwase of Tokyo Metropolitan University and co-authors made their own replicas of adzes and axes out of three raw materials common to the region between 38,000 and 30,000 years ago: semi-nephrite rocks, hornfels rocks, and tuff rocks. They used a stone hammer and anvil to create various long oval shapes and polished the edges with either a coarse-grained sandstone or a medium-grained tuff. There were three types of replica tools: adze-types, with the working edge oriented perpendicular to the long axis of a bent handle; axe-types, with a working edge parallel to the bent handle’s long axis; and chisel-types, in which a stone tool was placed at the end of a straight handle.

Testing various replicas of Stone Age tools for different uses: A, tree-felling; B, wood-adzing; C, wood-scraping; D, fresh bone-adzing; E, dry hide-scraping; F, disarticulation of a joint.

Enlarge / Testing various replicas of Stone Age tools for different uses: A, tree-felling; B, wood-adzing; C, wood-scraping; D, fresh bone-adzing; E, dry hide-scraping; F, disarticulation of a joint.

A. Iwase et al., 2024/Tokyo Metropolitan University

Then it was time to test the replica tools via ten different usage experiments. For instance, the authors used axe-type tools to fell Japanese cedar and maple trees in north central Honshu, as well as a forest near Tokyo Metropolitan University. Axe-type and adze-type tools were used to make a dugout canoe and wooden spears, while adze-type tools and chisel-type tools were used to scrape off the bark of fig and pine. They scraped flesh and grease from fresh and dry hides of deer and boar using adze-type and chisel-type tools. Finally, they used adze-type tools to disarticulate the femur and tibia joints of deer hindlimbs.

The team also conducted several experiments in which the tools were not used to identify accidental fractures not related to any tool-use function. For instance, flakes and blades can break in half during flint knapping; transporting tools in, say, small leather bags can cause microscopic flaking; and trampling on tools left on the ground can also modify the edges. All these scenarios were tested. All the tools used in both use and non-use experiments,ents were then examined for both macroscopic and microscopic traces of fracture or wear.

Traces left by tree-felling experiments on replica stone age tools. Characteristic macroscopic (top) and microscopic (bottom) traces might be used to determine how stone edges were used.

Enlarge / Traces left by tree-felling experiments on replica stone age tools. Characteristic macroscopic (top) and microscopic (bottom) traces might be used to determine how stone edges were used.

Tokyo Metropolitan University

The results: they were able to identify nine different types of macroscopic fractures, several of which were only seen when making percussive motions, particularly in the case of felling trees. There were also telltale microscopic traces resulting from friction between the wood and stone edge. Cutting away at antlers and bones caused a lot of damage to the edges of adze-like tools, creating long and/or wide bending fractures. The tools used for limb disarticulation caused fairly large bending fractures and smaller flaking scars, while only nine out of 21 of the scraping tools showed macroscopic signs of wear, despite hundreds of repeated strokes.

The authors concluded that examining macroscopic fracture patterns alone are insufficient to determine whether a given stone tool had been used percussively. Nor is any resulting micropolish from abrasion an unambiguous indicator on its own, since scraping motions produce a similar micropolish.  Combining the two, however, did yield more reliable conclusions about which tools had been used percussively to fell trees, compared to other uses, such as disarticulation of bones.

DOI: Journal of Archaeological Science, 2024. 10.1016/j.jas.2023.105891  (About DOIs).

These scientists built their own Stone Age tools to figure out how they were used Read More »

a-hunk-of-junk-from-the-international-space-station-hurtles-back-to-earth

A hunk of junk from the International Space Station hurtles back to Earth

In March 2021, the International Space Station's robotic arm released a cargo pallet with nine expended batteries.

Enlarge / In March 2021, the International Space Station’s robotic arm released a cargo pallet with nine expended batteries.

NASA

A bundle of depleted batteries from the International Space Station careened around Earth for almost three years before falling out of orbit and plunging back into the atmosphere Friday. Most of the trash likely burned up during reentry, but it’s possible some fragments may have reached Earth’s surface intact.

Larger pieces of space junk regularly fall to Earth on unguided trajectories, but they’re usually derelict satellites or spent rocket stages. This involved a pallet of batteries from the space station with a mass of more than 2.6 metric tons (5,800 pounds). NASA intentionally sent the space junk on a path toward an unguided reentry.

Naturally self-cleaning

Sandra Jones, a NASA spokesperson, said the agency “conducted a thorough debris analysis assessment on the pallet and has determined it will harmlessly reenter the Earth’s atmosphere.” This was, by far, the most massive object ever tossed overboard from the International Space Station.

The batteries reentered the atmosphere at 2: 29 pm EST (1929 UTC), according to US Space Command. At that time, the pallet would have been flying between Mexico and Cuba. “We do not expect any portion to have survived reentry,” Jones told Ars.

The European Space Agency (ESA) also monitored the trajectory of the battery pallet. In a statement this week, the ESA said the risk of a person being hit by a piece of the pallet was “very low” but said “some parts may reach the ground.” Jonathan McDowell, an astrophysicist who closely tracks spaceflight activity, estimated about 500 kilograms (1,100 pounds) of debris would hit the Earth’s surface.

“The general rule of thumb is that 20 to 40 percent of the mass of a large object will reach the ground, though it depends on the design of the object,” the Aerospace Corporation says.

A dead ESA satellite reentered the atmosphere in a similar uncontrolled manner February 21. At 2.3 metric tons, this satellite was similar in mass to the discarded battery pallet. ESA, which has positioned itself as a global leader in space sustainability, set up a website that provided daily tracking updates on the satellite’s deteriorating orbit.

This map shows the track of the unguided cargo pallet around the Earth over the course of six hours Friday. It reentered the atmosphere near Cuba on southwest-to-northeast heading.

Enlarge / This map shows the track of the unguided cargo pallet around the Earth over the course of six hours Friday. It reentered the atmosphere near Cuba on southwest-to-northeast heading.

As NASA and ESA officials have said, the risk of injury or death from a spacecraft reentry is quite low. Falling space debris has never killed anyone. According to ESA, the risk of a person getting hit by a piece of space junk is about 65,000 times lower than the risk of being struck by lightning.

This circumstance is unique in the type and origin of the space debris, which is why NASA purposely cast it away on an uncontrolled trajectory back to Earth.

The space station’s robotic arm released the battery cargo pallet on March 11, 2021. Since then, the batteries have been adrift in orbit, circling the planet about every 90 minutes. Over a span of months and years, low-Earth orbit is self-cleaning thanks to the influence of aerodynamic drag. The resistance of rarefied air molecules in low-Earth orbit gradually slowed the pallet’s velocity until, finally, gravity pulled it back into the atmosphere Friday.

The cargo pallet, which launched inside a Japanese HTV cargo ship in 2020, carried six new lithium-ion batteries to the International Space Station. The station’s two-armed Dextre robot, assisted by astronauts on spacewalks, swapped out aging nickel-hydrogen batteries for the upgraded units. Nine of the old batteries were installed on the HTV cargo pallet before its release from the station’s robotic arm.

A hunk of junk from the International Space Station hurtles back to Earth Read More »

study-finds-that-we-could-lose-science-if-publishers-go-bankrupt

Study finds that we could lose science if publishers go bankrupt

Need backups —

A scan of archives shows that lots of scientific papers aren’t backed up.

A set of library shelves with lots of volumes stacked on them.

Back when scientific publications came in paper form, libraries played a key role in ensuring that knowledge didn’t disappear. Copies went out to so many libraries that any failure—a publisher going bankrupt, a library getting closed—wouldn’t put us at risk of losing information. But, as with anything else, scientific content has gone digital, which has changed what’s involved with preservation.

Organizations have devised systems that should provide options for preserving digital material. But, according to a recently published survey, lots of digital documents aren’t consistently showing up in the archives that are meant to preserve it. And that puts us at risk of losing academic research—including science paid for with taxpayer money.

Tracking down references

The work was done by Martin Eve, a developer at Crossref. That’s the organization that organizes the DOI system, which provides a permanent pointer toward digital documents, including almost every scientific publication. If updates are done properly, a DOI will always resolve to a document, even if that document gets shifted to a new URL.

But it also has a way of handling documents disappearing from their expected location, as might happen if a publisher went bankrupt. There are a set of what’s called “dark archives” that the public doesn’t have access to, but should contain copies of anything that’s had a DOI assigned. If anything goes wrong with a DOI, it should trigger the dark archives to open access, and the DOI updated to point to the copy in the dark archive.

For that to work, however, copies of everything published have to be in the archives. So Eve decided to check whether that’s the case.

Using the Crossref database, Eve got a list of over 7 million DOIs and then checked whether the documents could be found in archives. He included well-known ones, like the Internet Archive at archive.org, as well as some dedicated to academic works, like LOCKSS (Lots of Copies Keeps Stuff Safe) and CLOCKSS (Controlled Lots of Copies Keeps Stuff Safe).

Not well-preserved

The results were… not great.

When Eve broke down the results by publisher, less than 1 percent of the 204 publishers had put the majority of their content into multiple archives. (The cutoff was 75 percent of their content in three or more archives.) Fewer than 10 percent had put more than half their content in at least two archives. And a full third seemed to be doing no organized archiving at all.

At the individual publication level, under 60 percent were present in at least one archive, and over a quarter didn’t appear to be in any of the archives at all. (Another 14 percent were published too recently to have been archived or had incomplete records.)

The good news is that large academic publishers appear to be reasonably good about getting things into archives; most of the unarchived issues stem from smaller publishers.

Eve acknowledges that the study has limits, primarily in that there may be additional archives he hasn’t checked. There are some prominent dark archives that he didn’t have access to, as well as things like Sci-hub, which violates copyright in order to make material from for-profit publishers available to the public. Finally, individual publishers may have their own archiving system in place that could keep publications from disappearing.

Should we be worried?

The risk here is that, ultimately, we may lose access to some academic research. As Eve phrases it, knowledge gets expanded because we’re able to build upon a foundation of facts that we can trace back through a chain of references. If we start losing those links, then the foundation gets shakier. Archiving comes with its own set of challenges: It costs money, it has to be organized, consistent means of accessing the archived material need to be established, and so on.

But, to an extent, we’re failing at the first step. “An important point to make,” Eve writes, “is that there is no consensus over who should be responsible for archiving scholarship in the digital age.”

A somewhat related issue is ensuring that people can find the archived material—the issue that DOIs were designed to solve. In many cases, the authors of the manuscript place copies in places like the arXiv/bioRxiv, or the NIH’s PubMed Centra (this sort of archiving is increasingly being made a requirement by funding bodies). The problem here is that the archived copies may not include the DOI that’s meant to ensure it can be located. That doesn’t mean it can’t be identified through other means, but it definitely makes finding the right document much more difficult.

Put differently, if you can’t find a paper or can’t be certain you’re looking at the right version of it, it can be just as bad as not having a copy of the paper at all.

None of this is to say that we’ve already lost important research documents. But Eve’s paper serves a valuable function by highlighting that the risk is real. We’re well into the era where print copies of journals are irrelevant to most academics, and digital-only academic journals have proliferated. It’s long past time for us to have clear standards in place to ensure that digital versions of research have the endurance that print works have enjoyed.

Journal of Librarianship and Scholarly Communication, 2024. DOI: 10.31274/jlsc.16288  (About DOIs).

Study finds that we could lose science if publishers go bankrupt Read More »

thousands-of-us-kids-are-overdosing-on-melatonin-gummies,-er-study-finds

Thousands of US kids are overdosing on melatonin gummies, ER study finds

treats or treatment? —

In the majority of cases, the excessive amounts led to only minimal side effects.

In this photo illustration, melatonin gummies are displayed on April 26, 2023, in Miami, Florida.

Enlarge / In this photo illustration, melatonin gummies are displayed on April 26, 2023, in Miami, Florida.

Federal regulators have long decried drug-containing products that appeal to kids—like nicotine-containing e-cigarette products with fruity and dessert-themed flavors or edible cannabis products sold to look exactly like name-brand candies.

But a less-expected candy-like product is sending thousands of kids to emergency departments in the US in recent years: melatonin, particularly in gummy form. According to a new report from researchers at the Centers for Disease Control and Prevention, use of the over-the-counter sleep-aid supplement has skyrocketed in recent years—and so have calls to poison control centers and visits to emergency departments.

Melatonin, a neurohormone that regulates the sleep-wake cycle, has become very popular for self-managing conditions like sleep disorders and jet lag—even in children. Use of melatonin in adults rose from 0.4 percent in 1999–2000 to 2.1 percent in 2017–2018. But the more people have these tempting, often candy-like supplements in their homes, the more risk that children will get ahold of them unsupervised. Indeed, the rise in use led to a 530 percent increase in poison control center calls and a 420 percent increase in emergency department visits for accidental melatonin ingestion in infants and kids between 2009 and 2020.

And the problem is ongoing. In the new study, researchers estimate that between 2019 and 2022, nearly 11,000 kids went to the emergency department after accidentally gulping down melatonin supplements. Nearly all the cases involved a solid form of melatonin, with about 47 percent identified specifically as gummies and 49 percent listed as an unspecified sold form, likely to include gummies. These melatonin-based emergency visits made up 7 percent of all emergency visits by infants and kids who ingested medications unsupervised.

The candy-like appeal of melatonin products seems evident in the ages of kids rushed to emergency departments. Most emergency department visits for unsupervised medicine exposures are in infants and toddlers ages 1 to 2, but for melatonin-related visits, half were ages 3 to 5. The researchers noted that among the emergency visits with documentation, about three-quarters of the melatonin products involved came out of bottles, suggesting that the young kids managed to open the bottles themselves or that the bottles weren’t properly closed. Manufacturers are not required to use child-resistant packaging on melatonin supplements.

Luckily, most of the cases had only mild to no effects. Still, about 6.5 percent of the cases—a little over 700 children—were hospitalized from their melatonin binge. A 2022 study led by researchers in Michigan found that among poison control center calls for children consuming melatonin, the reported symptoms involved gastrointestinal, cardiovascular, or central nervous systems. For children who are given supervised doses of melatonin to improve sleep, known side effects include drowsiness, increased bedwetting or urination in the evening, headache, dizziness, and agitation.

According to the National Center for Complementary and Integrative Health—part of the National Institutes of Health—supervised use of melatonin in children appears to be safe for short-term use. But there’s simply not much data on use in children, and the long-term effects of regular use or acute exposures are unknown. The NCCIH cautions: “Because melatonin is a hormone, it’s possible that melatonin supplements could affect hormonal development, including puberty, menstrual cycles, and overproduction of the hormone prolactin, but we don’t know for sure.”

For now, the authors of the new study say the data “highlights the continued need to educate parents and other caregivers about the importance of keeping all medications and supplements (including gummies) out of children’s reach and sight.”

Thousands of US kids are overdosing on melatonin gummies, ER study finds Read More »

rocket-report:-starbase-will-expand-into-state-park;-another-japanese-rocket

Rocket Report: Starbase will expand into state park; another Japanese rocket

43 for 477 —

“Those launches are exciting the young minds that are watching them.”

This satellite view of SpaceX's Starbase facility shows a fully-stacked Starship rocket on the launch pad, just inland from the Gulf of Mexico.

Enlarge / This satellite view of SpaceX’s Starbase facility shows a fully-stacked Starship rocket on the launch pad, just inland from the Gulf of Mexico.

Welcome to Edition 6.34 of the Rocket Report! It’s Starship season again. Yes, SpaceX appears to be about a week away from launching the third full-scale Starship test flight from the company’s Starbase site in South Texas, pending final regulatory approval from the Federal Aviation Administration. Ars will be there. SpaceX plans to build a second Starship launch pad at Starbase, and the company’s footprint there is also about to get a little bigger, with the expected acquisition of 43 acres of Texas state park land.

As always, we welcome reader submissions, and if you don’t want to miss an issue, please subscribe using the box below (the form will not appear on AMP-enabled versions of the site). Each report will include information on small-, medium-, and heavy-lift rockets, as well as a quick look ahead at the next three launches on the calendar.

Astra’s founders take the company private. Astra’s three-year run as a public company is over. Chris Kemp and Adam London, Astra’s co-founders, are taking the company private after a string of rocket failures and funding shortfalls, Ars reports. Kemp and London bought the company for 50 cents a share. Astra’s board approved the transaction, the company announced Thursday, as the only alternative to bankruptcy. Kemp and London founded Astra in 2016. After emerging from stealth mode in 2020, Astra launched its light-class launcher, called Rocket 3, seven times, but five of those flights were failures. Astra went public via a special purpose acquisition company (or SPAC) in 2021, reaching a valuation of more than $2 billion. Today, its market cap sits at approximately $13 million.

What’s next for Astra? … Where Astra goes from here is anyone’s guess. The company abandoned its unreliable Rocket 3 vehicle in 2022 to focus on the larger Rocket 4 vehicle. But Rocket 4 is likely months or years from the launch pad. It faces stiff competition not just from established small launch players such as Rocket Lab and Firefly but also from new entrants as well, including ABL Space and Stoke Space. Additionally, all of these small launch companies have been undercut in price by SpaceX’s Transporter missions, which launch dozens of satellites at a time on the Falcon 9 booster. Additionally, Astra’s spacecraft engine business—acquired previously from Apollo Fusion—may or may not be profitable now, but there are questions about its long-term viability as well.

Virgin Galactic is retiring its only operational spaceship. Over the last year, Virgin Galactic has proven it has the technical acumen to pull off monthly flights of its VSS Unity rocket plane, each carrying six people on a suborbital climb to the edge of space. But VSS Unity has never been profitable. It costs too much and takes too much time to reconfigure between flights. Virgin Galactic plans to fly the suborbital spaceship one more time before taking a hiatus from flight operations, Ars reports. This, along with layoffs announced last year, will allow the company to preserve cash while focusing on the development of a new generation of rocket planes, called Delta-class ships, designed to fly more often and with more people. Michael Colglazier, Virgin Galactic’s president and CEO, says the first of the Delta ships is on track to begin ground and flight testing next year, with commercial service targeted for 2026 based out of Spaceport America in New Mexico.

Bigger and faster… The Delta ships will each carry six customers in the spacecraft’s pressurized passenger cabin, compared to a maximum of four passengers on each VSS Unity flight. Virgin Galactic’s goal is to fly each Delta ship eight times per month, and the company will do this by eliminating many of the inspections required between each VSS Unity flight. The company is building a Delta ship structural test article to put through extensive checks on the ground, validating component life and cycle limits for major components of the vehicle. This will give engineers enough confidence to forego many inspections, according to Mike Moses, president of Virgin Galactic’s spaceline operations. Virgin Galactic has nearly $1 billion in cash or cash equivalents on its balance sheet, so it’s not in any immediate financial trouble. But the company reported just $7 million in revenue last year, with a net loss of $502 million. So, there’s an obvious motivation to make a change.

The easiest way to keep up with Eric Berger’s space reporting is to sign up for his newsletter, we’ll collect his stories in your inbox.

A new Japanese rocket will launch this weekend. A privately held Japanese company named Space One is set to shoot for orbit with the first flight of its Kairos rocket Friday night (US time), News on Japan reports. Space One will attempt to become the first Japanese private company to launch a rocket into orbit. Japan’s existing launch vehicles, like the H-IIA, the H3, and the Epsilon, were developed with funding from the Japanese space agency. But there is some involvement from the Japanese government on this flight. The Kairos rocket will launch with a small “quick response” spacecraft for the Cabinet Intelligence and Research Office, which is responsible for Japan’s fleet of spy satellites. Kairos, which is the Ancient Greek word for “timeliness,” is made up of three solid-fueled stages and a liquid-fueled upper stage. It can place a payload of up to 550 pounds (250 kilograms) into low-Earth orbit.

Winning hearts and minds… The Kairos rocket will take off from Space One’s Space Port Kii, located on a south-facing peninsula on the main Japanese island of Honshu. This new launch site is hundreds of miles away from Japan’s existing spaceports. Local businesses see the arrival of the space industry in this remote part of Japan as a marketing opportunity. A local confectionery store, not wanting to miss the opportunity to attract visitors, is selling manju shaped like rockets. There are two paid viewing areas to watch the launch, and a total of 5,000 seats sold out in just two days, according to News on Japan. (submitted by tsunam)

UK spaceport project to get 10 million pounds from government. The UK government has pledged 10 million pounds in funding to SaxaVord Spaceport in Scotland, European Spaceflight reports. This funding is sorely needed for SaxaVord, which slowed construction last year after its developer ran into financial trouble. In the last couple of months, SaxaVord raised enough money to resume payments to the contractors building the launch site. The UK government’s pledge of 10 million pounds for SaxaVord apparently is not quite a done deal. The UK’s science minister posted on X that the funding was “subject to due diligence.” SaxaVord will eventually have three launch pads, one of which has been dedicated to German launch startup Rocket Factory Augsburg. This company’s rocket, RFA ONE, is expected to be the first orbital launch from SaxaVord later this year.

The UK spaceport scene… The UK government, local entities, and private industry are making a pretty serious effort to bring orbital launches to the British Isles. Spaceport Cornwall became the first UK facility to host an orbital attempt last year with the failed launch of Virgin Orbit’s LauncherOne rocket, which was released from a carrier jet that took off from Cornwall. There are several vertical launch spaceports under construction or in the concept development phase. SaxaVord appears to be among those closest to reality, along with Sutherland spaceport, also in Scotland, to be used by the UK launch startup Orbex Space. (submitted by Ken the Bin)

Rocket Report: Starbase will expand into state park; another Japanese rocket Read More »