chemistry

scientists-made-a-stretchable-lithium-battery-you-can-bend,-cut,-or-stab

Scientists made a stretchable lithium battery you can bend, cut, or stab

The Li-ion batteries that power everything from smartphones to electric cars are usually packed in rigid, sealed enclosures that prevent stresses from damaging their components and keep air from coming into contact with their flammable and toxic electrolytes. It’s hard to use batteries like this in soft robots or wearables, so a team of scientists at the University California, Berkeley built a flexible, non-toxic, jelly-like battery that could survive bending, twisting, and even cutting with a razor.

While flexible batteries using hydrogel electrolytes have been achieved before, they came with significant drawbacks. “All such batteries could [only] operate [for] a short time, sometimes a few hours, sometimes a few days,” says Liwei Lin, a mechanical engineering professor at UC Berkeley and senior author of the study. The battery built by his team endured 500 complete charge cycles—about as many as the batteries in most smartphones are designed for.

Power in water

“Current-day batteries require a rigid package because the electrolyte they use is explosive, and one of the things we wanted to make was a battery that would be safe to operate without this rigid package,” Lin told Ars. Unfortunately, flexible packaging made of polymers or other stretchable materials can be easily penetrated by air or water, which will react with standard electrolytes, generating lots of heat, potentially resulting in fires and explosions. This is why, in 2017, scientists started to experiment with quasi-solid-state hydrogel electrolytes.

These hydrogels were made of a polymer net that gave them their shape, crosslinkers like borax or hydrogen bonds that held this net together, a liquid phase made of water, and salt or other electrolyte additives providing ions that moved through the watery gel as the battery charged or discharged.

But hydrogels like that had their own fair share of issues. The first was a fairly narrow electrochemical stability window—a safe zone of voltage the battery can be exposed to. “This really limits how much voltage your battery can output,” says Peisheng He, a researcher at UC Berkeley Sensor and Actuator Center and lead author of the study. “Nowadays, batteries usually operate at 3.3 volts, so their stability window must be higher than that, probably four volts, something like that.” Water, which was the basis of these hydrogel electrolytes, typically broke down into hydrogen and oxygen when exposed to around 1.2 volts. That problem was solved by using highly concentrated salt water loaded with highly fluorinated lithium salts, which made it less likely to break down. But this led the researchers straight into safety issues, as fluorinated lithium salts are highly toxic to humans.

Scientists made a stretchable lithium battery you can bend, cut, or stab Read More »

a-guide-to-the-“platonic-ideal”-of-a-negroni-and-other-handy-tips

A guide to the “platonic ideal” of a Negroni and other handy tips


Perfumer by day, mixologist by night, Kevin Peterson specializes in crafting scent-paired cocktails.

Kevin Peterson is a “nose” for his own perfume company, Sfumato Fragrances, by day. By night, Sfumato’s retail store in Detroit transforms into Peterson’s craft cocktail bar, Castalia, where he is chief mixologist and designs drinks that pair with carefully selected aromas. He’s also the author of Cocktail Theory: A Sensory Approach to Transcendent Drinks, which grew out of his many (many!) mixology experiments and popular YouTube series, Objective Proof: The Science of Cocktails.

It’s fair to say that Peterson has had an unusual career trajectory. He worked as a line cook and an auto mechanic, and he worked on the production line of a butter factory, among other gigs, before attending culinary school in hopes of becoming a chef. However, he soon realized it wasn’t really what he wanted out of life and went to college, earning an undergraduate degree in physics from Carleton College and a PhD in mechanical engineering from the University of Michigan.

After 10 years as an engineer, he switched focus again and became more serious about his side hobby, perfumery. “Not being in kitchens anymore, I thought—this is a way to keep that little flavor part of my brain engaged,” Peterson told Ars. “I was doing problem sets all day. It was my escape to the sensory realm. ‘OK, my brain is melting—I need a completely different thing to do. Let me go smell smells, escape to my little scent desk.'” He and his wife, Jane Larson, founded Sfumato, which led to opening Castalia, and Peterson finally found his true calling.

Peterson spent years conducting mixology experiments to gather empirical data about the interplay between scent and flavor, correct ratios of ingredients, temperature, and dilution for all the classic cocktails—seeking a “Platonic ideal,” for each, if you will. He supplemented this with customer feedback data from the drinks served at Castalia. All that culminated in Cocktail Theory, which delves into the chemistry of scent and taste, introducing readers to flavor profiles, textures, visual presentation, and other factors that contribute to one’s enjoyment (or lack thereof) of a cocktail. And yes, there are practical tips for building your own home bar, as well as recipes for many of Castalia’s signature drinks.

In essence, Peterson’s work adds scientific rigor to what is frequently called the “Mr. Potato Head” theory of cocktails, a phrase coined by the folks at Death & Company, who operate several craft cocktail bars in key cities. “Let’s say you’ve got some classic cocktail, a daiquiri, that has this many parts of rum, this many parts of lime, this many parts of sugar,” said Peterson, who admits to having a Mr. Potato Head doll sitting on Castalia’s back bar in honor of the sobriquet. “You can think about each ingredient in a more general way: instead of rum, this is the spirit; instead of lime, this is the citrus; sugars are sweetener. Now you can start to replace those things with other things in the same categories.”

We caught up with Peterson to learn more.

Ars Technica: How did you start thinking about the interplay between perfumery and cocktail design and the role that aroma plays in each?

Kevin Peterson: The first step was from food over to perfumery, where I think about building a flavor for a soup, for a sauce, for a curry, in a certain way. “Oh, there’s a gap here that needs to be filled in by some herbs, some spice.” It’s almost an intuitive kind of thing. When I was making scents, I had those same ideas: “OK, the shape of this isn’t quite right. I need this to roughen it up or to smooth out this edge.”

Then I did the same thing for cocktails and realized that those two worlds didn’t really talk to each other. You’ve got two groups of people that study all the sensory elements and how to create the most intriguing sensory impression, but they use different language; they use different toolkits. They’re going for almost the same thing, but there was very little overlap between the two. So I made that my niche: What can perfumery teach bartenders? What can the cocktail world teach perfumery?

Ars Technica: In perfumery you talk about a top, a middle, and a base note. There must be an equivalent in cocktail theory?

Kevin Peterson: In perfumery, that is mostly talking about the time element: top notes perceived first, then middle notes, then base notes as you wear it over the course of a few hours. In the cocktail realm, there is that time element as well. You get some impression when you bring the glass to your nose, something when you sip, something in the aftertaste. But there can also be a spatial element. Some things you feel right at the tip of your tongue, some things you feel in different parts of your face and head, whether that’s a literal impression or you just kind of feel it somewhere where there’s not a literal nerve ending. It’s about filling up that space, or not filling it up, depending on what impression you’re going for—building out the full sensory space.

Ars Technica: You also talk about motifs and supportive effects or ornamental flourishes: themes that you can build on in cocktails.

Kevin Peterson: Something I see in the cocktail world occasionally is that people just put a bunch of ingredients together and figure, “This tastes fine.” But what were you going for here? There are 17 things in here, and it just kind of tastes like you were finger painting: “Hey, I made brown.” Brown is nice. But the motifs that I think about—maybe there’s just one particular element that I want to highlight. Say I’ve got this really great jasmine essence. Everything else in the blend is just there to highlight the jasmine.

If you’re dealing with a really nice mezcal or bourbon or some unique herb or spice, that’s going to be the centerpiece. You’re not trying to get overpowered by some smoky scotch, by some other more intense ingredient. The motif could just be a harmonious combination of elements. I think the perfect old-fashioned is where everything is present and nothing’s dominating. It’s not like the bitters or the whiskey totally took over. There’s the bitters, there’s a little bit of sugar, there’s the spirit. Everything’s playing nicely.

Another motif, I call it a jazz note. A Sazerac is almost the same as an old-fashioned, but it’s got a little bit of absinthe in it. You get all the harmony of the old-fashioned, but then you’re like, “Wait, what’s this weird thing pulling me off to the side? Oh, this absinthe note is kind of separate from everything else that’s going on in the drink.” It’s almost like that tension in a musical composition: “Well, these notes sound nice, but then there’s one that’s just weird.” But that’s what makes it interesting, that weird note. For me, formalizing some of those motifs help me make it clearer. Even if I don’t tell that to the guest during the composition stage, I know this is the effect I’m going for. It helps me build more intentionally when I’ve got a motif in mind.

Ars Technica: I tend to think about cocktails more in terms of chemistry, but there are many elements to taste and perception and flavor. You talk about ingredient matching, molecular matching, and impression matching, i.e., how certain elements will overlap in the brain. What role do each of those play?

Kevin Peterson: A lot of those ideas relate to how we pair scents with cocktails. At my perfume company, we make eight fragrances as our main line. Each scent then gets a paired drink on the cocktail menu. For example, this scent has coriander, cardamom, and nutmeg. What does it mean that the drink is paired with that? Does it need to literally have coriander, cardamom, and nutmeg in it? Does it need to have every ingredient? If the scent has 15 things, do I need to hit every note?

chart with sad neutral and happy faces showing the optimal temperature and dilution for a dauquiri

Peterson made over 100 daiquiris to find the “Platonic ideal” of the classic cocktail Credit: Kevin Peterson

The literal matching is the most obvious. “This has cardamom, that has cardamom.” I can see how that pairs. The molecular matching is essentially just one more step removed: Rosemary has alpha-pinene in it, and juniper berries have alpha-pinene in them. So if the scent has rosemary and the cocktail has gin, they’re both sharing that same molecule, so it’s still exciting that same scent receptor. What I’m thinking about is kind of resonant effects. You’re approaching the same receptor or the same neural structure in two different ways, and you’re creating a bigger peak with that.

The most hand-wavy one to me is the impression matching. Rosemary smells cold, and Fernet-Branca tastes cold even when it’s room temperature. If the scent has rosemary, is Fernet now a good match for that? Some of the neuroscience stuff that I’ve read has indicated that these more abstract ideas are represented by the same sort of neural-firing patterns. Initially, I was hesitant; cold and cold, it doesn’t feel as fulfilling to me. But then I did some more reading and realized there’s some science behind it and have been more intrigued by that lately.

Ars Technica: You do come up with some surprising flavor combinations, like a drink that combined blueberry and horseradish, which frankly sounds horrifying. 

Kevin Peterson: It was a hit on the menu. I would often give people a little taste of the blueberry and then a little taste of the horseradish tincture, and they’d say, “Yeah, I don’t like this.” And then I’d serve them the cocktail, and they’d be like, “Oh my gosh, it actually worked. I can’t believe it.”  Part of the beauty is you take a bunch of things that are at least not good and maybe downright terrible on their own, and then you stir them all together and somehow it’s lovely. That’s basically alchemy right there.

Ars Technica: Harmony between scent and the cocktail is one thing, but you also talk about constructive interference to get a surprising, unexpected, and yet still pleasurable result.

Kevin Peterson: The opposite is destructive interference, where there’s just too much going on. When I’m coming up with a drink, sometimes that’ll happen, where I’m adding more, but the flavor impression is going down. It’s sort of a weird non-linearity of flavor, where sometimes two plus two equals four, sometimes it equals three, sometimes it equals 17. I now have intuition about that, having been in this world for a lot of years, but I still get surprised sometimes when I put a couple things together.

Often with my end-of-the-shift drink, I’ll think, “Oh, we got this new bottle in. I’m going to try that in a Negroni variation.” Then I lose track and finish mopping, and then I sip, and I’m like, “What? Oh my gosh, I did not see this coming at all.” That little spark, or whatever combo creates that, will then often be the first step on some new cocktail development journey.

man's torso in a long-sleeved button down white shirt, with a small glass filled with juniper berries in front of him

Pairing scents with cocktails involves experimenting with many different ingredients Credit: EE Berger

Ars Technica: Smoked cocktails are a huge trend right now. What’s the best way to get a consistently good smoky element?

Kevin Peterson: Smoke is tricky to make repeatable. How many parts per million of smoke are you getting in the cocktail? You could standardize the amount of time that it’s in the box [filled with smoke]. Or you could always burn, say, exactly three grams of hickory or whatever. One thing that I found, because I was writing the book while still running the bar: People have a lot of expectations around how the drink is going to be served. Big ice cubes are not ideal for serving drinks, but people want a big ice cube in their old-fashioned. So we’re still using big ice cubes. There might be a Platonic ideal in terms of temperature, dilution, etc., but maybe it’s not the ideal in terms of visuals or tactile feel, and that is a part of the experience.

With the smoker, you open the doors, smoke billows out, your drink emerges from the smoke, and people say, “Wow, this is great.” So whether you get 100 PPM one time and 220 PPM the next, maybe that gets outweighed by the awesomeness of the presentation. If I’m trying to be very dialed in about it, I’ll either use a commercial smoky spirit—Laphroaig scotch, a smoky mezcal—where I decide that a quarter ounce is the amount of smokiness that I want in the drink. I can just pour the smoke instead of having to burn and time it.

Or I might even make my own smoke: light something on fire and then hold it under a bottle, tip it back up, put some vodka or something in there, shake it up. Now I’ve got smoke particles in my vodka. Maybe I can say, “OK, it’s always going to be one milliliter,” but then you miss out on the presentation—the showmanship, the human interaction, the garnish. I rarely garnish my own drinks, but I rarely send a drink out to a guest ungarnished, even if it’s just a simple orange peel.

Ars Technica: There’s always going to be an element of subjectivity, particularly when it comes to our sensory perceptions. Sometimes you run into a person who just can’t appreciate a certain note.

Kevin Peterson: That was something I grappled with. On the one hand, we’re all kind of living in our own flavor world. Some people are more sensitive to bitter. Different scent receptors are present in different people. It’s tempting to just say, “Well, everything’s so unique. Maybe we just can’t say anything about it at all.” But that’s not helpful either. Somehow, we keep having delicious food and drink and scents that come our way.

A sample page from Cocktail Theory discussing temperature and dilution

A sample page from Cocktail Theory discussing temperature and dilution. Credit: EE Berger

I’ve been taking a lot of survey data in my bar more recently, and definitely the individuality of preference has shown through in the surveys. But another thing that has shown through is that there are some universal trends. There are certain categories. There’s the spirit-forward, bittersweet drinkers, there’s the bubbly citrus folks, there’s the texture folks who like vodka soda. What is the taste? What is the aroma? It’s very minimal, but it’s a very intense texture. Having some awareness of that is critical when you’re making drinks.

One of the things I was going for in my book was to find, for example, the platonically ideal gin and tonic. What are the ratios? What is the temperature? How much dilution to how much spirit is the perfect amount? But if you don’t like gin and tonics, it doesn’t matter if it’s a platonically ideal gin and tonic. So that’s my next project. It’s not just getting the drink right. How do you match that to the right person? What questions do I have to ask you, or do I have to give you taste tests? How do I draw that information out of the customer to determine the perfect drink for them?

We offer a tasting menu, so our full menu is eight drinks, and you get a mini version of each drink. I started giving people surveys when they would do the tasting menu, asking, “Which drink do you think you like the most? Which drink do you think you like the least?” I would have them rate it. Less than half of people predicted their most liked and least liked, meaning if you were just going to order one drink off the menu, your odds are less than a coin flip that you would get the right drink.

Ars Technica: How does all this tie into your “cocktails as storytelling” philosophy? 

Kevin Peterson: So much of flavor impression is non-verbal. Scent is very hard to describe. You can maybe describe taste, but we only have five-ish words, things like bitter, sour, salty, sweet. There’s not a whole lot to say about that: “Oh, it was perfectly balanced.” So at my bar, when we design menus, we’ll put the drinks together, but then we’ll always give the menu a theme. The last menu that we did was the scientist menu, where every drink was made in honor of some scientist who didn’t get the credit they were due in the time they were alive.

Having that narrative element, I think, helps people remember the drink better. It helps them in the moment to latch onto something that they can more firmly think about. There’s a conceptual element. If I’m just doing chores around the house, I drink a beer, it doesn’t need to have a conceptual element. If I’m going out and spending money and it’s my night and I want this to be a more elevated experience, having that conceptual tie-in is an important part of that.

two martini glasses side by side with a cloudy liquid in them a bright red cherry at the bottom of the glass

My personal favorite drink, Corpse Reviver No. 2, has just a hint of absinthe. Credit: Sean Carroll

Ars Technica: Do you have any simple tips for people who are interested in taking their cocktail game to the next level?

Kevin Peterson:  Old-fashioneds are the most fragile cocktail. You have to get all the ratios exactly right. Everything has to be perfect for an old-fashioned to work. Anecdotally, I’ve gotten a lot of old-fashioneds that were terrible out on the town. In contrast, the Negroni is the most robust drink. You can miss the ratios. It’s got a very wide temperature and dilution window where it’s still totally fine. I kind of thought of them in the same way prior to doing the test. Then I found that this band of acceptability is much bigger for the Negroni. So now I think of old-fashioneds as something that either I make myself or I order when I either trust the bartender or I’m testing someone who wants to come work for me.

My other general piece of advice: It can be a very daunting world to try to get into. You may say, “Oh, there’s all these classics that I’m going to have to memorize, and I’ve got to buy all these weird bottles.” My advice is to pick a drink you like and take baby steps away from that drink. Say you like Negronis. That’s three bottles: vermouth, Campari, and gin. Start with that. When you finish that bottle of gin, buy a different type of gin. When you finish the Campari, try a different bittersweet liqueur. See if that’s going to work. You don’t have to drop hundreds of dollars, thousands of dollars, to build out a back bar. You can do it with baby steps.

Photo of Jennifer Ouellette

Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.

A guide to the “platonic ideal” of a Negroni and other handy tips Read More »

researchers-engineer-bacteria-to-produce-plastics

Researchers engineer bacteria to produce plastics

Image of a series of chemical reactions, with enzymes driving each step forward.

One of the enzymes used in this system takes an amino acid (left) and links it to Coenzyme A. The second takes these items and links them into a polymer. Credit: Chae et. al.

Normally, PHA synthase forms links between molecules that run through an oxygen atom. But it’s also possible to form a related chemical link that instead runs through a nitrogen atom, like those found on amino acids. There were no known enzymes, however, that catalyze these reactions. So, the researchers decided to test whether any existing enzymes could be induced to do something they don’t normally do.

The researchers started with an enzyme from Clostridium that links chemicals to Coenzyme A that has a reputation for not being picky about the chemicals it interacts with. This worked reasonably well at linking amino acids to Coenzyme A. For linking the amino acids together, they used an enzyme from Pseudomonas that had four different mutations that expanded the range of molecules it would use as reaction materials. Used in a test tube, the system worked: Amino acids were linked together in a polymer.

The question was whether it would work in cells. Unfortunately, one of the two enzymes turns out to be mildly toxic to E. coli, slowing its growth. So, the researchers evolved a strain of E. coli that could tolerate the protein. With both of these two proteins, the cells produced small amounts of an amino acid polymer. If they added an excess of an amino acid to the media the cells were growing in, the polymer would be biased toward incorporating that amino acid.

Boosting polymer production

However, the yield of the polymer by weight of bacteria was fairly low. “It was reasoned that these [amino acids] might be more efficiently incorporated into the polymer if generated within the cells from a suitable carbon source,” the researchers write. So, the researchers put in extra copies of the genes needed to produce one specific amino acid (lysine). That worked, producing more polymer, with a higher percentage of the polymer being lysine.

Researchers engineer bacteria to produce plastics Read More »

small-charges-in-water-spray-can-trigger-the-formation-of-key-biochemicals

Small charges in water spray can trigger the formation of key biochemicals

Once his team nailed how droplets become electrically charged and how the micro-lightning phenomenon works, they recreated the Miller-Urey experiment. Only without the spark plugs.

Ingredients of life

After micro-lightnings started jumping between droplets in a mixture of gases similar to that used by Miller and Urey, the team examined their chemical composition with a mass spectrometer. They confirmed glycine, uracil, urea, cyanoethylene, and lots of other chemical compounds were made. “Micro-lightnings made all organic molecules observed previously in the Miller-Urey experiment without any external voltage applied,” Zare claims.

But does it really bring us any closer to explaining the beginnings of life? After all, Miller and Urey already demonstrated those molecules could be produced by electrical discharges in a primordial Earth’s atmosphere—does it matter all that much where those discharges came from?  Zare argues that it does.

“Lightning is intermittent, so it would be hard for these molecules to concentrate. But if you look at waves crashing into rocks, you can think the spray would easily go into the crevices in these rocks,” Zare suggests. He suggests that the water in these crevices would evaporate, new spray would enter and evaporate again and again. The cyclic drying would allow the chemical precursors to build into more complex molecules. “When you go through such a dry cycle, it causes polymerization, which is how you make DNA,” Zare argues. Since sources of spray were likely common on the early Earth, Zare thinks this process could produce far more organic chemicals than potential alternatives like lightning strikes, hydrothermal vents, or impacting comets.

But even if micro-lightning really produced the basic building blocks of life on Earth, we’re still not sure how those combined into living organisms. “We did not make life. We just demonstrated a possible mechanism that gives us some chemical compounds you find in life,” Zare says. “It’s very important to have a lot of humility with this stuff.”

Science Advances, 2025.  DOI: 10.1126/sciadv.adt8979

Small charges in water spray can trigger the formation of key biochemicals Read More »

brewing-tea-removes-lead-from-water

Brewing tea removes lead from water

Testing the teas

Scanning electron microscope image of black tea leaves, magnified by 500 times. Black tea, which is wilted and fully oxidized, exhibits a wrinkled surface, potentially increasing the available surface area for adsorption. Credit: Vinayak P. Dravid Group/Northwestern University

To test their hypothesis, the authors purchased Lipton and Infusions commercial tea bags, as well as a variety of loose-leaf teas and herbal alternatives: black tea, green tea, white peony tea, oolong tea, rooibos tea, and chamomile tea. The tea bags were of different types (cotton, cellulose, and nylon). They brewed the tea the same way daily tea drinkers do, steeping the tea for various time intervals (mere seconds to 24 hours) in water spiked with elevated known levels of lead, chromium, copper zinc, and cadmium. Tea leaves were removed after steeping by pouring the tea through a cellulose filter into a separate tube. The team then measured how much of the toxic metals remained in the water and how much the leaves had adsorbed.

It turns out that the type of tea bag matters. The team found that cellulose tea bags work the best at adsorbing toxic metals from the water while cotton and nylon tea bags barely adsorbed any contaminants at all—and nylon bags also release contaminating microplastics to boot. Tea type and the grind level also played a part in adsorbing toxic metals, with finely ground black tea leaves performing the best on that score. This is because when those leaves are processed, they get wrinkled, which opens the pores, thereby adding more surface area. Grinding the tea further increases that surface area, with even more capacity for binding toxic metals.

But the most significant factor was steeping time: the longer the steeping time, the more toxic metals were adsorbed. Based on their experiments, the authors estimate that brewing tea—using a tea bag that steeps for three to five minutes in a mug—can remove about 15 percent of lead from drinking water, even water with concentrations as high as 10 parts per million.

Brewing tea removes lead from water Read More »

researchers-figure-out-how-to-get-fresh-lithium-into-batteries

Researchers figure out how to get fresh lithium into batteries

In their testing, they use a couple of unusual electrode materials, such as a chromium oxide (Cr8O21) and an organic polymer (a sulfurized polyacrylonitrile). Both of these have significant weight advantages over the typical materials used in today’s batteries, although the resulting batteries typically lasted less than 500 cycles before dropping to 80 percent of their original capacity.

But the striking experiment came when they used LiSO2CF3 to rejuvenate a battery that had been manufactured as normal but had lost capacity due to heavy use. Treating a lithium-iron phosphate battery that had lost 15 percent of its original capacity restored almost all of what was lost, allowing it to hold over 99 percent of its original charge. They also ran a battery for repeated cycles with rejuvenation every few thousand cycles. At just short of 12,000 cycles, it still could be restored to 96 percent of its original capacity.

Before you get too excited, there are a couple of things worth noting about lithium-iron phosphate cells. The first is that, relative to their charge capacity, they’re a bit heavy, so they tend to be used in large, stationary batteries like the ones in grid-scale storage. They’re also long-lived on their own; with careful management, they can take over 8,000 cycles before they drop to 80 percent of their initial capacity. It’s not clear whether similar rejuvenation is possible in the battery chemistries typically used for the sorts of devices that most of us own.

The final caution is that the battery needs to be modified so that fresh electrolytes can be pumped in and the gases released by the breakdown of the LiSO2CF3 removed. It’s safest if this sort of access is built into the battery from the start, rather than provided by modifying it much later, as was done here. And the piping needed would put a small dent in the battery’s capacity per volume if so.

All that said, the treatment demonstrated here would replenish even a well-managed battery closer to its original capacity. And it would largely restore the capacity of something that hadn’t been carefully managed. And that would allow us to get far more out of the initial expense of battery manufacturing. Meaning it might make sense for batteries destined for a large storage facility, where lots of them could potentially be treated at the same time.

Nature, 2025. DOI: 10.1038/s41586-024-08465-y  (About DOIs).

Researchers figure out how to get fresh lithium into batteries Read More »

turning-the-moon-into-a-fuel-depot-will-take-a-lot-of-power

Turning the Moon into a fuel depot will take a lot of power


Getting oxygen from regolith takes 24 kWh per kilogram, and we’d need tonnes.

Without adjustments for relativity, clocks here and on the Moon would rapidly diverge. Credit: NASA

If humanity is ever to spread out into the Solar System, we’re going to need to find a way to put fuel into rockets somewhere other than the cozy confines of a launchpad on Earth. One option for that is in low-Earth orbit, which has the advantage of being located very close to said launch pads. But it has the considerable disadvantage of requiring a lot of energy to escape Earth’s gravity—it takes a lot of fuel to put substantially less fuel into orbit.

One alternative is to produce fuel on the Moon. We know there is hydrogen and oxygen present, and the Moon’s gravity is far easier to overcome, meaning more of what we produce there can be used to send things deeper into the Solar System. But there is a tradeoff: any fuel production infrastructure will likely need to be built on Earth and sent to the Moon.

How much infrastructure is that going to involve? A study released today by PNAS evaluates the energy costs of producing oxygen on the Moon, and finds that they’re substantial: about 24 kWh per kilogram. This doesn’t sound bad until you start considering how many kilograms we’re going to eventually need.

Free the oxygen!

The math that makes refueling from the Moon appealing is pretty simple. “As a rule of thumb,” write the authors of the new study on the topic, “rockets launched from Earth destined for [Earth-Moon Lagrange Point 1] must burn ~25 kg of propellant to transport one kg of payload, whereas rockets launched from the Moon to [Earth-Moon Lagrange Point 1] would burn only ~four kg of propellant to transport one kg of payload.” Departing from the Earth-Moon Lagrange Point for locations deeper into the Solar System also requires less energy than leaving low-Earth orbit, meaning the fuel we get there is ultimately more useful, at least from an exploration perspective.

But, of course, you need to make the fuel there in the first place. The obvious choice for that is water, which can be split to produce hydrogen and oxygen. We know there is water on the Moon, but we don’t yet know how much, and whether it’s concentrated into large deposits. Given that uncertainty, people have also looked at other materials that we know are present in abundance on the Moon’s surface.

And there’s probably nothing more abundant on that surface than regolith, the dust left over from constant tiny impacts that have, over time, eroded lunar rocks. The regolith is composed of a variety of minerals, many of which contain oxygen, typically the heavier component of rocket fuel. And a variety of people have figured out the chemistry involved in separating oxygen from these minerals on the scale needed for rocket fuel production.

But knowing the chemistry is different from knowing what sort of infrastructure is needed to get that chemistry done at a meaningful scale. To get a sense of this, the researchers decided to focus on isolating oxygen from a mineral called ilmenite, or FeTiO3. It’s not the easiest way to get oxygen—iron oxides win out there—but it’s well understood. Someone actually patented oxygen production from ilmenite back in the 1970s, and two hardware prototypes have been developed, one of which may be sent to the Moon on a future NASA mission.

The researchers propose a system that would harvest regolith, partly purify the ilmenite, then combine it with hydrogen at high temperatures, which would strip the oxygen out as water, leaving behind purified iron and titanium (both of which may be useful to have). The resulting water would then be split to feed the hydrogen back into the system, while the oxygen can be sent off for use in rockets.

(This wouldn’t solve the issue of what that oxygen will ultimately oxidize to power a rocket. But oxygen is typically the heavier component of rocket fuel combinations—typically about 80 percent of the mass—and so the bigger challenge to get to a fuel depot.)

Obviously, this process will require a lot of infrastructure, like harvesters, separators, high-temperature reaction chambers, and more. But the researchers focus on a single element: how much power will it suck down?

More power!

To get their numbers, the researchers made a few simplifying assumptions. These include assuming that it’s possible to purify ilmenite from raw regolith and that it will be present in particles small enough that about half the material present will participate in chemical reactions. They ignored both the potential to get even more oxygen from the iron and titanium oxides present, as well as the potential for contamination from problematic materials like hydrogen sulfide or hydrochloric acid.

The team found that almost all of the energy is consumed at three steps in the process: the high-temperature hydrogen reaction that produces water (55 percent), splitting the water afterwards (38 percent), and converting the resulting oxygen to its liquid form (five percent). The typical total usage, depending on factors like the concentration of ilmenite in the regolith, worked out to be about 24 kW-hr for each kilogram of liquid oxygen.

Obviously, the numbers are sensitive to how efficiently you can do things like heat the reaction mix. (It might be possible to do this heating with concentrated solar, avoiding the use of electricity for this entirely, but the authors didn’t analyze that.) But it was also sensitive to less obvious efficiencies. For example, a better separation of the ilmenite from the rest of the regolith means you’re using less energy to heat contaminants. So, while the energetic cost of that separation is small, it pays off to do it effectively.

Based on orbital observations, the researchers map out the areas where ilmenite is present at high enough concentrations for this approach to make sense. These include some of the mares on the near side of the Moon, so they’re easy to get to.

A map of the lunar surface with locations highlighted in color.

A map of the lunar surface, with areas with high ilmenite concentrations shown in blue.

Credit: Leger, et. al.

A map of the lunar surface, with areas with high ilmenite concentrations shown in blue. Credit: Leger, et. al.

On its own, 24 kWh doesn’t seem like a lot of power. The problem is that we will need a lot of kilograms. The researchers estimate that getting an empty SpaceX Starship from the lunar surface to the Earth-Moon Lagrange Point takes 80 tonnes of liquid oxygen. And a fully fueled starship can hold over 500 tonnes of liquid oxygen.

We can compare that to something like the solar array on the International Space Station, which has a capacity of about 100 kW. That means it could power the production of about four kilograms of oxygen an hour. At that rate, it’ll take a bit over 10 days to produce a tonne, and a bit more than two years to get enough oxygen to get an empty Starship to the Lagrange Point—assuming 24-7 production. Being on the near side, they will only produce for half the time, given the lunar day.

Obviously, we can build larger arrays than that, but it boosts the amount of material that needs to be sent to the Moon from Earth. It may potentially make more sense to use nuclear power. While that would likely involve more infrastructure than solar arrays, it would allow the facilities to run around the clock, thus getting more production from everything else we’ve shipped from Earth.

This paper isn’t meant to be the final word on the possibilities for lunar-based refueling; it’s simply an early attempt to put hard numbers on what ultimately might be the best way to explore our Solar System. Still, it provides some perspective on just how much effort we’ll need to make before that sort of exploration becomes possible.

PNAS, 2025. DOI: 10.1073/pnas.2306146122 (About DOIs).

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

Turning the Moon into a fuel depot will take a lot of power Read More »

ai-used-to-design-a-multi-step-enzyme-that-can-digest-some-plastics

AI used to design a multi-step enzyme that can digest some plastics

And it worked. Repeating the same process with an added PLACER screening step boosted the number of enzymes with catalytic activity by over three-fold.

Unfortunately, all of these enzymes stalled after a single reaction. It turns out they were much better at cleaving the ester, but they left one part of it chemically bonded to the enzyme. In other words, the enzymes acted like part of the reaction, not a catalyst. So the researchers started using PLACER to screen for structures that could adopt a key intermediate state of the reaction. This produced a much higher rate of reactive enzymes (18 percent of them cleaved the ester bond), and two—named “super” and “win”—could actually cycle through multiple rounds of reactions. The team had finally made an enzyme.

By adding additional rounds alternating between structure suggestions using RFDiffusion and screening using PLACER, the team saw the frequency of functional enzymes increase and eventually designed one that had an activity similar to some produced by actual living things. They also showed they could use the same process to design an esterase capable of digesting the bonds in PET, a common plastic.

If that sounds like a lot of work, it clearly was—designing enzymes, especially ones where we know of similar enzymes in living things, will remain a serious challenge. But at least much of it can be done on computers rather than requiring someone to order up the DNA that encodes the enzyme, getting bacteria to make it, and screening for activity. And despite the process involving references to known enzymes, the designed ones didn’t share a lot of sequences in common with them. That suggests there should be added flexibility if we want to design one that will react with esters that living things have never come across.

I’m curious about what might happen if we design an enzyme that is essential for survival, put it in bacteria, and then allow it to evolve for a while. I suspect life could find ways of improving on even our best designs.

Science, 2024. DOI: 10.1126/science.adu2454  (About DOIs).

AI used to design a multi-step enzyme that can digest some plastics Read More »

gecko-feet-inspire-anti-slip-shoe-soles

Gecko feet inspire anti-slip shoe soles

Just add zirconia nanoparticles…

diagram of wet ice's quasi slippery layer and design of anti-slip shoe soles inspired by gecko and toad foot pads

Credit: V. Richhariya et al., 2025

It’s the “hydrophilic capillary-enhanced adhesion”of gecko feet that most interested the authors of this latest paper. Per the World Health Organization, 684,000 people die and another 38 million are injured every year in slips and falls, with correspondingly higher health care costs. Most antislip products (crampons, chains, studs, cleats), tread designs, or materials (fiberglass, carbon fiber, rubber) are generally only effective for specific purposes or short periods of time. And they often don’t perform as well on wet ice, which has a nanoscale quasi-liquid layer (QLL) that makes it even more slippery.

So Vipin Richhariya of the University of Minho in Portugal and co-authors turned to gecko toe pads (as well as those of toads) for a better solution. To get similar properties in their silicone rubber polymers, they added zirconia nanoparticles, which attract water molecules. The polymers were rolled into a thin film and hardened, and then a laser etched groove patterns onto the surface—essentially creating micro cavities that exposed the zirconia nanoparticles, thus enhancing the material’s hydrophilic effects.

Infrared spectroscopy and simulated friction tests revealed that the composites containing 3 percent and 5 percent zirconia nanoparticles were the most slip-resistant. “This optimized composite has the potential to change the dynamics of slip-and-fall accidents, providing a nature-inspired solution to prevent one of the most common causes of accidents worldwide,” the authors concluded. The material could also be used for electronic skin, artificial skin, or wound healing.

DOI: ACS Applied Materials & Interfaces, 2025. 10.1021/acsami.4c14496  (About DOIs).

Gecko feet inspire anti-slip shoe soles Read More »

a-solid-electrolyte-gives-lithium-sulfur-batteries-ludicrous-endurance

A solid electrolyte gives lithium-sulfur batteries ludicrous endurance


Sulfur can store a lot more lithium but is problematically reactive in batteries.

If you weren’t aware, sulfur is pretty abundant. Credit: P_Wei

Lithium may be the key component in most modern batteries, but it doesn’t make up the bulk of the material used in them. Instead, much of the material is in the electrodes, where the lithium gets stored when the battery isn’t charging or discharging. So one way to make lighter and more compact lithium-ion batteries is to find electrode materials that can store more lithium. That’s one of the reasons that recent generations of batteries are starting to incorporate silicon into the electrode materials.

There are materials that can store even more lithium than silicon; a notable example is sulfur. But sulfur has a tendency to react with itself, producing ions that can float off into the electrolyte. Plus, like any electrode material, it tends to expand in proportion to the amount of lithium that gets stored, which can create physical strains on the battery’s structure. So while it has been easy to make lithium-sulfur batteries, their performance has tended to degrade rapidly.

But this week, researchers described a lithium-sulfur battery that still has over 80 percent of its original capacity after 25,000 charge/discharge cycles. All it took was a solid electrolyte that was more reactive than the sulfur itself.

When lithium meets sulfur…

Sulfur is an attractive battery material. It’s abundant and cheap, and sulfur atoms are relatively lightweight compared to many of the other materials used in battery electrodes. Sodium-sulfur batteries, which rely on two very cheap raw materials, have already been developed, although they only work at temperatures high enough to melt both of these components. Lithium-sulfur batteries, by contrast, could operate more or less the same way that current lithium-ion batteries do.

With a few major exceptions, that is. One is that the elemental sulfur used as an electrode is a very poor conductor of electricity, so it has to be dispersed within a mesh of conductive material. (You can contrast that with graphite, which both stores lithium and conducts electricity relatively well, thanks to being composed of countless sheets of graphene.) Lithium is stored there as Li2S, which occupies substantially more space than the elemental sulfur it’s replacing.

Both of these issues, however, can be solved with careful engineering of the battery’s structure. A more severe problem comes from the properties of the lithium-sulfur reactions that occur at the electrode. Elemental sulfur exists as an eight-atom ring, and the reactions with lithium are slow enough that semi-stable intermediates with smaller chains of sulfur end up forming. Unfortunately, these tend to be soluble in most electrolytes, allowing them to travel to the opposite electrode and participate in chemical reactions there.

This process essentially discharges the battery without allowing the electrons to be put to use. And it gradually leaves the electrode’s sulfur unavailable for participating in future charge/discharge cycles. The net result is that early generations of the technology would discharge themselves while sitting unused and would only survive a few hundred cycles before performance decayed dramatically.

But there has been progress on all these fronts, and some lithium-sulfur batteries with performance similar to lithium-ion have been demonstrated. Late last year, a company announced that it had lined up the money needed to build the first large-scale lithium-sulfur battery factory. Still, work on improvements has continued, and the new work seems to suggest ways to boost performance well beyond lithium-ion.

The need for speed

The paper describing the new developments, done by a collaboration between Chinese and German researchers, focuses on one aspect of the challenges posed by lithium-sulfur batteries: the relatively slow chemical reaction between lithium ions and elemental sulfur. It presents that aspect as a roadblock to fast charging, something that will be an issue for automotive applications. But at the same time, finding a way to limit the formation of inactive intermediate products during this reaction goes to the root of the relatively short usable life span of lithium-sulfur batteries.

As it turns out, the researchers found two.

One of the problems with the lithium-sulfur reaction intermediates is that they dissolve in most electrolytes. But that’s not a problem if the electrolyte isn’t a liquid. Solid electrolytes are materials that have a porous structure at the atomic level, with the environment inside the pores being favorable for ions. This allows ions to diffuse through the solid. If there’s a way to trap ions on one side of the electrolyte, such as a chemical reaction that traps or de-ionizes them, then it can enable one-way travel.

Critically, pores that favor the transit of lithium ions, which are quite compact, aren’t likely to allow the transit of the large ionized chains of sulfur. So a solid electrolyte should help cut down on the problems faced by lithium-sulfur batteries. But it won’t necessarily help with fast charging.

The researchers began by testing a glass formed from a mixture of boron, sulfur, and lithium (B2S3 and Li2S). But this glass had terrible conductivity, so they started experimenting with related glasses and settled on a combination that substituted in some phosphorus and iodine.

The iodine turned out to be a critical component. While the exchange of electrons with sulfur is relatively slow, iodine undergoes electron exchange (technically termed a redox reaction) extremely quickly. So it can act as an intermediate in the transfer of electrons to sulfur, speeding up the reactions that occur at the electrode. In addition, iodine has relatively low melting and boiling points, and the researchers suggest there’s some evidence that it moves around within the electrolyte, allowing it to act as an electron shuttle.

Successes and caveats

The result is a far superior electrolyte—and one that enables fast charging. It’s typical that fast charging cuts into the total capacity that can be stored in a battery. But when charged at an extraordinarily fast rate (50C, meaning a full charge in just over a minute), a battery based on this system still had half the capacity of a battery charged 25 times more slowly (2C, or a half-hour to full charge).

But the striking thing was how durable the resulting battery was. Even at an intermediate charging rate (5C), it still had over 80 percent of its initial capacity after over 25,000 charge/discharge cycles. By contrast, lithium-ion batteries tend to hit that level of decay after about 1,000 cycles. If that sort of performance is possible in a mass-produced battery, it’s only a slight exaggeration to say it can radically alter our relationships with many battery-powered devices.

What’s not at all clear, however, is whether this takes full advantage of one of the original promises of lithium-sulfur batteries: more charge in a given weight and volume. The researchers specify the battery being used for testing; one electrode is an indium/lithium metal foil, and the other is a mix of carbon, sulfur, and the glass electrolyte. A layer of the electrolyte sits between them. But when giving numbers for the storage capacity per weight, only the weight of the sulfur is mentioned.

Still, even if weight issues would preclude this from being stuffed into a car or cell phone, there are plenty of storage applications that would benefit from something that doesn’t wear out even with 65 years of daily cycling.

Nature, 2025. DOI: 10.1038/s41586-024-08298-9  (About DOIs).

Photo of John Timmer

John is Ars Technica’s science editor. He has a Bachelor of Arts in Biochemistry from Columbia University, and a Ph.D. in Molecular and Cell Biology from the University of California, Berkeley. When physically separated from his keyboard, he tends to seek out a bicycle, or a scenic location for communing with his hiking boots.

A solid electrolyte gives lithium-sulfur batteries ludicrous endurance Read More »

graphene-enhanced-ceramic-tiles-make-striking-art

Graphene-enhanced ceramic tiles make striking art

In recent years, materials scientists experimenting with ceramics have started adding an oxidized form of graphene to the mix to produce ceramics that are tougher, more durable, and more resistant to fracture, among other desirable properties. Researchers at the National University of Singapore (NUS) have developed a new method that uses ultrasound to more evenly distribute graphene oxide (GO) in ceramics, according to a new paper published in the journal ACS Omega. And as a bonus, they collaborated with an artist who used the resulting ceramic tiles to create a unique art exhibit at the NUS Museum—a striking merger of science and art.

As reported previously, graphene is the thinnest material yet known, composed of a single layer of carbon atoms arranged in a hexagonal lattice. That structure gives it many unusual properties that hold great promise for real-world applications: batteries, super capacitors, antennas, water filters, transistors, solar cells, and touchscreens, just to name a few.

In 2021, scientists found that this wonder material might also provide a solution to the fading of colors of many artistic masterpieces. For instance, several of Georgia O’Keeffe’s oil paintings housed in the Georgia O’Keeffe Museum in Santa Fe, New Mexico, have developed tiny pin-sized blisters, almost like acne, for decades. Conservators have found similar deterioration in oil-based masterpieces across all time periods, including works by Rembrandt.

Van Gogh’s Sunflower series has been fading over the last century due to constant exposure to light. A 2011 study found that chromium in the chrome yellow Van Gogh favored reacted strongly with other compounds like barium and sulfur when exposed to sunlight. A 2016 study pointed the finger at the sulfates, which absorb in the UV spectrum, leading to degradation.

Even contemporary art materials are prone to irreversible color changes from exposure to light and oxidizing agents, among other hazards. That’s why there has been recent work on the use of nanomaterials for conservation of artworks. Graphene has a number of properties that make it attractive for art-conservation purposes. The one-atom-thick material is transparent, adheres easily to various substrates, and serves as an excellent barrier against oxygen, gases (corrosive or otherwise), and moisture. It’s also hydrophobic and is an excellent absorber of UV light.

Graphene-enhanced ceramic tiles make striking art Read More »

simple-voltage-pulse-can-restore-capacity-to-li-si-batteries

Simple voltage pulse can restore capacity to Li-Si batteries

The new work, then, is based on a hypothetical: What if we just threw silicon particles in, let them fragment, and then fixed them afterward?

As mentioned, the reason fragmentation is a problem is that it leads to small chunks of silicon that have essentially dropped off the grid—they’re no longer in contact with the system that shuttles charges into and out of the electrode. In many cases, these particles are also partly filled with lithium, which takes it out of circulation, cutting the battery’s capacity even if there’s sufficient electrode material around.

The researchers involved here, all based at Stanford University, decided there was a way to nudge these fragments back into contact with the electrical system and demonstrated it could restore a lot of capacity to a badly degraded battery.

Bringing things together

The idea behind the new work was that it could be possible to attract the fragments of silicon to an electrode, or at least some other material connected to the charge-handling network. On their own, the fragments in the anode shouldn’t have a net charge; when the lithium gives up an electron there, it should go back into solution. But the lithium is unlikely to be evenly distributed across the fragment, making them a polar material—net neutral, but with regions of higher and lower electron densities. And polar materials will move in an uneven electric field.

And, because of the uneven, chaotic structure of an electrode down at the nano scale, any voltage applied to it will create an uneven electric field. Depending on its local structure, that may attract or repel some of the particles. But because these are mostly within the electrode’s structure, most of the fragments of silicon are likely to bump into some other part of electrode in short order. And that could potentially re-establish a connection to the electrode’s current handling system.

To demonstrate that what should happen in theory actually does happen in an electrode, the researchers started by taking a used electrode and brushing some of its surface off into a solution. They then passed a voltage through the solution and confirmed the small bits of material from the battery started moving toward one of the electrodes that they used to apply a voltage to the solution. So, things worked as expected.

Simple voltage pulse can restore capacity to Li-Si batteries Read More »