Phoebus 2A, the most powerful space nuclear reactor ever made, was fired up at Nevada Test Site on June 26, 1968. The test lasted 750 seconds and confirmed it could carry first humans to Mars. But Phoebus 2A did not take anyone to Mars. It was too large, it cost too much, and it didn’t mesh with Nixon’s idea that we had no business going anywhere further than low-Earth orbit.
But it wasn’t NASA that first called for rockets with nuclear engines. It was the military that wanted to use them for intercontinental ballistic missiles. And now, the military wants them again.
Nuclear-powered ICBMs
The work on nuclear thermal rockets (NTRs) started with the Rover program initiated by the US Air Force in the mid-1950s. The concept was simple on paper. Take tanks of liquid hydrogen and use turbopumps to feed this hydrogen through a nuclear reactor core to heat it up to very high temperatures and expel it through the nozzle to generate thrust. Instead of causing the gas to heat and expand by burning it in a combustion chamber, the gas was heated by coming into contact with a nuclear reactor.
The key advantage was fuel efficiency. “Specific impulse,” a measurement that’s something like the gas mileage of a rocket, could be calculated from the square root of the exhaust gas temperature divided by the molecular weight of the propellant. This meant the most efficient propellant for rockets was hydrogen because it had the lowest molecular weight.
In chemical rockets, hydrogen had to be mixed with an oxidizer, which increased the total molecular weight of the propellant but was necessary for combustion to happen. Nuclear rockets didn’t need combustion and could work with pure hydrogen, which made them at least twice as efficient. The Air Force wanted to efficiently deliver nuclear warheads to targets around the world.
The problem was that running stationary reactors on Earth was one thing; making them fly was quite another.
Space reactor challenge
Fuel rods made with uranium 235 oxide distributed in a metal or ceramic matrix comprise the core of a standard fission reactor. Fission happens when a slow-moving neutron is absorbed by a uranium 235 nucleus and splits it into two lighter nuclei, releasing huge amounts of energy and excess, very fast neutrons. These excess neutrons normally don’t trigger further fissions, as they move too fast to get absorbed by other uranium nuclei.
Starting a chain reaction that keeps the reactor going depends on slowing them down with a moderator, like water, that “moderates” their speed. This reaction is kept at moderate levels using control rods made of neutron-absorbing materials, usually boron or cadmium, that limit the number of neutrons that can trigger fission. Reactors are dialed up or down by moving the control rods in and out of the core.
Translating any of this to a flying reactor is a challenge. The first problem is the fuel. The hotter you make the exhaust gas, the more you increase specific impulse, so NTRs needed the core to operate at temperatures reaching 3,000 K—nearly 1,800 K higher than ground-based reactors. Manufacturing fuel rods that could survive such temperatures proved extremely difficult.
Then there was the hydrogen itself, which is extremely corrosive at these temperatures, especially when interacting with those few materials that are stable at 3,000 K. Finally, standard control rods had to go, too, because on the ground, they were gravitationally dropped into the core, and that wouldn’t work in flight.
Los Alamos Scientific Laboratory proposed a few promising NTR designs that addressed all these issues in 1955 and 1956, but the program really picked up pace after it was transferred to NASA and Atomic Energy Commission (AEC) in 1958, There, the idea was rebranded as NERVA, Nuclear Engine for Rocket Vehicle Applications. NASA and AEC, blessed with nearly unlimited budget, got busy building space reactors—lots of them.
On April 11, a small company called Graphyte began pumping out beige bricks, somewhat the consistency of particle board, from its new plant in Pine Bluff, Arkansas. The bricks don’t look like much, but they come with a lofty goal: to help stop climate change.
Graphyte, a startup backed by billionaire Bill Gates’ Breakthrough Energy Ventures, will bury its bricks deep underground, trapping carbon there. The company bills it as the largest carbon dioxide removal project in the world.
Scientists have long warned of the dire threat posed by global warming. It’s gotten so bad though that the long-sought mitigation, cutting carbon dioxide emissions from every sector of the economy, might not be enough of a fix. To stave off the worst—including large swaths of the Earth exposed to severe heat waves, water scarcity, and crop failures—some experts say there is a deep need to remove previously emitted carbon, too. And that can be done anywhere on Earth—even in places not known for climate-friendly policies, like Arkansas.
Graphyte aims to store carbon that would otherwise be released from plant material as it burns or decomposes at a competitive sub-$100 per metric ton, and it wants to open new operations as soon as possible, single-handedly removing tens of thousands of tons of carbon annually, said Barclay Rogers, the company’s founder and CEO. Nevertheless, that’s nowhere near the amount of carbon that will have to be removed to register as a blip in global carbon emissions. “I’m worried about our scale of deployment,” he said. “I think we need to get serious fast.”
Hundreds of carbon removal startups have popped up over the past few years, but the fledgling industry has made little progress so far. That leads to the inevitable question: Could Graphyte and companies like it actually play a major role in combating climate change? And will a popular business model among these companies, inviting other companies to voluntarily buy “carbon credits” for those buried bricks, actually work?
Whether carbon emissions are cut to begin with, or pulled out of the atmosphere after they’ve already been let loose, climate scientists stress that there is no time to waste. The clock began ticking years ago, with the arrival of unprecedented fires and floods, superstorms, and intense droughts around the world. But carbon removal, as it’s currently envisioned, also poses additional sociological, economic, and ethical questions. Skeptics, for instance, say it could discourage more pressing efforts on cutting carbon emissions, leaving some experts wondering whether it will even work at all.
Still, the Intergovernmental Panel on Climate Change, the world’s forefront group of climate experts, is counting on carbon removal technology to dramatically scale up. If the industry is to make a difference, experimentation and research and development should be done quickly, within the next few years, said Gregory Nemet, professor of public affairs who studies low-carbon innovation at the University of Wisconsin-Madison. “Then after that is the time to really start going big and scaling up so that it becomes climate-relevant,” he added. “Scale-up is a big challenge.”
For nearly 20 years, scientists have known an asteroid named Apophis will pass unusually close to Earth on Friday, April 13, 2029. But most officials at the world’s space agencies stopped paying much attention when updated measurements ruled out the chance Apophis will impact Earth anytime soon.
Now, Apophis is again on the agenda, but this time as a science opportunity, not as a threat. The problem is there’s not much time to design, build and launch a spacecraft to get into position near Apophis in less than five years. The good news is there are designs, and in some cases, existing spacecraft, that governments can repurpose for missions to Apophis, a rocky asteroid about the size of three football fields.
Scientists discovered Apophis in 2004, and the first measurements of its orbit indicated there was a small chance it could strike Earth in 2029 or in 2036. Using more detailed radar observations of Apophis, scientists in 2021 ruled out any danger to Earth for at least the next 100 years.
“The three most important things about Apophis are: It will miss the Earth. It will miss the Earth. It will miss the Earth,” said Richard Binzel, a professor of planetary science at MIT. Binzel has co-chaired several conferences since 2020 aimed at drumming up support for space missions to take advantage of the Apophis opportunity in 2029.
“An asteroid this large comes this close only once per 1,000 years, or less frequently,” Binzel told Ars. “This is an experiment that nature is doing for us, bringing a large asteroid this close, such that Earth’s gravitational forces and tidal forces are going to tug and possibly shake this asteroid. The asteroid’s response is insightful to its interior.”
It’s important, Binzel argues, to get a glimpse of Apophis before and after its closest approach in 2029, when it will pass less than 20,000 miles (32,000 kilometers) from Earth’s surface, closer than the orbits of geostationary satellites.
“This is a natural experiment that will reveal how hazardous asteroids are put together, and there is no other way to get this information without vastly complicated spacecraft experiments,” Binzel said. “So this is a once-per-many-thousands-of-years experiment that nature is doing for us. We have to figure out how to watch.”
This week, the European Space Agency announced preliminary approval for a mission named RAMSES, which would launch in April 2028, a year ahead of the Apophis flyby, to rendezvous with the asteroid in early 2029. If ESA member states grant full approval for development next year, the RAMSES spacecraft will accompany Apophis throughout its flyby with Earth, collecting imagery and other scientific measurements before, during, and after closest approach.
The challenge of building and launching RAMSES in less than four years will serve as good practice for a potential future real-world scenario. If astronomers find an asteroid that’s really on a collision course with Earth, it might be necessary to respond quickly. Given enough time, space agencies could mount a reconnaissance mission, and if necessary, a mission to deflect or redirect the asteroid, likely using a technique similar to the one demonstrated by NASA’s DART mission in 2022.
“RAMSES will demonstrate that humankind can deploy a reconnaissance mission to rendezvous with an incoming asteroid in just a few years,” said Richard Moissl, head of ESA’s planetary defense office. “This type of mission is a cornerstone of humankind’s response to a hazardous asteroid. A reconnaissance mission would be launched first to analyze the incoming asteroid’s orbit and structure. The results would be used to determine how best to redirect the asteroid or to rule out non-impacts before an expensive deflector mission is developed.”
Shaking off the cobwebs
In order to make a 2028 launch feasible for RAMSES, ESA will reuse the design of a roughly half-ton spacecraft named Hera, which is scheduled for launch in October on a mission to survey the binary asteroid system targeted by the DART impact experiment in 2022. Copying the design of Hera will reduce the time needed to get RAMSES to the launch pad, ESA officials said.
“Hera demonstrated how ESA and European industry can meet strict deadlines and RAMSES will follow its example,” said Paolo Martino, who leads ESA’s development of Ramses, which stands for the Rapid Apophis Mission for Space Safety.
ESA’s space safety board recently authorized preparatory work on the RAMSES mission using funds already in the agency’s budget. OHB, the German spacecraft manufacturer that is building Hera, will also lead the industrial team working on RAMSES. The cost of RAMSES will be “significantly lower” than the 300-million-euro ($380 million) cost of the Hera mission, Martino wrote in an email to Ars.
“There is still so much we have yet to learn about asteroids but, until now, we have had to travel deep into the Solar System to study them and perform experiments ourselves to interact with their surface,” said Patrick Michel, a planetary scientist at the French National Center for Scientific Research, and principal investigator on the Hera mission.
“For the first time ever, nature is bringing one to us and conducting the experiment itself,” Michel said in a press release. “All we need to do is watch as Apophis is stretched and squeezed by strong tidal forces that may trigger landslides and other disturbances and reveal new material from beneath the surface.”
Assuming it gets the final go-ahead next year, RAMSES will join NASA’s OSIRIS-APEX mission in exploring Apophis. NASA is steering the spacecraft, already in space after its use on the OSIRIS-REx asteroid sample return mission, toward a rendezvous with Apophis in 2029, but it won’t arrive at its new target until a few weeks after its close flyby of Earth. The intricacies of orbital mechanics prevent a rendezvous with Apophis any earlier.
Observations from OSIRIS-APEX, a larger spacecraft than RAMSES with a sophisticated suite of instruments, “will deliver a detailed look of what Apophis is like after the Earth encounter,” Binzel said. “But until we establish the state of Apophis before the Earth encounter, we have only one side of the picture.”
Scientists are also urging NASA to consider launching a pair of mothballed science probes on a trajectory to fly by Apophis some time before its April 2029 encounter with Earth. These two spacecraft were built for NASA’s Janus mission, which the agency canceled last year after the mission fell victim to launch delays with NASA’s larger Psyche asteroid explorer. The Janus probes were supposed to launch on the same rocket as Psyche, but problems with the Psyche mission forced a delay in the launch of more than one year.
Despite the delay, Psyche could still reach its destination in the asteroid belt, but the new launch trajectory meant Janus would be unable to visit the two binary asteroids scientists originally wanted to explore with the probes. After spending nearly $50 million on the mission, NASA put the twin Janus spacecraft, each about the size of a suitcase, into long-term storage.
At the most recent workshop on Apophis missions in April, scientists heard presentations on more than 20 concepts for spacecraft and instrument measurements at Apophis.
They included an idea from Blue Origin, Jeff Bezos’s space company, to use its Blue Ring space tug as a host platform for multiple instruments and landers that could descend to the surface of Apophis, assuming research institutions have enough time and money to develop their payloads. A startup named Exploration Laboratories has proposed partnering with NASA’s Jet Propulsion Laboratory on a small spacecraft mission to Apophis.
“At the conclusion of the workshop, it was my job to try to bring forward some consensus, because if we don’t have some consensus on our top priority, we may end up with nothing,” Binzel said. “The consensus recommendation for ESA was to more forward with RAMSES.”
Workshop participants also gently nudged NASA to use the Janus probes for a mission to Apophis. “Apophis is a mission in search of a spacecraft, and Janus is a spacecraft in search of a mission,” Binzel said. “As a matter of efficiency and basic logic, Janus to Apophis is the highest priority.”
A matter of money
But NASA’s science budget, and especially funding for its planetary science vision, is under stress. Earlier this week, NASA canceled an already-built lunar rover named VIPER after spending $450 million on the mission. The mission had exceeded its original development cost by greater than 30 percent, prompting an automatic cancellation review.
The funding level for NASA’s science mission directorate this year is nearly $500 million less than last year’s budget, and $900 million below the White House’s budget request for fiscal year 2024. Because of the tight budget, NASA officials have said, for now, they are not starting development of any new planetary science missions as they focus on finishing projects already in the pipeline, like the Europa Clipper mission, the Dragonfly quadcopter to visit Saturn’s moon Titan, and the Near-Earth Object (NEO) Surveyor telescope to search for potentially hazardous asteroids.
NASA has asked the Janus team to look at the feasibility of launching on the same rocket as NEO Surveyor in 2027, according to Dan Scheeres, the Janus principal investigator at the University of Colorado. With such a launch in 2027, Janus could capture the first up-close images of Apophis before RAMSES and OSIRIS-APEX get there.
“This is something that we’re currently presenting in some discussions with NASA, just to make sure that they understand what the possibilities are there,” Scheeres said in a meeting last week of the Small Bodies Advisory Group, which represents the asteroid science community.
“These spacecraft are capable of performing future scientific flyby missions to near-Earth asteroids,” Scheeres said. “Each spacecraft has a high-quality Malin visible imager and a thermal infrared imager. Each spacecraft has the ability to track and image an asteroid system through a close, fast flyby.”
“The scientific return from an Apophis flyby by Janus could be one of the best opportunities out there,” said Daniella DellaGiustina, lead scientist on the OSIRIS-APEX mission from the University of Arizona.
Binzel, who has led the charge for Apophis missions, said there is also some symbolic value to having a spacecraft escort the asteroid by Earth. Apophis will be visible in the skies over Europe and Africa when it is closest to our planet.
“When 2 billion people are watching this, they are going to ask, ‘What are our space agencies doing?’ And if the answer is, ‘Oh, we’ll be there. We’re getting there,’ which is OSIRIS-APEX, I don’t think that’s a very satisfying answer,” Binzel said.
“As the international space community, we want to demonstrate on April 13, 2029, that we are there and we are watching, and we are watching because we want to gain the most knowledge and the most understanding about these objects that is possible, because someday it could matter,” Binzel said. “Someday, our detailed knowledge of hazardous asteroids would be among the most important knowledge bases for the future of humanity.”
Let me get three negative points about the Gazelle Eclipse out of the way first. First, it’s a 62-pound e-bike, so it’s tough to get moving without its battery. Second, its rack is a thick, non-standard size, so you might need new bags for it. Third—and this is the big one—with its $6,000 suggested retail price, it’s expensive, and you will probably feel nervous about locking it anywhere you don’t completely trust.
Apart from those issues, though, this e-bike is great fun. When I rode the Eclipse (the C380+ HMB version of it), I felt like Batman on a day off, or maybe Bruce Wayne doing reconnaissance as a bike enthusiast. The matte gray color, the black hardware, and the understated but impressively advanced tech certainly helped. But I felt prepared to handle anything that was thrown at me without having to think about it much. Brutally steep hills, poorly maintained gravel paths, curbs, stop lights, or friends trying to outrun me on their light road bikes—the Eclipse was ready.
It assists up to 28 miles per hour (i.e., Class 3) and provides up to 85 Nm of torque, and the front suspension absorbs shocks without shaking your grip confidence. It has integrated lights, the display can show you navigation while your phone is tucked away, and the automatic assist changing option balances your mechanical and battery levels, leaving you to just pedal and look.
What kind of bike is this? A fun one.
The Eclipse comes in two main variants, the 11-speed, chain-and-derailleur model T11+ HMB and the stepless Enviolo hub and Gates Carbon belt-based C380+ HMB. Both come in three sizes (45, 50, and 55 cm), in one of two colors (Anthracite Grey, Thyme Green for the T11+, and Metallic Orange for the C380+), and with either a low-step or high-step version, the latter with a sloping top bar. Most e-bikes come in two sizes if you’re lucky, typically “Medium” and “Large,” and their suggested height spans are far too generous. The T11+ starts at $5,500 and the C380+ starts at $6,000.
The Eclipse’s posture is an “active” one, seemingly halfway between the upright Dutch style and a traditional road or flat-bar bike. It’s perfect for this kind of ride. The front shocks have a maximum of 75 mm of travel, which won’t impress your buddies riding real trails but will make gravel, dirt, wooden bridges, and woodland trails a potential. Everything about the Eclipse tells you to stop worrying about whether you have the right kind of bike for a ride and just start pedaling.
“But I’m really into exercise riding, and I need lots of metrics and data, during and after the ride,” I hear some of you straw people saying. That’s why the Eclipse has the Bosch Kiox 300, a center display that is, for an e-bike, remarkably readable, navigable, and informative. You can see your max and average speed, distance, which assist levels you spent time in, power output, cadence, and more. You can push navigation directions from Komoot or standard maps apps from your phone to the display, using Bosch’s Flow app. And, of course, you can connect to Strava.
Halfway between maximum efficiency and careless joyriding, the Eclipse offers a feature that I can only hope makes it down to cheaper e-bikes over time: automatic assist changing. Bikes that have both gears and motor assist levels can sometimes leave you guessing as to which one you should change when approaching a hill or starting from a dead stop. Set the Eclipse to automatic assist and you only have to worry about the right-hand grip shifter. There are no gear numbers; there is a little guy on a bike, and as you raise or lower the gearing, the road he’s approaching get steep or flat.
If there’s anything the Deadpool franchise is known for, it’s R-rated cheeky irreverence. The forthcoming Deadpool and Wolverine clearly has that in spades, but the final trailer strikes an uncharacteristically somber note, reminding us just what Wade Wilson/Deadpool stands to lose if Wolverine can’t rise to the challenge. Bonus: There’s a surprise cameo from Hugh Jackman’s co-star in Logan.
As previously reported, Ryan Reynolds found the perfect fit with 2016’s Deadpool, starring as Wade Wilson, a former Canadian special forces operative (dishonorably discharged) who develops regenerative healing powers that heal his cancer but leave him permanently disfigured with scars all over his body. Wade decides to become a masked vigilante, turning down an invitation to join the X-Men and abandon his bad-boy ways. The first Deadpool was a big hit, racking up $782 million at the global box office, critical praise, and a couple of Golden Globe nominations for good measure. Deadpool 2 was released in 2018 and was just as successful.
Deadpool and Wolverine reunites Reynolds with many familiar faces from the first two films. Morena Baccarin is back as Wade’s girlfriend Vanessa, along with Leslie Uggams as Blind Al; Karan Soni as Wade’s personal chauffeur, taxi driver Dopinder; Brianna Hildebrand as Negasonic Teenage Warhead; Stefan Kapičić as the voice of Colossus; Shioli Kutsuna as Negasonic’s mutant girlfriend, Yukio; Randal Reeder as Buck; and Lewis Tan as X-Force member Shatterstar.
Along with Sabretooth, the mutants Toad and Dogpool should be on hand to make some trouble. New to the franchise are Matthew MacFadyen as a Time Variance Authority agent named Paradox and Emma Corrin as the lead villain. There have been rumors that Owen Wilson’s Mobius and the animated Miss Minutes from Loki may also appear in the film.
Marvel released a two-minute teaser for the new movie during the Super Bowl in February, featuring the trademark cheeky irreverence that made audiences embrace Reynold’s R-rated superhero in the first place, plus a glimpse of Hugh Jackman’s Wolverine—or rather, his distinctive shadow. And yes, Marvel is retaining that R rating—a big step given that all the prior MCU films have been resoundingly PG-13. Marvel dropped a full trailer in April that was chock-full of off-color witticisms, meta-references, slo-mo action, and a generous sprinkling of F-bombs. And last month, another one-minute trailer dropper with a surprise appearance: Sabretooth, played by the same actor, Tyler Mane, who portrayed the character in 2000’s X-Men.
Someone needs a pep talk
This final trailer takes an entirely different tone, playing like a love letter to the Wolverine of the X-Men franchise. It’s basically Wade/Deadpool having a bona fide heart-to-heart with the Wolverine in this alternate timeline. “I know I turn everything into a joke, but I care,” he says in a voiceover accompanying footage from prior X-Men films, rendered in nostalgic gray tones. “I waited a long time for this team-up. In my world, you’re well-regarded. You were an X-Man. Fuck that, you were THE X-Man. The Wolverine. He was a hero in my world.”
Wolverine’s response: “Yeah well, he ain’t shit in mine.” We learn that this version of Wolverine resisted all attempts to persuade him to officially don the suit and join the X-Men and now he believes it’s just too late to make a difference. As a last-ditch effort, Wade shows him a picture of the nine people he loves who make up his entire world and tells him he has no idea how to save them—but Wolverine does.
And who shows up at the crucial moment but Dafne Keene’s Laura Kinney, the cloned mutant formerly known as X-23, who inherited the Wolverine mantle in the comics. When Wolverine insists they’ve got the wrong guy, she replies, “You were always the wrong guy… until you weren’t.” We’re betting Wolverine’s gonna step up.
Deadpool and Wolverine hits theaters on July 26, 2024.
Continuing to evolve the fact-checking service that launched as Twitter’s Birdwatch, X has announced that Community Notes can now be requested to clarify problematic posts spreading on Elon Musk’s platform.
X’s Community Notes account confirmed late Thursday that, due to “popular demand,” X had launched a pilot test on the web-based version of the platform. The test is active now and the same functionality will be “coming soon” to Android and iOS, the Community Notes account said.
Through the current web-based pilot, if you’re an eligible user, you can click on the “•••” menu on any X post on the web and request fact-checking from one of Community Notes’ top contributors, X explained. If X receives five or more requests within 24 hours of the post going live, a Community Note will be added.
Only X users with verified phone numbers will be eligible to request Community Notes, X said, and to start, users will be limited to five requests a day.
“The limit may increase if requests successfully result in helpful notes, or may decrease if requests are on posts that people don’t agree need a note,” X’s website said. “This helps prevent spam and keep note writers focused on posts that could use helpful notes.”
Once X receives five or more requests for a Community Note within a single day, top contributors with diverse views will be alerted to respond. On X, top contributors are constantly changing, as their notes are voted as either helpful or not. If at least 4 percent of their notes are rated “helpful,” X explained on its site, and the impact of their notes meets X standards, they can be eligible to receive alerts.
“A contributor’s Top Writer status can always change as their notes are rated by others,” X’s website said.
Ultimately, X considers notes helpful if they “contain accurate, high-quality information” and “help inform people’s understanding of the subject matter in posts,” X said on another part of its site. To gauge the former, X said that the platform partners with “professional reviewers” from the Associated Press and Reuters. X also continually monitors whether notes marked helpful by top writers match what general X users marked as helpful.
“We don’t expect all notes to be perceived as helpful by all people all the time,” X’s website said. “Instead, the goal is to ensure that on average notes that earn the status of Helpful are likely to be seen as helpful by a wide range of people from different points of view, and not only be seen as helpful by people from one viewpoint.”
X will also be allowing half of the top contributors to request notes during the pilot phase, which X said will help the platform evaluate “whether it is beneficial for Community Notes contributors to have both the ability to write notes and request notes.”
According to X, the criteria for requesting a note have intentionally been designed to be simple during the pilot stage, but X expects “these criteria to evolve, with the goal that requests are frequently found valuable to contributors, and not noisy.”
It’s hard to tell from the outside looking in how helpful Community Notes are to X users. The most recent Community Notes survey data that X points to is from 2022 when the platform was still called Twitter and the fact-checking service was still called Birdwatch.
That data showed that “on average,” users were “20–40 percent less likely to agree with the substance of a potentially misleading Tweet than someone who sees the Tweet alone.” And based on Twitter’s “internal data” at that time, the platform also estimated that “people on Twitter who see notes are, on average, 15–35 percent less likely to Like or Retweet a Tweet than someone who sees the Tweet alone.”
Although US coal consumption has fallen dramatically since 2005, the country still consumes millions of tons a year, and exports tons more—much of it transported by train. Now, new research shows that these trains can affect the health of people living near where they pass.
The study found that residents living near railroad tracks likely have higher premature mortality rates due to air pollutants released during the passage of uncovered coal trains. The analysis of the San Francisco Bay Area cities of Oakland, Richmond, and Berkeley shows that increases in air pollutants such as small particulate matter (PM 2.5) are also associated with increases in asthma-related episodes and hospital admissions.
“This has never been studied in the world. There’s been a couple studies trying to measure just the air pollution, usually in rural areas, but this was the first to both measure air pollution and trains in an urban setting,” said Bart Ostro, author of the study and an epidemiologist at the University of California, Davis.
Persistent coal pollution
Trains carry nearly 70 percent of coal shipments in the United States, leaving a trail of pollution in their wake. And coal exports will have a similar impact during transit. Ostro explained that when uncovered coal trains travel, the coal particles disperse around the railroad tracks. Levels of PM 2.5 “[spread] almost a mile away,” he added.
As a result, the mere passage of coal trains could affect the health of surrounding communities. Ostro was particularly concerned about how these pollutants could harm vulnerable populations living near the coal export terminal in Richmond. Previous census data had already shown that those in Richmond who live around the rail line have mortality rates 10 to 50 percent higher than the county average. Communities in Oakland could be at risk, too, since discussions are underway to build a new coal export terminal in the region.
But before researchers could study the health effects of these air pollutants, they first had to understand how much was spread by passing trains. This was a challenge in itself because coal trains aren’t scheduled like regular passenger trains.
To ensure that researchers could measure all trains and pollutants, Ostro and his team developed a monitoring system with three main components: a weather station to provide meteorological parameters, an air quality sensor to track air pollution levels, and an AI-trained camera to recognize coal trains. The trained cameras were critical to the entire project, identifying different types of trains: full coal trains, empty coal trains, freight trains, and passenger trains.
With the system in place, Ostro’s team measured pollution levels and was able to attribute them directly to coal trains. Their results, published last year, showed that coal trains and terminal operations added a significant amount of PM 2.5 pollution to urban areas, more than other freight or passenger trains. Passing coal trains added an average of eight μg/m3 to ambient pollution. This is two to three micrograms more than freight trains contribute. Even empty coal cars contribute to increased pollution levels due to traces of coal dust.
Particulate problems
This year, in a follow-up study, researchers combined these findings with US Census data and health studies to understand how this increase might affect local communities. They estimated that more than 260,000 people would be exposed to some increase in annual PM 2.5, and that such exposure was associated with significant mortality and morbidity.
Health effects were quantified for three different scenarios based on different wind conditions. In the worst-case scenario, where there’s an increase of about two μg/m3 near the railway line, modeling suggests that premature mortality would increase by 1.3 percent. Hospital admissions for conditions such as chronic lung disease, pneumonia, and cardiovascular disease would also increase by 4.7 percent, 6.2 percent, and 2.2 percent, respectively. Although these are relatively small numbers in a small population, Ostro points out that they could be extrapolated to larger populations in other countries.
“The way I see it, this is a microcosm of what could be happening globally,” he added. While coal use—and the transportation of that coal—is declining in the US and the European Union, the same isn’t happening everywhere. In countries like China and India, for example, coal use is increasing, and populations living near the railroads that transport that coal could be at risk.
“These findings have major implications beyond San Francisco and the US,” said Michel Abramson from Monash University in Australia, who wasn’t involved in the study. The researcher thinks Ostro’s assessment “fills an important gap” by looking at the health effects of transporting coal in uncovered rail cars but doesn’t think there are any solutions to mitigate the problem other than stopping the use of coal.
“Covering the coal cars might not solve the problem, because it could increase the risk of fires,” he added. “Ultimately the world needs to phase out the mining, transport, and combustion of coal, not only to reduce the risks of climate change, but also to improve the health of the population.”
Bárbara Pinho is a science journalist specializing in climate, health, and agriculture, based in Porto, Portugal. Learn more about her work at barbarapinho.com or follow her on X (formerly Twitter) @BarbPinho
After every one of its house-brand phones, and even its new wall charger, have been meticulously photographed, sized, and rated for battery capacity, what should Google do to keep the anticipation up for the Pixel 9 series’ August 13 debut?
Lean into it, it seems, and Google is doing so with an eye toward further promoting its Gemini-based AI aims. In a video post on X (formerly Twitter), Google describes a “phone built for the Gemini era,” one that can, through the power of Gemini, “even let your old phone down easy” with a breakup letter. The camera pans out, and the shape of the Pixel 9 Pro appears and turns around to show off the now-standard Pixel camera bar across the upper back.
There’s also a disclaimer to this tongue-in-cheek request for a send-off to a phone that is “just the same old thing”: “Screen simulated. Limitations apply. Check responses for accuracy.”
Over at the Google Store, you can see a static image of the Pixel 9 Pro and sign up for alerts about its availability. The image confirms that the photos taken by Taiwanese regulatory authority NCC were legitimate, right down to the coloring on the back of the Pixel 9 Pro and the camera and flash placement.
Those NCC photos confirmed that Google intends to launch four different phone-ish devices at its August 13 “Made by Google” event. The Pixel 9 and Pixel 9 Pro are both roughly 6.1-inch devices, but the Pro will likely offer more robust Gemini AI integration due to increased RAM and other spec bumps. The Pixel 9 Pro XL should have similarly AI-ready specs, just in a larger size. And the Pixel 9 Pro Fold is an iteration on Google’s first Pixel Fold model, with seemingly taller dimensions and a daringly smaller battery.
Full suspension mountain bikes are complicated beasts, with sections of the frame that pivot and a shock absorber to moderate that pivot. These parts help limit the bumps that reach your body and keep your rear tire in contact with the trail across all sorts of terrain and obstacles. The complexity and additional parts, however, boost the costs of full suspension bikes considerably, a situation that only gets worse when you electrify things.
But there’s one easy way to lower the price considerably: lose the full suspension. The electric “hardtails” from major manufacturers typically cost considerably less than a full suspension bike with similar components. And because the engineering demands are considerably lower than in a full suspension bike, it’s easier for some of the smaller e-bike companies to put together a solid offering.
So over the course of the spring and into the summer, I’ve been testing two hardtail mountain bikes that were recently introduced by e-bike specialists. First up is the Aventon Ramblas.
The hardware
Aventon is one of the larger dedicated e-bike makers and offers a wide range of bikes at competitive prices. Most of them fall into a sort of generic “commuter” category, though; the Ramblas is the first offering from the company made for a specific audience (though it’s also categorized as a commuter option on the company’s website). It’s also the first bike the company is offering above the $2,000 price point. At $2,899, it’s actually more expensive than one of the electric hardtail models being cleared out by Trek, a company that does not have a reputation for affordability.
What do you get for that price? Solid low/mid-range components from SRAM, including its NX Eagle drive train. There’s a dropper seat, a front suspension from RockShox, and Maxxis tires. The fork is coil based, so it doesn’t offer much in the way of adjustment—what you start the ride with is pretty much what you’ll spend the entire ride experiencing, unlike many alternatives that let you firm up the ride for pavement. (It has a rebound adjustment at the bottom of the fork, but the effects are subtle.) Aventon doesn’t list who makes the rims on its website, and there are no external indications of the manufacturer there.
Overall, it’s about what you’d expect from an entry-level offering. I don’t have any concerns about the durability of the components, and their performance was mostly fine. The one thing that did concern me was the plastic cover over the battery, which didn’t fit against the frame snugly and was only held in place by relatively weak contacts at each end. It’s enough to handle some water splashed off the front wheel, but I wouldn’t trust it to protect the battery while fording anything significant.
Saddle and pedals are matters of personal taste, and many people will argue they’re irrelevant because any serious cyclist will want to replace them anyway. But that’s far less likely to be true on the budget end of the scale, so I did most of my riding on what came with the bike. The pedals, while lacking the threatening-looking screws of serious mountain bike offerings, worked out fine when paired with a sticky set of mountain bike shoes, though I felt I had a bit more confidence going over bumps on a ride where I swapped in my clipless pedals.
The saddle, however, was a problem, in part because the frame was a bit too small for my relatively long legs. The saddle has a relatively slick surface that, when combined with my road biking shorts, meant I tended to slide toward the back of the seat over time. A better-fitting frame might have solved this issue (the large version was supposedly rated up to my height, but I clearly should have gone for the XL).
Speaking of the frame, Aventon has detailed measurements of the geometry available if those make sense to you. But my experience was that the bike was fairly compact in the seat-to-handlebar dimension, leaving me feeling that I was leaning over the handlebars a bit more than I do in other bikes. It wasn’t uncomfortable; it just felt different.
“Language is a huge field, and we are novices in this. We know a lot about how different areas of the brain are involved in linguistic tasks, but the details are not very clear,” says Mohsen Jamali, a computational neuroscience researcher at Harvard Medical School who led a recent study into the mechanism of human language comprehension.
“What was unique in our work was that we were looking at single neurons. There is a lot of studies like that on animals—studies in electrophysiology, but they are very limited in humans. We had a unique opportunity to access neurons in humans,” Jamali adds.
Probing the brain
Jamali’s experiment involved playing recorded sets of words to patients who, for clinical reasons, had implants that monitored the activity of neurons located in their left prefrontal cortex—the area that’s largely responsible for processing language. “We had data from two types of electrodes: the old-fashioned tungsten microarrays that can pick the activity of a few neurons; and the Neuropixel probes which are the latest development in electrophysiology,” Jamali says. The Neuropixels were first inserted in human patients in 2022 and could record the activity of over a hundred neurons.
“So we were in the operation room and asked the patient to participate. We had a mixture of sentences and words, including gibberish sounds that weren’t actual words but sounded like words. We also had a short story about Elvis,” Jamali explains. He said the goal was to figure out if there was some structure to the neuronal response to language. Gibberish words were used as a control to see if the neurons responded to them in a different way.
“The electrodes we used in the study registered voltage—it was a continuous signal at 30 kHz sampling rate—and the critical part was to dissociate how many neurons we had in each recording channel. We used statistical analysis to separate individual neurons in the signal,” Jamali says. Then, his team synchronized the neuronal activity signals with the recordings played to the patients down to a millisecond and started analyzing the data they gathered.
Putting words in drawers
“First, we translated words in our sets to vectors,” Jamali says. Specifically, his team used the Word2Vec, a technique used in computer science to find relationships between words contained in a large corpus of text. What Word2Vec can do is tell if certain words have something in common—if they are synonyms, for example. “Each word was represented by a vector in a 300-dimensional space. Then we just looked at the distance between those vectors and if the distance was close, we concluded the words belonged in the same category,” Jamali explains.
Then the team used these vectors to identify words that clustered together, which suggested they had something in common (something they later confirmed by examining which words were in a cluster together). They then determined whether specific neurons responded differently to different clusters of words. It turned out they did.
“We ended up with nine clusters. We looked at which words were in those clusters and labeled them,” Jamali says. It turned out that each cluster corresponded to a neat semantic domain. Specialized neurons responded to words referring to animals, while other groups responded to words referring to feelings, activities, names, weather, and so on. “Most of the neurons we registered had one preferred domain. Some had more, like two or three,” Jamali explained.
The mechanics of comprehension
The team also tested if the neurons were triggered by the mere sound of a word or by its meaning. “Apart from the gibberish words, another control we used in the study was homophones,” Jamali says. The idea was to test if the neurons responded differently to the word “sun” and the word “son,” for example.
It turned out that the response changed based on context. When the sentence made it clear the word referred to a star, the sound triggered neurons triggered by weather phenomena. When it was clear that the same sound referred to a person, it triggered neurons responsible for relatives. “We also presented the same words at random without any context and found that it didn’t elicit as strong a response as when the context was available,” Jamali claims.
But the language processing in our brains will need to involve more than just different semantic categories being processed by different groups of neurons.
“There are many unanswered questions in linguistic processing. One of them is how much a structure matters, the syntax. Is it represented by a distributed network, or can we find a subset of neurons that encode structure rather than meaning?” Jamali asked. Another thing his team wants to study is what the neural processing looks like during speech production, in addition to comprehension. “How are those two processes related in terms of brain areas and the way the information is processed,” Jamali adds.
The last thing—and according to Jamali the most challenging thing—is using the Neuropixel probes to see how information is processed across different layers of the brain. “The Neuropixel probe travels through the depths of the cortex, and we can look at the neurons along the electrode and say like, ‘OK, the information from this layer, which is responsible for semantics, goes to this layer, which is responsible for something else.’ We want to learn how much information is processed by each layer. This should be challenging, but it would be interesting to see how different areas of the brain are involved at the same time when presented with linguistic stimuli,” Jamali concludes.
A British judge is referring self-proclaimed bitcoin inventor Craig Wright to the Crown Prosecution Service (CPS) to consider criminal charges of perjury and forgery. The judge said that CPS can decide whether Wright should be arrested and granted two injunctions that prohibit Wright from re-litigating his claim to be bitcoin inventor Satoshi Nakamoto.
“I have no doubt that I should refer the relevant papers in this case to the CPS for consideration of whether a prosecution should be commenced against Dr. Wright for his wholescale perjury and forgery of documents and/or whether a warrant for his arrest should be issued and/or whether his extradition should be sought from wherever he now is. All those matters are to be decided by the CPS,” Justice James Mellor of England’s High Court of Justice wrote in a ruling issued today.
If Wright actually believes he is Nakamoto, “he is deluding himself,” Mellor wrote.
Mellor previously found that Wright “lied repeatedly and extensively” and forged documents “on a grand scale” in a case related to Wright’s claim that he is Nakamoto. The case began when Wright was sued by the nonprofit Crypto Open Patent Alliance (COPA), which said its goal was to disprove Wright’s bitcoin-inventing claim and stop him from claiming intellectual property rights to the system.
Wright’s location unknown
Wright’s location is unknown, today’s ruling said. “The evidence shows that Dr. Wright has left his previous residence in Wimbledon, appears to have left the UK, has been said to be traveling and was last established to be in the time zone of UTC +7,” Mellor wrote.
COPA asked Mellor “to dispense with personal service of the final Order on Dr. Wright” because his whereabouts are a mystery. COPA told the court that “Dr. Wright may either be deliberately evading service or at least is peripatetic and is very difficult to locate.” Mellor wrote that COPA’s view “seems to me to be fully justified and warrants the order which COPA seeks as to service of my final Order on Dr. Wright at his solicitors.”
After the events of the trial, Mellor’s decision to refer Wright for a perjury prosecution was apparently an easy one. “As COPA submitted, if what happened in this case does not warrant referral to the CPS, it is difficult to envisage a case which would… In advancing his false claim to be Satoshi through multiple legal actions, Dr. Wright committed ‘a most serious abuse’ of the process of the courts of the UK, Norway and the USA,” Mellor wrote.
Anti-lawsuit injunction
Mellor also approved COPA’s request for injunctions that prohibit Wright from bringing certain kinds of lawsuits based on his bitcoin-inventing claim. As the Associated Press reported, the approved injunctions are intended to prevent Wright “from threatening to sue or filing lawsuits aimed at developers.”
The COPA requests approved by Mellor were for “an anti-suit injunction preventing Dr. Wright or the other Claimants in the related claims from pursuing further proceedings in this or other jurisdictions to re-litigate his claim to be Satoshi,” and “a related order preventing him from threatening such proceedings.”
Mellor declined to issue additional orders preventing Wright from asserting legal rights as Nakamoto, preventing re-publication of Wright’s fraudulent claims, and requiring him to delete previously published statements. The judge said there was some overlap between the injunction requests that were approved and those that were not. Moreover, Wright would have difficulty convincing anyone that he invented bitcoin without violating the two approved injunctions.
Although there is a slight risk that “certain people may start to change their minds or begin to believe that Dr. Wright is Satoshi… I am inclined to the view that the effect would be small. Right-thinking people are likely to regard those assertions as hot air or empty rhetoric, even faintly ridiculous,” Mellor wrote.
Similarly, an order to delete statements “would be disproportionate” and is unnecessary because “anyone with an interest in Bitcoin will have been aware of the COPA Trial and know of the outcome,” Mellor wrote. However, the judge decided that COPA can make the requests again if it turns out to be necessary.
“I accept that my assessment may turn out to be off the mark. Furthermore, the evidence shows that whilst Dr. Wright has modified his public statements following the outcome of the COPA Trial, that may well turn out to be temporary. Dr. Wright is perfectly capable, once the dust has settled, of ramping up his public pronouncements again,” Mellor wrote.
Because of that possibility, Mellor said COPA has “permission to apply, for a period of 2 years, for any further injunctive relief they consider they can establish to be required to protect the interests of the corporate entities they represent as well as the individuals in the Bitcoin community who have suffered due to Dr. Wright’s false claim to be Satoshi.”
According to Bloomberg, Google’s offer to the Cloud Infrastructure Services Providers in Europe (CISPE) required that the group maintain its EU antitrust complaint. It came “just days” before CISPE settled with Microsoft, and it was apparently not compelling enough to stop CISPE from inking a deal with the software giant that TechCrunch noted forced CISPE to accept several compromises.
Bloomberg uncovered Google’s attempted counteroffer after reviewing confidential documents and speaking to “people familiar with the matter.” Apparently, Google sought to sway CISPE with a package worth nearly $500 million for more than five years of software licenses and about $15 million in cash.
But CISPE did not take the bait, announcing last week that an agreement was reached with Microsoft, seemingly frustrating Google.
CISPE initially raised its complaint in 2022, alleging that Microsoft was “irreparably damaging the European cloud ecosystem and depriving European customers of choice in their cloud deployments” by spiking costs to run Microsoft’s software on rival cloud services. In February, CISPE said that “any remedies and resolution must apply across the sector and to be accessible to all cloud customers in Europe.” They also promised that “any agreements will be made public.”
But the settlement reached last week excluded major rivals, including Amazon, which is a CISPE member, and Google, which is not. And despite CISPE’s promise, the terms of the deal were not published, apart from a CISPE blog roughly outlining central features that it claimed resolved the group’s concerns over Microsoft’s allegedly anticompetitive behaviors.
What is clear is that CISPE agreed to drop their complaint by taking the deal, but no one knows exactly how much Microsoft paid in a “lump sum” to cover CISPE legal fees for three years, TechCrunch noted. However, “two people with direct knowledge of the matter” told Reuters that Microsoft offered about $22 million.
Google has been trying to catch up with Microsoft and Amazon in the cloud market and has recently begun gaining ground. Last year, Google’s cloud operation broke even for the first time, and the company earned a surprising $900 million in profits in the first quarter of 2024, which bested analysts’ projections by more than $200 million, Bloomberg reported. For Google, the global cloud market has become a key growth area, Bloomberg noted, as potential growth opportunities in search advertising slow. Seemingly increasing regulatory pressure on Microsoft while taking a chunk of its business in the EU was supposed to be one of Google’s next big moves.
A CISPE spokesperson, Ben Maynard, told Ars that its “members were presented with alternative options to accepting the Microsoft deal,” while not disclosing the terms of the other options. “However, the members voted by a significant majority to accept the Microsoft offer, which, in their view, presented the best opportunity for the European cloud sector,” Maynard told Ars.
Neither Microsoft nor Google has commented directly on the reported counteroffer. A Google spokesperson told Bloomberg that Google “has long supported the principles of fair software licensing and that the firm was having discussions about joining CISPE, to fight anticompetitive licensing practices.” A person familiar with the matter told Ars that Google did not necessarily make the counteroffer contingent on dropping the EU complaint, but had long been exploring joining CISPE and would only do so if CISPE upheld its mission to defend fair licensing deals. Microsoft reiterated a past statement from its president, Brad Smith, confirming that Microsoft was “pleased” to resolve CISPE’s antitrust complaint.
For CISPE, the resolution may not have been perfect, but it “will enable European cloud providers to offer Microsoft applications and services on their local cloud infrastructures, meeting the demand for sovereign cloud solutions.” In 2022, CISPE Secretary-General Francisco Mingorance told Ars that although CISPE had been clear that it intended to force Microsoft to make changes allowing all cloud rivals to compete, “a key reason behind filing the complaint was to support” two smaller cloud service providers, Aruba and OVH.