Science

air-quality-problems-spur-$200-million-in-funds-to-cut-pollution-at-ports

Air quality problems spur $200 million in funds to cut pollution at ports


Diesel equipment will be replaced with hydrogen- or electric-power gear.

Raquel Garcia has been fighting for years to clean up the air in her neighborhood southwest of downtown Detroit.

Living a little over a mile from the Ambassador Bridge, which thousands of freight trucks cross every day en route to the Port of Detroit, Garcia said she and her neighbors are frequently cleaning soot off their homes.

“You can literally write your name in it,” she said. “My house is completely covered.”

Her neighborhood is part of Wayne County, which is home to heavy industry, including steel plants and major car manufacturers, and suffers from some of the worst air quality in Michigan. In its 2024 State of the Air report, the American Lung Association named Wayne County one of the “worst places to live” in terms of annual exposure to fine particulate matter pollution, or PM2.5.

But Detroit, and several other Midwest cities with major shipping ports, could soon see their air quality improve as port authorities receive hundreds of millions of dollars to replace diesel equipment with cleaner technologies like solar power and electric vehicles.

Last week, the Biden administration announced $3 billion in new grants from the US Environmental Protection Agency’s Clean Ports program, which aims to slash carbon emissions and reduce air pollution at US shipping ports. More than $200 million of that funding will go to four Midwestern states that host ports along the Great Lakes: Michigan, Illinois, Ohio, and Indiana.

The money, which comes from the Inflation Reduction Act, will not only be used to replace diesel-powered equipment and vehicles, but also to install clean energy systems and charging stations, take inventory of annual port emissions, and set plans for reducing them. It will also fund a feasibility study for establishing a green hydrogen fuel hub along the Great Lakes.

The EPA estimates that those changes will, nationwide, reduce carbon pollution in the first 10 years by more than 3 million metric tons, roughly the equivalent of taking 600,000 gasoline-powered cars off the road. The agency also projects reduced emissions of nitrous oxide and PM2.5—both of which can cause serious, long-term health complications—by about 10,000 metric tons and about 180 metric tons, respectively, during that same time period.

“Our nation’s ports are critical to creating opportunity here in America, offering good-paying jobs, moving goods, and powering our economy,” EPA Administrator Michael Regan said in the agency’s press release announcing the funds. “Delivering cleaner technologies and resources to US ports will slash harmful air and climate pollution while protecting people who work in and live nearby ports communities.”

Garcia, who runs the community advocacy nonprofit Southwest Detroit Environmental Vision, said she’s “really excited” to see the Port of Detroit getting those funds, even though it’s just a small part of what’s needed to clean up the city’s air pollution.

“We care about the air,” she said. “There’s a lot of kids in the neighborhood where I live.”

Jumpstarting the transition to cleaner technology

Nationwide, port authorities in 27 states and territories tapped the Clean Ports funding, which they’ll use to buy more than 1,500 units of cargo-handling equipment, such as forklifts and cranes, 1,000 heavy-duty trucks, 10 locomotives, and 20 seafaring vessels, all of which will be powered by electricity or green hydrogen, which doesn’t emit CO2 when burned.

In the Midwest, the Illinois Environmental Protection Agency and the Cleveland-Cuyahoga County Port Authority in Ohio were awarded about $95 million each from the program, the Detroit-Wayne County Port Authority in Michigan was awarded $25 million, and the Ports of Indiana will receive $500,000.

Mark Schrupp, executive director of the Detroit-Wayne County Port Authority, said the funding for his agency will be used to help port operators at three terminals purchase new electric forklifts, cranes, and boat motors, among other zero-emission equipment. The money will also pay for a new solar array that will reduce energy consumption for port facilities, as well as 11 new electric vehicle charging stations.

“This money is helping those [port] businesses make the investment in this clean technology, which otherwise is sometimes five or six times the cost of a diesel-powered equipment,” he said, noting that the costs of clean technologies are expected to fall significantly in the coming years as manufacturers scale up production. “It also exposes them to the potential savings over time—full maintenance costs and other things that come from having the dirtier technology in place.”

Schrupp said that the new equipment will slash the Detroit-Wayne County Port Authority’s overall carbon emissions by more than 8,600 metric tons every year, roughly a 30 percent reduction.

Carly Beck, senior manager of planning, environment and information systems for the Cleveland-Cuyahoga County Port Authority, said its new equipment will reduce the Port of Cleveland’s annual carbon emissions by roughly 1,000 metric tons, or about 40 percent of the emissions tied to the port’s operations. The funding will also pay for two electric tug boats and the installation of solar panels and battery storage on the port’s largest warehouse, she added.

In 2022, Beck said, the Port of Cleveland took an emissions inventory, which found that cargo-handling equipment, building energy use, and idling ships were the port’s biggest sources of carbon emissions. Docked ships would run diesel generators for power as they unloaded, she said, but with the new infrastructure, the cargo-handling equipment and idling ships can draw power from a 2-megawatt solar power system with battery storage.

“We’re essentially creating a microgrid at the port,” she said.

Improving the air for disadvantaged communities

The Clean Ports funding will also be a boon for people like Garcia, who live near a US shipping port.

Shipping ports are notorious for their diesel pollution, which research has shown disproportionately affects poor communities of color. And most, if not all, of the census tracts surrounding the Midwest ports are deemed “disadvantaged communities” by the federal government. The EPA uses a number of factors, including income level and exposure to environmental harms, to determine whether a community is “disadvantaged.”

About 10,000 trucks pass through the Port of Detroit every day, Schrupp said, which helps to explain why residents of Southwest Detroit and the neighboring cities of Ecorse and River Rouge, which sit adjacent to Detroit ports, breathe the state’s dirtiest air.

“We have about 50,000 residents within a few miles of the port, so those communities will definitely benefit,” he said. “This is a very industrialized area.”

Burning diesel or any other fossil fuel produces nitrous oxide or PM2.5, and research has shown that prolonged exposure to high levels of those pollutants can lead to serious health complications, including lung disease and premature death. The Detroit-Wayne County Port Authority estimates that the new port equipment will cut nearly 9 metric tons of PM2.5 emissions and about 120 metric tons of nitrous oxide emissions each year.

Garcia said she’s also excited that some of the Detroit grants will be used to establish workforce training programs, which will show people how to use the new technologies and showcase career opportunities at the ports. Her area is gentrifying quickly, Garcia said, so it’s heartening to see the city and port authority taking steps to provide local employment opportunities.

Beck said that the Port of Cleveland is also surrounded by a lot of heavy industry and that the census tracts directly adjacent to the port are all deemed “disadvantaged” by federal standards.

“We’re trying to be good neighbors and play our part,” she said, “to make it a more pleasant environment.”

Kristoffer Tigue is a staff writer for Inside Climate News, covering climate issues in the Midwest. He previously wrote the twice-weekly newsletter Today’s Climate and helped lead ICN’s national coverage on environmental justice. His work has been published in Reuters, Scientific American, Mother Jones, HuffPost, and many more. Tigue holds a master’s degree in journalism from the Missouri School of Journalism.

This story originally appeared on Inside Climate News.

Photo of Inside Climate News

Air quality problems spur $200 million in funds to cut pollution at ports Read More »

russia:-fine,-i-guess-we-should-have-a-grasshopper-rocket-project,-too

Russia: Fine, I guess we should have a Grasshopper rocket project, too

Like a lot of competitors in the global launch industry, Russia for a long time dismissed the prospects of a reusable first stage for a rocket.

As late as 2016, an official with the Russian agency that develops strategy for the country’s main space corporation, Roscosmos, concluded, “The economic feasibility of reusable launch systems is not obvious.” In the dismissal of the landing prospects of SpaceX’s Falcon 9 rocket, Russian officials were not alone. Throughout the 2010s, competitors including space agencies in Europe and Japan, and US-based United Launch Alliance, all decided to develop expendable rockets.

However, by 2017, when SpaceX re-flew a Falcon 9 rocket for the first time, the writing was on the wall. “This is a very important step, we sincerely congratulate our colleague on this achievement,” then-Roscosmos CEO Igor Komarov said at the time. He even spoke of developing reusable components, such as rocket engines capable of multiple firings.

A Russian Grasshopper

That was more than seven years ago, however, and not much has happened in Russia since then to foster the development of a reusable rocket vehicle. Yes, Roscosmos unveiled plans for the “Amur” rocket in 2020, which was intended to have a reusable first stage and methane-fueled engines and land like the Falcon 9. But its debut has slipped year for year—originally intended to fly in 2026, its first launch is now expected no earlier than 2030.

Now, however, there is some interesting news from Moscow about plans to develop a prototype vehicle to test the ability to land the Amur rocket’s first stage vertically.

According to the state-run news agency, TASS, construction of this test vehicle will enable the space corporation to solve key challenges. “Next year preparation of an experimental stage of the (Amur) rocket, which everyone is calling ‘Grasshopper,’ will begin,” said Igor Pshenichnikov, the Roscosmos deputy director of the department of future programs. The Russian news article was translated for Ars by Rob Mitchell.

Russia: Fine, I guess we should have a Grasshopper rocket project, too Read More »

how-a-stubborn-computer-scientist-accidentally-launched-the-deep-learning-boom

How a stubborn computer scientist accidentally launched the deep learning boom


“You’ve taken this idea way too far,” a mentor told Prof. Fei-Fei Li.

Credit: Aurich Lawson | Getty Images

Credit: Aurich Lawson | Getty Images

During my first semester as a computer science graduate student at Princeton, I took COS 402: Artificial Intelligence. Toward the end of the semester, there was a lecture about neural networks. This was in the fall of 2008, and I got the distinct impression—both from that lecture and the textbook—that neural networks had become a backwater.

Neural networks had delivered some impressive results in the late 1980s and early 1990s. But then progress stalled. By 2008, many researchers had moved on to mathematically elegant approaches such as support vector machines.

I didn’t know it at the time, but a team at Princeton—in the same computer science building where I was attending lectures—was working on a project that would upend the conventional wisdom and demonstrate the power of neural networks. That team, led by Prof. Fei-Fei Li, wasn’t working on a better version of neural networks. They were hardly thinking about neural networks at all.

Rather, they were creating a new image dataset that would be far larger than any that had come before: 14 million images, each labeled with one of nearly 22,000 categories.

Li tells the story of ImageNet in her recent memoir, The Worlds I See. As she worked on the project, she faced plenty of skepticism from friends and colleagues.

“I think you’ve taken this idea way too far,” a mentor told her a few months into the project in 2007. “The trick is to grow with your field. Not to leap so far ahead of it.”

It wasn’t just that building such a large dataset was a massive logistical challenge. People doubted that the machine learning algorithms of the day would benefit from such a vast collection of images.

“Pre-ImageNet, people did not believe in data,” Li said in a September interview at the Computer History Museum. “Everyone was working on completely different paradigms in AI with a tiny bit of data.”

Ignoring negative feedback, Li pursued the project for more than two years. It strained her research budget and the patience of her graduate students. When she took a new job at Stanford in 2009, she took several of those students—and the ImageNet project—with her to California.

ImageNet received little attention for the first couple of years after its release in 2009. But in 2012, a team from the University of Toronto trained a neural network on the ImageNet dataset, achieving unprecedented performance in image recognition. That groundbreaking AI model, dubbed AlexNet after lead author Alex Krizhevsky, kicked off the deep learning boom that has continued to the present day.

AlexNet would not have succeeded without the ImageNet dataset. AlexNet also would not have been possible without a platform called CUDA, which allowed Nvidia’s graphics processing units (GPUs) to be used in non-graphics applications. Many people were skeptical when Nvidia announced CUDA in 2006.

So the AI boom of the last 12 years was made possible by three visionaries who pursued unorthodox ideas in the face of widespread criticism. One was Geoffrey Hinton, a University of Toronto computer scientist who spent decades promoting neural networks despite near-universal skepticism. The second was Jensen Huang, the CEO of Nvidia, who recognized early that GPUs could be useful for more than just graphics.

The third was Fei-Fei Li. She created an image dataset that seemed ludicrously large to most of her colleagues. But it turned out to be essential for demonstrating the potential of neural networks trained on GPUs.

Geoffrey Hinton

A neural network is a network of thousands, millions, or even billions of neurons. Each neuron is a mathematical function that produces an output based on a weighted average of its inputs.

Suppose you want to create a network that can identify handwritten decimal digits like the number two in the red square above. Such a network would take in an intensity value for each pixel in an image and output a probability distribution over the ten possible digits—0, 1, 2, and so forth.

To train such a network, you first initialize it with random weights. You then run it on a sequence of example images. For each image, you train the network by strengthening the connections that push the network toward the right answer (in this case, a high-probability value for the “2” output) and weakening connections that push toward a wrong answer (a low probability for “2” and high probabilities for other digits). If trained on enough example images, the model should start to predict a high probability for “2” when shown a two—and not otherwise.

In the late 1950s, scientists started to experiment with basic networks that had a single layer of neurons. However, their initial enthusiasm cooled as they realized that such simple networks lacked the expressive power required for complex computations.

Deeper networks—those with multiple layers—had the potential to be more versatile. But in the 1960s, no one knew how to train them efficiently. This was because changing a parameter somewhere in the middle of a multi-layer network could have complex and unpredictable effects on the output.

So by the time Hinton began his career in the 1970s, neural networks had fallen out of favor. Hinton wanted to study them, but he struggled to find an academic home in which to do so. Between 1976 and 1986, Hinton spent time at four different research institutions: Sussex University, the University of California San Diego (UCSD), a branch of the UK Medical Research Council, and finally Carnegie Mellon, where he became a professor in 1982.

Geoffrey Hinton speaking in Toronto in June.

Credit: Photo by Mert Alper Dervis/Anadolu via Getty Images

Geoffrey Hinton speaking in Toronto in June. Credit: Photo by Mert Alper Dervis/Anadolu via Getty Images

In a landmark 1986 paper, Hinton teamed up with two of his former colleagues at UCSD, David Rumelhart and Ronald Williams, to describe a technique called backpropagation for efficiently training deep neural networks.

Their idea was to start with the final layer of the network and work backward. For each connection in the final layer, the algorithm computes a gradient—a mathematical estimate of whether increasing the strength of that connection would push the network toward the right answer. Based on these gradients, the algorithm adjusts each parameter in the model’s final layer.

The algorithm then propagates these gradients backward to the second-to-last layer. A key innovation here is a formula—based on the chain rule from high school calculus—for computing the gradients in one layer based on gradients in the following layer. Using these new gradients, the algorithm updates each parameter in the second-to-last layer of the model. The gradients then get propagated backward to the third-to-last layer, and the whole process repeats once again.

The algorithm only makes small changes to the model in each round of training. But as the process is repeated over thousands, millions, billions, or even trillions of training examples, the model gradually becomes more accurate.

Hinton and his colleagues weren’t the first to discover the basic idea of backpropagation. But their paper popularized the method. As people realized it was now possible to train deeper networks, it triggered a new wave of enthusiasm for neural networks.

Hinton moved to the University of Toronto in 1987 and began attracting young researchers who wanted to study neural networks. One of the first was the French computer scientist Yann LeCun, who did a year-long postdoc with Hinton before moving to Bell Labs in 1988.

Hinton’s backpropagation algorithm allowed LeCun to train models deep enough to perform well on real-world tasks like handwriting recognition. By the mid-1990s, LeCun’s technology was working so well that banks started to use it for processing checks.

“At one point, LeCun’s creation read more than 10 percent of all checks deposited in the United States,” wrote Cade Metz in his 2022 book Genius Makers.

But when LeCun and other researchers tried to apply neural networks to larger and more complex images, it didn’t go well. Neural networks once again fell out of fashion, and some researchers who had focused on neural networks moved on to other projects.

Hinton never stopped believing that neural networks could outperform other machine learning methods. But it would be many years before he’d have access to enough data and computing power to prove his case.

Jensen Huang

Jensen Huang speaking in Denmark in October.

Credit: Photo by MADS CLAUS RASMUSSEN/Ritzau Scanpix/AFP via Getty Images

Jensen Huang speaking in Denmark in October. Credit: Photo by MADS CLAUS RASMUSSEN/Ritzau Scanpix/AFP via Getty Images

The brain of every personal computer is a central processing unit (CPU). These chips are designed to perform calculations in order, one step at a time. This works fine for conventional software like Windows and Office. But some video games require so many calculations that they strain the capabilities of CPUs. This is especially true of games like Quake, Call of Duty, and Grand Theft Auto, which render three-dimensional worlds many times per second.

So gamers rely on GPUs to accelerate performance. Inside a GPU are many execution units—essentially tiny CPUs—packaged together on a single chip. During gameplay, different execution units draw different areas of the screen. This parallelism enables better image quality and higher frame rates than would be possible with a CPU alone.

Nvidia invented the GPU in 1999 and has dominated the market ever since. By the mid-2000s, Nvidia CEO Jensen Huang suspected that the massive computing power inside a GPU would be useful for applications beyond gaming. He hoped scientists could use it for compute-intensive tasks like weather simulation or oil exploration.

So in 2006, Nvidia announced the CUDA platform. CUDA allows programmers to write “kernels,” short programs designed to run on a single execution unit. Kernels allow a big computing task to be split up into bite-sized chunks that can be processed in parallel. This allows certain kinds of calculations to be completed far faster than with a CPU alone.

But there was little interest in CUDA when it was first introduced, wrote Steven Witt in The New Yorker last year:

When CUDA was released, in late 2006, Wall Street reacted with dismay. Huang was bringing supercomputing to the masses, but the masses had shown no indication that they wanted such a thing.

“They were spending a fortune on this new chip architecture,” Ben Gilbert, the co-host of “Acquired,” a popular Silicon Valley podcast, said. “They were spending many billions targeting an obscure corner of academic and scientific computing, which was not a large market at the time—certainly less than the billions they were pouring in.”

Huang argued that the simple existence of CUDA would enlarge the supercomputing sector. This view was not widely held, and by the end of 2008, Nvidia’s stock price had declined by seventy percent…

Downloads of CUDA hit a peak in 2009, then declined for three years. Board members worried that Nvidia’s depressed stock price would make it a target for corporate raiders.

Huang wasn’t specifically thinking about AI or neural networks when he created the CUDA platform. But it turned out that Hinton’s backpropagation algorithm could easily be split up into bite-sized chunks. So training neural networks turned out to be a killer app for CUDA.

According to Witt, Hinton was quick to recognize the potential of CUDA:

In 2009, Hinton’s research group used Nvidia’s CUDA platform to train a neural network to recognize human speech. He was surprised by the quality of the results, which he presented at a conference later that year. He then reached out to Nvidia. “I sent an e-mail saying, ‘Look, I just told a thousand machine-learning researchers they should go and buy Nvidia cards. Can you send me a free one?’ ” Hinton told me. “They said no.”

Despite the snub, Hinton and his graduate students, Alex Krizhevsky and Ilya Sutskever, obtained a pair of Nvidia GTX 580 GPUs for the AlexNet project. Each GPU had 512 execution units, allowing Krizhevsky and Sutskever to train a neural network hundreds of times faster than would be possible with a CPU. This speed allowed them to train a larger model—and to train it on many more training images. And they would need all that extra computing power to tackle the massive ImageNet dataset.

Fei-Fei Li

Fei-Fei Li at the SXSW conference in 2018.

Credit: Photo by Hubert Vestil/Getty Images for SXSW

Fei-Fei Li at the SXSW conference in 2018. Credit: Photo by Hubert Vestil/Getty Images for SXSW

Fei-Fei Li wasn’t thinking about either neural networks or GPUs as she began a new job as a computer science professor at Princeton in January of 2007. While earning her PhD at Caltech, she had built a dataset called Caltech 101 that had 9,000 images across 101 categories.

That experience had taught her that computer vision algorithms tended to perform better with larger and more diverse training datasets. Not only had Li found her own algorithms performed better when trained on Caltech 101, but other researchers also started training their models using Li’s dataset and comparing their performance to one another. This turned Caltech 101 into a benchmark for the field of computer vision.

So when she got to Princeton, Li decided to go much bigger. She became obsessed with an estimate by vision scientist Irving Biederman that the average person recognizes roughly 30,000 different kinds of objects. Li started to wonder if it would be possible to build a truly comprehensive image dataset—one that included every kind of object people commonly encounter in the physical world.

A Princeton colleague told Li about WordNet, a massive database that attempted to catalog and organize 140,000 words. Li called her new dataset ImageNet, and she used WordNet as a starting point for choosing categories. She eliminated verbs and adjectives, as well as intangible nouns like “truth.” That left a list of 22,000 countable objects ranging from “ambulance” to “zucchini.”

She planned to take the same approach she’d taken with the Caltech 101 dataset: use Google’s image search to find candidate images, then have a human being verify them. For the Caltech 101 dataset, Li had done this herself over the course of a few months. This time she would need more help. She planned to hire dozens of Princeton undergraduates to help her choose and label images.

But even after heavily optimizing the labeling process—for example, pre-downloading candidate images so they’re instantly available for students to review—Li and her graduate student Jia Deng calculated that it would take more than 18 years to select and label millions of images.

The project was saved when Li learned about Amazon Mechanical Turk, a crowdsourcing platform Amazon had launched a couple of years earlier. Not only was AMT’s international workforce more affordable than Princeton undergraduates, but the platform was also far more flexible and scalable. Li’s team could hire as many people as they needed, on demand, and pay them only as long as they had work available.

AMT cut the time needed to complete ImageNet down from 18 to two years. Li writes that her lab spent two years “on the knife-edge of our finances” as the team struggled to complete the ImageNet project. But they had enough funds to pay three people to look at each of the 14 million images in the final data set.

ImageNet was ready for publication in 2009, and Li submitted it to the Conference on Computer Vision and Pattern Recognition, which was held in Miami that year. Their paper was accepted, but it didn’t get the kind of recognition Li hoped for.

“ImageNet was relegated to a poster session,” Li writes. “This meant that we wouldn’t be presenting our work in a lecture hall to an audience at a predetermined time but would instead be given space on the conference floor to prop up a large-format print summarizing the project in hopes that passersby might stop and ask questions… After so many years of effort, this just felt anticlimactic.”

To generate public interest, Li turned ImageNet into a competition. Realizing that the full dataset might be too unwieldy to distribute to dozens of contestants, she created a much smaller (but still massive) dataset with 1,000 categories and 1.4 million images.

The first year’s competition in 2010 generated a healthy amount of interest, with 11 teams participating. The winning entry was based on support vector machines. Unfortunately, Li writes, it was “only a slight improvement over cutting-edge work found elsewhere in our field.”

The second year of the ImageNet competition attracted fewer entries than the first. The winning entry in 2011 was another support vector machine, and it just barely improved on the performance of the 2010 winner. Li started to wonder if the critics had been right. Maybe “ImageNet was too much for most algorithms to handle.”

“For two years running, well-worn algorithms had exhibited only incremental gains in capabilities, while true progress seemed all but absent,” Li writes. “If ImageNet was a bet, it was time to start wondering if we’d lost.”

But when Li reluctantly staged the competition a third time in 2012, the results were totally different. Geoff Hinton’s team was the first to submit a model based on a deep neural network. And its top-5 accuracy was 85 percent—10 percentage points better than the 2011 winner.

Li’s initial reaction was incredulity: “Most of us saw the neural network as a dusty artifact encased in glass and protected by velvet ropes.”

“This is proof”

Yann LeCun testifies before the US Senate in September.

Credit: Photo by Kevin Dietsch/Getty Images

Yann LeCun testifies before the US Senate in September. Credit: Photo by Kevin Dietsch/Getty Images

The ImageNet winners were scheduled to be announced at the European Conference on Computer Vision in Florence, Italy. Li, who had a baby at home in California, was planning to skip the event. But when she saw how well AlexNet had done on her dataset, she realized this moment would be too important to miss: “I settled reluctantly on a twenty-hour slog of sleep deprivation and cramped elbow room.”

On an October day in Florence, Alex Krizhevsky presented his results to a standing-room-only crowd of computer vision researchers. Fei-Fei Li was in the audience. So was Yann LeCun.

Cade Metz reports that after the presentation, LeCun stood up and called AlexNet “an unequivocal turning point in the history of computer vision. This is proof.”

The success of AlexNet vindicated Hinton’s faith in neural networks, but it was arguably an even bigger vindication for LeCun.

AlexNet was a convolutional neural network, a type of neural network that LeCun had developed 20 years earlier to recognize handwritten digits on checks. (For more details on how CNNs work, see the in-depth explainer I wrote for Ars in 2018.) Indeed, there were few architectural differences between AlexNet and LeCun’s image recognition networks from the 1990s.

AlexNet was simply far larger. In a 1998 paper, LeCun described a document-recognition network with seven layers and 60,000 trainable parameters. AlexNet had eight layers, but these layers had 60 million trainable parameters.

LeCun could not have trained a model that large in the early 1990s because there were no computer chips with as much processing power as a 2012-era GPU. Even if LeCun had managed to build a big enough supercomputer, he would not have had enough images to train it properly. Collecting those images would have been hugely expensive in the years before Google and Amazon Mechanical Turk.

And this is why Fei-Fei Li’s work on ImageNet was so consequential. She didn’t invent convolutional networks or figure out how to make them run efficiently on GPUs. But she provided the training data that large neural networks needed to reach their full potential.

The technology world immediately recognized the importance of AlexNet. Hinton and his students formed a shell company with the goal to be “acquihired” by a big tech company. Within months, Google purchased the company for $44 million. Hinton worked at Google for the next decade while retaining his academic post in Toronto. Ilya Sutskever spent a few years at Google before becoming a cofounder of OpenAI.

AlexNet also made Nvidia GPUs the industry standard for training neural networks. In 2012, the market valued Nvidia at less than $10 billion. Today, Nvidia is one of the most valuable companies in the world, with a market capitalization north of $3 trillion. That high valuation is driven mainly by overwhelming demand for GPUs like the H100 that are optimized for training neural networks.

Sometimes the conventional wisdom is wrong

“That moment was pretty symbolic to the world of AI because three fundamental elements of modern AI converged for the first time,” Li said in a September interview at the Computer History Museum. “The first element was neural networks. The second element was big data, using ImageNet. And the third element was GPU computing.”

Today, leading AI labs believe the key to progress in AI is to train huge models on vast data sets. Big technology companies are in such a hurry to build the data centers required to train larger models that they’ve started to lease out entire nuclear power plants to provide the necessary power.

You can view this as a straightforward application of the lessons of AlexNet. But I wonder if we ought to draw the opposite lesson from AlexNet: that it’s a mistake to become too wedded to conventional wisdom.

“Scaling laws” have had a remarkable run in the 12 years since AlexNet, and perhaps we’ll see another generation or two of impressive results as the leading labs scale up their foundation models even more.

But we should be careful not to let the lessons of AlexNet harden into dogma. I think there’s at least a chance that scaling laws will run out of steam in the next few years. And if that happens, we’ll need a new generation of stubborn nonconformists to notice that the old approach isn’t working and try something different.

Tim Lee was on staff at Ars from 2017 to 2021. Last year, he launched a newsletter, Understanding AI, that explores how AI works and how it’s changing our world. You can subscribe here.

Photo of Timothy B. Lee

Timothy is a senior reporter covering tech policy and the future of transportation. He lives in Washington DC.

How a stubborn computer scientist accidentally launched the deep learning boom Read More »

rocket-report:-australia-says-yes-to-the-launch;-russia-delivers-for-iran

Rocket Report: Australia says yes to the launch; Russia delivers for Iran


The world’s first wooden satellite arrived at the International Space Station this week.

A Falcon 9 booster fires its engines on SpaceX’s “tripod” test stand in McGregor, Texas. Credit: SpaceX

Welcome to Edition 7.19 of the Rocket Report! Okay, we get it. We received more submissions from our readers on Australia’s approval of a launch permit for Gilmour Space than we’ve received on any other news story in recent memory. Thank you for your submissions as global rocket activity continues apace. We’ll cover Gilmour in more detail as they get closer to launch. There will be no Rocket Report next week as Eric and I join the rest of the Ars team for our 2024 Technicon in New York.

As always, we welcome reader submissions. If you don’t want to miss an issue, please subscribe using the box below (the form will not appear on AMP-enabled versions of the site). Each report will include information on small-, medium-, and heavy-lift rockets as well as a quick look ahead at the next three launches on the calendar.

Gilmour Space has a permit to fly. Gilmour Space Technologies has been granted a permit to launch its 82-foot-tall (25-meter) orbital rocket from a spaceport in Queensland, Australia. The space company, founded in 2012, had initially planned to lift off in March but was unable to do so without approval from the Australian Space Agency, the Australian Broadcasting Corporation reports. The government approved Gilmour’s launch permit Monday, although the company is still weeks away from flying its three-stage Eris rocket.

A first for Australia … Australia hosted a handful of satellite launches with US and British rockets from 1967 through 1971, but Gilmour’s Eris rocket would become the first all-Australian launch vehicle to reach orbit. The Eris rocket is capable of delivering about 670 pounds (305 kilograms) of payload mass into a Sun-synchronous orbit. Eris will be powered by hybrid rocket engines burning a solid fuel mixed with a liquid oxidizer, making it unique among orbital-class rockets. Gilmour completed a wet dress rehearsal, or practice countdown, with the Eris rocket on the launch pad in Queensland in September. The launch permit becomes active after 30 days, or the first week of December. “We do think we’ve got a good chance of launching at the end of the 30-day period, and we’re going to give it a red hot go,” said Adam Gilmour, the company’s co-founder and CEO. (submitted by Marzipan, mryall, ZygP, Ken the Bin, Spencer Willis, MarkW98, and EllPeaTea)

North Korea tests new missile. North Korea apparently completed a successful test of its most powerful intercontinental ballistic missile on October 31, lofting it nearly 4,800 miles (7,700 kilometers) into space before the projectile fell back to Earth, Ars reports. This solid-fueled, multi-stage missile, named the Hwasong-19, is a new tool in North Korea’s increasingly sophisticated arsenal of weapons. It has enough range—perhaps as much as 9,320 miles (15,000 kilometers), according to Japan’s government—to strike targets anywhere in the United States. It also happens to be one of the largest ICBMs in the world, rivaling the missiles fielded by the world’s more established nuclear powers.

Quid pro quo? … The Hwasong-19 missile test comes as North Korea deploys some 10,000 troops inside Russia to support the country’s war against Ukraine. The budding partnership between Russia and North Korea has evolved for several years. Russian President Vladimir Putin has met with North Korean leader Kim Jong Un on multiple occasions, most recently in Pyongyang in June. This has fueled speculation about what Russia is offering North Korea in exchange for the troops deployed on Russian soil. US and South Korean officials have some thoughts. They said North Korea is likely to ask for technology transfers in diverse areas related to tactical nuclear weapons, ICBMs, and reconnaissance satellites.

The easiest way to keep up with Eric Berger’s and Stephen Clark’s reporting on all things space is to sign up for our newsletter. We’ll collect their stories and deliver them straight to your inbox.

Sign Me Up!

Virgin Galactic is on the hunt for cash. Virgin Galactic is proposing to raise $300 million in additional capital to accelerate production of suborbital spaceplanes and a mothership aircraft the company says can fuel its long-term growth, Space News reports. The company, founded by billionaire Richard Branson, suspended operations of its VSS Unity suborbital spaceplane earlier this year. VSS Unity hit a monthly flight cadence carrying small groups of space tourists and researchers to the edge of space, but it just wasn’t profitable. Now, Virgin Galactic is developing larger Delta-class spaceplanes it says will be easier and cheaper to turn around between flights.

All-in with Delta … Michael Colglazier, Virgin Galactic’s CEO, announced the company’s appetite for fundraising in a quarterly earnings call with investment analysts Wednesday. He said manufacturing of components for Virgin Galactic’s first two Delta-class ships, which the company says it can fund with existing cash, is proceeding on schedule at a factory in Arizona. Virgin Galactic previously said it would use revenue from paying passengers on its first two Delta-class ships to pay for development of future vehicles. Instead, Virgin Galactic now says it wants to raise money to speed up work on the third and fourth Delta-class vehicles, along with a second airplane mothership to carry the spaceplanes aloft before they release and fire into space. (submitted by Ken the Bin and EllPeaTea)

ESA breaks its silence on Themis. The European Space Agency has provided a rare update on the progress of its Themis reusable booster demonstrator project, European Spaceflight reports. ESA is developing the Themis test vehicle for atmospheric flights to fine-tune technologies for a future European reusable rocket capable of vertical takeoffs and vertical landings. Themis started out as a project led by CNES, the French space agency, in 2018. ESA member states signed up to help fund the project in 2019, and the agency awarded ArianeGroup a contract to move forward with Themis in 2020. At the time, the first low-altitude hop test was expected to take place in 2022.

Some slow progress … Now, the first low-altitude hop is scheduled for 2025 from Esrange Space Centre in Sweden, a three-year delay. This week, ESA said engineers have completed testing of the Themis vehicle’s main systems, and assembly of the demonstrator is underway in France. A single methane-fueled Prometheus engine, also developed by ArianeGroup, has been installed on the rocket. Teams are currently adding avionics, computers, electrical systems, and cable harnesses. Themis’ stainless steel propellant tanks have been manufactured, tested, and cleaned and are now ready to be installed on the Themis demonstrator. Then, the rocket will travel by road from France to the test site in Sweden for its initial low-altitude hops. After those flights are complete, officials plan to add two more Prometheus engines to the rocket and ship it to French Guiana for high-altitude test flights. (submitted by Ken the Bin and EllPeaTea)

SpaceX will give the ISS a boost. A Cargo Dragon spacecraft docked to the International Space Station on Tuesday morning, less than a day after lifting off from Florida. As space missions go, this one is fairly routine, ferrying about 6,000 pounds (2,700 kilograms) of cargo and science experiments to the space station. One thing that’s different about this mission is that it delivered to the station a tiny 2 lb (900 g) satellite named LignoSat, the first spacecraft made of wood, for later release outside the research complex. There is one more characteristic of this flight that may prove significant for NASA and the future of the space station, Ars reports. As early as Friday, NASA and SpaceX have scheduled a “reboost and attitude control demonstration,” during which the Dragon spacecraft will use some of the thrusters at the base of the capsule. This is the first time the Dragon spacecraft will be used to move the space station.

Dragon’s breath … Dragon will fire a subset of its 16 Draco thrusters, each with about 90 pounds of thrust, for approximately 12.5 minutes to make a slight adjustment to the orbital trajectory of the roughly 450-ton space station. SpaceX and NASA engineers will analyze the results from the demonstration to determine if Dragon could be used for future space station reboost opportunities. The data will also inform the design of the US Deorbit Vehicle, which SpaceX is developing to perform the maneuvers required to bring the space station back to Earth for a controlled, destructive reentry in the early 2030s. For NASA, demonstrating Dragon’s ability to move the space station will be another step toward breaking free of reliance on Russia, which is currently responsible for providing propulsion to maneuver the orbiting outpost. Northrop Grumman’s Cygnus supply ship also previously demonstrated a reboost capability. (submitted by Ken the Bin and N35t0r)

Russia launches Soyuz in service of Iran. Russia launched a Soyuz rocket Monday carrying two satellites designed to monitor the space weather around Earth and 53 small satellites, including two Iranian ones, Reuters reports. The primary payloads aboard the Soyuz-2.1b rocket were two Ionosfera-M satellites to probe the ionosphere, an outer layer of the atmosphere near the edge of space. Solar activity can alter conditions in the ionosphere, impacting communications and navigation. The two Iranian satellites on this mission were named Kowsar and Hodhod. They will collect high-resolution reconnaissance imagery and support communications for Iran.

A distant third … This was only the 13th orbital launch by Russia this year, trailing far behind the United States and China. We know of two more Soyuz flights planned for later this month, but no more, barring a surprise military launch (which is possible). The projected launch rate puts Russia on pace for its quietest year of launch activity since 1961, the year Yuri Gagarin became the first person to fly in space. A major reason for this decline in launches is the decisions of Western governments and companies to move their payloads off of Russian rockets after the invasion of Ukraine. For example, OneWeb stopped launching on Soyuz in 2022, and the European Space Agency suspended its partnership with Russia to launch Soyuz rockets from French Guiana. (submitted by Ken the Bin)

H3 deploys Japanese national security satellite. Japan launched a defense satellite Monday aimed at speedier military operations and communication on an H3 rocket and successfully placed it into orbit, the Associated Press reports. The Kirameki 3 satellite will use high-speed X-band communication to support Japan’s defense ministry with information and data sharing, and command and control services. The satellite will serve Japanese land, air, and naval forces from its perch in geostationary orbit alongside two other Kirameki communications satellites.

Gaining trust … The H3 is Japan’s new flagship rocket, developed by Mitsubishi Heavy Industries (MHI) and funded by the Japan Aerospace Exploration Agency (JAXA). The launch of Kirameki 3 marked the third consecutive successful launch of the H3 rocket, following a debut flight in March 2023 that failed to reach orbit. This was the first time Japan’s defense ministry put one of its satellites on the H3 rocket. The first two Kirameki satellites launched on a European Ariane 5 and a Japanese H-IIA rocket, which the H3 will replace. (submitted by Ken the Bin, tsunam, and EllPeaTea)

Rocket Lab enters the race for military contracts. Rocket Lab is aiming to chip away at SpaceX’s dominance in military space launch, confirming its bid to compete for Pentagon contracts with its new medium-lift rocket, Neutron, Space News reports. Last month, the Space Force released a request for proposals from launch companies seeking to join the military’s roster of launch providers in the National Security Space Launch (NSSL) program. The Space Force will accept bids for launch providers to “on-ramp” to the NSSL Phase 3 Lane 1 contract, which doles out task orders to launch companies for individual missions. In order to win a task order, a launch provider must be on the Phase 3 Lane 1 contract. Currently, SpaceX, United Launch Alliance, and Blue Origin are the only rocket companies eligible. SpaceX won all of the first round of Lane 1 task orders last month.

Joining the club … The Space Force is accepting additional risk for Lane 1 missions, which largely comprise repeat launches deploying a constellation of missile-tracking and data-relay satellites for the Space Development Agency. A separate class of heavy-lift missions, known as Lane 2, will require rockets to undergo a thorough certification by the Space Force to ensure their reliability. In order for a launch company to join the Lane 1 roster, the Space Force requires bidders to be ready for a first launch by December 2025. Peter Beck, Rocket Lab’s founder and CEO, said he thinks the Neutron rocket will be ready for its first launch by then. Other new medium-lift rockets, such as Firefly Aerospace’s MLV and Relativity’s Terran-R, almost certainly won’t be ready to launch by the end of next year, leaving Rocket Lab as the only company that will potentially join incumbents SpaceX, ULA, and Blue Origin. (submitted by Ken the Bin)

Next Starship flight is just around the corner. Less than a month has passed since the historic fifth flight of SpaceX’s Starship, during which the company caught the booster with mechanical arms back at the launch pad in Texas. Now, another test flight could come as soon as November 18, Ars reports. The improbable but successful recovery of the Starship first stage with “chopsticks” last month, and the on-target splashdown of the Starship upper stage halfway around the world, allowed SpaceX to avoid an anomaly investigation by the Federal Aviation Administration. Thus, the company was able to press ahead on a sixth test flight if it flew a similar profile. And that’s what SpaceX plans to do, albeit with some notable additions to the flight plan.

Around the edges … Perhaps the most significant change to the profile for Flight 6 will be an attempt to reignite a Raptor engine on Starship while it is in space. SpaceX tried to do this on a test flight in March but aborted the burn because the ship’s rolling motion exceeded limits. A successful demonstration of a Raptor engine relight could pave the way for SpaceX to launch Starship into a higher stable orbit around Earth on future test flights. This is required for SpaceX to begin using Starship to launch Starlink Internet satellites and perform in-orbit refueling experiments with two ships docked together. (submitted by EllPeaTea)

China’s version of Starship. China has updated the design of its next-generation heavy-lift rocket, the Long March 9, and it looks almost exactly like a clone of SpaceX’s Starship rocket, Ars reports. The Long March 9 started out as a conventional-looking expendable rocket, then morphed into a launcher with a reusable first stage. Now, the rocket will have a reusable booster and upper stage. The booster will have 30 methane-fueled engines, similar to the number of engines on SpaceX’s Super Heavy booster. The upper stage looks remarkably like Starship, with flaps in similar locations. China intends to fly this vehicle for the first time in 2033, nearly a decade from now.

A vehicle for the Moon … The reusable Long March 9 is intended to unlock robust lunar operations for China, similar to the way Starship, and to some extent Blue Origin’s Blue Moon lander, promises to support sustained astronaut stays on the Moon’s surface. China says it plans to land its astronauts on the Moon by 2030, initially using a more conventional architecture with an expendable rocket named the Long March 10, and a lander reminiscent of NASA’s Apollo lunar lander. These will allow Chinese astronauts to remain on the Moon for a matter of days. With Long March 9, China could deliver massive loads of cargo and life support resources to sustain astronauts for much longer stays.

Ta-ta to the tripod. The large three-legged vertical test stand at SpaceX’s engine test site in McGregor, Texas, is being decommissioned, NASA Spaceflight reports. Cranes have started removing propellant tanks from the test stand, nicknamed the tripod, towering above the Central Texas prairie. McGregor is home to SpaceX’s propulsion test team and has 16 test cells to support firings of Merlin, Raptor, and Draco engines multiple times per day for the Falcon 9 rocket, Starship, and Dragon spacecraft.

Some history … The tripod might have been one of SpaceX’s most important assets in the company’s early years. It was built by Beal Aerospace for liquid-fueled rocket engine tests in the late 1990s. Beal Aerospace folded, and SpaceX took over the site in 2003. After some modifications, SpaceX installed the first qualification version of its Falcon 9 rocket on the tripod for a series of nine-engine test-firings leading up to the rocket’s inaugural flight in 2010. SpaceX test-fired numerous new Falcon 9 boosters on the tripod before shipping them to launch sites in Florida or California. Most recently, the tripod was used for testing of Raptor engines destined to fly on Starship and the Super Heavy booster.

Next three launches

Nov. 9:  Long March 2C | Unknown Payload | Jiuquan Satellite Launch Center, China | 03: 40 UTC

Nov. 9: Falcon 9 | Starlink 9-10 | Vandenberg Space Force Base, California | 06: 14 UTC

Nov. 10:  Falcon 9 | Starlink 6-69 | Cape Canaveral Space Force Station, Florida | 21: 28 UTC

Photo of Stephen Clark

Stephen Clark is a space reporter at Ars Technica, covering private space companies and the world’s space agencies. Stephen writes about the nexus of technology, science, policy, and business on and off the planet.

Rocket Report: Australia says yes to the launch; Russia delivers for Iran Read More »

research-monkeys-still-having-a-ball-days-after-busting-out-of-lab,-policy-say

Research monkeys still having a ball days after busting out of lab, policy say

If you need any inspiration for cutting loose and relaxing this weekend, look no further than a free-wheeling troop of monkeys that broke out of their South Carolina research facility Wednesday and, as of noon Friday, were still “playfully exploring” with their newfound freedom.

In an update Friday, the police department of Yemassee, SC said that the 43 young, female rhesus macaque monkeys are still staying around the perimeter of the Alpha Genesis Primate Research Facility. “The primates are exhibiting calm and playful behavior, which is a positive indication,” the department noted.

The fun-loving furballs got free after a caretaker “failed to secure doors” at the facility.

Alpha Genesis staff have been keeping an eye on the escapees, trying to entice them back in with food. But, instead of taking the bait, the primates have been playing on the perimeter fence while still keeping in touch with the monkeys inside by cooing to them.

“They’re just being goofy monkeys jumping back and forth playing with each other,” Alpha Genesis CEO Greg Westergaard told CBS News Thursday. “It’s kind of like a playground situation here.”

Yemassee police note that the monkeys are very young and small—only about 6 or 7 pounds each. They have not been used for any testing yet, don’t carry any disease, and pose no health risk to the public. Still, area residents have been advised to keep their doors and windows locked in case the wee primates try to pay a visit.

This isn’t the first time—or even the second time—Alpha Genesis has had trouble keeping its monkeys under control. In 2018, the US Department of Agriculture fined the company $12,600 for violations between 2014 and 2016 that included four monkey breakouts. In those incidents, a total of 30 monkeys escaped. One was never found.

Research monkeys still having a ball days after busting out of lab, policy say Read More »

space-policy-is-about-to-get-pretty-wild,-y’all

Space policy is about to get pretty wild, y’all


Saddle up, space cowboys. It may get bumpy for a while.

President Donald Trump steps on the stage at Kennedy Space Center after the successful launch of the Demo-2 crew mission in May 2020. Credit: NASA/Bill Ingalls

The global space community awoke to a new reality on Wednesday morning.

The founder of this century’s most innovative space company, Elon Musk, successfully used his fortune, time, and energy to help elect Donald Trump to president of the United States. Already, Musk was the dominant Western player in space. SpaceX launches national security satellites and NASA astronauts and operates a megaconstellation. He controls the machines that provide essential space services to NASA and the US military. And now, thanks to his gamble on backing Trump, Musk has strong-armed himself into Trump’s inner circle.

Although he may not have a cabinet-appointed position, Musk will have a broad portfolio in the new administration for as long as his relations with Trump remain positive. This gives Musk extraordinary power over a number of areas, including spaceflight. Already this week, he has been soliciting ideas and input from colleagues. The New York Times reported that Musk has advised Trump to hire key employees from SpaceX into his administration, including at the Department of Defense. This reflects the huge conflict of interest that Musk will face when it comes to space policy. His actions could significantly benefit SpaceX, of which he is the majority owner and has the final say in major decisions.

It will be a hugely weird dynamic. Musk is unquestionably in a position for self-dealing. Normally, such conflicts of interest would be frowned on within a government, but Trump has already shown a brazen disregard for norms, and there’s no reason to believe that will change during his second go at the presidency. One way around this could be to give Musk a “special adviser” tag, which means he would not have to comply with federal conflict-of-interest laws.

So it’s entirely possible that the sitting chief executive of SpaceX could be the nation’s most important adviser on space policy, conflicts be damned. Musk possesses flaws as a leader, but it is difficult to argue against results. His intuitions for the industry, such as pushing hard for reusable launch and broadband Internet from space, have largely been correct. In a vacuum, it is not necessarily bad to have someone like Musk providing a vision for US spaceflight in the 21st century. But while space may be a vacuum, there is plenty of oxygen in Washington, DC.

Being a space journalist got a lot more interesting this week—and a lot more difficult. As I waded through this reality on Wednesday, I began to reach out to sources about what is likely to happen. It’s way too early to have much certainty, but we can begin to draw some broad outlines for what may happen to space policy during a second Trump presidency. Buckle up—it could be a wild ride.

Bringing efficiency to NASA?

Let’s start with NASA and firmly establish what we mean. The US space agency does some pretty great things, but it’s also a bloated bureaucracy. That’s by design. Members of Congress write budgets and inevitably seek to steer more federal dollars to NASA activities in the areas they represent. Two decades ago, an engineer named Mike Griffin—someone Musk sought to hire as SpaceX’s first chief engineer in 2002—became NASA administrator under President George W. Bush.

Griffin recognized NASA’s bloat. For starters, it had too many field centers. NASA simply doesn’t need 10 major outposts across the country, as they end up fighting one another for projects and funding. However, Griffin knew he would face a titanic political struggle to close field centers, on par with federal efforts to close duplicative military bases during the “Base Realignment and Closure” process after the Cold War. So Griffin instead sought to make the best of the situation with his “Ten Healthy Centers” initiative. Work together, he told his teams across the country.

Essentially, then, for the last two decades, NASA programs have sought to leverage expertise across the agency. Consider the development of the Orion spacecraft, which began nearly 20 years ago. The following comment comes from Julie Kramer-White from an oral history interview conducted in 2016. Kramer is a long-time NASA engineer who was chief engineer of Orion at the time.

“I’ll tell you the truth, ten healthy centers is a pain in the butt,” she said. “The engineering team is a big engineering team, and they are spread across 9 of the 10 Centers… Our guys don’t think anything about a phone call that’s got people from six different centers. You’re trying to balance the time zone differences, and of course that’s got its own challenge with Europe as well but even within the United States with the different centers managing the time zone issue. I would say as a net technically, it’s a good thing. From a management perspective, boy, it’s a hassle.”

Space does not get done fast or efficiently by committee. But that’s how NASA operates—committees within committees, reviewed by committees.

Musk has repeatedly said he wants to bring efficiency to the US government and vowed to identify $2 trillion in savings. Well, NASA would certainly be more efficient with fewer centers—each of which has its own management layers, human resources setups, and other extensive overhead. But will the Trump administration really have the stomach to close centers? Certainly the congressional leadership from a state like Ohio would fight tooth and nail for Glenn Research Center. This offers an example of how bringing sweeping change to the US government in general, and NASA in particular, will run into the power of the purse held by Congress.

One tool NASA has used in recent years to increase efficiency is buying commercial services rather than leading the development of systems, such as the Orion spacecraft. This most prominent example is cargo and crew transportation to the International Space Station, but NASA has extended this approach to all manner of areas, from space communications to lunar landers to privately operated space stations. Congress has not always been happy with this transition because it has lessened its influence over steering funding directly to centers. NASA has nonetheless continued to push for this change because it has lowered agency costs, allowing it to do more.

Yet here again we run into conflicts of interest with Musk. The primary reason that NASA’s transition toward buying services has been a success is due to SpaceX. Private companies not named SpaceX have struggled to compete as NASA awards more fixed-price contracts for space services. Given Congress’ love for directing space funds to local centers, it’s unlikely to let Musk overhaul the agency in ways that send huge amounts of new business to SpaceX.

Where art thou, Artemis?

The biggest question is what to do with the Artemis program to return humans to the Moon. Ars wrote extensively about some of the challenges with this program a little more than a month ago, and Michael Bloomberg, founder of Bloomberg News, wrote a scathing assessment of Artemis recently under the headline “NASA’s $100 billion Moon mission is going nowhere.”

It is unlikely that outright cancellation of Artemis is on the table—after all, the first Trump administration created Artemis six years ago. However, Musk is clearly focused on sending humans to Mars, and the Moon-first approach of Artemis was championed by former Vice President Mike Pence, who is long gone. Trump loves grand gestures, and Musk has told Trump it will be possible to send humans to Mars before the end of his term. (That would be 2028, and it’s almost impossible to see this happening for a lot of reasons.) The Artemis architecture was developed around a “Moon-then-Mars” philosophy—as in, NASA will send humans to the Moon now, with Mars missions pushed into a nebulous future. Whatever Artemis becomes, it is likely to at least put Mars on equal footing to the Moon.

Notably, Musk despises NASA’s Space Launch System rocket, a central element of Artemis. He sees the rocket as the epitome of government bloat. And it’s not hard to understand why. The Space Launch System is completely expendable and costs about 10 to 100 times as much to launch as his own massive Starship rocket.

The key function the SLS rocket and the Orion spacecraft currently provide in Artemis is transporting astronauts from Earth to lunar orbit and back. There are ways to address this. Trump could refocus Artemis on using Starship to get humans to Mars. Alternatively, he could direct NASA to kludge together some combination of Orion, Dragon, and Falcon rockets to get astronauts to the Moon. He might also direct NASA to use the SLS for now but cancel further upgrades to it and a lunar space station called Gateway.

“The real question is how far is a NASA landing team and beachhead team are willing to go in destabilizing the program of record,” one policy source told Ars. “I can’t see Trump and Vance being less willing to shake up NASA than they are other public policy zones.”

What does seem clear is that, for the first time in 15 years, canceling the Space Launch System rocket or dramatically reducing its influence is on the table. This will be an acid test for Musk and Trump’s rhetoric on government efficiency, since the base of support for Artemis is in the deep-red South: states like Alabama, Mississippi, Louisiana, and Florida.

Will they really cut jobs there in the name of efficiency?

Regulatory reform

Reducing government regulations is one area in which the pathway for Musk and Trump is clear. The first Trump administration pushed to reduce regulations on US businesses almost from day one. In spaceflight, this produced Space Policy Directive-2 in 2018. Some progress was made, but it was far from total.

For spaceflight, Musk’s goal is to get faster approval for Starship test flights and licensing for the (literally) hundreds of launches SpaceX is already conducting annually. This will be broadly supported by the second Trump administration. During Trump’s first term, some of the initiatives in Space Policy Directive-2 were slowed or blocked by the Federal Aviation Administration and NASA, but the White House push will be even harder this time.

A looser regulatory environment should theoretically lead to more and more rapid progress in commercial space capabilities.

It’s worth noting here that if you spend any time talking to space startup executives, they all have horror stories about interacting with the FAA or other agencies. Pretty much everyone agrees that regulators could be more efficient but also that they need more resources to process rules in a timely manner. The FAA and Federal Communications Commission have important jobs when it comes to keeping people on the ground safe and keeping orbits sustainable in terms of traffic and space junk.

The second Trump administration will have some important allies on this issue in Congress. Ted Cruz, the US Senator from Texas, will likely chair the Senate Committee on Commerce, Science, and Transportation, which oversees legislation for space activities. He is one of the senators who has shown the most interest in commercial space, and he will support pro-business legislation—that is, laws that allow companies freer rein and regulatory agencies fewer teeth. How far this gets will depend on whether Republicans keep the House or Democrats take control.

Other areas of change

Over the course of the last seven decades, space has largely been a non-partisan topic.

But Musk’s deepening involvement in US space policy could pose a serious problem to this, as he’s now viewed extremely negatively by many Democrats. It seems probable that many people in Congress will oppose any significant shift of NASA’s focus from the Moon to Mars, particularly because it aligns with Musk’s long-stated goal of making humans a multiplanetary species.

There are likely to be battles in space science, as well. Traditionally, Republican presidents have cut funding for Earth science missions, and Democrats have increased funding to better study and understand climate change. Generally, given the administration’s likely focus on human spaceflight, space science will probably take a back seat and may lose funding.

Another looming issue is Mars Sample Return, which NASA is reconsidering due to budget and schedule issues. Presently, the agency intends to announce a new plan for retrieving rock and soil samples from Mars and returning them to Earth in December.

But if Musk and Trump are bent on sending humans to Mars as soon as possible, there is little sense in the space agency spending billions of dollars on a robotic sample return mission. Astronauts can just bring them back inside Starship.

Finally, at present, NASA has rich partnerships with space agencies around the world. In fact, it was the first Trump administration that created the Artemis Accords a little more than four years ago to develop an international coalition to return to the Moon. Since then, the United States and China have both been signing up partners in their competition to establish a presence at the South Pole of the Moon.

One huge uncertainty is how some of NASA’s long-established partners, especially in Europe, where there is bound to be tension around Ukraine and other issues with the Trump administration, will react at the US space agency’s exploration plans. Europeans are already wary of SpaceX’s prowess in global spaceflight and likely will not want to be on board with any space activities that further Musk’s ambitions.

These are just some of the high-level questions facing NASA and US spaceflight. There are many others. For example, how will Trump’s proposed tariffs on key components impact the national security and civil space supply chain? And there’s the Department of Defense, where the military already has multibillion dollar contracts with SpaceX, and there are bound to be similar conflicts and ethical concerns.

No one can hear you scream in space, but there will be plenty of screaming about space in the coming months.

Photo of Eric Berger

Eric Berger is the senior space editor at Ars Technica, covering everything from astronomy to private space to NASA policy, and author of two books: Liftoff, about the rise of SpaceX; and Reentry, on the development of the Falcon 9 rocket and Dragon. A certified meteorologist, Eric lives in Houston.

Space policy is about to get pretty wild, y’all Read More »

dna-shows-pompeii’s-dead-aren’t-who-we-thought-they-were

DNA shows Pompeii’s dead aren’t who we thought they were

People have long been fascinated by the haunting plaster casts of the bodies of people who died in Pompeii when Mount Vesuvius erupted in 79 CE. Archaeologists have presented certain popular narratives about who these people might have been and how they might have been related. But ancient DNA analysis has revealed that those preferred narratives were not entirely accurate and may reflect certain cultural biases, according to a new paper published in the journal Current Biology. The results also corroborate prior research suggesting that the people of ancient Pompeii were the descendants of immigrants from the Eastern Mediterranean.

As previously reported, the eruption of Mount Vesuvius released thermal energy roughly equivalent to 100,000 times the atomic bombs dropped on Hiroshima and Nagasaki at the end of World War II, spewing molten rock, pumice, and hot ash over the cities of Pompeii and Herculaneum in particular. The vast majority of people in Pompeii and Herculaneum—the cities hardest hit—perished from asphyxiation, choking on the thick clouds of noxious gas and ash. But at least some of the Vesuvian victims probably died instantaneously from the intense heat of fast-moving lava flows, with temperatures high enough to boil brains and explode skulls.

In the first phase, immediately after the eruption, a long column of ash and pumice blanketed the surrounding towns, most notably Pompeii and Herculaneum. By late night or early morning, pyroclastic flows (fast-moving hot ash, lava fragments, and gases) swept through and obliterated what remained, leaving the bodies of the victims frozen in seeming suspended action.

In the 19th century, an archaeologist named Giuseppe Fiorelli figured out how to make casts of those frozen bodies by pouring liquid plaster into the voids where the soft tissue had been. Some 1,000 bodies have been discovered in the ruins, and 104 plaster casts have been preserved. Restoration efforts of 86 of those casts began about 10 years ago, during which researchers took CT scans and X-rays to see if there were complete skeletons inside. Those images revealed that there had been a great deal of manipulation of the casts, depending on the aesthetics of the era in which they were made, including altering some features of the bodies’ shapes or adding metal rods to stabilize the cast, as well as frequently removing bones before casting.

DNA shows Pompeii’s dead aren’t who we thought they were Read More »

after-decades,-fda-finally-moves-to-pull-ineffective-decongestant-off-shelves

After decades, FDA finally moves to pull ineffective decongestant off shelves

In a long-sought move, the Food and Drug Administration on Thursday formally began the process of abandoning oral doses of a common over-the-counter decongestant, which the agency concluded last year is not effective at relieving stuffy noses.

Specifically, the FDA issued a proposed order to remove oral phenylephrine from the list of drugs that drugmakers can include in over-the-counter products—also known as the OTC monograph. Once removed, drug makers will no longer be able to include phenylephrine in products for the temporary relief nasal congestion.

“It is the FDA’s role to ensure that drugs are safe and effective,” Patrizia Cavazzoni, director of the FDA’s Center for Drug Evaluation and Research, said in a statement. “Based on our review of available data and consistent with the advice of the advisory committee, we are taking this next step in the process to propose removing oral phenylephrine because it is not effective as a nasal decongestant.”

For now, the order is just a proposal. The FDA will open up a public comment period, and if no comments can sway the FDA’s previous conclusion that the drug is useless, the agency will make the order final. Drugmakers will get a grace period to reformulate their products.

Reviewed reviews

The slow-moving abandonment of phenylephrine is years in the making. The decongestant was originally approved by the FDA back in 1976, but it came to prominence after 2006. That was the year when the “Combat Methamphetamine Epidemic Act of 2005” came into effect, and pseudoephedrine—the main component of Sudafed—moved behind the pharmacy counter to keep it from being used to make methamphetamine. With pseudoephedrine out of easy reach at drugstores, phenylephrine became the leading over-the-counter decongestant. And researchers had questions.

In 2007, an FDA panel reevaluated the drug, which allegedly works by shrinking blood vessels in the nasal passage, opening up the airway. While the panel upheld the drug’s approval, it concluded that more studies were needed for a full assessment. After that, three large, carefully designed studies were conducted—two by Merck for the treatment of seasonal allergies and one by Johnson & Johnson for the treatment of the common cold. All three found no significant difference between phenylephrine and a placebo.

After decades, FDA finally moves to pull ineffective decongestant off shelves Read More »

what-makes-baseball’s-“magic-mud”-so-special?

What makes baseball’s “magic mud” so special?

“Magic mud” composition and microstructure: (top right) a clean baseball surface; (bottom right) a mudded baseball.

Credit: S. Pradeep et al., 2024

“Magic mud” composition and microstructure: (top right) a clean baseball surface; (bottom right) a mudded baseball. Credit: S. Pradeep et al., 2024

Pradeep et al. found that magic mud’s particles are primarily silt and clay, with a bit of sand and organic material. The stickiness comes from the clay, silt, and organic matter, while the sand makes it gritty. So the mud “has the properties of skin cream,” they wrote. “This allows it to be held in the hand like a solid but also spread easily to penetrate pores and make a very thin coating on the baseball.”

When the mud dries on the baseball, however, the residue left behind is not like skin cream. That’s due to the angular sand particles bonded to the baseball by the clay, which can increase surface friction by as much as a factor of two. Meanwhile, the finer particles double the adhesion. “The relative proportions of cohesive particulates, frictional sand, and water conspire to make a material that flows like skin cream but grips like sandpaper,” they wrote.

Despite its relatively mundane components, the magic mud nonetheless shows remarkable mechanical behaviors that the authors think would make it useful in other practical applications. For instance, it might replace synthetic materials as an effective lubricant, provided the gritty sand particles are removed. Or it could be used as a friction agent to improve traction on slippery surfaces, provided one could define the optimal fraction of sand content that wouldn’t diminish its spreadability. Or it might be used as a binding agent in locally sourced geomaterials for construction.

“As for the future of Rubbing Mud in Major League Baseball, unraveling the mystery of its behavior does not and should not necessarily lead to a synthetic replacement,” the authors concluded. “We rather believe the opposite; Rubbing Mud is a nature-based material that is replenished by the tides, and only small quantities are needed for great effect. In a world that is turning toward green solutions, this seemingly antiquated baseball tradition provides a glimpse of a future of Earth-inspired materials science.”

DOI: PNAS, 2024. 10.1073/pnas.241351412  (About DOIs).

What makes baseball’s “magic mud” so special? Read More »

nearly-three-years-since-launch,-webb-is-a-hit-among-astronomers

Nearly three years since launch, Webb is a hit among astronomers

From its halo-like orbit nearly a million miles from Earth, the James Webb Space Telescope is seeing farther than human eyes have ever seen.

In May, astronomers announced that Webb detected the most distant galaxy found so far, a fuzzy blob of red light that we see as it existed just 290 million years after the Big Bang. Light from this galaxy, several hundreds of millions of times the mass of the Sun, traveled more than 13 billion years until photons fell onto Webb’s gold-coated mirror.

A few months later, in July, scientists released an image Webb captured of a planet circling a star slightly cooler than the Sun nearly 12 light-years from Earth. The alien world is several times the mass of Jupiter and the closest exoplanet to ever be directly imaged. One of Webb’s science instruments has a coronagraph to blot out bright starlight, allowing the telescope to resolve the faint signature of a nearby planet and use spectroscopy to measure its chemical composition.

These are just a taste of the discoveries made by the $10 billion Webb telescope since it began science observations in 2022. Judging by astronomers’ interest in using Webb, there are many more to come.

Breaking records

The Space Telescope Science Institute, which operates Webb on behalf of NASA and its international partners, said last week that it received 2,377 unique proposals from science teams seeking observing time on the observatory. The institute released a call for proposals earlier this year for the so-called “Cycle 4” series of observations with Webb.

This volume of proposals represents around 78,000 hours of observing time with Webb, nine times more than the telescope’s available capacity for scientific observations in this cycle. The previous observing cycle had a similar “oversubscription rate” but had less overall observing time available to the science community.

Nearly three years since launch, Webb is a hit among astronomers Read More »

for-fame-or-a-death-wish?-kids’-tiktok-challenge-injuries-stump-psychiatrists

For fame or a death wish? Kids’ TikTok challenge injuries stump psychiatrists

Case dilemma

The researchers give the example of a 10-year-old patient who was found unconscious in her bedroom. The psychiatry team was called in to consult for a suicide attempt by hanging. But when the girl was evaluated, she was tearful, denied past or recent suicide attempts, and said she was only participating in the blackout challenge. Still, she reported being in depressed moods, having feelings of hopelessness, having thoughts of suicide since age 9, being bullied, and having no friends. Family members reported unstable housing, busy or absent parental figures, and a family history of a suicide attempts.

If the girl’s injuries were unintentional, stemming from the poor choice to participate in the life-threatening TikTok challenge, clinicians would discharge the patient home with a recommendation for outpatient mental health care to address underlying psychiatric conditions and stressors. But if the injuries were self-inflicted with an intent to die, the clinicians would recommend inpatient psychiatric treatment for safety, which would allow for further risk assessment, monitoring, and treatment for the suspected suicide attempt.

It’s critical to make the right call here. Children and teens who attempt suicide are at risk of more attempts, both immediately and in the future. But to make matters even more complex, injuries from social media challenges have the potential to spur depression and post-traumatic stress disorder. Those, in turn, could increase the risk of suicide attempts.

To keep kids and teens safe, the Ataga and Arnold call for more awareness about the dangers of TikTok challenges, as well as empathetic psychiatric assessments using kid-appropriate measurements. They also call for more research. While there are a handful of case studies on TikTok challenge injuries and deaths among kids and teens, there’s a lack of large-scale data. More research is needed to “demonstrate the role of such challenges as precipitating factors in unintentional and intentional injuries, suicidal behaviors, and deaths among children in the US,” the psychiatrists write.

If you or someone you know is in crisis, call or text 988 for the Suicide and Crisis Lifeline or contact the Crisis Text Line by texting TALK to 741741.

For fame or a death wish? Kids’ TikTok challenge injuries stump psychiatrists Read More »

the-next-starship-launch-may-occur-in-less-than-two-weeks

The next Starship launch may occur in less than two weeks

The company will also use Starship’s next flight to assess new tiles and other elements of the vehicle’s heat shield.

“Several thermal protection experiments and operational changes will test the limits of Starship’s capabilities and generate flight data to inform plans for ship catch and reuse,” the company’s statement said. “The flight test will assess new secondary thermal protection materials and will have entire sections of heat shield tiles removed on either side of the ship in locations being studied for catch-enabling hardware on future vehicles. The ship also will intentionally fly at a higher angle of attack in the final phase of descent, purposefully stressing the limits of flap control to gain data on future landing profiles.”

Final flight of the first Starship

The five previous flights of Starship, dating back to April 2023, have all launched near dawn from South Texas. For the upcoming mission, the company will look for a late-afternoon launch window, which will allow the vehicle to reenter during daylight into the Indian Ocean.

SpaceX’s update also confirms that this will be the last flight of the initial version of the Starship vehicle, with the next generation including redesigned forward flaps, larger propellant tanks, and newer tiles and secondary thermal protection layers.

Reaching a near-monthly cadence of Starship flights during only the second year of the vehicle’s operation is impressive, but it’s also essential if SpaceX wants to unlock the full potential of a rocket that needs multiple refueling launches to support Starship missions to the Moon or Mars.

Wednesday’s announcement comes the day after the US presidential election in which Donald Trump was given a second term by American voters, and it is notable that he was assisted in this through an all-out effort by SpaceX founder Elon Musk.

Musk’s interventions in politics were highly controversial and alienated a significant segment of the US population and political class. Nevertheless Musk’s gambit paid off, as the election of Trump will now likely accelerate Starship’s development and increase its centrality to the nation’s space exploration endeavors.

However, the timing of this launch announcement is likely coincidental, as SpaceX did not need formal regulatory approval to move ahead with this sixth attempt—it was almost entirely dependent on the readiness of the company’s hardware, software, and ground systems.

The next Starship launch may occur in less than two weeks Read More »