Author name: Paul Patrick

nvidia-geforce-rtx-5090-costs-as-much-as-a-whole-gaming-pc—but-it-sure-is-fast

Nvidia GeForce RTX 5090 costs as much as a whole gaming PC—but it sure is fast


Even setting aside Frame Generation, this is a fast, power-hungry $2,000 GPU.

Credit: Andrew Cunningham

Credit: Andrew Cunningham

Nvidia’s GeForce RTX 5090 starts at $1,999 before you factor in upsells from the company’s partners or price increases driven by scalpers and/or genuine demand. It costs more than my entire gaming PC.

The new GPU is so expensive that you could build an entire well-specced gaming PC with Nvidia’s next-fastest GPU in it—the $999 RTX 5080, which we don’t have in hand yet—for the same money, or maybe even a little less with judicious component selection. It’s not the most expensive GPU that Nvidia has ever launched—2018’s $2,499 Titan RTX has it beat, and 2022’s RTX 3090 Ti also cost $2,000—but it’s safe to say it’s not really a GPU intended for the masses.

At least as far as gaming is concerned, the 5090 is the very definition of a halo product; it’s for people who demand the best and newest thing regardless of what it costs (the calculus is probably different for deep-pocketed people and companies who want to use them as some kind of generative AI accelerator). And on this front, at least, the 5090 is successful. It’s the newest and fastest GPU you can buy, and the competition is not particularly close. It’s also a showcase for DLSS Multi-Frame Generation, a new feature unique to the 50-series cards that Nvidia is leaning on heavily to make its new GPUs look better than they already are.

Founders Edition cards: Design and cooling

RTX 5090 RTX 4090 RTX 5080 RTX 4080 Super
CUDA cores 21,760 16,384 10,752 10,240
Boost clock 2,410 MHz 2,520 MHz 2,617 MHz 2,550 MHz
Memory bus width 512-bit 384-bit 256-bit 256-bit
Memory bandwidth 1,792 GB/s 1,008 GB/s 960 GB/s 736 GB/s
Memory size 32GB GDDR7 24GB GDDR6X 16GB GDDR7 16GB GDDR6X
TGP 575 W 450 W 360 W 320 W

We won’t spend too long talking about the specific designs of Nvidia’s Founders Edition cards since many buyers will experience the Blackwell GPUs with cards from Nvidia’s partners instead (the cards we’ve seen so far mostly look like the expected fare: gargantuan triple-slot triple-fan coolers, with varying degrees of RGB). But it’s worth noting that Nvidia has addressed a couple of my functional gripes with the 4090/4080-series design.

The first was the sheer dimensions of each card—not an issue unique to Nvidia, but one that frequently caused problems for me as someone who tends toward ITX-based PCs and smaller builds. The 5090 and 5080 FE designs are the same length and height as the 4090 and 4080 FE designs, but they only take up two slots instead of three, which will make them an easier fit for many cases.

Nvidia has also tweaked the cards’ 12VHPWR connector, recessing it into the card and mounting it at a slight angle instead of having it sticking straight out of the top edge. The height of the 4090/4080 FE design made some cases hard to close up once you factored in the additional height of a 12VHPWR cable or Nvidia’s many-tentacled 8-pin-to-12VHPWR adapter. The angled connector still extends a bit beyond the top of the card, but it’s easier to tuck the cable away so you can put the side back on your case.

Finally, Nvidia has changed its cooler—whereas most OEM GPUs mount all their fans on the top of the GPU, Nvidia has historically placed one fan on each side of the card. In a standard ATX case with the GPU mounted parallel to the bottom of the case, this wasn’t a huge deal—there’s plenty of room for that air to circulate inside the case and to be expelled by whatever case fans you have installed.

But in “sandwich-style” ITX cases, where a riser cable wraps around so the GPU can be mounted parallel to the motherboard, the fan on the bottom side of the GPU was poorly placed. In many sandwich-style cases, the GPU fan will dump heat against the back of the motherboard, making it harder to keep the GPU cool and creating heat problems elsewhere besides. The new GPUs mount both fans on the top of the cards.

Nvidia’s Founders Edition cards have had heat issues in the past—most notably the 30-series GPUs—and that was my first question going in. A smaller cooler plus a dramatically higher peak power draw seems like a recipe for overheating.

Temperatures for the various cards we re-tested for this review. The 5090 FE is the toastiest of all of them, but it still has a safe operating temperature.

At least for the 5090, the smaller cooler does mean higher temperatures—around 10 to 12 degrees Celsius higher when running the same benchmarks as the RTX 4090 Founders Edition. And while temperatures of around 77 degrees aren’t hugely concerning, this is sort of a best-case scenario, with an adequately cooled testbed case with the side panel totally removed and ambient temperatures at around 21° or 22° Celsius. You’ll just want to make sure you have a good amount of airflow in your case if you buy one of these.

Testbed notes

A new high-end Nvidia GPU is a good reason to tweak our test bed and suite of games, and we’ve done both here. Mainly, we added a 1050 W Thermaltake Toughpower GF A3 power supply—Nvidia recommends at least 1000 W for the 5090, and this one has a native 12VHPWR connector for convenience. We’ve also swapped the Ryzen 7 7800X3D for a slightly faster Ryzen 7 9800X3D to reduce the odds that the CPU will bottleneck performance as we try to hit high frame rates.

As for the suite of games, we’ve removed a couple of older titles and added some with built-in benchmarks that will tax these GPUs a bit more, especially at 4K with all the settings turned up. Those games include the RT Overdrive preset in the perennially punishing Cyberpunk 2077 and Black Myth: Wukong in Cinematic mode, both games where even the RTX 4090 struggles to hit 60 fps without an assist from DLSS. We’ve also added Horizon Zero Dawn Remastered, a recent release that doesn’t include ray-tracing effects but does support most DLSS 3 and FSR 3 features (including FSR Frame Generation).

We’ve tried to strike a balance between games with ray-tracing effects and games without it, though most AAA games these days include it, and modern GPUs should be able to handle it well (best of luck to AMD with its upcoming RDNA 4 cards).

For the 5090, we’ve run all tests in 4K—if you don’t care about running games in 4K, even if you want super-high frame rates at 1440p or for some kind of ultrawide monitor, the 5090 is probably overkill. When we run upscaling tests, we use the newest DLSS version available for Nvidia cards, the newest FSR version available for AMD cards, and the newest XeSS version available for Intel cards (not relevant here, just stating for the record), and we use the “Quality” setting (at 4K, that equates to an actual rendering version of 1440p).

Rendering performance: A lot faster, a lot more power-hungry

Before we talk about Frame Generation or “fake frames,” let’s compare apples to apples and just examine the 5090’s rendering performance.

The card mainly benefits from four things compared to the 4090: the updated Blackwell GPU architecture, a nearly 33 percent increase in the number of CUDA cores, an upgrade from GDDR6X to GDDR7, and a move from a 384-bit memory bus to a 512-bit bus. It also jumps from 24GB of RAM to 32GB, but games generally aren’t butting up against a 24GB limit yet, so the capacity increase by itself shouldn’t really change performance if all you’re focused on is gaming.

And for people who prioritize performance over all else, the 5090 is a big deal—it’s the first consumer graphics card from any company that is faster than a 4090, as Nvidia never spruced up the 4090 last year when it did its mid-generation Super refreshes of the 4080, 4070 Ti, and 4070.

Comparing natively rendered games at 4K, the 5090 is between 17 percent and 40 percent faster than the 4090, with most of the games we tested landing somewhere in the low to high 30 percent range. That’s an undeniably big bump, one that’s roughly commensurate with the increase in the number of CUDA cores. Tests run with DLSS enabled (both upscaling-only and with Frame Generation running in 2x mode) improve by roughly the same amount.

You could find things to be disappointed about if you went looking for them. That 30-something-percent performance increase comes with a 35 percent increase in power use in our testing under load with punishing 4K games—the 4090 tops out around 420 W, whereas the 5090 went all the way up to 573 W, with the 5090 coming closer to its 575 W TDP than the 4090 does to its theoretical 450 W maximum. The 50-series cards use the same TSMC 4N manufacturing process as the 40-series cards, and increasing the number of transistors without changing the process results in a chip that uses more power (though it should be said that capping frame rates, running at lower resolutions, or running less-demanding games can rein in that power use a bit).

Power draw under load goes up by an amount roughly commensurate with performance. The 4090 was already power-hungry; the 5090 is dramatically more so. Credit: Andrew Cunningham

The 5090’s 30-something percent increase over the 4090 might also seem underwhelming if you recall that the 4090 was around 55 percent faster than the previous-generation 3090 Ti while consuming about the same amount of power. To be even faster than a 4090 is no small feat—AMD’s fastest GPU is more in line with Nvidia’s 4080 Super—but if you’re comparing the two cards using the exact same tests, the relative leap is less seismic.

That brings us to Nvidia’s answer for that problem: DLSS 4 and its Multi-Frame Generation feature.

DLSS 4 and Multi-Frame Generation

As a refresher, Nvidia’s DLSS Frame Generation feature, as introduced in the GeForce 40-series, takes DLSS upscaling one step further. The upscaling feature inserted interpolated pixels into a rendered image to make it look like a sharper, higher-resolution image without having to do all the work of rendering all those pixels. DLSS FG would interpolate an entire frame between rendered frames, boosting your FPS without dramatically boosting the amount of work your GPU was doing. If you used DLSS upscaling and FG at the same time, Nvidia could claim that seven out of eight pixels on your screen were generated by AI.

DLSS Multi-Frame Generation (hereafter MFG, for simplicity’s sake) does the same thing, but it can generate one to three interpolated frames for every rendered frame. The marketing numbers have gone up, too; now, 15 out of every 16 pixels on your screen can be generated by AI.

Nvidia might point to this and say that the 5090 is over twice as fast as the 4090, but that’s not really comparing apples to apples. Expect this issue to persist over the lifetime of the 50-series. Credit: Andrew Cunningham

Nvidia provided reviewers with a preview build of Cyberpunk 2077 with DLSS MFG enabled, which gives us an example of how those settings will be exposed to users. For 40-series cards that only support the regular DLSS FG, you won’t notice a difference in games that support MFG—Frame Generation is still just one toggle you can turn on or off. For 50-series cards that support MFG, you’ll be able to choose from among a few options, just as you currently can with other DLSS quality settings.

The “2x” mode is the old version of DLSS FG and is supported by both the 50-series cards and 40-series GPUs; it promises one generated frame for every rendered frame (two frames total, hence “2x”). The “3x” and “4x” modes are new to the 50-series and promise two and three generated frames (respectively) for every rendered frame. Like the original DLSS FG, MFG can be used in concert with normal DLSS upscaling, or it can be used independently.

One problem with the original DLSS FG was latency—user input was only being sampled at the natively rendered frame rate, meaning you could be looking at 60 frames per second on your display but only having your input polled 30 times per second. Another is image quality; as good as the DLSS algorithms can be at guessing and recreating what a natively rendered pixel would look like, you’ll inevitably see errors, particularly in fine details.

Both these problems contribute to the third problem with DLSS FG: Without a decent underlying frame rate, the lag you feel and the weird visual artifacts you notice will both be more pronounced. So DLSS FG can be useful for turning 120 fps into 240 fps, or even 60 fps into 120 fps. But it’s not as helpful if you’re trying to get from 20 or 30 fps up to a smooth 60 fps.

We’ll be taking a closer look at the DLSS upgrades in the next couple of weeks (including MFG and the new transformer model, which will supposedly increase upscaling quality and supports all RTX GPUs). But in our limited testing so far, the issues with DLSS MFG are basically the same as with the first version of Frame Generation, just slightly more pronounced. In the built-in Cyberpunk 2077 benchmark, the most visible issues are with some bits of barbed-wire fencing, which get smoother-looking and less detailed as you crank up the number of AI-generated frames. But the motion does look fluid and smooth, and the frame rate counts are admittedly impressive.

But as we noted in last year’s 4090 review, the xx90 cards portray FG and MFG in the best light possible since the card is already capable of natively rendering such high frame rates. It’s on lower-end cards where the shortcomings of the technology become more pronounced. Nvidia might say that the upcoming RTX 5070 is “as fast as a 4090 for $549,” and it might be right in terms of the number of frames the card can put up on your screen every second. But responsiveness and visual fidelity on the 4090 will be better every time—AI is a good augmentation for rendered frames, but it’s iffy as a replacement for rendered frames.

A 4090, amped way up

Nvidia’s GeForce RTX 5090. Credit: Andrew Cunningham

The GeForce RTX 5090 is an impressive card—it’s the only consumer graphics card to be released in over two years that can outperform the RTX 4090. The main caveats are its sky-high power consumption and sky-high price; by itself, it costs as much (and consumes as much power as) an entire mainstream gaming PC. The card is aimed at people who care about speed way more than they care about price, but it’s still worth putting it into context.

The main controversy, as with the 40-series, is how Nvidia talks about its Frame Generation-inflated performance numbers. Frame Generation and Multi-Frame Generation are tools in a toolbox—there will be games where they make things look great and run fast with minimal noticeable impact to visual quality or responsiveness, games where those impacts are more noticeable, and games that never add support for the features at all. (As well-supported as DLSS generally is in new releases, it is incumbent upon game developers to add it—and update it when Nvidia puts out a new version.)

But using those Multi-Frame Generation-inflated FPS numbers to make topline comparisons to last-generation graphics cards just feels disingenuous. No, an RTX 5070 will not be as fast as an RTX 4090 for just $549, because not all games support DLSS MFG, and not all games that do support it will run it well. Frame Generation still needs a good base frame rate to start with, and the slower your card is, the more issues you might notice.

Fuzzy marketing aside, Nvidia is still the undisputed leader in the GPU market, and the RTX 5090 extends that leadership for what will likely be another entire GPU generation, since both AMD and Intel are focusing their efforts on higher-volume, lower-cost cards right now. DLSS is still generally better than AMD’s FSR, and Nvidia does a good job of getting developers of new AAA game releases to support it. And if you’re buying this GPU to do some kind of rendering work or generative AI acceleration, Nvidia’s performance and software tools are still superior. The misleading performance claims are frustrating, but Nvidia still gains a lot of real advantages from being as dominant and entrenched as it is.

The good

  • Usually 30-something percent faster than an RTX 4090
  • Redesigned Founders Edition card is less unwieldy than the bricks that were the 4090/4080 design
  • Adequate cooling, despite the smaller card and higher power use
  • DLSS Multi-Frame Generation is an intriguing option if you’re trying to hit 240 or 360 fps on your high-refresh-rate gaming monitor

The bad

  • Much higher power consumption than the 4090, which already consumed more power than any other GPU on the market
  • Frame Generation is good at making a game that’s running fast run faster, it’s not as good for bringing a slow game up to 60 Hz
  • Nvidia’s misleading marketing around Multi-Frame Generation is frustrating—and will likely be more frustrating for lower-end cards since they aren’t getting the same bumps to core count and memory interface that the 5090 gets

The ugly

  • You can buy a whole lot of PC for $2,000, and we wouldn’t bet on this GPU being easy to find at MSRP

Photo of Andrew Cunningham

Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.

Nvidia GeForce RTX 5090 costs as much as a whole gaming PC—but it sure is fast Read More »

researchers-optimize-simulations-of-molecules-on-quantum-computers

Researchers optimize simulations of molecules on quantum computers

The net result is a much faster operation involving far fewer gates. That’s important because errors in quantum hardware increase as a function of both time and the number of operations.

The researchers then used this approach to explore a chemical, Mn4O5Ca, that plays a key role in photosynthesis. Using this approach, they showed it’s possible to calculate what’s called the “spin ladder,” or the list of the lowest-energy states the electrons can occupy. The energy differences between these states correspond to the wavelengths of light they can absorb or emit, so this also defines the spectrum of the molecule.

Faster, but not quite fast enough

We’re not quite ready to run this system on today’s quantum computers, as the error rates are still a bit too high. But because the operations needed to run this sort of algorithm can be done so efficiently, the error rates don’t have to come down very much before the system will become viable. The primary determinant of whether it will run into an error is how far down the time dimension you run the simulation, plus the number of measurements of the system you take over that time.

“The algorithm is especially promising for near-term devices having favorable resource requirements quantified by the number of snapshots (sample complexity) and maximum evolution time (coherence) required for accurate spectral computation,” the researchers wrote.

But the work also makes a couple of larger points. The first is that quantum computers are fundamentally unlike other forms of computation we’ve developed. They’re capable of running things that look like traditional algorithms, where operations are performed and a result is determined. But they’re also quantum systems that are growing in complexity with each new generation of hardware, which makes them great at simulating other quantum systems. And there are a number of hard problems involving quantum systems we’d like to solve.

In some ways, we may only be starting to scratch the surface of quantum computers’ potential. Up until quite recently, there were a lot of hypotheticals; it now appears we’re on the cusp of using one for some potentially useful computations. And that means more people will start thinking about clever ways we can solve problems with them—including cases like this, where the hardware would be used in ways its designers might not have even considered.

Nature Physics, 2025. DOI: 10.1038/s41567-024-02738-z  (About DOIs).

Researchers optimize simulations of molecules on quantum computers Read More »

openai-launches-operator,-an-ai-agent-that-can-operate-your-computer

OpenAI launches Operator, an AI agent that can operate your computer

While it’s working, Operator shows a miniature browser window of its actions.

However, the technology behind Operator is still relatively new and far from perfect. The model reportedly performs best at repetitive web tasks like creating shopping lists or playlists. It struggles more with unfamiliar interfaces like tables and calendars, and does poorly with complex text editing (with a 40 percent success rate), according to OpenAI’s internal testing data.

OpenAI reported the system achieved an 87 percent success rate on the WebVoyager benchmark, which tests live sites like Amazon and Google Maps. On WebArena, which uses offline test sites for training autonomous agents, Operator’s success rate dropped to 58.1 percent. For computer operating system tasks, CUA set an apparent record of 38.1 percent success on the OSWorld benchmark, surpassing previous models but still falling short of human performance at 72.4 percent.

With this imperfect research preview, OpenAI hopes to gather user feedback and refine the system’s capabilities. The company acknowledges CUA won’t perform reliably in all scenarios but plans to improve its reliability across a wider range of tasks through user testing.

Safety and privacy concerns

For any AI model that can see how you operate your computer and even control some aspects of it, privacy and safety are very important. OpenAI says it built multiple safety controls into Operator, requiring user confirmation before completing sensitive actions like sending emails or making purchases. Operator also has limits on what it can browse, set by OpenAI. It cannot access certain website categories, including gambling and adult content.

Traditionally, AI models based on large language model-style Transformer technology like Operator have been relatively easy to fool with jailbreaks and prompt injections.

To catch attempts at subverting Operator, which might hypothetically be embedded in websites that the AI model browses, OpenAI says it has implemented real-time moderation and detection systems. OpenAI reports the system recognized all but one case of prompt injection attempts during an early internal red-teaming session.

OpenAI launches Operator, an AI agent that can operate your computer Read More »

court-rules-fbi’s-warrantless-searches-violated-fourth-amendment

Court rules FBI’s warrantless searches violated Fourth Amendment

“Certainly, the Court can imagine situations where obtaining a warrant might frustrate the purpose of querying, particularly where exigency requires immediate querying,” DeArcy Hall wrote. “This is why the Court does not hold that querying Section 702-acquired information always requires a warrant.”

Ruling renews calls for 702 reforms

While digital rights groups like the EFF and the American Civil Liberties Union (ACLU) cheered the ruling as providing much-needed clarity, they also suggested that the ruling should prompt lawmakers to go back to the drawing board and reform Section 702.

Section 702 is set to expire on April 15, 2026. Over the years, Congress has repeatedly voted to renew 702 protections, but the EFF is hoping that DeArcy Hall’s ruling will perhaps spark a sea change.

“In light of this ruling, we ask Congress to uphold its responsibility to protect civil rights and civil liberties by refusing to renew Section 702 absent a number of necessary reforms, including an official warrant requirement for querying US persons data and increased transparency,” the EFF wrote in a blog.

A warrant requirement could help truly end backdoor searches, the EFF suggested, and ensure “that the intelligence community does not continue to trample on the constitutionally protected rights to private communications.”

The ACLU warned that reforms are especially critical now, considering that unconstitutional backdoor searches have been “used by the government to conduct warrantless surveillance of Americans, including protesters, members of Congress, and journalists.”

Patrick Toomey, the deputy director of the ACLU’s National Security Project, dubbed 702 “one of the most abused provisions of FISA.”

“As the court recognized, the FBI’s rampant digital searches of Americans are an immense invasion of privacy and trigger the bedrock protections of the Fourth Amendment,” Toomey said. “Section 702 is long overdue for reform by Congress, and this opinion shows why.”

Court rules FBI’s warrantless searches violated Fourth Amendment Read More »

The EU’s AI Act

Have you ever been in a group project where one person decided to take a shortcut, and suddenly, everyone ended up under stricter rules? That’s essentially what the EU is saying to tech companies with the AI Act: “Because some of you couldn’t resist being creepy, we now have to regulate everything.” This legislation isn’t just a slap on the wrist—it’s a line in the sand for the future of ethical AI.

Here’s what went wrong, what the EU is doing about it, and how businesses can adapt without losing their edge.

When AI Went Too Far: The Stories We’d Like to Forget

Target and the Teen Pregnancy Reveal

One of the most infamous examples of AI gone wrong happened back in 2012, when Target used predictive analytics to market to pregnant customers. By analyzing shopping habits—think unscented lotion and prenatal vitamins—they managed to identify a teenage girl as pregnant before she told her family. Imagine her father’s reaction when baby coupons started arriving in the mail. It wasn’t just invasive; it was a wake-up call about how much data we hand over without realizing it. (Read more)

Clearview AI and the Privacy Problem

On the law enforcement front, tools like Clearview AI created a massive facial recognition database by scraping billions of images from the internet. Police departments used it to identify suspects, but it didn’t take long for privacy advocates to cry foul. People discovered their faces were part of this database without consent, and lawsuits followed. This wasn’t just a misstep—it was a full-blown controversy about surveillance overreach. (Learn more)

The EU’s AI Act: Laying Down the Law

The EU has had enough of these oversteps. Enter the AI Act: the first major legislation of its kind, categorizing AI systems into four risk levels:

  1. Minimal Risk: Chatbots that recommend books—low stakes, little oversight.
  2. Limited Risk: Systems like AI-powered spam filters, requiring transparency but little more.
  3. High Risk: This is where things get serious—AI used in hiring, law enforcement, or medical devices. These systems must meet stringent requirements for transparency, human oversight, and fairness.
  4. Unacceptable Risk: Think dystopian sci-fi—social scoring systems or manipulative algorithms that exploit vulnerabilities. These are outright banned.

For companies operating high-risk AI, the EU demands a new level of accountability. That means documenting how systems work, ensuring explainability, and submitting to audits. If you don’t comply, the fines are enormous—up to €35 million or 7% of global annual revenue, whichever is higher.

Why This Matters (and Why It’s Complicated)

The Act is about more than just fines. It’s the EU saying, “We want AI, but we want it to be trustworthy.” At its heart, this is a “don’t be evil” moment, but achieving that balance is tricky.

On one hand, the rules make sense. Who wouldn’t want guardrails around AI systems making decisions about hiring or healthcare? But on the other hand, compliance is costly, especially for smaller companies. Without careful implementation, these regulations could unintentionally stifle innovation, leaving only the big players standing.

Innovating Without Breaking the Rules

For companies, the EU’s AI Act is both a challenge and an opportunity. Yes, it’s more work, but leaning into these regulations now could position your business as a leader in ethical AI. Here’s how:

  • Audit Your AI Systems: Start with a clear inventory. Which of your systems fall into the EU’s risk categories? If you don’t know, it’s time for a third-party assessment.
  • Build Transparency Into Your Processes: Treat documentation and explainability as non-negotiables. Think of it as labeling every ingredient in your product—customers and regulators will thank you.
  • Engage Early With Regulators: The rules aren’t static, and you have a voice. Collaborate with policymakers to shape guidelines that balance innovation and ethics.
  • Invest in Ethics by Design: Make ethical considerations part of your development process from day one. Partner with ethicists and diverse stakeholders to identify potential issues early.
  • Stay Dynamic: AI evolves fast, and so do regulations. Build flexibility into your systems so you can adapt without overhauling everything.

The Bottom Line

The EU’s AI Act isn’t about stifling progress; it’s about creating a framework for responsible innovation. It’s a reaction to the bad actors who’ve made AI feel invasive rather than empowering. By stepping up now—auditing systems, prioritizing transparency, and engaging with regulators—companies can turn this challenge into a competitive advantage.

The message from the EU is clear: if you want a seat at the table, you need to bring something trustworthy. This isn’t about “nice-to-have” compliance; it’s about building a future where AI works for people, not at their expense.

And if we do it right this time? Maybe we really can have nice things.

The EU’s AI Act Read More »

satellite-firm-bucks-miniaturization-trend,-aims-to-build-big-for-big-rockets

Satellite firm bucks miniaturization trend, aims to build big for big rockets

Although the price of this satellite bus is proprietary, various estimates place the cost at between $100 million and $150 million. One reason for the expense is that Lockheed Martin buys most of the satellite’s elements, such as its reaction wheels, from suppliers.

“Lockheed is amazing at doing those missions with really complex requirements,” Kunjur said. “But they just have not changed the way they build these larger, more complex spacecraft in the last 15 or 20 years.”

Vertical integration is the way?

K2 aims to disrupt this ecosystem. For example, the reaction wheels that Honeywell Aerospace sells to Lockheed cost approximately $500,000 to $1 million apiece. K2 is now on its fourth iteration of an internally built reaction wheel and has driven the cost down to $35,000. Kunjur said about 80 percent of K2’s satellite production is vertically integrated.

The company is now building its first “Mega Class” satellite bus, intended to have similar capabilities to Lockheed’s LM2100: 20 kW of power, 1,000 kg of payload capacity, and propulsion to move between orbits. But it’s also stackable: Ten will fit within a Falcon 9 payload fairing and about 50 within Starship’s fairing. The biggest difference is cost. K2 aims to sell its satellite bus for $15 million.

The US government is definitely interested in this capability. About a month ago, K2 announced that it had signed a contract with the US Space Force to launch its first Mega Class satellite in early 2026. The $60 million contract for the “Gravitas” mission will demonstrate the ability of K2’s satellite bus to host several experiments and successfully maneuver from low-Earth orbit to middle-Earth orbit (several thousand km above the surface of Earth).

Although the Mega Class satellite is attractive to government and commercial customers—its lower cost could allow for larger constellations in middle- and geostationary orbits—Kunjur said he and his brother, Neel Kunjur, founded K2 to enable more frequent science missions to other planets in the Solar System.

“We looked at the decadal studies and saw all the mission concept studies that were done,” Kunjur said. “There were maybe 50 studies over a 10-year period. And we realized that if NASA funding remains level, we’ll be able to do one or maybe two of these. So we decided to go after one of the big problems.”

So, if we’re moving into an era of launch abundance, K2 might just solve the problem of affordable science satellites to launch on all these rockets—if it all works, of course.

Satellite firm bucks miniaturization trend, aims to build big for big rockets Read More »

trump-issues-flurry-of-orders-on-tiktok,-doge,-social-media,-ai,-and-energy

Trump issues flurry of orders on TikTok, DOGE, social media, AI, and energy


A roundup of executive orders issued by Trump after his second inauguration.

US President Donald Trump after being sworn in at his inauguration on January 20, 2025 in Washington, DC. Credit: Getty Images

President Donald Trump’s flurry of day-one actions included a reprieve for TikTok, the creation of a Department of Government Efficiency (DOGE), an order on social media “censorship,” a declaration of an energy emergency, and reversal of a Biden order on artificial intelligence.

The TikTok executive order attempts to delay enforcement of a US law that requires TikTok to be banned unless its Chinese owner ByteDance sells the platform. “I am instructing the Attorney General not to take any action to enforce the Act for a period of 75 days from today to allow my Administration an opportunity to determine the appropriate course forward in an orderly way that protects national security while avoiding an abrupt shutdown of a communications platform used by millions of Americans,” Trump’s order said.

TikTok shut down in the US for part of the weekend but re-emerged after Trump said on Sunday that he would issue an order to “extend the period of time before the law’s prohibitions take effect, so that we can make a deal to protect our national security.” Trump also suggested that the US should own half of TikTok.

Energy and Commerce Committee Ranking Member Frank Pallone, Jr. (D-N.J.) criticized Trump’s TikTok action. “I have serious concerns with President Trump’s executive order because he is circumventing national security legislation passed by an overwhelming bipartisan majority in Congress… ByteDance has had 270 days to sell TikTok to an American company, and it’s disgraceful they spent all that time playing political games rather than working to find a buyer,” Pallone said.

Trump’s order doesn’t necessarily remove liability for any company that helps TikTok stay available in the US, The Washington Post reported:

Legal experts and some lawmakers said that with the ban already in force, companies that host or distribute the app will be in violation and could be held liable, no matter what Trump says. Sen. Tom Cotton (R-Arkansas), chair of the Senate Intelligence Committee, warned Sunday after Trump detailed his TikTok plans that companies could still “face hundreds of billions of dollars of ruinous liability under the law,” even if Trump’s Justice Department does not enforce it.

Trump also issued an order revoking numerous Biden administration orders. One is an October 2023 order titled Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. That Biden order, as we wrote at the time, “includes testing mandates for advanced AI models to ensure they can’t be used for creating weapons, suggestions for watermarking AI-generated media, and provisions addressing privacy and job displacement.”

In other White House actions we wrote about yesterday and today, Trump ordered the US to withdraw from the World Health Organization and reversed steps taken to promote electric vehicles.

DOGE

Trump’s executive order establishing a Department of Government Efficiency has been expected since November, when he announced the plan and said that DOGE would be led by Elon Musk and former Republican presidential candidate Vivek Ramaswamy. Instead of creating a brand-new department, the order gives a new name to the existing US Digital Service.

“The United States Digital Service is hereby publicly renamed as the United States DOGE Service (USDS) and shall be established in the Executive Office of the President,” Trump’s order said.

The US Digital Service was launched in 2014 by the Obama administration as a “small team of America’s best digital experts” to “work in collaboration with other government agencies to make websites more consumer friendly, to identify and fix problems, and to help upgrade the government’s technology infrastructure.”

Trump said in November that DOGE “will pave the way for my Administration to dismantle Government Bureaucracy, slash excess regulations, cut wasteful expenditures, and restructure Federal Agencies.” Yesterday’s executive order said the department will focus on “modernizing Federal technology and software to maximize governmental efficiency and productivity.”

Federal agencies will have to collaborate with DOGE. “Among other things, the USDS Administrator shall work with Agency Heads to promote inter-operability between agency networks and systems, ensure data integrity, and facilitate responsible data collection and synchronization,” the order said. “Agency Heads shall take all necessary steps, in coordination with the USDS Administrator and to the maximum extent consistent with law, to ensure USDS has full and prompt access to all unclassified agency records, software systems, and IT systems. USDS shall adhere to rigorous data protection standards.”

Speech on social media

Trump tackled social media in an order titled Restoring Freedom of Speech and Ending Federal Censorship. The order targets the Biden administration’s practice of contacting social media platforms about content that government officials believe should have been moderated or blocked.

In 2023, the Supreme Court blocked an injunction that would have prevented the Biden administration from pressuring social media firms to take down content. Justices expressed skepticism during oral arguments about whether federal government officials should face limits on their communications with social media networks like Facebook and ruled in favor of the Biden administration in June 2024.

Despite the Biden court win, Trump’s order described the Biden administration’s approach as a threat to the First Amendment.

“Over the last 4 years, the previous administration trampled free speech rights by censoring Americans’ speech on online platforms, often by exerting substantial coercive pressure on third parties, such as social media companies, to moderate, deplatform, or otherwise suppress speech that the Federal Government did not approve,” Trump’s order said. “Under the guise of combatting ‘misinformation,’ ‘disinformation,’ and ‘malinformation,’ the Federal Government infringed on the constitutionally protected speech rights of American citizens across the United States in a manner that advanced the Government’s preferred narrative about significant matters of public debate. Government censorship of speech is intolerable in a free society.”

The order goes on to say that federal government employees and officials are prohibited from “engag[ing] in or facilitat[ing] any conduct that would unconstitutionally abridge the free speech of any American citizen.” Trump further directed his administration to”identify and take appropriate action to correct past misconduct by the Federal Government related to censorship of protected speech.”

Fossil fuels good, wind bad

On the energy front, the most striking executive order is one declaring that the US is facing an energy emergency. This comes despite the fact that the US has been producing, in the words of its own agency, “more crude oil than any country, ever.” It’s also producing record volumes of natural gas. Prices for both have been low in part due to this large supply. Yet the executive order states that “identification, leasing, development, production, transportation, refining, and generation capacity of the United States are all far too inadequate to meet our Nation’s needs.”

The order describes ways to streamline permitting for all of these under emergency provisions overseen by the US Army Corps of Engineers. On the face of it, this would seem to also be good for wind and solar power, which are produced domestically and suffer from permitting barriers and a backlog of requests for connections to the grid. But toward the end of the text, “energy” is defined in a way that excludes wind and solar. “The term ‘energy’ or ‘energy resources’ means crude oil, natural gas, lease condensates, natural gas liquids, refined petroleum products, uranium, coal, biofuels, geothermal heat, the kinetic movement of flowing water, and critical minerals,” the order says.

If the animosity toward the fastest-growing sources of renewable energy weren’t clear there, a separate executive order makes them explicit, as Trump is putting a temporary end to all offshore wind lease sales. “This withdrawal temporarily prevents consideration of any area in the [Offshore Continental Shelf] for any new or renewed wind energy leasing for the purposes of generation of electricity or any other such use derived from the use of wind,” it reads. “This withdrawal does not apply to leasing related to any other purposes such as, but not limited to, oil, gas, minerals, and environmental conservation.”

The ostensible reason for this is “alleged legal deficiencies” in the environmental reviews that were conducted prior to the leasing process. There will also be an attempt to claw back existing leases. The secretary of the interior and attorney general are instructed to “conduct a comprehensive review of the ecological, economic, and environmental necessity of terminating or amending any existing wind energy leases.”

As an added bonus, the same accusations of legal deficiencies is leveled against a single land-based project, the proposed Lava Ridge wind farm in Idaho. So all government activities related to that project are on hold until it’s reviewed.

“Burdensome” regulations targeted

When it comes to fossil fuel development on the continental shelf, a Trump order alleges that “burdensome and ideologically motivated regulations” are impeding development. The order takes several steps to speed up permitting of fossil fuel projects. It also kills a grab bag of climate-related programs.

One of the most prominent efforts is to do away with the emissions waivers, allowed under the Clean Air Act, which enable states like California to set stricter rules than the federal government. The Supreme Court recently declined even to consider an attempt to challenge these waivers. Yet as part of an attack on electric vehicles, the administration is adopting a policy of “terminating, where appropriate, state emissions waivers that function to limit sales of gasoline-powered automobiles.”

Also targeted for termination is the American Climate Corps, a job training program focused on people entering the workforce. The Biden administration’s effort to determine and consider the social cost of carbon emissions during federal rulemaking will also be ended.

Several federal rules and executive orders will be targets, notably those on implementing the energy provisions of the Inflation Reduction Act, which have subsidized renewable energy and funded programs like carbon capture and hydrogen production. Many of these are already formal rules published in the Federal Register, which means that new rulemaking processes will be required to eliminate them, something that typically takes over a year and can be subject to court challenge.

In a separate part of the order, titled “Terminating the Green New Deal,” the Order suspends funding provided under two laws that were not part of the Green New Deal: the Inflation Reduction Act and the Infrastructure Investment and Jobs Act. Given those funds have already been allocated by Congress, it’s not clear how long Trump can delay this spending.

Finally, Trump decided he would attack the foundation of US efforts to limit greenhouse gas emissions: the EPA’s finding that greenhouse gasses are a threat to the public as defined by the Clean Air Act. The endangerment finding is solidly based on well-established science, so much so that attempts to challenge it during the first Trump administration were reportedly abandoned as being unrealistic. Now, the incoming EPA administrator is given just 30 days to “submit joint recommendations to the Director of [Office of Management and Budget] on the legality and continuing applicability of the Administrator’s findings.”

Photo of Jon Brodkin

Jon is a Senior IT Reporter for Ars Technica. He covers the telecom industry, Federal Communications Commission rulemakings, broadband consumer affairs, court cases, and government regulation of the tech industry.

Trump issues flurry of orders on TikTok, DOGE, social media, AI, and energy Read More »

edge-of-mars’-great-dichotomy-eroded-back-by-hundreds-of-kilometers

Edge of Mars’ great dichotomy eroded back by hundreds of kilometers

A shoreline transformed?

The huge area covered by these mounds gives a sense of just how significant this erosion was. “The dichotomy boundary has receded several hundred kilometres,” the researchers note. “Nearly all intervening material—approximately 57,000 cubic kilometers over an area of 284,000 square kilometers west of Ares Vallis alone—has been removed, leaving only remnant mounds.”

Based on the distribution of the different clays, the team argues that their water-driven formation took place before the erosion of the material. This would indicate that water-rock interactions were going on over a very wide region early in the history of Mars, which likely required an extensive hydrological cycle on the red planet. As the researchers note, a nearby ocean would have improved the chances of exposing this region to water, but the exposure could also have been due to processes like melting at the base of an ice cap.

Complicating matters further, many of the mounds top out below one proposed shoreline of the northern ocean and above a second. It’s possible that a receding ocean could have contributed to their erosion. But, at the same time, some of the features of a proposed shoreline now appear to have been caused by the general erosion of the original plateau, and may not be associated with an ocean at all.

Overall, the new results provide mixed evidence for the presence of a Martian ocean. They clearly show an active water cycle and erosion on a massive scale, which are both consistent with having a lot of water around. At the same time, however, the water exposure the mesas and buttes have experienced needn’t have come through their being submerged by said ocean and, given their elevation, might best be explained through some other process.

Nature Geoscience, 2019. DOI: 10.1038/s41561-024-01634-8 (About DOIs).

Edge of Mars’ great dichotomy eroded back by hundreds of kilometers Read More »

robotic-hand-helps-pianists-overcome-“ceiling-effect”

Robotic hand helps pianists overcome “ceiling effect”

Fast and complex multi-finger movements generated by the hand exoskeleton. Credit: Shinichi Furuya

When it comes to fine-tuned motor skills like playing the piano, practice, they say, makes perfect. But expert musicians often experience a “ceiling effect,” in which their skill level plateaus after extensive training. Passive training using a robotic exoskeleton hand could help pianists overcome that ceiling effect, according to a paper published in the journal Science Robotics.

“I’m a pianist, but I [injured] my hand because of overpracticing,” coauthor Shinichi Furuya of Kabushiki Keisha Sony Computer Science Kenkyujo told New Scientist. “I was suffering from this dilemma, between overpracticing and the prevention of the injury, so then I thought, I have to think about some way to improve my skills without practicing.” Recalling that his former teachers used to place their hands over his to show him how to play more advanced pieces, he wondered if he could achieve the same effect with a robotic hand.

So Furuya et al. used a custom-made exoskeleton robot hand capable of moving individual fingers on the right hand independently, flexing and extending the joints as needed. Per the authors, prior studies with robotic exoskeletons focused on simpler movements, such as assisting in the movement of limbs stabilizing body posture, or helping grasp objects. That sets the custom robotic hand used in these latest experiments apart from those used for haptics in virtual environments.

A helping robot hand

A total of 118 pianists participated in three different experiments. In the first, 30 pianists performed a designated “chord trill” motor task with the piano at home every day for two weeks: first simultaneously striking D and F keys with the right index and ring fingers, then striking the E and G keys with the right middle and little fingers. “We used this task because it has been widely recognized as technically challenging to play quickly and accurately,” the authors explained. It appears in such classical pieces as Chopin’s Etude Op. 25. No. 6, Maurice Ravel’s “Ondine,” and the first movement of Beethoven’s Piano Sonata No. 3.

Robotic hand helps pianists overcome “ceiling effect” Read More »

tiktok-is-mostly-restored-after-trump-pledges-an-order-and-half-us-ownership

TikTok is mostly restored after Trump pledges an order and half US ownership

At a rally Sunday, he did not clarify if this meant a US-based business or the government itself. “So they’ll have a partner, the United States, and they’ll have a lot of bidders … And there’s no risk, we’re not putting up any money. All we’re doing is giving them the approval without which they don’t have anything,” Trump said Sunday.

Legal limbo

Trump’s order, and TikTok’s return to service, both seem at odds with the law—and leadership in the Republican party. Speaker Mike Johnson said on NBC’s Meet the Press Sunday that Congress would “enforce the law.” Sens. Tom Cotton (R-Ark.) and Pete Ricketts (R-Neb.) issued a joint statement Sunday, commending Apple, Microsoft, and Google for “following the law,” and noting that other companies “face ruinous bankruptcy” for violating it.

“Now that the law has taken effect, there’s no legal basis for any kind of ‘extension’ of its effective date,” the statement read. The law states that “A path to executing a qualified divestiture” has to be determined before a one-time extension of 90 days can be granted.

TikTok’s best chance at avoiding a shutdown vanished in last week’s unanimous Supreme Court decision upholding the divest-or-sell law. Aimed at protecting national security interests from TikTok’s Chinese owners having access to the habits and data of 170 million American users, the law was ruled to be “content-neutral,” and that the US “had good reason to single out TikTok for special treatment.”

Reports at Forbes, Bloomberg, and elsewhere have suggested that ByteDance and its Chinese owners could be seeking to use TikTok as a bargaining chip, with maneuvers including a sale to Trump ally Elon Musk as a means of counteracting Trump’s proposed tariffs on Chinese imports.

One largely unforeseen side effect of Congress’ TikTok-centered actions is that Marvel Snap, a mobile collectible card and deck-building game, disappeared in similar fashion over the weekend. The game, developed by a California-based team, is published by ByteDance’s Nuverse mobile game division. With no web version available, Snap remained unavailable on app stores Monday morning. A message to players with the game installed noted that “This outage is a surprise to us and wasn’t planned,” though it pledged to restore the game.

TikTok is mostly restored after Trump pledges an order and half US ownership Read More »

has-trump-changed-the-retirement-plans-for-the-country’s-largest-coal-plants?

Has Trump changed the retirement plans for the country’s largest coal plants?


A growth in electricity demand is leading to talk of delayed closures.

A house is seen near the Gavin Power Plant in Cheshire, Ohio. Credit: Stephanie Keith/Getty Images

This article originally appeared on Inside Climate News, a nonprofit, non-partisan news organization that covers climate, energy, and the environment. Sign up for their newsletter here.

There is renewed talk of a coal power comeback in the United States, inspired by Donald Trump’s return to the presidency and forecasts of soaring electricity demand.

The evidence so far only shows that some plants are getting small extensions on their retirement dates. This means a slowdown in coal’s rate of decline, which is bad for the environment, but it does little to change the long-term trajectory for the domestic coal industry.

In October, I wrote about how five of the country’s 10 largest coal-fired power plants had retirement dates. Today, I’m revisiting the list, providing some updates and then taking a few steps back to look at US coal plants as a whole. Consider this the “before” picture that can be judged against the “after” in four years.

Some coal plant owners have already pushed back retirement timetables. The largest example, this one from just before the election, is the Gibson plant in Indiana, the second-largest coal plant in the country. It’s set to close in 2038 instead of 2035, following an announcement in October from the owner, Duke Energy.

But the changes do not constitute a coal comeback in this country. For that to happen, power companies would need to be building new plants to replace the many that are closing, and there is almost no development of new coal plants.

That said, there have been some changes since October.

As recently as a few months ago, Southern Co. was saying it intended to close Plant Bowen in Georgia by 2035 at the latest. Bowen is the largest coal plant in the country, with a summer capacity of 3,200 megawatts.

Southern has since said it may extend the plant’s life in response to forecasts of rising electricity demand. Chris Womack, Southern’s CEO, confirmed this possibility when speaking at a utility industry conference in November, saying that the plant may need to operate for longer than previously planned because of demand from data centers.

Southern has not yet made regulatory filings that spell out its plans, but this will likely occur in the next few weeks, according to a company spokesman.

In October, I reported that the Gavin plant in Ohio was likely to get a 2031 date to retire or switch to a different fuel once the plant’s pending sale was completed. The person who shared that information with me was involved with the plans and spoke on condition of anonymity because the sale was not final.

Since then, the prospective buyer of the plant has said in federal regulatory filings that it has no timetable for closing the plant or switching to a different fuel. The plant is changing hands as part of a larger deal between investment firms, with Lightstone Holdco selling to Energy Capital Partners, or ECP. Another company, coal exporter Javelin Global Commodities, is buying a minority share of the Gavin plant.

I went back to the person who told me about the 2031 retirement date. They said forecasts of rising electricity demand, as well as the election of Trump, have created enough uncertainty about power prices and regulations that it makes sense to not specify a date.

The 2031 timeline, and its abandonment, makes some sense once you understand that the Biden administration finalized power plant regulations last spring that gave coal plant operators an incentive to announce a retirement date: Plants closing before 2032 faced no new requirements. That incentive is likely to go away as Trump plans to roll back power plant pollution regulations.

Gavin’s sale is still pending. Several parties have filed objections to the transaction with the Federal Energy Regulatory Commission, arguing that the sellers have not been clear enough about their plans.

An ECP spokesman said the company has no comment beyond its filings.

Other than the changes to plans for Bowen and Gavin, the outlook has not shifted for the rest of the plants among the 10 largest. The Gibson and Rockport plants in Indiana still have retirement dates, as do Cumberland in Tennessee and Monroe in Michigan, according to the plants’ owners.

The Amos plant in West Virginia, Miller in Alabama, Scherer in Georgia, and Parish in Texas didn’t have retirement dates a few months ago, and they still don’t.

But the largest coal plants are only part of the story. Several dozen smaller plants are getting extensions of retirement plans, as Emma Foehringer Merchant reported last week for Floodlight News.

One example is the 1,157-megawatt Baldwin plant in Illinois, which was scheduled to close this year. Now the owner, Vistra Corp., has pushed back the retirement to 2027.

A few extra years of a coal plant is more of a stopgap than a long-term solution. When it comes to building new power plants to meet demand, developers are talking about natural gas, solar, nuclear, and other resources, but I have yet to see a substantial discussion of building a new coal plant.

In Alaska, Gov. Mike Dunleavy has said the state may build two coal plants to provide power in remote mining areas, as reported by Taylor Kuykendall of S&P Global Commodity Insights. Flatlands Energy, a Canadian company, has also talked about building a 400-megawatt coal plant in Alaska, as Nathaniel Herz reported for Alaska Beacon. These appear to be early-stage plans.

The lack of development activity underscores how coal power is fading in this country, and has been for a while.

Coal was used to generate 16 percent of US electricity in 2023, down by more than half from 2014. In that time, coal went from the country’s leading fuel for electricity to trailing natural gas, renewables, and nuclear. (These and all the figures that follow are from the US Energy Information Administration.)

The United States had about 176,000 megawatts of coal plant capacity as of October, down from about 300,000 megawatts in 2014.

The coal plants that do remain are being used less. In 2023, the average capacity factor for a coal plant was 42 percent. Capacity factor is a measure of how much electricity a plant has generated relative to the maximum possible if it was running all the time. In 2014, the average capacity factor was 61 percent.

Power companies are burning less coal because of the availability of less expensive alternatives, such as natural gas, wind, and solar, among others. The think tank Energy Innovation issued a report in 2023 finding that 99 percent of US coal-fired power plants cost more to operate than the cost of replacement with a combination of wind, solar, and batteries.

The Trump administration will arrive in Washington with promises to help fossil fuels. It could extend the lives of some coal plants by weakening environmental regulations, which may reduce the plants’ operational costs. It also could repeal or revise subsidies that help to reduce the costs of renewables and batteries, making those resources more expensive.

I don’t want to minimize the damage that could be caused by those policies. But even in extreme scenarios, it’s difficult to imagine investors wanting to spend billions of dollars to develop a new coal plant, much less a fleet of them.

Photo of Inside Climate News

Has Trump changed the retirement plans for the country’s largest coal plants? Read More »

gm-faces-ban-on-selling-driver-data-that-can-be-used-to-raise-insurance-rates

GM faces ban on selling driver data that can be used to raise insurance rates

The FTC said its complaint alleged that “GM used a misleading enrollment process to get consumers to sign up for its OnStar connected vehicle service and the OnStar Smart Driver feature.” Lina Khan, who is in her final week as FTC chair, said that “GM monitored and sold people’s precise geolocation data and driver behavior information, sometimes as often as every three seconds.”

Settlement not quite finalized

The proposed settlement was approved in a closed meeting by the FTC’s three Democrats, with the two Republicans recorded as absent. The pending agreement will be subject to public comment for 30 days after publication in the Federal Register, and a final FTC decision will be made under the Trump administration.

In addition to location data, the GM/FTC settlement covers “radio listening data regarding specific content, channel, or station; hard braking, hard acceleration, hard cornering, crossing of a designated high-speed threshold, seat belt usage, or late-night driving; and trip time and duration for such events.” GM and OnStar agreed to delete data collected before the settlement and ask third parties to delete data previously shared with them.

GM also “must allow consumers to disable the collection of Location Data from their Vehicles to the extent the Vehicle is equipped with the necessary technology.”

GM issued a press release on the settlement. “Last year, we discontinued Smart Driver across all GM vehicles, unenrolled all customers, and ended our third-party telematics relationships with LexisNexis and Verisk,” GM said. “In September, we consolidated many of our US privacy statements into a single, simpler statement as part of our broader work to keep raising the bar on privacy… As part of the agreement, GM will obtain affirmative customer consent to collect, use, or disclose certain types of connected vehicle data (with exceptions for certain purposes).”

Affirmative consent is not required for purposes such as providing driver data to emergency responders, responding to customer-initiated communications, complying with government requests and legal requirements, and investigating product quality or safety problems. While the ban on sharing driving data lasts only five years, the overall settlement would be in place for 20 years.

GM faces ban on selling driver data that can be used to raise insurance rates Read More »