gaming

framework’s-first-desktop-is-a-strange—but-unique—mini-itx-gaming-pc

Framework’s first desktop is a strange—but unique—mini ITX gaming PC

In Framework’s first-party case, the PC starts at $1,099, which gets you a Ryzen AI Max 385 (that’s an 8-core CPU and 32 GPU cores) and 32GB of RAM. A fully loaded 128GB with a Ryzen AI Max+ 395 configuration (16 CPU cores, 40 GPU cores) will run you $1,999. There’s also an in-between build with the Ryzen AI Max+ 395 chip and 64GB of RAM for $1,599. If you just want the mini ITX board to put in a case of your choosing, that starts at $799.

None of these are impulse buys, exactly, but they’re priced a bit better than a gaming-focused mini PC like the Asus ROG NUC, which starts at nearly $1,300 as of this writing and comes with half as much RAM. It’s also priced well compared to what you can get out of a DIY mini ITX PC based on integrated graphics—the Ryzen 7 8700G, an AM5 ITX motherboard, and 32GB of DDR5 can all be had for around $500 collectively before you add a case, power supply, or SSD, but for considerably slower performance.

The volume of the Framework Desktop’s first-party case is just 4.5 liters—for reference, the SSUPD Meshroom S is 14.9 liters, a fairly middle-of-the-road volume for an ITX case that can fit a full-size GPU. An Xbox Series X is about 6.9 liters, and the Xbox Series S is 4.4 liters. Apple’s Mac Studio is about 3.7 liters. The Framework Desktop isn’t breaking records, but it’s definitely tiny.

Despite the non-upgradeability of the main components, Framework has tried to stick to existing standards where it can by using a flex ATX power supply, ATX headers on the motherboard, regular 120 mm fans that can be changed out, and of course the mini ITX form factor itself. Credit: Framework

So the pitch for the system is easy: You get a reasonably powerful 1440p-capable gaming and workstation PC inside a case the size of a small game console. “If the Series S could run Windows, I’d buy it in a second” is a thought that has occurred to me, so I can see the appeal, even though it costs at least three times as much.

But it does feel like a strange fit for Framework, given that it’s so much less upgradeable than most PCs. The CPU and GPU are one piece of silicon, and they’re soldered to the motherboard. The RAM is also soldered down and not upgradeable once you’ve bought it, setting it apart from nearly every other board Framework sells.

Framework’s first desktop is a strange—but unique—mini ITX gaming PC Read More »

nvidia-geforce-rtx-5070-ti-review:-an-rtx-4080-for-$749,-at-least-in-theory

Nvidia GeForce RTX 5070 Ti review: An RTX 4080 for $749, at least in theory


may the odds be ever in your favor

It’s hard to review a product if you don’t know what it will actually cost!

The Asus Prime GeForce RTX 5070 Ti. Credit: Andrew Cunningham

The Asus Prime GeForce RTX 5070 Ti. Credit: Andrew Cunningham

Nvidia’s RTX 50-series makes its first foray below the $1,000 mark starting this week, with the $749 RTX 5070 Ti—at least in theory.

The third-fastest card in the Blackwell GPU lineup, the 5070 Ti is still far from “reasonably priced” by historical standards (the 3070 Ti was $599 at launch). But it’s also $50 cheaper and a fair bit faster than the outgoing 4070 Ti Super and the older 4070 Ti. These are steps in the right direction, if small ones.

We’ll talk more about its performance shortly, but at a high level, the 5070 Ti’s performance falls in the same general range as the 4080 Super and the original RTX 4080, a card that launched for $1,199 just over two years ago. And it’s probably your floor for consistently playable native 4K gaming for those of you out there who don’t want to rely on DLSS or 4K upscaling to hit that resolution (it’s also probably all the GPU that most people will need for high-FPS 1440p, if that’s more your speed).

But it’s a card I’m ambivalent about! It’s close to 90 percent as fast as a 5080 for 75 percent of the price, at least if you go by Nvidia’s minimum list prices, which for the 5090 and 5080 have been mostly fictional so far. If you can find it at that price—and that’s a big “if,” since every $749 model is already out of stock across the board at Newegg—and you’re desperate to upgrade or are building a brand-new 4K gaming PC, you could do worse. But I wouldn’t spend more than $749 on it, and it might be worth waiting to see what AMD’s first 90-series Radeon cards look like in a couple weeks before you jump in.

Meet the GeForce RTX 5070 Ti

RTX 5080 RTX 4080 Super RTX 5070 Ti RTX 4070 Ti Super RTX 4070 Ti RTX 5070
CUDA Cores 10,752 10,240 8,960 8,448 7,680 6,144
Boost Clock 2,617 MHz 2,550 MHz 2,452 MHz 2,610 MHz 2,610 MHz 2,512 MHz
Memory Bus Width 256-bit 256-bit 256-bit 256-bit 192-bit 192-bit
Memory Bandwidth 960 GB/s 736 GB/s 896 GB/s 672 GB/s 504 GB/s 672 GB/s
Memory size 16GB GDDR7 16GB GDDR6X 16GB GDDR7 16GB GDDR6X 12GB GDDR6X 12GB GDDR7
TGP 360 W 320 W 300 W 285 W 285 W 250 W

Nvidia isn’t making a Founders Edition version of the 5070 Ti, so this time around our review unit is an Asus Prime GeForce RTX 5070 Ti provided by Asus and Nvidia. These third-party cards will deviate a little from the stock specs listed above, but factory overclocks tend to be inordinately mild, and done mostly so the GPU manufacturer can slap a big “overclocked” badge somewhere on the box. We tested this Asus card with its BIOS switch set to “performance” mode, which elevates the boost clock by an entire 30 MHz; you don’t need to be a math whiz to guess that a 1.2 percent overclock is not going to change performance much.

Compared to the 4070 Ti Super, the 5070 Ti brings two things to the table: a roughly 6 percent increase in CUDA cores and a 33 percent increase in memory bandwidth, courtesy of the switch from GDDR6X to GDDR7. The original 4070 Ti had even fewer CUDA cores, but most importantly for its 4K performance included just 12GB of memory on a 192-bit bus.

The 5070 Ti is based on the same GB203 GPU silicon as the 5080 series, but with 1,792 CUDA cores disabled. But there are a lot of similarities between the two, including the 16GB bank of GDDR7 and the 256-bit memory bus. It looks nothing like the yawning gap between the RTX 5090 and the RTX 5080, and the two cards’ similar-ish specs meant they weren’t too far away from each other in our testing. The 5070 Ti’s 300 W power requirement is also a bit lower than the 5080’s 360 W, but it’s pretty close to the 4080 and 4080 Super’s 320 W; in practice, the 5070 Ti draws about as much as the 4080 cards do under load.

Asus’ design for its Prime RTX 5070 Ti is an inoffensive 2.5-slot, triple-fan card that should fit without a problem in most builds. Credit: Andrew Cunningham

As a Blackwell GPU, the 5070 Ti also supports Nvidia’s most-hyped addition to the 50-series: support for DLSS 4 and Multi-Frame Generation (MFG). We’ve already covered this in our 5090 and 5080 reviews, but the short version is that MFG works exactly like Frame Generation did in the 40-series, except that it can now insert up to three AI-generated frames in between natively rendered frames instead of just one.

Especially if you’re already running at a reasonably high frame rate, this can make things look a lot smoother on a high-refresh-rate monitor without introducing distractingly excessive lag or weird rendering errors. The feature is mainly controversial because Nvidia is comparing 50-series performance numbers with DLSS MFG enabled to older 40-series cards without DLSS MFG to make the 50-series cards seem a whole lot faster than they actually are.

We’ll publish some frame-generation numbers in our review, both using DLSS and (for AMD cards) FSR. But per usual, we’ll continue to focus on natively rendered performance—more relevant for all the games out there that don’t support frame generation or don’t benefit much from it, and more relevant because your base performance dictates how good your generated frames will look and feel anyway.

Testbed notes

We tested the 5070 Ti in the same updated testbed and with the same updated suite of games that we started using in our RTX 5090 review. The heart of the build is an AMD Ryzen 9800X3D, ensuring that our numbers are limited as little as possible by the CPU speed.

Per usual, we prioritize testing GPUs at resolutions that we think most people will use them for. For the 5070 Ti, that means both 4K and 1440p—this card is arguably still overkill for 1440p, but if you’re trying to hit 144 or 240 Hz (or even more) on a monitor, there’s a good case to be made for it. We also use a mix of ray-traced and non-ray-traced games. For the games we test with upscaling enabled, we use DLSS on Nvidia cards and the newest supported version of FSR (usually 2.x or 3.x) for AMD cards.

Though we’ve tested and re-tested multiple cards with recent drivers in our updated testbed, we don’t have a 4070 Ti Super, 4070 Ti, or 3070 Ti available to test with. We’ve provided some numbers for those GPUs from past reviews; these are from a PC running older drivers and a Ryzen 7 7800X3D instead of a 9800X3D, and we’ve put asterisks next to them in our charts. They should still paint a reasonably accurate picture of the older GPUs’ relative performance, but take them with that small grain of salt.

Performance and power

Despite including fewer CUDA cores than either version of the 4080, some combination of architectural improvements and memory bandwidth increases help the card keep pace with both 4080 cards almost perfectly. In most of our tests, it landed in the narrow strip right in between the 4080 and the 4080 Super, and its power consumption under load was also almost identical.

Benchmarks with DLSS/FSR and/or frame generation enabled.

In every way that matters, the 5070 Ti is essentially an RTX 4080 that also supports DLSS Multi-Frame Generation. You can see why we’d be mildly enthusiastic about it at $749 but less and less impressed the closer the price creeps to $1,000.

Being close to a 4080 also means that the performance gap between the 5070 Ti and the 5080 is usually pretty small. In most of the games we tested, the 5070 Ti hovers right around 90 percent of the 5080’s performance.

The 5070 Ti is also around 60 percent as fast as an RTX 5090. The performance is a lot lower, but the price-to-performance ratio is a lot higher, possibly reflecting the fact that the 5070 Ti actually has other GPUs it has to compete with (in non-ray-traced games, the Radeon RX 7900 XTX generally keeps pace with the 5070 Ti, though at this late date it is mostly out of stock unless you’re willing to pay way more than you ought to for one).

Compared to the old 4070 Ti, the 5070 Ti can be between 20 and 50 percent faster at 4K, depending on how limited the game is by the 4070 Ti’s narrower memory bus and 12GB bank of RAM. The performance improvement over the 4070 Ti Super is more muted, ranging from as little as 8 percent to as much as 20 percent in our 4K tests. This is better than the RTX 5080 did relative to the RTX 4080 Super, but as a generational leap, it’s still pretty modest—it’s clear why Nvidia wants everyone to look at the Multi-Frame Generation numbers when making comparisons.

Waiting to put theory into practice

Asus’ RTX 5070 Ti, replete with 12-pin power plug. Credit: Andrew Cunningham

Being able to get RTX 4080-level performance for several hundred dollars less just a couple of years after the 4080 launched is kind of exciting, though that excitement is leavened by the still high-ish $749 price tag (again, assuming it’s actually available at or anywhere near that price). That certainly makes it feel more like a next-generation GPU than the RTX 5080 did—and whatever else you can say about it, the 5070 Ti certainly feels like a better buy than the 5080.

The 5070 Ti is a fast and 4K-capable graphics card, fast enough that you should be able to get some good results from all of Blackwell’s new frame-generation trickery if that’s something you want to play with. Its price-to-performance ratio does not thrill me, but if you do the math, it’s still a much better value than the 4070 Ti series was—particularly the original 4070 Ti, with the 12GB allotment of RAM that limited its usefulness and future-proofness at 4K.

Two reasons to hold off on buying a 5070 Ti, if you’re thinking about it: We’re waiting to see how AMD’s 9070 series GPUs shake out, and Nvidia’s 50-series launch so far has been kind of a mess, with low availability and price gouging both on retail sites and in the secondhand market. Pay much more than $749 for a 5070 Ti, and its delicate value proposition fades quickly. We should know more about the AMD cards in a couple of weeks. The supply situation, at least so far, seems like a problem that Nvidia can’t (or won’t) figure out how to solve.

The good

  • For a starting price of $749, you get the approximate performance and power consumption of an RTX 4080, a GPU that cost $1,199 two years ago and $999 one year ago.
  • Good 4K performance and great 1440p performance for those with high-refresh monitors.
  • 16GB of RAM should be reasonably future-proof.
  • Multi-Frame Generation is an interesting performance-boosting tool to have in your toolbox, even if it isn’t a cure-all for low framerates.
  • Nvidia-specific benefits like DLSS support and CUDA.

The bad

  • Not all that much faster than a 4070 Ti Super.
  • $749 looks cheap compared to a $2,000 GPU, but it’s still enough money to buy a high-end game console or an entire 1080p gaming PC.

The ugly

  • Pricing and availability for other 50-series GPUs to date have both been kind of a mess.
  • Will you actually be able to get it for $749? Because it doesn’t make a ton of sense if it costs more than $749.
  • Seriously, it’s been months since I reviewed a GPU that was actually widely available at its advertised price.
  • And it’s not just the RTX 5090 or 5080, it’s low-end stuff like the Intel Arc B580 and B570, too.
  • Is it high demand? Low supply? Scalpers and resellers hanging off the GPU market like the parasites they are? No one can say!
  • It makes these reviews very hard to do.
  • It also makes PC gaming, as a hobby, really difficult to get into if you aren’t into it already!
  • It just makes me mad is all.
  • If you’re reading this months from now and the GPUs actually are in stock at the list price, I hope this was helpful.

Photo of Andrew Cunningham

Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.

Nvidia GeForce RTX 5070 Ti review: An RTX 4080 for $749, at least in theory Read More »

microsoft-shows-progress-toward-real-time-ai-generated-game-worlds

Microsoft shows progress toward real-time AI-generated game worlds

For a while now, many AI researchers have been working to integrate a so-called “world model” into their systems. Ideally, these models could infer a simulated understanding of how in-game objects and characters should behave based on video footage alone, then create fully interactive video that instantly simulates new playable worlds based on that understanding.

Microsoft Research’s new World and Human Action Model (WHAM), revealed today in a paper published in the journal Nature, shows how quickly those models have advanced in a short time. But it also shows how much further we have to go before the dream of AI crafting complete, playable gameplay footage from just some basic prompts and sample video footage becomes a reality.

More consistent, more persistent

Much like Google’s Genie model before it, WHAM starts by training on “ground truth” gameplay video and input data provided by actual players. In this case, that data comes from Bleeding Edge, a four-on-four online brawler released in 2020 by Microsoft subsidiary Ninja Theory. By collecting actual player footage since launch (as allowed under the game’s user agreement), Microsoft gathered the equivalent of seven player-years’ worth of gameplay video paired with real player inputs.

Early in that training process, Microsoft Research’s Katja Hoffman said the model would get easily confused, generating inconsistent clips that would “deteriorate [into] these blocks of color.” After 1 million training updates, though, the WHAM model started showing basic understanding of complex gameplay interactions, such as a power cell item exploding after three hits from the player or the movements of a specific character’s flight abilities. The results continued to improve as the researchers threw more computing resources and larger models at the problem, according to the Nature paper.

To see just how well the WHAM model generated new gameplay sequences, Microsoft tested the model by giving it up to one second’s worth of real gameplay footage and asking it to generate what subsequent frames would look like based on new simulated inputs. To test the model’s consistency, Microsoft used actual human input strings to generate up to two minutes of new AI-generated footage, which was then compared to actual gameplay results using the Frechet Video Distance metric.

Microsoft shows progress toward real-time AI-generated game worlds Read More »

valve-releases-full-team-fortress-2-game-code-to-encourage-new,-free-versions

Valve releases full Team Fortress 2 game code to encourage new, free versions

Valve’s updates to its classic games evoke Hemingway’s two kinds of going bankrupt: gradually, then suddenly. Nothing is heard, little is seen, and then, one day, Half-Life 2: DeathmatchDay of Defeat, and other Source-engine-based games get a bevy of modern upgrades. Now, the entirety of Team Fortress 2 (TF2) client and server game code, a boon for modders and fixers, is also being released.

That source code allows for more ambitious projects than have been possible thus far, Valve wrote in a blog post. “Unlike the Steam Workshop or local content mods, this SDK gives mod makers the ability to change, extend, or rewrite TF2, making anything from small tweaks to complete conversions possible.” The SDK license restricts any resulting projects to “a non-commercial basis,” but they can be published on Steam’s store as their own entities.

Since it had the tools out, Valve also poked around the games based on that more open source engine and spiffed them up as well. Most games got 64-bit binary support, scalable HUD graphics, borderless window options, and the like. Many of these upgrades come from the big 25-year anniversary update made to Half-Life 2, which included “overbright lighting,” gamepad configurations, Steam networking support, and the like.

Valve releases full Team Fortress 2 game code to encourage new, free versions Read More »

nvidia’s-50-series-cards-drop-support-for-physx,-impacting-older-games

Nvidia’s 50-series cards drop support for PhysX, impacting older games

Nvidia’s PhysX offerings to developers didn’t always generate warm feelings. As part of its broader GamesWorks package, PhysX was cited as one of the reasons The Witcher 3 ran at notably sub-optimal levels at launch. Protagonist Geralt’s hair, rendered in PhysX-powered HairWorks, was a burden on some chipsets.

PhysX started appearing in general game engines, like Unity 5, and was eventually open-sourced, first in limited computer and mobile form, then more broadly. As an application wrapped up in Nvidia’s 32-bit CUDA API and platform, the PhysX engine had a built-in shelf life. Now the expiration date is known, and it is conditional on buying into Nvidia’s 50-series video cards—whenever they approach reasonable human prices.

Dune buggy in Borderlands 3, dodging rockets shot by a hovering attack craft just over a sand dune, in Borderlands 3.

See that smoke? It’s from Sweden, originally.

Credit: Gearbox/Take 2

See that smoke? It’s from Sweden, originally. Credit: Gearbox/Take 2

The real dynamic particles were the friends we made…

Nvidia noted in mid-January that 32-bit applications cannot be developed or debugged on the latest versions of its CUDA toolkit. They will still run on cards before the 50 series. Technically, you could also keep an older card installed on your system for compatibility, which is real dedication to early-2010’s-era particle physics.

Technically, a 64-bit game could still support PhysX on Nvidia’s newest GPUs, but the heyday of PhysX, as a stand-alone technology switched on in game settings, tended to coincide with the 32-bit computing era.

If you load up a 32-bit game now with PhysX enabled (or forced in a config file) and a 50-series Nvidia GPU installed, there’s a good chance the physics work will be passed to the CPU instead of the GPU, likely bottlenecking the game and steeply lowering frame rates. Of course, turning off PhysX entirely raised frame rates above even native GPU support levels.

Demanding Borderlands 2 keep using PhysX made it so it “runs terrible,” noted one Redditor, even if the dust clouds and flapping cloth strips looked interesting. Other games with PhysX baked in, as listed by ResetEra completists, include Metro 2033, Assassin’s Creed IV: Black Flag, and the 2013 Star Trek game.

Commenters on Reddit and ResetEra note that many of the games listed had performance issues with PhysX long before Nvidia forced them to either turn off or be loaded onto a CPU. For some games, however, PhysX enabled destructible environments, “dynamic bank notes” and “posters” (in the Arkham games), fluid simulations, and base gameplay physics.

Anyone who works in, or cares about, game preservation has always had their work cut out for them. But it’s a particularly tough challenge to see certain aspects of a game’s operation lost to the forward march of the CUDA platform, something that’s harder to explain than a scratched CD or Windows compatibility.

Nvidia’s 50-series cards drop support for PhysX, impacting older games Read More »

streamer-completes-hitless-run-of-seven-fromsoft-soulslikes-without-leveling-up

Streamer completes hitless run of seven FromSoft Soulslikes without leveling up

What now?

In a follow-up stream on Monday, Nico called his latest gaming achievement “by far the most difficult run I have ever completed. We did the same run leveled, but it is not even close to as difficult as the level 1 run. The level 1 run, the difficulty level is just insane.”

Aside from being an incredible individual achievement, Nico’s level 1 God Run helps put FromSoft’s reputation for difficulty into perspective. While these games can punish failure very harshly—and require lots of arcane knowledge to play well—Nico shows that they’re also designed to be fair to players with the steely nerves to attack and dodge with perfect timing.

After almost 2 years of attempts!

IT’S FINALLY DONE!

7 Games, 0 Hits, Character Level 1!

WE DID IT!

Here is the last fight! (and sorry that it is cinder again) 😅https://t.co/0g24MY4wRy

— Nico (@dinossindgeil) February 16, 2025

With his ultimate FromSoft achievement now complete, Nico said he’s “going to take a vacation now. And by vacation, I mean I’ll continue doing hitless runs, I will continue being live every day… but we’re going to do some smaller ones for now.” In the longer term, Nico hinted that he is “going to work on one really big project again,” but wasn’t willing to provide details just yet.

If you want to follow in Nico’s hitless FromSoft footsteps, he puts out instructional videos laying out the specific paths and strategies needed to get through specific games (Pro tip: It involves a lot of well-timed rolling and memorizing attack and stagger patterns). Nico also took the time to recently rank how hard he finds each game in the God Run, putting Elden Ring in the easiest tier and Dark Souls II in the hardest.

Streamer completes hitless run of seven FromSoft Soulslikes without leveling up Read More »

how-diablo-hackers-uncovered-a-speedrun-scandal

How Diablo hackers uncovered a speedrun scandal


Investigators decompiled the game to search through 2.2 billion random dungeon seeds.

The word Debunk radiating flames against a demonic background Credit: Aurich Lawson

For years, Maciej “Groobo” Maselewski stood as the undisputed champion of Diablo speedrunning. His 3-minute, 12-second Sorcerer run looked all but unbeatable thanks to a combination of powerful (and allowable) glitch exploits along with what seemed like some unbelievable luck in the game’s randomly generated dungeon.

But when a team of other speedrunners started trying and failing to replicate that luck using outside software and analysis tools, the story behind Groobo’s run began to fall apart. As the inconsistencies in the run started to mount, that team would conduct an automated search through billions of legitimate Diablo dungeons to prove beyond a shadow of a doubt that Groobo’s game couldn’t have taken place in any of them.

“We just had a lot of curiosity and resentment that drove us to dig even deeper,” team member Staphen told Ars Technica of their investigation. “Betrayal might be another way to describe it,” team member AJenbo added. “To find out that this had been done illegitimately… and the person had both gotten and taken a lot of praise for their achievement.”

If we have unearned luck

If you have any familiarity with Diablo or speedrunning, watching Groobo’s run feels like watching someone win the lottery. First, there’s the dungeon itself, which features a sequence of stairways that appear just steps from each other, forming a quick and enemy-free path down to the dungeon’s deeper levels. Then there’s Groobo’s lucky find of Naj’s Puzzler on level 9, a unique item that enables the teleporting necessary for many of the run’s late-game maneuvers.

Groobo’s 3: 12 Diablo speedrun, as submitted to Speed Demos Archive in 2009

“It seemed very unusual that we would have so many levels with the upstairs and the downstairs right next to each other,” Allan “DwangoAC” Cecil told Ars Technica. “We wanted to find some way of replicating this.”

When Cecil and a team of tool-assisted speedrun (TAS) authors started that search process in earnest last February, they said they used Groobo’s run as a baseline to try to improve from. While Groobo ostensibly had to rely on his own human luck in prepping his run, the TAS runners could use techniques and tools from outside the game to replicate Groobo’s run (or something very similar) every time.

To find an RNG seed that could do just that, the TAS team created a custom-built map generation tool by reverse-engineering a disassembled Diablo executable. That tool can take any of the game’s billions of possible random seeds and quickly determine the map layout, item distribution, and quest placement available in the generated save file. A scanner built on top of that tool can then quickly look through those generated dungeons for ones that might be optimal for speedrunning.

“We were working on finding the best seed for our TAS, and we were trying to identify the seed from Groobo’s run, both to validate that our scanner works and to potentially straight-up use it for the run,” Stephan said of the effort. “We naturally had a lot of trouble finding [that seed] because it doesn’t exist.”

A thorough search

In their effort to find Groobo’s storied run (or at least one that resembled it), the TAS team conducted a distributed search across the game’s roughly 2.2 billion valid RNG seeds. Each of these seeds represents a different specific second on the system clock when a Diablo save file is created, ranging from between January 1, 1970, and December 31, 2038 (the only valid dates accepted by the game).

After comparing each of those billions of those RNG dungeons to a re-creation of the dungeon seen in Groobo’s run, the team couldn’t find a single example containing the crucial level 9 Naj’s Puzzler drop. After that, the team started searching through “impossible” seeds, which could only be created by using save modification tools to force a creation date after the year 2038.

The team eventually found dungeons matching Naj’s Puzzler drop in Groobo’s video, using seeds associated with the years 2056 and 2074.

After an exhaustive search, the TAS team couldn’t find a dungeon with Naj’s Puzzler dropped in the place Groobo’s run said it should be.

After an exhaustive search, the TAS team couldn’t find a dungeon with Naj’s Puzzler dropped in the place Groobo’s run said it should be. Credit: Analysis of Groobo’s Diablo WR Speedrun

The early presumption that Groobo’s run was legitimate ended up costing the team weeks of work. “It was baffling when we couldn’t find [the early Naj’s Puzzler] in any of the searches we did,” Cecil said. “We were always worried that the scanner might have bugs in it,” Staphen added.

The TAS team’s thorough search also showed troubling inconsistencies in the other dungeon levels shown in Groobo’s run. “Normally you would only need to identify a single level to replicate a run since all the other levels are generated from the same seed,” AJenbo told Ars. But the levels seen in Groobo’s run came from multiple different seeds, which would require splicing footage from multiple different playthrough of different dungeons. That’s a big no-no even in a so-called “segmented” run, which is still supposed to contain segments from a single unmodified save file.

“At that point we also wanted to figure out how manipulated the run was,” AJenbo said. “Was it a legit run except for [dungeon level] 9? Was it three good runs combined? In the end we only found two levels that had come from the same run so at least 13 (probably 15) runs were spliced into one video, which is a lot for a game with just 16 levels.”

The evidence piles up

After Groobo’s dungeon generation problems came to light, other inconsistencies in his run started to become apparent. Some of these are relatively easy to spot with the naked eye once you know what you’re looking for.

For instance, the “1996–2001” copyright date seen on the title screen in Groobo’s video is inconsistent with the v1.00 shown on the initial menu screen, suggesting Groobo’s run was spliced together from runs on multiple different versions of the game. Items acquired early in the run also disappear from the inventory later on with no apparent explanation.

This copyright date doesn’t line up with the “V1.00” seen later on the menu screen in Groobo’s run.

This copyright date doesn’t line up with the “V1.00” seen later on the menu screen in Groobo’s run. Credit: Analysis of Groobo’s Diablo WR Speedrun

Even months after the investigation first started, new inconsistencies are still coming to light. Groobo’s final fight against Diablo, for instance, required just 19 fireballs to take him out. While that’s technically possible with perfect luck for the level 12 Sorcerer seen in the footage, the TAS team found that the specific damage dealt and boss behavior only matched when they attempted the same attacks using a level 26 Sorcerer.

After the TAS team compiled their many findings into a lengthy document, Groobo defended his submission in a discussion with Cecil (screenshots of which were viewed by Ars Technica). “My run is a segmented/spliced run,” Groobo said. “It always has been and it was never passed off as anything else, nor was it part of any competition or leaderboards. The Speed Demos Archive [SDA] page states that outright.” Indeed, an archived version of Groobo’s record-setting Speed Demos Archive submission does say directly that it’s made up of “27 segments appended to one file.”

But simply splitting a run into segments doesn’t explain away all of the problems the TAS team found. Getting Naj’s Puzzler on dungeon level 9, for instance, still requires outside modification of a save file, which is specifically prohibited by longstanding Speed Demos Archive rules that “manually editing/adding/removing game files is generally not allowed.” Groobo’s apparent splicing of multiple game versions and differently seeded save files also seems to go against SDA rules, which say that “there obviously needs to be continuity between segments in terms of inventory, experience points or whatever is applicable for the individual game.”

After being presented with the TAS team’s evidence, SDA wrote that “it has been determined that Groobo’s run very likely does not stem from only legitimate techniques, and as such, has itself been banished barring new developments.” But Groobo’s record is still listed as the “Fastest completion of an RPG videogame” by Guinness World Records, which has not offered a substantive response to the team’s findings (Guinness has not responded to a request for comment from Ars Technica).

A recent Diablo speedrun on a confirmed legitimate dungeon seed.

This might seem like a pretty petty issue to spend weeks of time and attention debunking. But at a recent presentation attended by Ars, Cecil said he was motivated to pursue it because “it did harm. Groobo’s alleged cheating in 2009 completely stopped interest in speedrunning this category [of Diablo]. No one tried, no one could.”

Because of Groobo’s previously unknown modifications to make an impossible-to-beat run, “this big running community just stopped trying to run this game in that category,” Cecil said. “For more than a decade, this had a chilling impact on that community.” With Groobo’s run out of the way, though, new runners are setting new records on confirmed legitimate RNG seeds, and with the aid of TAS tools.

In the end, Cecil said he hopes the evidence regarding Groobo’s run will make people look more carefully at other record submissions. “Groobo had created a number of well-respected … speedruns,” he said. “[People thought] there wasn’t any good reason to doubt him. In other words, there was bias in familiarity. This was a familiar character. Why would they cheat?”

Photo of Kyle Orland

Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

How Diablo hackers uncovered a speedrun scandal Read More »

what-we-know-about-amd-and-nvidia’s-imminent-midrange-gpu-launches

What we know about AMD and Nvidia’s imminent midrange GPU launches

The GeForce RTX 5090 and 5080 are both very fast graphics cards—if you can look past the possibility that we may have yet another power-connector-related overheating problem on our hands. But the vast majority of people (including you, discerning and tech-savvy Ars Technica reader) won’t be spending $1,000 or $2,000 (or $2,750 or whatever) on a new graphics card this generation.

No, statistically, you (like most people) will probably end up buying one of the more affordable midrange Nvidia or AMD cards, GPUs that are all slated to begin shipping later this month or early in March.

There has been a spate of announcements on that front this week. Nvidia announced yesterday that the GeForce RTX 5070 Ti, which the company previously introduced at CES, would be available starting on February 20 for $749 and up. The new GPU, like the RTX 5080, looks like a relatively modest upgrade from last year’s RTX 4070 Ti Super. But it ought to at least flirt with affordability for people who are looking to get natively rendered 4K without automatically needing to enable DLSS upscaling to get playable frame rates.

RTX 5070 Ti RTX 4070 Ti Super RTX 5070 RTX 4070 Super
CUDA Cores 8,960 8,448 6,144 7,168
Boost Clock 2,452 MHz 2,610 MHz 2,512 MHz 2,475 MHz
Memory Bus Width 256-bit 256-bit 192-bit 192-bit
Memory Bandwidth 896 GB/s 672 GB/s 672 GB/s 504 GB/s
Memory size 16GB GDDR7 16GB GDDR6X 12GB GDDR7 12GB GDDR6X
TGP 300 W 285 W 250 W 220 W

That said, if the launches of the 5090 and 5080 are anything to go by, it may not be easy to find and buy the RTX 5070 Ti for anything close to the listed retail price; early retail listings are not promising on this front. You’ll also be relying exclusively on Nvidia’s partners to deliver unadorned, relatively minimalist MSRP versions of the cards since Nvidia isn’t making a Founders Edition version.

As for the $549 RTX 5070, Nvidia’s website says it’s launching on March 5. But it’s less exciting than the other 50-series cards because it has fewer CUDA cores than the outgoing RTX 4070 Super, leaving it even more reliant on AI-generated frames to improve performance compared to the last generation.

What we know about AMD and Nvidia’s imminent midrange GPU launches Read More »

handful-of-users-claim-new-nvidia-gpus-are-melting-power-cables-again

Handful of users claim new Nvidia GPUs are melting power cables again

The 12VHPWR and 12V-2×6 connectors are both designed to solve a real problem: delivering hundreds of watts of power to high-end GPUs over a single cable rather than trying to fit multiple 8-pin power connectors onto these GPUs. In theory, swapping two to four 8-pin connectors for a single 12V-2×6 or 12VHPWR connector cuts down on the amount of board space OEMs must reserve for these connectors in their designs and the number of cables that users have to snake through the inside of their gaming PCs.

But while Nvidia, Intel, AMD, Qualcomm, Arm, and other companies are all PCI-SIG members and all had a hand in the design of the new standards, Nvidia is the only GPU company to use the 12VHPWR and 12V-2×6 connectors in most of its GPUs. AMD and Intel have continued to use the 8-pin power connector, and even some of Nvidia’s partners have stuck with 8-pin connectors for lower-end, lower-power cards like the RTX 4060 and 4070 series.

Both of the reported 5090 incidents involved third-party cables, one from custom PC part manufacturer MODDIY and one included with an FSP power supply, rather than the first-party 8-pin adapter that Nvidia supplies with GeForce GPUs. It’s much too early to say whether these cables (or Nvidia, or the design of the connector, or the affected users) caused the problem or whether this was just a coincidence.

We’ve contacted Nvidia to see whether it’s aware of and investigating the reports and will update this piece if we receive a response.

Handful of users claim new Nvidia GPUs are melting power cables again Read More »

dragonsweeper-is-my-favorite-game-of-2025-(so-far)

Dragonsweeper is my favorite game of 2025 (so far)

While writing a wide-ranging history of Windows Minesweeper for Boss Fight Books in 2023, I ended up playing many variations of Microsoft’s beloved original game. Those include versions with hexagonal tiles, versions with weird board shapes, and versions that extend Minesweeper into four dimensions or more, to name just a few.

Almost all these variants messed a little too much with the careful balance of simplicity, readability, reasoning, and luck that made the original Minesweeper so addictive. None of them became games I return to day after day.

But then I stumbled onto Dragonsweeper, a free browser-based game that indie developer Daniel Benmergui released unceremoniously on itch.io last month. In the weeks since I discovered it, the game has become my latest puzzle obsession, filling in a worrying proportion of my spare moments with its addictive, simple RPG-tinged take on the Minesweeper formula.

Exploresweeper

Like Minesweeper before it, Dragonsweeper is a game about deducing hidden information based on the limited information you can already see on the grid. But the numbers you reveal in Dragonsweeper don’t simply tell you the number of threats on adjacent squares. Instead, the “numbers are sum of monster power,” as the game’s cryptic “Monsternomicon” explains. So a revealed square with a “14” could suggest two 7-power devils nearby or two 5-power slimes and a 4-power ogre, or even seven 2-power bats in a particularly weird randomized arrangement.

Destroying those monsters means eating into your avatar Jorge’s health total, which is prominently displayed in the bottom-left corner. Jorge’s health can safely go down to zero hearts without dying—which feels a bit counter-intuitive at first—and can be restored by using discovered health potions or by leveling up with gold accumulated from downed monsters and items. If you can level up enough without dying, you’ll have the health necessary to defeat the titular dragon sitting in the middle of the board and win the game.

Dragonsweeper is my favorite game of 2025 (so far) Read More »

punch-out’s-mike-tyson-has-been-defeated-in-under-two-minutes-for-the-first-time

Punch-Out’s Mike Tyson has been defeated in under two minutes for the first time

Bismuth explains the unreasonable luck needed for a record-setting Tyson fight at around the 56: 30 mark in this 2024 video.

Summoning Salt says Tyson here gave him a “perfect pattern” during his first phase of endless uppercuts, something that happens only 1 in 1,600 bouts. And later in the fight, the game’s random-number generator cooperated by adding only an extra 16 frames of delay (~0.8 in-game seconds) compared to a “perfect” run. Combined, Summoning Salt estimates that Tyson will only punch this quickly once every 7,000 to 10,000 attempts.

“It’s over,” Summoning Salt said live on Twitch when the record-setting match was finished, in a surprisingly even tone that came over what sounds very much like a dropped controller. “I thought I’d be a lot more excited about this. Holy shit, dude! It’s fucking over… Dude, am I dreaming right now? … I’m sorry I’m so quiet. I’m kind of in shock right now that that just happened.”

Where do we go from here?

With his near-perfect combination of both skill and luck, Summoning Salt’s new record surpasses his own previous world record of precisely 2: 00.00 on the in-game clock. That mark, set just eight months ago, was just three frames off of displaying 1: 59 on the in-game timer for the first time.

Summoning Salt was also the first runner to break the 2: 01 barrier on Tyson in 2020, a feat he has since replicated just 15 times over tens of thousands of attempts. “There’s essentially no difference between all of those [2:00.xx] fights and this one, except I got better luck from Tyson on this fight,” he writes. “Finally, after nearly half a decade, the 1: 59 has happened.”

Summoning Salt discusses the difficulty of beating 2: 13 on Tyson in 2020, months before setting a then-record time of 2: 00 himself

Ironically, just before posting his first 2: 00.xx fight in 2020, Summoning Salt posted a video discussing in part just how difficult it was for speedrunners to beat Matt Turk’s 2007 record of 2: 13 on Tyson. “For years it was just this impossibly fast time that the top players just couldn’t get close to,” Summoning Salt said at the time. “Of course, other top players fought Tyson years later, but their best efforts came up short… they couldn’t touch it. It stood alone.”

Summoning Salt is now just over a second off of the tool-assisted speedrun record of 1: 58.61, which uses emulated gameplay to fight a theoretical “perfect” bout every time. But after spending years on what he writes “is the greatest gaming achievement I have ever accomplished,” Summoning Salt seems ready to hang up his virtual boxing gloves for good.

“I have no plans to ever improve this time,” he writes. “It will be beaten by somebody one day, likely by matching this fight and then getting better luck in phase 3. I have no interest in competing for that but am extremely proud to have gotten the first sub 2 ever on Mike Tyson.”

Punch-Out’s Mike Tyson has been defeated in under two minutes for the first time Read More »

the-sims-re-release-shows-what’s-wrong-with-big-publishers-and-single-player-games

The Sims re-release shows what’s wrong with big publishers and single-player games


Opinion: EA might be done with single-player games—but we’re not.

The Sims Steam re-release has all of the charm of the original, if you can get it working. Credit: Samuel Axon

It’s the year 2000 all over again, because I’ve just spent the past week playing The Sims, a game that could have had a resurgent zeitgeist moment if only EA, the infamous game publisher, had put enough effort in.

A few days ago, EA re-released two of its most legendary games: The Sims and The Sims 2. Dubbed the “The Legacy Collection,” these could not even be called remasters. EA just put the original games on Steam with some minor patches to make them a little more likely to work on some modern machines.

The emphasis of that sentence should be on the word “some.” Forums and Reddit threads were flooded with players saying the game either wouldn’t launch at all, crashed shortly after launch, or had debilitating graphical issues. (Patches have been happening, but there’s work to be done yet.)

Further, the releases lack basic features that are standard for virtually all Steam releases now, like achievements or Steam Cloud support.

It took me a bit of time to get it working myself, but I got there, and my time with the game has reminded me of two things. First, The Sims is a unique experience that is worthy of its lofty legacy. Second, The Sims deserved better than this lackluster re-release.

EA didn’t meet its own standard

Look, it’s fine to re-release a game without remastering it. I’m actually glad to see the game’s original assets as they always were—it’s deeply nostalgic, and there’s always a tinge of sadness when a remaster overwrites the work of the original artists. That’s not a concern here.

But if you’re going to re-release a game on Steam in 2025, there are minimum expectations—especially from a company with the resources of EA, and even more so for a game that is this important and beloved.

The game needs to reliably run on modern machines, and it needs to support basic platform features like cloud saves or achievements. It’s not much to ask, and it’s not what we got.

The Steam forums for the game are filled with people saying it’s lazy that EA didn’t include Steam Cloud support because implementing that is ostensibly as simple as picking a folder and checking a box.

I spoke with two different professional game developers this week who have previously published games on Steam, and I brought up the issue of Steam Cloud and achievement support. As they tell it, it turns out it’s not nearly as simple as those players in the forums believe—but it still should have been within EA’s capabilities, even with a crunched schedule.

Yes, it’s sometimes possible to get it working at a basic level within a couple of hours, provided you’re already using the Steamworks API. But even in that circumstance, the way a game’s saves work might require additional work to protect against lost data or frequent problems with conflicts.

Given that the game doesn’t support achievements or really anything else you’d expect, it’s possible EA didn’t use the Steamworks API at all. (Doing that would have been hours of additional work.)

A pop-up in The Sims says the sim has accidentally been transferred $500 because of a computer bug

Sadly, this is not the sort of computer bug players are encountering. Credit: Samuel Axon

I’m not giving EA a pass, though. Four years ago, EA put out the Command & Conquer Remastered Collection, a 4K upscale remaster of the original C&C games. The release featured a unified binary for the classic games, sprites and textures that were upscaled to higher resolutions, quality of life improvements, and yes, many of the Steam bells and whistles that include achievements. I’m not saying that the remaster was flawless, but it exhibited significantly more care and effort than The Sims re-release.

I love Command & Conquer. I played a lot of it when I was younger. But even a longtime C&C fan like myself can easily acknowledge that its importance in gaming history (as well as its popularity and revenue potential) pale in comparison to The Sims.

If EA could do all that for C&C, it’s all the more perplexing that it didn’t bother with a 25th-anniversary re-release of The Sims.

Single-player games, meet publicly traded companies

While we don’t have much insight into all the inner workings of EA, there are hints as to why this sort of thing is happening. For one thing, anyone who has worked for a giant corporation like this knows it’s all too easy for the objective to be passed down from above at the last minute, leaving no time or resources to see it through adequately.

But it might run deeper than that. To put it simply, publicly traded publishers like EA can’t seem to satisfy investors with single-purchase, single-player games. The emphasis on single-player releases has been decreasing for a long time, and it’s markedly less just five years after the release of the C&C remaster.

Take the recent comments from EA CEO Andrew Wilson’s post-earnings call, for example. Wilson noted that the big-budget, single-player RPG Dragon Age: The Veilguard failed to meet sales expectations—even though it was apparently one of EA’s most successful single-player Steam releases ever.

“In order to break out beyond the core audience, games need to directly connect to the evolving demands of players who increasingly seek shared-world features and deeper engagement alongside high-quality narratives in this beloved category,” he explained, suggesting that games need to be multiplayer games-as-a-service to be successful in this market.

Ironically, though, the single-player RPG Kingdom Come Deliverance 2 launched around the same time he made those comments, and that game’s developer said it made its money back in a single day of sales. It’s currently one of the top-trending games on Twitch, too.

It’s possible that Baldur’s Gate 3 director Swen Vincke hit the nail on the head when he suggested at the Game Developers Conference last year that a particular approach to pursuing quarterly profits runs counter to the practice of making good games.

“I’ve been fighting publishers my entire life, and I keep on seeing the same, same, same mistakes over and over and over,” he said. “It’s always the quarterly profits. The only thing that matters are the numbers.”

Later on X, he clarified who he was pointing a finger at: “This message was for those who try to double their revenue year after year. You don’t have to do that. Build more slowly and make your aim improving the state of the art, not squeezing out the last drop.”

In light of Wilson’s comments, it’s a fair guess that EA might not have put in much effort on The Sims re-releases simply because of a belief that single-player games that aren’t “shared world experiences” just aren’t worth the resources anymore, given the company’s need to satisfy shareholders with perpetual revenue growth.

Despite all this, The Sims is worth a look

It’s telling that in a market with too many options, I still put the effort in to get the game working, and I spent multiple evenings this week immersed in the lives of my sims.

Even after 25 years, this game is unique. It has the emergent wackiness of something like RimWorld or Dwarf Fortress, but it has a fast-acting, addictive hook and is easy to learn. There have been other games besides The Sims that are highly productive engines for original player stories, but few have achieved these heights while remaining accessible to virtually everyone.

Like so many of the best games, it’s hard to stop playing once you start. There’s always one more task you want to complete—or you’re about to walk away when something hilariously unexpected happens.

The problems I had getting The Sims to run aren’t that much worse than what I surely experienced on my PC back in 2002—it’s just that the standards are a lot higher now.

I’ve gotten $20 out of value out of the purchase, despite my gripes. But it’s not just about my experience. More broadly, The Sims deserved better. It could have had a moment back in the cultural zeitgeist, with tens of thousands of Twitch viewers.

Missed opportunities

The moment seems perfect: The world is stressful, so people want nostalgia. Cozy games are ascendant. Sandbox designs are making a comeback. The Sims slots smoothly into all of that.

But go to those Twitch streams, and you’ll see a lot of complaining about how the game didn’t really get everything it deserved and a sentiment that whatever moment EA was hoping for was undermined by this lack of commitment.

Instead, the cozy game du jour on Twitch is the Animal Crossing-like Hello Kitty Island Adventure, a former Apple Arcade exclusive that made its way to Steam recently. To be clear, I’m not knocking Hello Kitty Island Adventure; it’s a great game for fans of the modern cozy genre, and I’m delighted to see an indie studio seeing so much success.

A screenshot of the Twitch page for Hello Kitty

The cozy game of the week is Hello Kitty Island Adventures, not The Sims. Credit: Samuel Axon

The takeaway is that we can’t look to big publishers like EA to follow through on delivering quality single-player experiences anymore. It’s the indies that’ll carry that forward.

It’s just a bummer for fans that The Sims couldn’t have the revival moment it should have gotten.

Photo of Samuel Axon

Samuel Axon is a senior editor at Ars Technica. He covers Apple, software development, gaming, AI, entertainment, and mixed reality. He has been writing about gaming and technology for nearly two decades at Engadget, PC World, Mashable, Vice, Polygon, Wired, and others. He previously ran a marketing and PR agency in the gaming industry, led editorial for the TV network CBS, and worked on social media marketing strategy for Samsung Mobile at the creative agency SPCSHP. He also is an independent software and game developer for iOS, Windows, and other platforms, and he is a graduate of DePaul University, where he studied interactive media and software development.

The Sims re-release shows what’s wrong with big publishers and single-player games Read More »