gaming

stalker-2-has-been-enjoyable-jank,-but-it’s-also-getting-rapidly-fixed

Stalker 2 has been enjoyable jank, but it’s also getting rapidly fixed

When the impossibly punctuated S.T.A.L.K.E.R. 2: Heart of Chernobyl released on November 20, after many delays (that included the Russian invasion of the developer’s native Ukraine), it seemed like it could have used even more delays.

Stalker 2 had big performance issues and game-breaking bugs, along with balance and difficulty spike issues. Some things that seem “wrong” in the game are just going to stay that way. The first-person survival/shooter series has always had a certain wobbly, wild feel to it. This expresses itself in both the game world, where a major villain can off themselves by walking through a window, and in the tech stack, where broken save games, DIY optimization, and other unmet needs have created thriving mod scenes.

Developer GSC Game World has been steadfastly patching the game since its release, and the latest one should nudge the needle a bit from “busted” to “charmingly wonky.” Amid the “Over 1,800 fixes and adjustments” in Patch 1.1, the big changes are to “A-Life.” In porting Stalker 2 to Unreal Engine 5, the developer faced a challenge in getting this global AI management system working, but it’s showing its weird self again.

A-Life, as detailed by Destructoid, is the idea that “the characters in the game live their own lives and exist all the time, not only when they are in the player’s field of view.” In a certain radius around the player, events happen “online,” in real time, such that you could stumble upon them. Farther out, things are still happening, and non-player characters (NPCs) are trekking about, but on an “offline,” almost turn-based, less resource-intensive schedule. Modders have had quite a bit of fun tweaking A-life in prior versions of Stalker 2.

With the latest patch, the weirdly engaging feel that the world goes on without you returns. There will be more NPCs visible, NPCs out of range will pursue their “goals,” and a more diverse range of human factions, mutants, and weirdos will exist. Perhaps most intriguingly, an issue where “Human NPCs didn’t satisfy their communication needs and talks” is fixed. If only that could be patched for most of us player characters here in the real world.

Stalker 2 has been enjoyable jank, but it’s also getting rapidly fixed Read More »

the-backbone-one-would-be-an-ideal-game-controller—if-the-iphone-had-more-games

The Backbone One would be an ideal game controller—if the iPhone had more games


It works well, but there still aren’t enough modern, console-style games.

The Backbone One attachable game controller for the iPhone.

In theory, it ought to be as good a time as ever to be a gamer on the iPhone.

Classic console emulators have rolled out to the platform for the first time, and they work great. There are strong libraries of non-skeezy mobile games on Apple Arcade and Netflix Games, streaming via Xbox and PlayStation services is continuing apace, and there are even a few AAA console games now running natively on the platform, like Assassin’s Creed and Resident Evil titles.

Some of those games need a traditional, dual-stick game controller to work well, though, and Apple bafflingly offers no first-party solution for this.

Yes, you can sync popular Bluetooth controllers from Sony, Microsoft, Nintendo, and 8bitdo with your iPhone, but that’s not really the ideal answer—your iPhone isn’t a big TV sitting across the room or a computer monitor propped up on your desk.

A few companies have jumped in to solve this with attachable controllers that give the iPhone a Nintendo Switch or Steam Deck-like form factor (albeit a lot smaller). There’s a wide range of quality, though, and some of the ones you’ll see advertised aren’t very well made.

There’s some debate out there, but there’s one that just about anyone will at least put up for consideration: the Backbone One. That’s the one I picked for my new iPhone 16 Pro Max, which I have loaded with emulators and tons of games.

Since many folks are about to get iPhone 16s for the holidays and might be in the market for something similar, I figured it’d be a good time to write some quick impressions, including pros and cons. Is this thing worth a $99 price tag? What about its subscription-based app?

Switching from the Razer Kishi

Here’s some background, real quick: I previously owned an iPhone 13 Pro, and I played a lot of Diablo Immortal. I wanted to try the controller experience with that game, so I bought a first-generation Razer Kishi—which I liked for the most part. It had excellent thumbsticks that felt similar to those you’d find on an Xbox controller, if a little bit softer.

That said, its design involved a back that loosened up and flexed to fit different kinds of phones, but I found it annoying to take on or off because it immediately crumbled into a folding mess. The big issue that made me go with something else, though, was that the controller worked with a Lightning port, and my new iPhone traded that for USB-C. That’s a good change, overall, but it did mean I had to replace some things.

The Kishi I had is now discontinued, and it’s been replaced with the Kishi V2, which looks… less appealing to me. That’s because it ditches those Xbox-like sticks for ones more similar to what you see with a Nintendo Switch. There’s less range of motion and less stability.

The Razer Kishi V2 (top) and Razer Kishi V1 (bottom). I had the V1. Credit: Ars Technica

The Backbone One has similar drawbacks, but I was drawn to the Backbone as an alternative partly because I had enough complaints about the Kishi that I wanted to roll the dice on something new. I also wanted a change because there’s a version with PlayStation button symbols—and I planned to primarily play PS1 games in an emulator as well as stream PS5 games to the device instead of a PlayStation Portal.

Solid hardware

One of my big complaints about the first-generation Kishi (the folding and flimsy back) isn’t an issue with the Backbone One. It’s right there in the name: This accessory has a sturdy plastic backbone that keeps things nice and stable.

The PlayStation version I got has face buttons and a directional pad that seem like good miniature counterparts to the buttons on Sony’s DualSense controller. The triggers and sticks offer much shallower and less precise control than the DualSense, though—they closely resemble the triggers and sticks on the Nintendo Switch Joy-Cons.

A Backbone One and a DualSense controller side-by-side

This version of the Backbone One adopts some styling from Sony’s DualSense PS5 controller. Credit: Samuel Axon

I feel that’s a big downside. It’s fine for some games, but if you’re playing any game built around quickly and accurately aiming in a 3D environment, you’ll feel the downgrade compared to a real controller.

The product feels quite sturdy to hold and use, and it doesn’t seem likely to break anytime soon. The only thing that bugs me on that front is that the placement of the USB-C port for connecting to the phone is in a place where it takes enough force to insert or remove it that I’m worried about wear and tear on the ports on either my phone or the controller. Time will tell on that front.

There’s an app, but…

The Backbone One is not just a hardware product, even though I think it’d be a perfectly good product without any kind of software or service component.

There is a Backbone app that closely resembles the PlayStation 5’s home screen interface (this is not just for the PlayStation version of the controller, to be clear). It offers a horizontally scrolling list of games from multiple sources like streaming services, mobile game subscription services, or just what’s installed on your device. It also includes voice chat, multiplayer lobbies, streaming to Twitch, and content like video highlights from games.

A screenshot showing a scrollable list of games

The Backbone One app collects games from different sources into one browsing interface. Credit: Samuel Axon

Unfortunately, all this requires a $40 annual subscription after a one-month trial. The good news is that you don’t have to pay for the Backbone One’s subscription service to use it as a controller with your games and emulators.

I don’t think anyone anywhere was asking for a subscription-based app for their mobile game controller. The fact that one is offered proves two things. First, it shows you just how niche this kind of product still is (and transitively, the current state of playing traditional, console-style games on iPhone) that the company that made it felt this was necessary to make a sufficient amount of money.

Second, it shows how much work Apple still needs to do to bake these features into the OS to make iOS/iPadOS a platform that is competitive with offerings from Sony, Microsoft, or even Nintendo in terms of appeal for core rather than casual gamers. That involves more than just porting a few AAA titles.

The state of iPhone gaming

The Backbone One is a nice piece of hardware, but many games you might be excited to play with it are better played elsewhere or with something else.

Hit games with controller support like Genshin Impact, Call of Duty Mobile, and Infinity Nikki all have excellent touch-based control schemes, making using a gamepad simply a matter of preference rather than a requirement.

While Apple is working with publishers like Capcom and Ubisoft to bring some hardcore console titles to the platform, that all still seems like just dipping toes in the water at this point, because they’re such a tiny slice of what’s on offer for PlayStation, Xbox, PC, or even Nintendo Switch players.

In theory, AAA game developers should be excited at the prospect of having iPhone players as a market—the install base of the iPhone absolutely dwarfs all home and handheld consoles combined. But they’re facing two barriers. The first is a chicken-and-egg problem: Only the most recent iPhones (iPhone 15 Pro and the iPhone 16 series) have supported those console AAA titles, and it will take a few years before most iPhone owners catch up.

A Backbone One attached to an iphone 16 Pro Max with the RetroArch main menu on its screen

Emulators like RetroArch (seen here running on an iPhone 16 Pro Max) are the main use case of the Backbone One. Credit: Samuel Axon

The second is that modern AAA games are immensely expensive to produce, and they (thankfully) don’t typically have robust enough in-game monetization paths to be distributed for free. That means that to profit and not cannibalize console and PC sales, publishers need to sell games for much higher up-front costs than mobile players are accustomed to.

So if mobile-first hardcore games are best played with touchscreens, and gamepad-first console games haven’t hit their stride on the platform yet, what’s the point of spending $100 on a Backbone One?

The answer is emulators, for both classic and homebrew games. For that, I’ve been pleased with the Backbone One. But if your goal is to play modern games, the time still hasn’t quite come.

Photo of Samuel Axon

Samuel Axon is a senior editor at Ars Technica. He covers Apple, software development, gaming, AI, entertainment, and mixed reality. He has been writing about gaming and technology for nearly two decades at Engadget, PC World, Mashable, Vice, Polygon, Wired, and others. He previously ran a marketing and PR agency in the gaming industry, led editorial for the TV network CBS, and worked on social media marketing strategy for Samsung Mobile at the creative agency SPCSHP. He also is an independent software and game developer for iOS, Windows, and other platforms, and he is a graduate of DePaul University, where he studied interactive media and software development.

The Backbone One would be an ideal game controller—if the iPhone had more games Read More »

the-$700-price-tag-isn’t-hurting-ps5-pro’s-early-sales

The $700 price tag isn’t hurting PS5 Pro’s early sales

When Sony revealed the PlayStation 5 Pro a few months ago, some wondered just how many people would be willing to spend $700 for a marginal upgrade to the already quite powerful graphical performance of the PS5. Now, initial sales reports suggest there’s still a substantial portion of the console market that’s willing to shell out serious money for top-of-the-line console graphics.

Circana analyst Matt Piscatella shared on Bluesky this morning that the PS5 Pro accounted for a full 19 percent of US PS5 sales in its launch month of November. That sales ratio puts initial upgrade interest in the PS5 Pro roughly in line with lifetime interest in the PS4 Pro, which recent reports suggest was responsible for about 20 percent of all PS4 sales following its launch in 2016.

That US sales ratio also lines up with international sales reports for the PS5 Pro launch. In the UK, GfK ChartTrack reports that the PS5 Pro was responsible for 26 percent of all console sales for November. And in Japan, Famitsu sales data suggests the PS5 Pro was responsible for a full 63 percent of the PS5’s November sales after selling an impressive 78,000 units in its launch week alone.

Shut up and take my money

In the US, raw unit sales for the PS5 Pro were down slightly (12 percent) compared to those for the PS4 Pro’s launch month in November 2016, Piscatella writes. But the PS5 Pro still managed to bring in 50 percent more total US revenue in its launch month, owing to the PS4 Pro’s much more reasonable $400 launch price (or $533 in 2024 dollars).

The $700 price tag isn’t hurting PS5 Pro’s early sales Read More »

nvidia’s-new-app-is-causing-large-frame-rate-dips-in-many-games

Nvidia’s new app is causing large frame rate dips in many games

When Nvidia replaced the longstanding GeForce Experience App with a new, unified Nvidia App last month, most GPU owners probably noted the refresh and rebranding with nothing more than bemusement (though the new lack of an account login requirement was a nice improvement). Now, testing shows that running the new app with default settings can lead to some significant frame rate dips on many high-end games, even when the app’s advanced AI features aren’t being actively used.

Tom’s Hardware noted the performance dip after reading reports of related problems around the web. The site’s testing with and without the Nvidia App installed confirms that, across five games running on an RTX 4060, the app reduced average frame rates by around 3 to 6 percent, depending on the resolution and graphical quality level.

The site’s measured frame rate drop peaked at 12 percent for Assassin’s Creed Mirage running at 1080p Ultra settings; other tested games (including Baldur’s Gate 3, Black Myth: Wukong, Flight Simulator 2024, and Stalker 2) showed a smaller drop at most settings.

Unfiltered

This is a significant performance impact for an app that simply runs quietly in the background for most users. The impact is roughly comparable to that of going from a top-of-the-line RTX 4070 Ti Super to an older RTX 4070 Ti or 4070 Super, based on our earlier testing of those cards.

Nvidia’s new app is causing large frame rate dips in many games Read More »

nvidia-partners-leak-next-gen-rtx-50-series-gpus,-including-a-32gb-5090

Nvidia partners leak next-gen RTX 50-series GPUs, including a 32GB 5090

Rumors have suggested that Nvidia will be taking the wraps off of some next-generation RTX 50-series graphics cards at CES in January. And as we get closer to that date, Nvidia’s partners and some of the PC makers have begun to inadvertently leak details of the cards.

According to recent leaks from both Zotac and Acer, it looks like Nvidia is planning to announce four new GPUs next month, all at the high end of its lineup: The RTX 5090, RTX 5080, RTX 5070 Ti, and RTX 5070 were all briefly listed on Zotac’s website, as spotted by VideoCardz. There’s also an RTX 5090D variant for the Chinese market, which will presumably have its specs tweaked to conform with current US export restrictions on high-performance GPUs.

Though the website leak didn’t confirm many specs, it did list the RTX 5090 as including 32GB of GDDR7, an upgrade from the 4090’s 24GB of GDDR6X. An Acer spec sheet for new Predator Orion desktops also lists 32GB of GDDR7 for the 4090, as well as 16GB of GDDR7 for the RTX 5080. This is the same amount of RAM included with the RTX 4080 and 4080 Super.

The 5090 will be a big deal when it launches because no graphics card released since October 2022 has come close to beating the 4090’s performance. Nvidia’s early 2024 Super refresh for some 40-series cards didn’t include a 4090 Super, and AMD’s flagship RX 7900 XTX card is more comfortable competing with the likes of the 4080 and 4080 Super. The 5090 isn’t a card that most people are going to buy, but for the performance-obsessed, it’s the first high-end performance upgrade the GPU market has seen in more than two years.

Nvidia partners leak next-gen RTX 50-series GPUs, including a 32GB 5090 Read More »

ps-placeable:-the-adorable-mod-that-turns-a-playstation-portable-into-a-console

PS Placeable: The adorable mod that turns a PlayStation Portable into a console

When Sony launched the PlayStation Portable almost exactly 20 years ago, the value proposition was right there in the name: a PlayStation, but portable. But now modders have flipped that, introducing a PSP that can be played on a TV, console-style, and they’ve dubbed it the PS Placeable.

It’s a non-destructive mod to PSP-2000 and PSP-3000 systems that allows you to play PSP games on the TV off the original UMD physical media format, with a wireless controller like the PlayStation 4’s DualShock 4—all wrapped in a miniature, PlayStation 2-like enclosure.

Let’s be frank: One of the main reasons this thing gets special attention here is that its look is both clever and, well, kind of adorable. The miniaturization of the retro styling of the PlayStation 2 is a nice touch.

Of course, there have long been other ways to play some PSP games on the big screen—but there has always been one downside or another.

For example, you could connect the original PSP to a TV with convoluted cables, but you would then have to use that tethered handheld as your controller.

Much later, the PlayStation TV set-top box made by Sony itself was essentially a console-style take on the PlayStation Vita, and like the Vita, it could play numerous classic PSP games—plus, it supported wireless controllers—but it didn’t support most PSP games, and it only worked with those downloaded through Sony’s digital store.

PS Placeable: The adorable mod that turns a PlayStation Portable into a console Read More »

intel-arc-b580-review:-a-$249-rtx-4060-killer,-one-and-a-half-years-later

Intel Arc B580 review: A $249 RTX 4060 killer, one-and-a-half years later


Intel has solved the biggest problems with its Arc GPUs, but not the timing.

Intel’s Arc B580 design doesn’t include LEDs or other frills, but it’s a clean-looking design. Credit: Andrew Cunningham

Intel’s Arc B580 design doesn’t include LEDs or other frills, but it’s a clean-looking design. Credit: Andrew Cunningham

Intel doesn’t have a ton to show for its dedicated GPU efforts yet.

After much anticipation, many delays, and an anticipatory apology tour for its software quality, Intel launched its first Arc GPUs at the end of 2022. There were things to like about the A770 and A750, but buggy drivers, poor performance in older games, and relatively high power use made them difficult to recommend. They were more notable as curiosities than as consumer graphics cards.

The result, after more than two years on the market, is that Arc GPUs remain a statistical nonentity in the GPU market, according to analysts and the Steam Hardware Survey. But it was always going to take time—and probably a couple of hardware generations—for Intel to make meaningful headway against entrenched competitors.

Intel’s reference design is pretty by the book, with two fans, a single 8-pin power connector, and a long heatsink and fan shroud that extends several inches beyond the end of the PCB. Andrew Cunningham

The new Arc B580 card, the first dedicated GPU based on the new “Battlemage” architecture, launches into the exact same “sub-$300 value-for-money” graphics card segment that the A770 and A750 are already stuck in. But it’s a major improvement over those cards in just about every way, and Intel has gone a long way toward fixing drivers and other issues that plagued the first Arc cards at launch. If nothing else, the B580 suggests that Intel has some staying power and that the B700-series GPUs could be genuinely exciting if Intel can get one out relatively soon.

Specs and testbed notes

Specs for the Arc B580 and B570. Credit: Intel

The Arc B580 and Arc B570 lead the charge for the Battlemage generation. Both are based on the same GPU silicon, but the B580 has a few more execution resources, slightly higher clock speeds, a 192-bit memory bus instead of 160-bit, and 12GB of memory instead of 10GB.

Intel positions both cards as entry-level 1440p options because they have a bit more RAM than the 8GB baseline of the GeForce RTX 4060 and Radeon RX 7600. These 8GB cards are still generally fine at 1080p, but more memory does make the Arc cards feel a little more future-proof, especially since they’re fast enough to actually hit 60 fps in a lot of games at 1440p.

Our testbed remains largely the same as it has been for a while, though we’ve swapped the ASRock X670E board for an Asus model. The Ryzen 7 7800X3D remains the heart of the system, with more than enough performance to avoid bottlenecking midrange and high-end GPUs.

We haven’t done extensive re-testing of most older GPUs—the GeForce and Radeon numbers here are the same ones we used in the RX 7600 XT review earlier this year. We wouldn’t expect new drivers to change the scores in our games much since they’re mostly a bit older—we still use a mix of DirectX 11 and DirectX 12 games, including a few with and without ray-tracing effects enabled. We have re-tested the older Arc cards with recent drivers since Intel does still occasionally make changes that can have a noticeable impact on older games.

As with the Arc A-series cards, Intel emphatically recommends that resizable BAR be enabled for your motherboard to get optimal performance. This is sometimes called Smart Access Memory or SAM, depending on your board; most AMD AM4 and 8th-gen Intel Core systems should support it after a BIOS update, and newer PCs should mostly have it on by default. Our test system had it enabled for the B580 and for all the other GPUs we tested.

Performance and power

As a competitor to the RTX 4060, the Arc B580 is actually pretty appealing, whether you’re talking about 1080p or 1440p, in games with ray-tracing on or off. Even older DirectX 11 titles in our suite, like Grand Theft Auto V and Assassin’s Creed Odyssey, don’t seem to take the same performance hit as they did on older Arc cards.

Intel is essentially making a slightly stronger version of the argument that AMD has been trying to make with the RX 7600. AMD’s cards always come with the caveat of significantly worse performance in games with heavy ray-tracing effects, but the performance hit for Intel cards in ray-traced games looks a lot more like Nvidia’s than AMD’s. Playable ray-traced 1080p is well within reach for the Intel card, and in both Cyberpunk 2077 and Returnal, its performance came closer to the 8GB 4060 Ti’s.

The 12GB of RAM is also enough to put more space between the B580 and the 8GB versions of the 4060 and 7600. Forza Horizon 5 performs significantly better at 1440p on cards with more memory, like the B580 and the 16GB RX 7600 XT, and it’s a safe bet that the 8GB limit will become more of a factor for high-end games at higher resolutions as the years go on.

We experienced just one performance anomaly in our testing. Forza Horizon 5 actually runs a bit worse with XeSS enabled, with a smooth average frame rate but frequent stutters that make it less playable overall (though it’s worth noting that Forza Horizon 5 never benefits much from upscaling algorithms on any GPUs we’ve tested, for whatever reason). Intel also alerted us to a possible issue with Cyberpunk 2077 when enabling ray-tracing but recommended a workaround that involved pressing F1 to reset the game’s settings; the benchmark ran fine on our testbed.

GPU power consumption numbers under load. Credit: Andrew Cunningham

Power consumption is another place where the Battlemage GPU plays a lot of catch-up with Nvidia. With the caveat that software-measured power usage numbers like ours are less accurate than numbers captured with hardware tools, it looks like the B580’s power consumption, when fully loaded, consumes somewhere between 120 and 130 W in Hitman and Borderlands. This is a tad higher than the 4060, but it’s lower than either Radeon RX 7600.

It’s not the top of the class, but looking at the A750’s power consumption shows how far Intel has come—the B580 beats the A750’s performance every single time while consuming about 60 W less power.

A strong contender, a late arrival

The Intel Arc B580. Credit: Andrew Cunningham

Intel is explicitly targeting Nvidia’s GeForce RTX 4060 with the Arc B580, a role it fills well for a low starting price. But the B580 is perhaps more damaging to AMD, which positions both of its 7600-series cards (and the remaining 6600-series stuff that’s hanging around) in the same cheaper-than-Nvidia-with-caveats niche.

In fact, I’d probably recommend the B580 to a budget GPU buyer over any of the Radeon RX 7600 cards at this point. For the same street price as the RX 7600, Intel is providing better performance in most games and much better performance in ray-traced games. The 16GB 7600 XT has more RAM, but it’s $90 to $100 more expensive, and a 12GB card is still reasonably future-proof and decent at 1440p.

All of that said, Intel is putting out a great competitor to the RTX 4060 and RX 7600 a year and a half after those cards both launched—and within just a few months of a possible RTX 5060. Intel is selling mid-2023’s midrange GPU performance in late 2024. There are actually good arguments for building a budget gaming PC right this minute, before potential Trump-administration tariffs can affect prices or supply chains, but assuming the tech industry can maintain its normal patterns, it would be smartest to wait and see what Nvidia does next.

Nvidia also has some important structural benefits. DLSS upscaling support is nearly ubiquitous in high-end games, Nvidia’s drivers are more battle-tested, and it’s extremely unlikely that Nvidia will decide to pull out of the GPU market and stop driver development any time soon (Intel has published a roadmap encompassing multiple GPU generations, which is reassuring, but the company’s recent financial distress has seen it shed several money-losing hobby projects).

If there’s a saving grace for Intel and the B580, it’s that Nvidia has signaled, both through its statements and its behavior, that it’s mostly uninterested in aggressively lowering GPU prices, either over time (Nvidia GPUs tend not to stray far from MSRP, barring supply issues) or between generations. An RTX 5060 is highly unlikely to be cheaper than a 4060 and could easily be more expensive. Depending on how good a hypothetical RTX 5060 is, Intel still has a lot of room to offer good performance for the price in a $200-to-$250-ish GPU market that doesn’t get a ton of attention.

The other issue for Intel is that for a second straight GPU generation, the company is launching late with a part that is forced by its performance to play in a budget-oriented, low-margin area of the GPU market. I don’t think I’m expecting a 4090 or 5090-killer out of Intel any time soon, but based on the B580, I’m at least a little optimistic that Intel can offer a B700-series card that can credibly compete with the likes of Nvidia’s 4070-series or AMD’s 7800 XT and 7900 GRE. Performance-wise, that’s the current sweet spot of the GPU market, but you’ll spend more than you would on a PS5 to buy most of those cards. If Intel can shake up that part of the business, it could help put Arc on the map.

The good

  • Solid midrange 1080p and 1440p performance at a good starting price
  • More RAM than the competition
  • Much-improved power efficiency compared to Arc A-series GPUs
  • Unlike the A-series, we noticed no outliers where performance was disproportionately bad
  • Simple, clean-looking reference design from Intel

The bad

  • Competing with cards that launched a year and a half ago
  • New Nvidia and AMD competitors are likely within a few months
  • Intel still can’t compete at the high end of the GPU market, or even the medium-high end

The ugly

  • So far, Arc cards have not been successful enough to guarantee their long-term existence

Photo of Andrew Cunningham

Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.

Intel Arc B580 review: A $249 RTX 4060 killer, one-and-a-half years later Read More »

reminder:-donate-to-win-swag-in-our-annual-charity-drive-sweepstakes

Reminder: Donate to win swag in our annual Charity Drive sweepstakes

If you’ve been too busy punching virtual Nazis to take part in this year’s Ars Technica Charity Drive sweepstakes, don’t worry. You still have time to donate to a good cause and get a chance to win your share of over $4,000 worth of swag (no purchase necessary to win).

In the first three days of the drive, over 100 readers have contributed almost $9,500 to either the Electronic Frontier Foundation or Child’s Play as part of the charity drive (Child’s Play is now leading in the donation totals by about $1,000). That’s a long way off from 2020’s record haul of over $58,000, but there’s still plenty of time until the Charity Drive wraps up on Thursday, January 2, 2025.

That doesn’t mean you should put your donation off, though. Do yourself and the charities involved a favor and give now while you’re thinking about it.

See below for instructions on how to enter, and check out the Charity Drive kickoff post for a complete list of available prizes.

How it works

Donating is easy. Simply donate to Child’s Play using a credit card or PayPal or donate to the EFF using PayPal, credit card, or cryptocurrency. You can also support Child’s Play directly by using this Ars Technica campaign page or picking an item from the Amazon wish list of a specific hospital on its donation page. Donate as much or as little as you feel comfortable with—every little bit helps.

Reminder: Donate to win swag in our annual Charity Drive sweepstakes Read More »

the-talos-principle:-reawakened-adds-new-engine,-looks,-and-content-to-a-classic

The Talos Principle: Reawakened adds new engine, looks, and content to a classic

Are humans just squishy machines? Can an artificially intelligent robot create a true moral compass for itself? Is there a best time to play The Talos Principle again?

The answer to at least one of these questions is now somewhat answered. The Talos Principle: Reawakened, due in “Early 2025,” will bundle the original critically acclaimed 2014 game, its Road to Gehenna DLC, and a new chapter, “In the Beginning,” into an effectively definitive edition. Developer commentary and a level editor will also be packed in. But most of all, the whole game has been rebuilt from the ground up in Unreal Engine 5, bringing “vastly improved visuals” and quality-of-life boosts to the game, according to publisher Devolver Digital.

Trailer for The Talos Principle: Reawakened.

Playing Reawakened, according to its Steam page requires a minimum of 8 GB of RAM, 75 GB of storage space, and something more than an Intel integrated GPU. It also recommends 16 GB RAM, something close to a GeForce 3070, and a 6–8-core CPU.

It starts off with puzzle pieces and gets a bit more complicated as you go on.

Credit: Devolver Digital

It starts off with puzzle pieces and gets a bit more complicated as you go on. Credit: Devolver Digital

The Talos Principle, from the developers of the Serious Sam series, takes its name from the bronze-made protector of Crete in Greek mythology. The gameplay has you solve a huge assortment of puzzles as a robot avatar and answer the serious philosophical questions that it ponders. You don’t shoot things or become a stealth archer, but you deal with drones, turrets, and other obstacles that require some navigation, tool use, and deeper thinking. As you progress, you learn more about what happened to the world, why you’re being challenged with these puzzles, and what choices an artificial intelligence can really make. It’s certainly not bad timing for this game to arrive once more.

If you can’t wait until the remaster, the original game and its also well-regarded sequel, The Talos Principle II, are on deep sale at the moment, both on Steam (I and II) and GOG (I and II).

The Talos Principle: Reawakened adds new engine, looks, and content to a classic Read More »

itch.io-platform-briefly-goes-down-due-to-“ai-driven”-anti-phishing-report

Itch.io platform briefly goes down due to “AI-driven” anti-phishing report

The itch.io domain was back up and running by 7 am Eastern, according to media reports, “after the registrant finally responded to our notice and took appropriate action to resolve the issue.” Users could access the site throughout if they typed the itch.io IP address into their web browser directly.

Too strong a shield?

BrandShield’s website describes it as a service that “detects and hunts online trademark infringement, counterfeit sales, and brand abuse across multiple platforms.” The company claims to have multiple Fortune 500 and FTSE100 companies on its client list.

In its own series of social media posts, BrandShield said its “AI-driven platform” had identified “an abuse of Funko… from an itch.io subdomain.” The takedown request it filed was focused on that subdomain, not the entirety of itch.io, BrandShield said.

“The temporary takedown of the website was a decision made by the service providers, not BrandShield or Funko.”

The whole affair highlights how the delicate web of domain registrars and DNS servers can remain a key failure point for web-based businesses. Back in May, we saw how the desyncing of a single DNS root server could cause problems across the entire Internet. And in 2012, the hacking collective Anonymous highlighted the potential for a coordinated attack to take down the entire DNS system.

Itch.io platform briefly goes down due to “AI-driven” anti-phishing report Read More »

indiana-jones-and-the-great-circle-is-pitch-perfect-archaeological-adventuring

Indiana Jones and the Great Circle is pitch-perfect archaeological adventuring


Review: Amazing open-world environs round out a tight, fun-filled adventure story.

No need to put Harrison Ford through the de-aging filter here! Credit: Bethesda / MachineGames

No need to put Harrison Ford through the de-aging filter here! Credit: Bethesda / MachineGames

Historically, games based on popular film or TV franchises have generally been seen as cheap cash-ins, slapping familiar characters and settings on a shovelware clone of a popular genre and counting on the license to sell enough copies to devoted fans. Indiana Jones and the Great Circle clearly has grander ambitions than that, putting a AAA budget behind a unique open-world exploration game built around stealth, melee combat, and puzzle solving.

Building such a game on top of such well-loved source material comes with plenty of challenges. The developers at MachineGames need to pay homage to the source material without resorting to the kind of slavish devotion that amounts to a mere retread of a familiar story. At the same time, any new Indy adventure carries with it the weight not just of the character’s many film and TV appearances but also well-remembered games like Indiana Jones and the Fate of Atlantis. Then there are game franchises like Tomb Raider and Uncharted, which have already put their own significant stamps on the Indiana Jones formula of action-packed, devil-may-care treasure-hunting.

No, this is not a scene from a new Uncharted game. Credit: Bethesda / MachineGames

Surprisingly, Indiana Jones and the Great Circle bears all this pressure pretty well. While the stealth-exploration gameplay and simplistic puzzles can feel a bit trite at points, the game’s excellent presentation, top-notch world-building, and fun-filled, campy storyline drive one of Indy’s most memorable adventures since the original movie trilogy.

A fun-filled adventure

The year is 1937, and Indiana Jones has already Raided a Lost Ark but has yet to investigate the Last Crusade. After a short introductory flashback that retells an interactive version of Raiders of the Lost Ark‘s famous golden idol extraction, Professor Jones gets unexpectedly drawn away from preparations for midterms when a giant of a man breaks into Marshall College’s antiquities wing and steals a lone mummified cat.

Investigating that theft takes Jones on a globetrotting tour of locations along “The Great Circle,” a ring of archaeologically significant sites around the world that house ancient artifacts rumored to hold great and mysterious power. Those rumors have attracted the attention of the Nazis (who else would you expect?), dragging Indy into a race to secure the artifacts before they threaten to alter the course of an impending world war.

You see a whip, I see a grappling hook. Credit: Bethesda / MachineGames

The game’s overarching narrative—told mainly through lengthy cut scenes that serve as the most captivating reward for in-game achievements—does a pitch-perfect job of replicating the campy, madcap, fun-filled, adventurous tone Indy is known for. The writing is full of all the pithy one-liners and cheesy puns you could hope for, as well as countless overt and subtle references to Indy movie moments that will be familiar to even casual fans.

Indy here is his usual mix of archaeological superhero and bumbling everyman. One moment, he’s using his whip and some hard-to-believe upper body strength to jump around some quickly crumbling ruins. The next, he’s avoiding death during a madcap fight scene through a combination of sheer dumb luck and overconfident opposition. The next, he’s solving ancient riddles with reams of historical techno-babble and showing a downright supernatural ability to decipher long-dead languages in an instant when the plot demands it.

You have to admit it, this circle is pretty great! Credit: Bethesda / MachineGames

It all works in large part thanks to Troy Baker’s excellent vocal performance as Jones, which he somehow pulls off as a compelling cross between Harrison Ford and Jeff Goldblum. The music does some heavy lifting in setting the tone, too; it’s full of downright cinematic stirring horns and tension-packed strings that fade in and out perfectly in sync with the on-screen action. The game even shows some great restraint in its sparing use of the famous Indiana Jones theme, which I ended up humming to myself as I played more often than I actually heard it referenced in the game’s score.

Indy quips well off of Gina, a roving reporter searching for her missing sister who serves as the obligatory love interest/globetrotting exploration partner. But the game’s best scenes all involve Emmerich Voss, the Nazi archaeologist antagonist who makes an absolute meal out of his scenery chewing. From his obsession with cranial shapes to his preening diatribes about the inferiority of American culture, Voss makes the perfect foil for Indy’s no-nonsense, homespun apple pie forthrightness.

Voss steals literally every scene he’s in. Credit: Bethesda / MachineGames

By the time the plot descends into an inevitable mess of pseudo-religious magical mysticism, it’s clear that this is a story that doesn’t take itself too seriously. You may cringe a bit at how over the top it all gets, but you’ll probably be having too much fun to care.

Take a look around

In between the cut scenes—which together could form the basis for a strong Indiana Jones-themed episodic streaming miniseries—there’s an actual interactive game to play here as well. That game primarily plays out across three decently sized maps—one urban, one desert, and one water-logged marsh—that you can explore relatively freely, broken up by shorter, more linear interludes in between.

Following the main story quests in each of these locales generally has you zigzagging across the map through a series of glorified fetch quests. Go to location A to collect some mystical doodad, then return it to unlock some fun exposition and a reason to go to location B. Repeat as necessary.

I say “point A” there, but it’s usually more accurate to say the game points you toward “circle A” on the map. Once you get there, you often have to do a bit of unguided exploring to find the hidden trinket or secret entry point you need.

Am I going in the right direction? Credit: Bethesda / MachineGames

At their best, these exploration bits made me feel more like an archaeological detective than the usual in-game tourist blindly following a waypoint from location to location. At its worst, I spent 15 minutes searching through one of these map circles before finding my in-game partner Gina standing right next to the target I was probably intended to find immediately. So it goes.

Traipsing across the map in this way slowly reveals the sizable scale of the game’s environments, which often extend beyond what’s first apparent on the map to multi-floor buildings and gigantic subterranean caverns. Unlocking and/or figuring out all of the best paths through these labyrinthine locales—which can involve climbing across rooftops or crawling through enemy barracks—is often half the fun.

As you crisscross the map, you also invariably stumble on a seemingly endless array of optional sidequests, mysteries, and “fieldwork,” which you keep track of in a dynamically updated journal. While there’s an attempt at a plot justification for each of these optional fetch quests, the ones I tried ended up being much less compelling than the main plot, which seems to have taken most of the writers’ attention.

Indiana Jones, famous Vatican tourist. Credit: Bethesda / MachineGames

As you explore, a tiny icon in the corner of the screen will also alert you to photo opportunities, which can unlock important bits of lore or context for puzzles. I thoroughly enjoyed these quick excuses to appreciate the game’s well-designed architecture and environments, even as it made Indy feel a bit more like a random tourist than a badass archaeologist hero.

Quick, hide!

Unfortunately, your ability to freely explore The Great Circle‘s environments is often hampered by large groups of roaming Nazi and/or fascist soldiers. Sometimes, you can put on a disguise to walk among them unseen, but even then, certain enemies can pick you out of the crowd, something that was not clear to me until I had already been plucked out of obscurity more than a few times.

When undisguised, you’ll spend a lot of time kneeling and sneaking silently just outside the soldiers’ vision cones or patiently waiting for them to move so you can sneak through a newly safe path. Remaining unseen also lets you silently take out enemies from behind, which includes pushing unsuspected enemy sentries off of ledges in a hilarious move that never, ever gets old.

They’ll never find me up here. Credit: Bethesda / MachineGames

When your sneaking skills fail you amid a large group of enemies, the best and easiest thing to do is immediately run and hide. For the most part, the enemies are incredibly inept in their inevitable pursuit; dodge around a couple of corners and hide in a dark alley and they’ll usually quickly lose track of you. While I appreciated that being spotted wasn’t an instant death sentence, the ease with which I could outsmart these soldiers made the sneaking a lot less tense.

If you get spotted by a group of just one or two enemy soldiers, though, it’s time for some first-person melee combat, which draws heavy inspiration from the developers’ previous work on the early ’00s Chronicles of Riddick games. These fights usually play out like the world’s most overdesigned game of Punch-Out!!—you stand there waiting for a heavily telegraphed punch to come in, at which point you throw up a quick block or dodge and then counter with a series of rapid, crunchy punches of your own. Repeat until the enemy goes down.

You can spice things up a bit here by disarming and/or unbalancing your foes with your whip or by grabbing a wide variety of nearby objects to use as improvised melee weapons. After a while, though, all the fistfights start to feel pretty rote and unmemorable. The first time you hit a Nazi upside the head with a plunger is hilarious. The fifth time is a bit tiresome.

It’s always a good time to punch a Nazi. Credit: Bethesda / MachineGames

While you can also pull out a trusty revolver to simply shoot your foes, the racket the shots make usually leads to so much unwelcome enemy attention that it’s rarely worth the trouble. Aside from a handful of obligatory sections where the game practically forces you into a shooting gallery situation, I found little need to engage in the serviceable but unexciting gun combat.

And while The Great Circle is far from a horror game, there are a few combat moments of genuine terror with foes more formidable than the average grunt. I don’t want to give away too much, but those with fear of underwater creatures, the dark, or confined spaces will find some parts of the game incredibly tense.

Not so puzzling

My favorite gameplay moments in The Great Circle were the extended sections where I didn’t have to worry about stealth or combat and could just focus on exploring massive underground ruins. These feature some of the game’s most interesting traversal challenges, where looking around and figuring out just how to make it to the next objective is engaging on its own terms. There’s little of the Uncharted-style gameplay of practically highlighting every handhold and jump with a flashing red sign.

When giant mechanical gears need placing, you know who to call! Credit: Bethesda / MachineGames

These exploratory bits are broken up by some obligatory puzzles, usually involving Indiana Jones’ trademark of unbelievably intricate ancient stone machinery. Arrange the giant stone gears so the door opens, put the right relic in the right spot, shine a light on some emblems with a few mirrors, and so on. You know the drill if you’ve played any number of similar action-adventure games, and you probably won’t be all that engaged if you know how to perform some basic logic and exploration (though snapping pictures with the in-game camera offers hints for those who get unexpectedly stuck).

But even during the least engaging puzzles or humdrum fights in The Great Circle, I was compelled forward by the promise of some intricate ruin or pithy cut scene quip to come. Like the best Indiana Jones movies, there’s a propulsive force to the game’s most exciting scenes that helps you push past any brief feelings of tedium in between. Here’s hoping we see a lot more of this version of Indiana Jones in the future.

A note on performance

Indiana Jones and the Great Circle has received some recent negative attention for having relatively beefy system requirements, including calling for GPUs that have some form of real-time ray-tracing acceleration. We tested the game on a system with an Nvidia RTX 2080 Ti and an Intel i7-8700K CPU with 32 GB of RAM, which puts it roughly between the “minimum” and “recommended” specs suggested by the publisher.

Trace those rays. Credit: Bethesda / MachineGames

Despite this, we were able to run the game at 1440p resolution and “High” graphical settings at a steady 60 fps throughout. The game did occassionally suffer some heavy frame stuttering when loading new scenes, and far-off background elements had a tendency to noticeably “pop in” when running, but otherwise, we had few complaints about the graphical performance.

Photo of Kyle Orland

Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

Indiana Jones and the Great Circle is pitch-perfect archaeological adventuring Read More »

google’s-genie-2-“world-model”-reveal-leaves-more-questions-than-answers

Google’s Genie 2 “world model” reveal leaves more questions than answers


Making a command out of your wish?

Long-term persistence, real-time interactions remain huge hurdles for AI worlds.

A sample of some of the best-looking Genie 2 worlds Google wants to show off. Credit: Google Deepmind

In March, Google showed off its first Genie AI model. After training on thousands of hours of 2D run-and-jump video games, the model could generate halfway-passable, interactive impressions of those games based on generic images or text descriptions.

Nine months later, this week’s reveal of the Genie 2 model expands that idea into the realm of fully 3D worlds, complete with controllable third- or first-person avatars. Google’s announcement talks up Genie 2’s role as a “foundational world model” that can create a fully interactive internal representation of a virtual environment. That could allow AI agents to train themselves in synthetic but realistic environments, Google says, forming an important stepping stone on the way to artificial general intelligence.

But while Genie 2 shows just how much progress Google’s Deepmind team has achieved in the last nine months, the limited public information about the model thus far leaves a lot of questions about how close we are to these foundational model worlds being useful for anything but some short but sweet demos.

How long is your memory?

Much like the original 2D Genie model, Genie 2 starts from a single image or text description and then generates subsequent frames of video based on both the previous frames and fresh input from the user (such as a movement direction or “jump”). Google says it trained on a “large-scale video dataset” to achieve this, but it doesn’t say just how much training data was necessary compared to the 30,000 hours of footage used to train the first Genie.

Short GIF demos on the Google DeepMind promotional page show Genie 2 being used to animate avatars ranging from wooden puppets to intricate robots to a boat on the water. Simple interactions shown in those GIFs demonstrate those avatars busting balloons, climbing ladders, and shooting exploding barrels without any explicit game engine describing those interactions.

Those Genie 2-generated pyramids will still be there in 30 seconds. But in five minutes? Credit: Google Deepmind

Perhaps the biggest advance claimed by Google here is Genie 2’s “long horizon memory.” This feature allows the model to remember parts of the world as they come out of view and then render them accurately as they come back into the frame based on avatar movement. This kind of persistence has proven to be a persistent problem for video generation models like Sora, which OpenAI said in February “do[es] not always yield correct changes in object state” and can develop “incoherencies… in long duration samples.”

The “long horizon” part of “long horizon memory” is perhaps a little overzealous here, though, as Genie 2 only “maintains a consistent world for up to a minute,” with “the majority of examples shown lasting [10 to 20 seconds].” Those are definitely impressive time horizons in the world of AI video consistency, but it’s pretty far from what you’d expect from any other real-time game engine. Imagine entering a town in a Skyrim-style RPG, then coming back five minutes later to find that the game engine had forgotten what that town looks like and generated a completely different town from scratch instead.

What are we prototyping, exactly?

Perhaps for this reason, Google suggests Genie 2 as it stands is less useful for creating a complete game experience and more to “rapidly prototype diverse interactive experiences” or to turn “concept art and drawings… into fully interactive environments.”

The ability to transform static “concept art” into lightly interactive “concept videos” could definitely be useful for visual artists brainstorming ideas for new game worlds. However, these kinds of AI-generated samples might be less useful for prototyping actual game designs that go beyond the visual.

On Bluesky, British game designer Sam Barlow (Silent Hill: Shattered Memories, Her Story) points out how game designers often use a process called whiteboxing to lay out the structure of a game world as simple white boxes well before the artistic vision is set. The idea, he says, is to “prove out and create a gameplay-first version of the game that we can lock so that art can come in and add expensive visuals to the structure. We build in lo-fi because it allows us to focus on these issues and iterate on them cheaply before we are too far gone to correct.”

Generating elaborate visual worlds using a model like Genie 2 before designing that underlying structure feels a bit like putting the cart before the horse. The process almost seems designed to generate generic, “asset flip”-style worlds with AI-generated visuals papered over generic interactions and architecture.

As podcaster Ryan Zhao put it on Bluesky, “The design process has gone wrong when what you need to prototype is ‘what if there was a space.'”

Gotta go fast

When Google revealed the first version of Genie earlier this year, it also published a detailed research paper outlining the specific steps taken behind the scenes to train the model and how that model generated interactive videos. No such research paper has been published detailing Genie 2’s process, leaving us guessing at some important details.

One of the most important of these details is model speed. The first Genie model generated its world at roughly one frame per second, a rate that was orders of magnitude slower than would be tolerably playable in real time. For Genie 2, Google only says that “the samples in this blog post are generated by an undistilled base model, to show what is possible. We can play a distilled version in real-time with a reduction in quality of the outputs.”

Reading between the lines, it sounds like the full version of Genie 2 operates at something well below the real-time interactions implied by those flashy GIFs. It’s unclear how much “reduction in quality” is necessary to get a diluted version of the model to real-time controls, but given the lack of examples presented by Google, we have to assume that reduction is significant.

Oasis’ AI-generated Minecraft clone shows great potential, but still has a lot of rough edges, so to speak. Credit: Oasis

Real-time, interactive AI video generation isn’t exactly a pipe dream. Earlier this year, AI model maker Decart and hardware maker Etched published the Oasis model, showing off a human-controllable, AI-generated video clone of Minecraft that runs at a full 20 frames per second. However, that 500 million parameter model was trained on millions of hours of footage of a single, relatively simple game, and focused exclusively on the limited set of actions and environmental designs inherent to that game.

When Oasis launched, its creators fully admitted the model “struggles with domain generalization,” showing how “realistic” starting scenes had to be reduced to simplistic Minecraft blocks to achieve good results. And even with those limitations, it’s not hard to find footage of Oasis degenerating into horrifying nightmare fuel after just a few minutes of play.

What started as a realistic-looking soldier in this Genie 2 demo degenerates into this blobby mess just seconds later. Credit: Google Deepmind

We can already see similar signs of degeneration in the extremely short GIFs shared by the Genie team, such as an avatar’s dream-like fuzz during high-speed movement or NPCs that quickly fade into undifferentiated blobs at a short distance. That’s not a great sign for a model whose “long memory horizon” is supposed to be a key feature.

A learning crèche for other AI agents?

From this image, Genie 2 could generate a useful training environment for an AI agent and a simple “pick a door” task. Credit: Google Deepmind

Genie 2 seems to be using individual game frames as the basis for the animations in its model. But it also seems able to infer some basic information about the objects in those frames and craft interactions with those objects in the way a game engine might.

Google’s blog post shows how a SIMA agent inserted into a Genie 2 scene can follow simple instructions like “enter the red door” or “enter the blue door,” controlling the avatar via simple keyboard and mouse inputs. That could potentially make Genie 2 environment a great test bed for AI agents in various synthetic worlds.

Google claims rather grandiosely that Genie 2 puts it on “the path to solving a structural problem of training embodied agents safely while achieving the breadth and generality required to progress towards [artificial general intelligence].” Whether or not that ends up being true, recent research shows that agent learning gained from foundational models can be effectively applied to real-world robotics.

Using this kind of AI model to create worlds for other AI models to learn in might be the ultimate use case for this kind of technology. But when it comes to the dream of an AI model that can create generic 3D worlds that a human player could explore in real time, we might not be as close as it seems.

Photo of Kyle Orland

Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

Google’s Genie 2 “world model” reveal leaves more questions than answers Read More »