gaming

ps-placeable:-the-adorable-mod-that-turns-a-playstation-portable-into-a-console

PS Placeable: The adorable mod that turns a PlayStation Portable into a console

When Sony launched the PlayStation Portable almost exactly 20 years ago, the value proposition was right there in the name: a PlayStation, but portable. But now modders have flipped that, introducing a PSP that can be played on a TV, console-style, and they’ve dubbed it the PS Placeable.

It’s a non-destructive mod to PSP-2000 and PSP-3000 systems that allows you to play PSP games on the TV off the original UMD physical media format, with a wireless controller like the PlayStation 4’s DualShock 4—all wrapped in a miniature, PlayStation 2-like enclosure.

Let’s be frank: One of the main reasons this thing gets special attention here is that its look is both clever and, well, kind of adorable. The miniaturization of the retro styling of the PlayStation 2 is a nice touch.

Of course, there have long been other ways to play some PSP games on the big screen—but there has always been one downside or another.

For example, you could connect the original PSP to a TV with convoluted cables, but you would then have to use that tethered handheld as your controller.

Much later, the PlayStation TV set-top box made by Sony itself was essentially a console-style take on the PlayStation Vita, and like the Vita, it could play numerous classic PSP games—plus, it supported wireless controllers—but it didn’t support most PSP games, and it only worked with those downloaded through Sony’s digital store.

PS Placeable: The adorable mod that turns a PlayStation Portable into a console Read More »

intel-arc-b580-review:-a-$249-rtx-4060-killer,-one-and-a-half-years-later

Intel Arc B580 review: A $249 RTX 4060 killer, one-and-a-half years later


Intel has solved the biggest problems with its Arc GPUs, but not the timing.

Intel’s Arc B580 design doesn’t include LEDs or other frills, but it’s a clean-looking design. Credit: Andrew Cunningham

Intel’s Arc B580 design doesn’t include LEDs or other frills, but it’s a clean-looking design. Credit: Andrew Cunningham

Intel doesn’t have a ton to show for its dedicated GPU efforts yet.

After much anticipation, many delays, and an anticipatory apology tour for its software quality, Intel launched its first Arc GPUs at the end of 2022. There were things to like about the A770 and A750, but buggy drivers, poor performance in older games, and relatively high power use made them difficult to recommend. They were more notable as curiosities than as consumer graphics cards.

The result, after more than two years on the market, is that Arc GPUs remain a statistical nonentity in the GPU market, according to analysts and the Steam Hardware Survey. But it was always going to take time—and probably a couple of hardware generations—for Intel to make meaningful headway against entrenched competitors.

Intel’s reference design is pretty by the book, with two fans, a single 8-pin power connector, and a long heatsink and fan shroud that extends several inches beyond the end of the PCB. Andrew Cunningham

The new Arc B580 card, the first dedicated GPU based on the new “Battlemage” architecture, launches into the exact same “sub-$300 value-for-money” graphics card segment that the A770 and A750 are already stuck in. But it’s a major improvement over those cards in just about every way, and Intel has gone a long way toward fixing drivers and other issues that plagued the first Arc cards at launch. If nothing else, the B580 suggests that Intel has some staying power and that the B700-series GPUs could be genuinely exciting if Intel can get one out relatively soon.

Specs and testbed notes

Specs for the Arc B580 and B570. Credit: Intel

The Arc B580 and Arc B570 lead the charge for the Battlemage generation. Both are based on the same GPU silicon, but the B580 has a few more execution resources, slightly higher clock speeds, a 192-bit memory bus instead of 160-bit, and 12GB of memory instead of 10GB.

Intel positions both cards as entry-level 1440p options because they have a bit more RAM than the 8GB baseline of the GeForce RTX 4060 and Radeon RX 7600. These 8GB cards are still generally fine at 1080p, but more memory does make the Arc cards feel a little more future-proof, especially since they’re fast enough to actually hit 60 fps in a lot of games at 1440p.

Our testbed remains largely the same as it has been for a while, though we’ve swapped the ASRock X670E board for an Asus model. The Ryzen 7 7800X3D remains the heart of the system, with more than enough performance to avoid bottlenecking midrange and high-end GPUs.

We haven’t done extensive re-testing of most older GPUs—the GeForce and Radeon numbers here are the same ones we used in the RX 7600 XT review earlier this year. We wouldn’t expect new drivers to change the scores in our games much since they’re mostly a bit older—we still use a mix of DirectX 11 and DirectX 12 games, including a few with and without ray-tracing effects enabled. We have re-tested the older Arc cards with recent drivers since Intel does still occasionally make changes that can have a noticeable impact on older games.

As with the Arc A-series cards, Intel emphatically recommends that resizable BAR be enabled for your motherboard to get optimal performance. This is sometimes called Smart Access Memory or SAM, depending on your board; most AMD AM4 and 8th-gen Intel Core systems should support it after a BIOS update, and newer PCs should mostly have it on by default. Our test system had it enabled for the B580 and for all the other GPUs we tested.

Performance and power

As a competitor to the RTX 4060, the Arc B580 is actually pretty appealing, whether you’re talking about 1080p or 1440p, in games with ray-tracing on or off. Even older DirectX 11 titles in our suite, like Grand Theft Auto V and Assassin’s Creed Odyssey, don’t seem to take the same performance hit as they did on older Arc cards.

Intel is essentially making a slightly stronger version of the argument that AMD has been trying to make with the RX 7600. AMD’s cards always come with the caveat of significantly worse performance in games with heavy ray-tracing effects, but the performance hit for Intel cards in ray-traced games looks a lot more like Nvidia’s than AMD’s. Playable ray-traced 1080p is well within reach for the Intel card, and in both Cyberpunk 2077 and Returnal, its performance came closer to the 8GB 4060 Ti’s.

The 12GB of RAM is also enough to put more space between the B580 and the 8GB versions of the 4060 and 7600. Forza Horizon 5 performs significantly better at 1440p on cards with more memory, like the B580 and the 16GB RX 7600 XT, and it’s a safe bet that the 8GB limit will become more of a factor for high-end games at higher resolutions as the years go on.

We experienced just one performance anomaly in our testing. Forza Horizon 5 actually runs a bit worse with XeSS enabled, with a smooth average frame rate but frequent stutters that make it less playable overall (though it’s worth noting that Forza Horizon 5 never benefits much from upscaling algorithms on any GPUs we’ve tested, for whatever reason). Intel also alerted us to a possible issue with Cyberpunk 2077 when enabling ray-tracing but recommended a workaround that involved pressing F1 to reset the game’s settings; the benchmark ran fine on our testbed.

GPU power consumption numbers under load. Credit: Andrew Cunningham

Power consumption is another place where the Battlemage GPU plays a lot of catch-up with Nvidia. With the caveat that software-measured power usage numbers like ours are less accurate than numbers captured with hardware tools, it looks like the B580’s power consumption, when fully loaded, consumes somewhere between 120 and 130 W in Hitman and Borderlands. This is a tad higher than the 4060, but it’s lower than either Radeon RX 7600.

It’s not the top of the class, but looking at the A750’s power consumption shows how far Intel has come—the B580 beats the A750’s performance every single time while consuming about 60 W less power.

A strong contender, a late arrival

The Intel Arc B580. Credit: Andrew Cunningham

Intel is explicitly targeting Nvidia’s GeForce RTX 4060 with the Arc B580, a role it fills well for a low starting price. But the B580 is perhaps more damaging to AMD, which positions both of its 7600-series cards (and the remaining 6600-series stuff that’s hanging around) in the same cheaper-than-Nvidia-with-caveats niche.

In fact, I’d probably recommend the B580 to a budget GPU buyer over any of the Radeon RX 7600 cards at this point. For the same street price as the RX 7600, Intel is providing better performance in most games and much better performance in ray-traced games. The 16GB 7600 XT has more RAM, but it’s $90 to $100 more expensive, and a 12GB card is still reasonably future-proof and decent at 1440p.

All of that said, Intel is putting out a great competitor to the RTX 4060 and RX 7600 a year and a half after those cards both launched—and within just a few months of a possible RTX 5060. Intel is selling mid-2023’s midrange GPU performance in late 2024. There are actually good arguments for building a budget gaming PC right this minute, before potential Trump-administration tariffs can affect prices or supply chains, but assuming the tech industry can maintain its normal patterns, it would be smartest to wait and see what Nvidia does next.

Nvidia also has some important structural benefits. DLSS upscaling support is nearly ubiquitous in high-end games, Nvidia’s drivers are more battle-tested, and it’s extremely unlikely that Nvidia will decide to pull out of the GPU market and stop driver development any time soon (Intel has published a roadmap encompassing multiple GPU generations, which is reassuring, but the company’s recent financial distress has seen it shed several money-losing hobby projects).

If there’s a saving grace for Intel and the B580, it’s that Nvidia has signaled, both through its statements and its behavior, that it’s mostly uninterested in aggressively lowering GPU prices, either over time (Nvidia GPUs tend not to stray far from MSRP, barring supply issues) or between generations. An RTX 5060 is highly unlikely to be cheaper than a 4060 and could easily be more expensive. Depending on how good a hypothetical RTX 5060 is, Intel still has a lot of room to offer good performance for the price in a $200-to-$250-ish GPU market that doesn’t get a ton of attention.

The other issue for Intel is that for a second straight GPU generation, the company is launching late with a part that is forced by its performance to play in a budget-oriented, low-margin area of the GPU market. I don’t think I’m expecting a 4090 or 5090-killer out of Intel any time soon, but based on the B580, I’m at least a little optimistic that Intel can offer a B700-series card that can credibly compete with the likes of Nvidia’s 4070-series or AMD’s 7800 XT and 7900 GRE. Performance-wise, that’s the current sweet spot of the GPU market, but you’ll spend more than you would on a PS5 to buy most of those cards. If Intel can shake up that part of the business, it could help put Arc on the map.

The good

  • Solid midrange 1080p and 1440p performance at a good starting price
  • More RAM than the competition
  • Much-improved power efficiency compared to Arc A-series GPUs
  • Unlike the A-series, we noticed no outliers where performance was disproportionately bad
  • Simple, clean-looking reference design from Intel

The bad

  • Competing with cards that launched a year and a half ago
  • New Nvidia and AMD competitors are likely within a few months
  • Intel still can’t compete at the high end of the GPU market, or even the medium-high end

The ugly

  • So far, Arc cards have not been successful enough to guarantee their long-term existence

Photo of Andrew Cunningham

Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.

Intel Arc B580 review: A $249 RTX 4060 killer, one-and-a-half years later Read More »

reminder:-donate-to-win-swag-in-our-annual-charity-drive-sweepstakes

Reminder: Donate to win swag in our annual Charity Drive sweepstakes

If you’ve been too busy punching virtual Nazis to take part in this year’s Ars Technica Charity Drive sweepstakes, don’t worry. You still have time to donate to a good cause and get a chance to win your share of over $4,000 worth of swag (no purchase necessary to win).

In the first three days of the drive, over 100 readers have contributed almost $9,500 to either the Electronic Frontier Foundation or Child’s Play as part of the charity drive (Child’s Play is now leading in the donation totals by about $1,000). That’s a long way off from 2020’s record haul of over $58,000, but there’s still plenty of time until the Charity Drive wraps up on Thursday, January 2, 2025.

That doesn’t mean you should put your donation off, though. Do yourself and the charities involved a favor and give now while you’re thinking about it.

See below for instructions on how to enter, and check out the Charity Drive kickoff post for a complete list of available prizes.

How it works

Donating is easy. Simply donate to Child’s Play using a credit card or PayPal or donate to the EFF using PayPal, credit card, or cryptocurrency. You can also support Child’s Play directly by using this Ars Technica campaign page or picking an item from the Amazon wish list of a specific hospital on its donation page. Donate as much or as little as you feel comfortable with—every little bit helps.

Reminder: Donate to win swag in our annual Charity Drive sweepstakes Read More »

the-talos-principle:-reawakened-adds-new-engine,-looks,-and-content-to-a-classic

The Talos Principle: Reawakened adds new engine, looks, and content to a classic

Are humans just squishy machines? Can an artificially intelligent robot create a true moral compass for itself? Is there a best time to play The Talos Principle again?

The answer to at least one of these questions is now somewhat answered. The Talos Principle: Reawakened, due in “Early 2025,” will bundle the original critically acclaimed 2014 game, its Road to Gehenna DLC, and a new chapter, “In the Beginning,” into an effectively definitive edition. Developer commentary and a level editor will also be packed in. But most of all, the whole game has been rebuilt from the ground up in Unreal Engine 5, bringing “vastly improved visuals” and quality-of-life boosts to the game, according to publisher Devolver Digital.

Trailer for The Talos Principle: Reawakened.

Playing Reawakened, according to its Steam page requires a minimum of 8 GB of RAM, 75 GB of storage space, and something more than an Intel integrated GPU. It also recommends 16 GB RAM, something close to a GeForce 3070, and a 6–8-core CPU.

It starts off with puzzle pieces and gets a bit more complicated as you go on.

Credit: Devolver Digital

It starts off with puzzle pieces and gets a bit more complicated as you go on. Credit: Devolver Digital

The Talos Principle, from the developers of the Serious Sam series, takes its name from the bronze-made protector of Crete in Greek mythology. The gameplay has you solve a huge assortment of puzzles as a robot avatar and answer the serious philosophical questions that it ponders. You don’t shoot things or become a stealth archer, but you deal with drones, turrets, and other obstacles that require some navigation, tool use, and deeper thinking. As you progress, you learn more about what happened to the world, why you’re being challenged with these puzzles, and what choices an artificial intelligence can really make. It’s certainly not bad timing for this game to arrive once more.

If you can’t wait until the remaster, the original game and its also well-regarded sequel, The Talos Principle II, are on deep sale at the moment, both on Steam (I and II) and GOG (I and II).

The Talos Principle: Reawakened adds new engine, looks, and content to a classic Read More »

itch.io-platform-briefly-goes-down-due-to-“ai-driven”-anti-phishing-report

Itch.io platform briefly goes down due to “AI-driven” anti-phishing report

The itch.io domain was back up and running by 7 am Eastern, according to media reports, “after the registrant finally responded to our notice and took appropriate action to resolve the issue.” Users could access the site throughout if they typed the itch.io IP address into their web browser directly.

Too strong a shield?

BrandShield’s website describes it as a service that “detects and hunts online trademark infringement, counterfeit sales, and brand abuse across multiple platforms.” The company claims to have multiple Fortune 500 and FTSE100 companies on its client list.

In its own series of social media posts, BrandShield said its “AI-driven platform” had identified “an abuse of Funko… from an itch.io subdomain.” The takedown request it filed was focused on that subdomain, not the entirety of itch.io, BrandShield said.

“The temporary takedown of the website was a decision made by the service providers, not BrandShield or Funko.”

The whole affair highlights how the delicate web of domain registrars and DNS servers can remain a key failure point for web-based businesses. Back in May, we saw how the desyncing of a single DNS root server could cause problems across the entire Internet. And in 2012, the hacking collective Anonymous highlighted the potential for a coordinated attack to take down the entire DNS system.

Itch.io platform briefly goes down due to “AI-driven” anti-phishing report Read More »

indiana-jones-and-the-great-circle-is-pitch-perfect-archaeological-adventuring

Indiana Jones and the Great Circle is pitch-perfect archaeological adventuring


Review: Amazing open-world environs round out a tight, fun-filled adventure story.

No need to put Harrison Ford through the de-aging filter here! Credit: Bethesda / MachineGames

No need to put Harrison Ford through the de-aging filter here! Credit: Bethesda / MachineGames

Historically, games based on popular film or TV franchises have generally been seen as cheap cash-ins, slapping familiar characters and settings on a shovelware clone of a popular genre and counting on the license to sell enough copies to devoted fans. Indiana Jones and the Great Circle clearly has grander ambitions than that, putting a AAA budget behind a unique open-world exploration game built around stealth, melee combat, and puzzle solving.

Building such a game on top of such well-loved source material comes with plenty of challenges. The developers at MachineGames need to pay homage to the source material without resorting to the kind of slavish devotion that amounts to a mere retread of a familiar story. At the same time, any new Indy adventure carries with it the weight not just of the character’s many film and TV appearances but also well-remembered games like Indiana Jones and the Fate of Atlantis. Then there are game franchises like Tomb Raider and Uncharted, which have already put their own significant stamps on the Indiana Jones formula of action-packed, devil-may-care treasure-hunting.

No, this is not a scene from a new Uncharted game. Credit: Bethesda / MachineGames

Surprisingly, Indiana Jones and the Great Circle bears all this pressure pretty well. While the stealth-exploration gameplay and simplistic puzzles can feel a bit trite at points, the game’s excellent presentation, top-notch world-building, and fun-filled, campy storyline drive one of Indy’s most memorable adventures since the original movie trilogy.

A fun-filled adventure

The year is 1937, and Indiana Jones has already Raided a Lost Ark but has yet to investigate the Last Crusade. After a short introductory flashback that retells an interactive version of Raiders of the Lost Ark‘s famous golden idol extraction, Professor Jones gets unexpectedly drawn away from preparations for midterms when a giant of a man breaks into Marshall College’s antiquities wing and steals a lone mummified cat.

Investigating that theft takes Jones on a globetrotting tour of locations along “The Great Circle,” a ring of archaeologically significant sites around the world that house ancient artifacts rumored to hold great and mysterious power. Those rumors have attracted the attention of the Nazis (who else would you expect?), dragging Indy into a race to secure the artifacts before they threaten to alter the course of an impending world war.

You see a whip, I see a grappling hook. Credit: Bethesda / MachineGames

The game’s overarching narrative—told mainly through lengthy cut scenes that serve as the most captivating reward for in-game achievements—does a pitch-perfect job of replicating the campy, madcap, fun-filled, adventurous tone Indy is known for. The writing is full of all the pithy one-liners and cheesy puns you could hope for, as well as countless overt and subtle references to Indy movie moments that will be familiar to even casual fans.

Indy here is his usual mix of archaeological superhero and bumbling everyman. One moment, he’s using his whip and some hard-to-believe upper body strength to jump around some quickly crumbling ruins. The next, he’s avoiding death during a madcap fight scene through a combination of sheer dumb luck and overconfident opposition. The next, he’s solving ancient riddles with reams of historical techno-babble and showing a downright supernatural ability to decipher long-dead languages in an instant when the plot demands it.

You have to admit it, this circle is pretty great! Credit: Bethesda / MachineGames

It all works in large part thanks to Troy Baker’s excellent vocal performance as Jones, which he somehow pulls off as a compelling cross between Harrison Ford and Jeff Goldblum. The music does some heavy lifting in setting the tone, too; it’s full of downright cinematic stirring horns and tension-packed strings that fade in and out perfectly in sync with the on-screen action. The game even shows some great restraint in its sparing use of the famous Indiana Jones theme, which I ended up humming to myself as I played more often than I actually heard it referenced in the game’s score.

Indy quips well off of Gina, a roving reporter searching for her missing sister who serves as the obligatory love interest/globetrotting exploration partner. But the game’s best scenes all involve Emmerich Voss, the Nazi archaeologist antagonist who makes an absolute meal out of his scenery chewing. From his obsession with cranial shapes to his preening diatribes about the inferiority of American culture, Voss makes the perfect foil for Indy’s no-nonsense, homespun apple pie forthrightness.

Voss steals literally every scene he’s in. Credit: Bethesda / MachineGames

By the time the plot descends into an inevitable mess of pseudo-religious magical mysticism, it’s clear that this is a story that doesn’t take itself too seriously. You may cringe a bit at how over the top it all gets, but you’ll probably be having too much fun to care.

Take a look around

In between the cut scenes—which together could form the basis for a strong Indiana Jones-themed episodic streaming miniseries—there’s an actual interactive game to play here as well. That game primarily plays out across three decently sized maps—one urban, one desert, and one water-logged marsh—that you can explore relatively freely, broken up by shorter, more linear interludes in between.

Following the main story quests in each of these locales generally has you zigzagging across the map through a series of glorified fetch quests. Go to location A to collect some mystical doodad, then return it to unlock some fun exposition and a reason to go to location B. Repeat as necessary.

I say “point A” there, but it’s usually more accurate to say the game points you toward “circle A” on the map. Once you get there, you often have to do a bit of unguided exploring to find the hidden trinket or secret entry point you need.

Am I going in the right direction? Credit: Bethesda / MachineGames

At their best, these exploration bits made me feel more like an archaeological detective than the usual in-game tourist blindly following a waypoint from location to location. At its worst, I spent 15 minutes searching through one of these map circles before finding my in-game partner Gina standing right next to the target I was probably intended to find immediately. So it goes.

Traipsing across the map in this way slowly reveals the sizable scale of the game’s environments, which often extend beyond what’s first apparent on the map to multi-floor buildings and gigantic subterranean caverns. Unlocking and/or figuring out all of the best paths through these labyrinthine locales—which can involve climbing across rooftops or crawling through enemy barracks—is often half the fun.

As you crisscross the map, you also invariably stumble on a seemingly endless array of optional sidequests, mysteries, and “fieldwork,” which you keep track of in a dynamically updated journal. While there’s an attempt at a plot justification for each of these optional fetch quests, the ones I tried ended up being much less compelling than the main plot, which seems to have taken most of the writers’ attention.

Indiana Jones, famous Vatican tourist. Credit: Bethesda / MachineGames

As you explore, a tiny icon in the corner of the screen will also alert you to photo opportunities, which can unlock important bits of lore or context for puzzles. I thoroughly enjoyed these quick excuses to appreciate the game’s well-designed architecture and environments, even as it made Indy feel a bit more like a random tourist than a badass archaeologist hero.

Quick, hide!

Unfortunately, your ability to freely explore The Great Circle‘s environments is often hampered by large groups of roaming Nazi and/or fascist soldiers. Sometimes, you can put on a disguise to walk among them unseen, but even then, certain enemies can pick you out of the crowd, something that was not clear to me until I had already been plucked out of obscurity more than a few times.

When undisguised, you’ll spend a lot of time kneeling and sneaking silently just outside the soldiers’ vision cones or patiently waiting for them to move so you can sneak through a newly safe path. Remaining unseen also lets you silently take out enemies from behind, which includes pushing unsuspected enemy sentries off of ledges in a hilarious move that never, ever gets old.

They’ll never find me up here. Credit: Bethesda / MachineGames

When your sneaking skills fail you amid a large group of enemies, the best and easiest thing to do is immediately run and hide. For the most part, the enemies are incredibly inept in their inevitable pursuit; dodge around a couple of corners and hide in a dark alley and they’ll usually quickly lose track of you. While I appreciated that being spotted wasn’t an instant death sentence, the ease with which I could outsmart these soldiers made the sneaking a lot less tense.

If you get spotted by a group of just one or two enemy soldiers, though, it’s time for some first-person melee combat, which draws heavy inspiration from the developers’ previous work on the early ’00s Chronicles of Riddick games. These fights usually play out like the world’s most overdesigned game of Punch-Out!!—you stand there waiting for a heavily telegraphed punch to come in, at which point you throw up a quick block or dodge and then counter with a series of rapid, crunchy punches of your own. Repeat until the enemy goes down.

You can spice things up a bit here by disarming and/or unbalancing your foes with your whip or by grabbing a wide variety of nearby objects to use as improvised melee weapons. After a while, though, all the fistfights start to feel pretty rote and unmemorable. The first time you hit a Nazi upside the head with a plunger is hilarious. The fifth time is a bit tiresome.

It’s always a good time to punch a Nazi. Credit: Bethesda / MachineGames

While you can also pull out a trusty revolver to simply shoot your foes, the racket the shots make usually leads to so much unwelcome enemy attention that it’s rarely worth the trouble. Aside from a handful of obligatory sections where the game practically forces you into a shooting gallery situation, I found little need to engage in the serviceable but unexciting gun combat.

And while The Great Circle is far from a horror game, there are a few combat moments of genuine terror with foes more formidable than the average grunt. I don’t want to give away too much, but those with fear of underwater creatures, the dark, or confined spaces will find some parts of the game incredibly tense.

Not so puzzling

My favorite gameplay moments in The Great Circle were the extended sections where I didn’t have to worry about stealth or combat and could just focus on exploring massive underground ruins. These feature some of the game’s most interesting traversal challenges, where looking around and figuring out just how to make it to the next objective is engaging on its own terms. There’s little of the Uncharted-style gameplay of practically highlighting every handhold and jump with a flashing red sign.

When giant mechanical gears need placing, you know who to call! Credit: Bethesda / MachineGames

These exploratory bits are broken up by some obligatory puzzles, usually involving Indiana Jones’ trademark of unbelievably intricate ancient stone machinery. Arrange the giant stone gears so the door opens, put the right relic in the right spot, shine a light on some emblems with a few mirrors, and so on. You know the drill if you’ve played any number of similar action-adventure games, and you probably won’t be all that engaged if you know how to perform some basic logic and exploration (though snapping pictures with the in-game camera offers hints for those who get unexpectedly stuck).

But even during the least engaging puzzles or humdrum fights in The Great Circle, I was compelled forward by the promise of some intricate ruin or pithy cut scene quip to come. Like the best Indiana Jones movies, there’s a propulsive force to the game’s most exciting scenes that helps you push past any brief feelings of tedium in between. Here’s hoping we see a lot more of this version of Indiana Jones in the future.

A note on performance

Indiana Jones and the Great Circle has received some recent negative attention for having relatively beefy system requirements, including calling for GPUs that have some form of real-time ray-tracing acceleration. We tested the game on a system with an Nvidia RTX 2080 Ti and an Intel i7-8700K CPU with 32 GB of RAM, which puts it roughly between the “minimum” and “recommended” specs suggested by the publisher.

Trace those rays. Credit: Bethesda / MachineGames

Despite this, we were able to run the game at 1440p resolution and “High” graphical settings at a steady 60 fps throughout. The game did occassionally suffer some heavy frame stuttering when loading new scenes, and far-off background elements had a tendency to noticeably “pop in” when running, but otherwise, we had few complaints about the graphical performance.

Photo of Kyle Orland

Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

Indiana Jones and the Great Circle is pitch-perfect archaeological adventuring Read More »

google’s-genie-2-“world-model”-reveal-leaves-more-questions-than-answers

Google’s Genie 2 “world model” reveal leaves more questions than answers


Making a command out of your wish?

Long-term persistence, real-time interactions remain huge hurdles for AI worlds.

A sample of some of the best-looking Genie 2 worlds Google wants to show off. Credit: Google Deepmind

In March, Google showed off its first Genie AI model. After training on thousands of hours of 2D run-and-jump video games, the model could generate halfway-passable, interactive impressions of those games based on generic images or text descriptions.

Nine months later, this week’s reveal of the Genie 2 model expands that idea into the realm of fully 3D worlds, complete with controllable third- or first-person avatars. Google’s announcement talks up Genie 2’s role as a “foundational world model” that can create a fully interactive internal representation of a virtual environment. That could allow AI agents to train themselves in synthetic but realistic environments, Google says, forming an important stepping stone on the way to artificial general intelligence.

But while Genie 2 shows just how much progress Google’s Deepmind team has achieved in the last nine months, the limited public information about the model thus far leaves a lot of questions about how close we are to these foundational model worlds being useful for anything but some short but sweet demos.

How long is your memory?

Much like the original 2D Genie model, Genie 2 starts from a single image or text description and then generates subsequent frames of video based on both the previous frames and fresh input from the user (such as a movement direction or “jump”). Google says it trained on a “large-scale video dataset” to achieve this, but it doesn’t say just how much training data was necessary compared to the 30,000 hours of footage used to train the first Genie.

Short GIF demos on the Google DeepMind promotional page show Genie 2 being used to animate avatars ranging from wooden puppets to intricate robots to a boat on the water. Simple interactions shown in those GIFs demonstrate those avatars busting balloons, climbing ladders, and shooting exploding barrels without any explicit game engine describing those interactions.

Those Genie 2-generated pyramids will still be there in 30 seconds. But in five minutes? Credit: Google Deepmind

Perhaps the biggest advance claimed by Google here is Genie 2’s “long horizon memory.” This feature allows the model to remember parts of the world as they come out of view and then render them accurately as they come back into the frame based on avatar movement. This kind of persistence has proven to be a persistent problem for video generation models like Sora, which OpenAI said in February “do[es] not always yield correct changes in object state” and can develop “incoherencies… in long duration samples.”

The “long horizon” part of “long horizon memory” is perhaps a little overzealous here, though, as Genie 2 only “maintains a consistent world for up to a minute,” with “the majority of examples shown lasting [10 to 20 seconds].” Those are definitely impressive time horizons in the world of AI video consistency, but it’s pretty far from what you’d expect from any other real-time game engine. Imagine entering a town in a Skyrim-style RPG, then coming back five minutes later to find that the game engine had forgotten what that town looks like and generated a completely different town from scratch instead.

What are we prototyping, exactly?

Perhaps for this reason, Google suggests Genie 2 as it stands is less useful for creating a complete game experience and more to “rapidly prototype diverse interactive experiences” or to turn “concept art and drawings… into fully interactive environments.”

The ability to transform static “concept art” into lightly interactive “concept videos” could definitely be useful for visual artists brainstorming ideas for new game worlds. However, these kinds of AI-generated samples might be less useful for prototyping actual game designs that go beyond the visual.

On Bluesky, British game designer Sam Barlow (Silent Hill: Shattered Memories, Her Story) points out how game designers often use a process called whiteboxing to lay out the structure of a game world as simple white boxes well before the artistic vision is set. The idea, he says, is to “prove out and create a gameplay-first version of the game that we can lock so that art can come in and add expensive visuals to the structure. We build in lo-fi because it allows us to focus on these issues and iterate on them cheaply before we are too far gone to correct.”

Generating elaborate visual worlds using a model like Genie 2 before designing that underlying structure feels a bit like putting the cart before the horse. The process almost seems designed to generate generic, “asset flip”-style worlds with AI-generated visuals papered over generic interactions and architecture.

As podcaster Ryan Zhao put it on Bluesky, “The design process has gone wrong when what you need to prototype is ‘what if there was a space.'”

Gotta go fast

When Google revealed the first version of Genie earlier this year, it also published a detailed research paper outlining the specific steps taken behind the scenes to train the model and how that model generated interactive videos. No such research paper has been published detailing Genie 2’s process, leaving us guessing at some important details.

One of the most important of these details is model speed. The first Genie model generated its world at roughly one frame per second, a rate that was orders of magnitude slower than would be tolerably playable in real time. For Genie 2, Google only says that “the samples in this blog post are generated by an undistilled base model, to show what is possible. We can play a distilled version in real-time with a reduction in quality of the outputs.”

Reading between the lines, it sounds like the full version of Genie 2 operates at something well below the real-time interactions implied by those flashy GIFs. It’s unclear how much “reduction in quality” is necessary to get a diluted version of the model to real-time controls, but given the lack of examples presented by Google, we have to assume that reduction is significant.

Oasis’ AI-generated Minecraft clone shows great potential, but still has a lot of rough edges, so to speak. Credit: Oasis

Real-time, interactive AI video generation isn’t exactly a pipe dream. Earlier this year, AI model maker Decart and hardware maker Etched published the Oasis model, showing off a human-controllable, AI-generated video clone of Minecraft that runs at a full 20 frames per second. However, that 500 million parameter model was trained on millions of hours of footage of a single, relatively simple game, and focused exclusively on the limited set of actions and environmental designs inherent to that game.

When Oasis launched, its creators fully admitted the model “struggles with domain generalization,” showing how “realistic” starting scenes had to be reduced to simplistic Minecraft blocks to achieve good results. And even with those limitations, it’s not hard to find footage of Oasis degenerating into horrifying nightmare fuel after just a few minutes of play.

What started as a realistic-looking soldier in this Genie 2 demo degenerates into this blobby mess just seconds later. Credit: Google Deepmind

We can already see similar signs of degeneration in the extremely short GIFs shared by the Genie team, such as an avatar’s dream-like fuzz during high-speed movement or NPCs that quickly fade into undifferentiated blobs at a short distance. That’s not a great sign for a model whose “long memory horizon” is supposed to be a key feature.

A learning crèche for other AI agents?

From this image, Genie 2 could generate a useful training environment for an AI agent and a simple “pick a door” task. Credit: Google Deepmind

Genie 2 seems to be using individual game frames as the basis for the animations in its model. But it also seems able to infer some basic information about the objects in those frames and craft interactions with those objects in the way a game engine might.

Google’s blog post shows how a SIMA agent inserted into a Genie 2 scene can follow simple instructions like “enter the red door” or “enter the blue door,” controlling the avatar via simple keyboard and mouse inputs. That could potentially make Genie 2 environment a great test bed for AI agents in various synthetic worlds.

Google claims rather grandiosely that Genie 2 puts it on “the path to solving a structural problem of training embodied agents safely while achieving the breadth and generality required to progress towards [artificial general intelligence].” Whether or not that ends up being true, recent research shows that agent learning gained from foundational models can be effectively applied to real-world robotics.

Using this kind of AI model to create worlds for other AI models to learn in might be the ultimate use case for this kind of technology. But when it comes to the dream of an AI model that can create generic 3D worlds that a human player could explore in real time, we might not be as close as it seems.

Photo of Kyle Orland

Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

Google’s Genie 2 “world model” reveal leaves more questions than answers Read More »

the-return-of-steam-machines?-valve-rolls-out-new-“powered-by-steamos”-branding.

The return of Steam Machines? Valve rolls out new “Powered by SteamOS” branding.

Longtime Valve watchers likely remember Steam Machines, the company’s aborted, pre-Steam Deck attempt at crafting a line of third-party gaming PC hardware based around an early verison of its Linux-based SteamOS. Now, there are strong signs that Valve is on the verge of launching a similar third-party hardware branding effort under the “Powered by SteamOS” label.

The newest sign of those plans come via newly updated branding guidelines posted by Valve on Wednesday (as noticed by the trackers at SteamDB). That update includes the first appearance of a new “Powered by SteamOS” logo intended “for hardware running the SteamOS operating system, implemented in close collaboration with Valve.”

The document goes on to clarify that the new Powered by SteamOS logo “indicates that a hardware device will run the SteamOS and boot into SteamOS upon powering on the device.” That’s distinct from the licensed branding for merely “Steam Compatible” devices, which include “non-Valve input peripherals” that have been reviewed by Valve to work with Steam.

The new guidelines replace an older set of branding guidelines, last revised in late 2017, that included detailed instructions for how to use the old “Steam Machines” name and logo on third-party hardware. That branding has been functionally defunct for years, making Valve’s apparent need to suddenly update it more than a little suspect.

The return of Steam Machines? Valve rolls out new “Powered by SteamOS” branding. Read More »

intel’s-second-generation-arc-b580-gpu-beats-nvidia’s-rtx-4060-for-$249

Intel’s second-generation Arc B580 GPU beats Nvidia’s RTX 4060 for $249

Turnover at the top of the company isn’t stopping Intel from launching new products: Today the company is announcing the first of its next-generation B-series Intel Arc GPUs, the Arc B580 and Arc B570.

Both are decidedly midrange graphics cards that will compete with the likes of Nvidia’s GeForce RTX 4060 and AMD’s RX 7600 series, but Intel is pricing them competitively: $249 for a B580 with 12GB of RAM and $219 for a B570 with 10GB of RAM. The B580 launches on December 13, while the B570 won’t be available until January 16.

The two cards are Intel’s first dedicated GPUs based on its next-generation “Battlemage” architecture, a successor to the “Alchemist” architecture used in the A-series cards. Intel’s Core Ultra 200 laptop processors were its first products to ship with Battlemage, though they used an integrated version with fewer of Intel’s Xe cores and no dedicated memory. Both B-series GPUs use silicon manufactured on a 5 nm TSMC process, an upgrade from the 6 nm process used for the A-series; as of this writing, no integrated or dedicated Arc GPUs have been manufactured by one of Intel’s factories.

Both cards use a single 8-pin power connector, at least in Intel’s reference design; Intel is offering a first-party limited-edition version of the B580, while it looks like partners like Asus, ASRock, Gunnir, Maxsun, Onix, and Sparkle will be responsible for the B570.

Compared to the original Arc GPUs, both Battlemage cards should benefit from the work Intel has put into its graphics drivers over the last two years—a combination of performance improvements plus translation layers for older versions of DirectX have all improved Arc’s performance quite a bit in older games since late 2022. Hopefully buyers won’t need to wait months or years to get good performance out of the Battlemage cards.

The new cards also come with XeSS 2, the next-generation version of Intel’s upscaling technology (analogous to DLSS for Nvidia cards and FSR for AMD’s). Like DLSS 3 and FSR 3, one of XeSS 2’s main additions is a frame-generation feature that can interpolate additional frames to insert between the frames that are actually being rendered by the graphics card. These kinds of technologies tend to work best when the cards are already running at a reasonably high frame rate, but when they’re working well, they can lead to smoother-looking gameplay. A related technology, Xe Low Latency, aims to reduce the increase in latency that comes with frame-generation technologies, similar to Nvidia’s Reflex and AMD’s Anti-Lag.

Intel’s second-generation Arc B580 GPU beats Nvidia’s RTX 4060 for $249 Read More »

the-raspberry-pi-5-now-works-as-a-smaller,-faster-kind-of-steam-link

The Raspberry Pi 5 now works as a smaller, faster kind of Steam Link

The Steam Link was a little box ahead of its time. It streamed games from a PC to a TV, ran 1,500 0f them natively, offered a strange (if somewhat lovable) little controller, and essentially required a great network, Ethernet cables, and a good deal of fiddling.

Valve quietly discontinued the Steam Link gear in November 2018, but it didn’t give up. These days, a Steam Link app can be found on most platforms, and Valve’s sustained effort to move Linux-based (i.e., non-Windows-controlled) gaming forward has paid real dividends. If you still want a dedicated device to stream Steam games, however? A Raspberry Pi 5 (with some help from Valve) can be a Substitute Steam Link.

As detailed in the Raspberry Pi blog, there were previously means of getting Steam Link working on Raspberry Pi devices, but the platform’s move away from proprietary Broadcom libraries—and from X to Wayland display systems—required “a different approach.” Sam Lantinga from Valve worked with the Raspberry Pi team on optimizing for the Raspberry Pi 5 hardware. As of Steam Link 1.3.13 for the little board, Raspberry Pi 5 units could support up to 1080p at 144 frames per second (FPS) on the H.264 protocol and 4k at 60 FPS or 1080p at 240 FPS, presuming your primary gaming computer and network can support that.

Jeff Geerling’s test of Steam Link on Raspberry Pi 5, showing some rather smooth Red Dead movement.

I have a documented preference for a Moonlight/Sunshine game streaming setup over Steam Link because I have better luck getting games streaming at their best on it. But it’s hard to beat Steam Link for ease of setup, given that it only requires Steam to be running on the host PC, plus a relatively simple configuration on the client screen. A Raspberry Pi 5 is an easy device to hide near your TV. And, of course, if you don’t end up using it, you only have 450 other things you can do with it.

The Raspberry Pi 5 now works as a smaller, faster kind of Steam Link Read More »

blizzard’s-pulling-of-warcraft-i-&-ii-tests-gog’s-new-preservation-program

Blizzard’s pulling of Warcraft I & II tests GOG’s new Preservation Program

GOG’s version goes a bit beyond the classic versions that were on sale on Blizzard.net. Beyond the broad promise that “this is the best version of this game you can buy on any PC platform,” GOG has made specific tweaks to the networking code for Warcraft I and fixed up the DirectX wrapper for Warcraft II to improve its scaling on modern monitor resolutions.

It’s quite a novel commitment, keeping non-revenue-generating games playable for buyers, even after a publisher no longer makes them available for sale. The Warcraft titles certainly won’t be the only games for which publisher enthusiasm lags behind GOG and its classic gamers.

As noted at the Preservation Program’s launch, for some titles, GOG does not have the rights to modify a game’s build, and only its original developers can do so. So if GOG can’t make it work in, say, DOSBox, extraordinary efforts may be required.

A screenshot from Blizzard's Warcraft II: Remastered release, showing brick keeps, archers, footsoldiers, dragons around a roost, and knights on horseback units.

Warcraft II: Remastered lets you switch back and forth between classic and remastered graphics and promises to offer better support for widescreen monitors and more units selected at once.

Credit: Blizzard

Warcraft II: Remastered lets you switch back and forth between classic and remastered graphics and promises to offer better support for widescreen monitors and more units selected at once. Credit: Blizzard

Beyond being tied to Blizzard’s Battle.net service in perpetuity, there are other reasons Warcraft fans might want to hold onto the originals. Blizzard’s 2020 release of Warcraft III Reforged was widely panned as uneven, unfinished, and in some ways unfair, as it, too, removed the original Warcraft III from stores. Reforged was still in rough shape a year later, leading Ars’ list of 2020’s most disappointing games. A 2.0 update promised a total reboot, but fans remain torn on the new art styles and are somewhat wary.

Then again, you can now select more units in the first two Warcraft games’ remasters, and you get “numerous visual updates for the UI.”

Blizzard’s pulling of Warcraft I & II tests GOG’s new Preservation Program Read More »

the-atari-7800+-is-a-no-frills-glimpse-into-a-forgotten-gaming-era

The Atari 7800+ is a no-frills glimpse into a forgotten gaming era


Awkward controls and a lack of features make a device for Atari completists only.

Shiny and chrome? In this economy? Credit: Kyle Orland

Like a lot of children of the ’80s, my early gaming nostalgia has a huge hole where the Atari 7800 might have lived. While practically everyone I knew had an NES during my childhood—and a few uncles and friends’ older siblings even had an Atari 2600 gathering dust in their dens—I was only vaguely aware of the 7800, Atari’s backward compatible, late ’80s attempt to maintain relevance in the quickly changing console market.

Absent that kind of nostalgia, the Atari 7800+ comes across as a real oddity. Fiddling with the system’s extremely cumbersome controllers and pixelated, arcade-port-heavy software library from a modern perspective is like peering into a fallen alternate universe, one where Nintendo wasn’t able to swoop in and revive a flailing Western home video game industry with the NES.

Even for those with fond memories of Atari 7800-filled childhoods, I’m not sure that this bare-bones package justifies its $130 price. There are many more full-featured ways to get your retro gaming fix, even for those still invested in the tail end of Atari’s dead-end branch of the gaming console’s evolutionary tree.

7800HD

Much like last year’s Atari 2600+, the 7800+ shell is a slightly slimmed-down version of Atari’s nostalgic hardware design. This time, Atari took design inspiration from the rainbow-adorned European version of the 7800 console (which released a year later), rather than the bulkier, less colorful US release.

A reverse angle showing how 7800 cartridges stick out with the art facing away from the front. Kyle Orland

The 7800+ plays any of the 58 officially licensed Atari 7800 cartridges released decades ago, as well as the dozens of homebrew cartridges released by coders in more recent years (some of which are now being sold for $30 each by the modern Atari corporation itself; more on those later). The data on those cartridges is run via the open source ProSystem emulator, which seems more than up to the job of re-creating the relatively ancient 7800 tech without any apparent slowdown, input lag, or graphical inconsistencies. The 15 to 30 seconds of loading time when you first plug in a new cartridge is more than a bit annoying, though.

The HDMI output from the 7800+ is the updated console’s main selling point, if anything is. The sharp, upscaled images work best on games with lots of horizontal and/or vertical lines and bright, single-colored sprites. But blowing up decades-old low-resolution graphics can also hurt the visual appeal of games designed to take advantage of the smoother edges and blended color gradients inherent to older cathode ray tube TVs.

Atari’s new console doesn’t offer the kind of scanline emulation or graphical filters that can help recreate that CRT glow in countless other emulation solutions (though a hardware switch does let you extend the standard 4:3 graphics to a sickeningly stretched-out 16:9). That means many of the sprites in games like Food Fight and Fatal Run end up looking like blocky riots of color when blown up to HD resolutions on the 7800+.

Beyond graphics, the 7800+ also doesn’t offer any modern emulation conveniences like save states, fast-forward and rewind, slow-mo, controller customization, or high-score tracking across sessions. Authenticity seems to have taken precedence over modern conveniences here.

Much like the original Atari 7800, the 7800+ is also backward-compatible with older Atari 2600 cartridges and controllers (re-created through the able Stella emulator). That’s a nice touch but also a little galling for anyone who already invested money in last year’s Atari 2600+, which the company is still selling for roughly the same price as the 7800+. Aside from the nostalgic styling of the box itself, we can’t see any reason why the less-capable 2600+ still needs to exist at all at this point.

A mess of a controller

In the US, the original Atari 7800 came with an oddly designed “ProLine” joystick featuring two buttons on either side of the base, designed to be hit with the thumb and index finger of your off hand. For the 7800+, Atari instead went with a controller modeled after the CX78 joypad released with the European version of the console.

This pad represents an odd inflection point in video game history, with a hard plastic thumbstick sticking out above a standard eight-way D-pad. Years before analog thumbsticks would become a console standard, this thumbstick feels incredibly fiddly for the console’s completely digital directional inputs. In a game like Asteroid Deluxe, for instance, I found turning to the right or left frequently led to thrusting forward with an accidental “up” push as well.

The CX78 pad was also the first packaged Atari controller with two face buttons, a la the familiar NES controller. Unfortunately, those buttons are spaced just far enough apart to make it extremely awkward to hit both at once using a single thumb, which is practically required in newer titles like Bentley Bear’s Crystal Quest. The whole thing seems designed for placing the controller in front of you and hitting the buttons with two separate fingers, which I found less than convenient.

The Atari 7800+ does feature two standard Atari console plugs in the front, making it compatible with pretty much all classic and revamped Atari controllers (and, oddly enough, Sega Genesis pads). Be wary, though; if a 7800 game requires two buttons, a lot of single-button Atari control options will prove insufficient.

The CX78+’s included wireless receivers (which plug into those controller ports) mean you don’t have to run any long cables from the system to your couch while playing the Atari 7800+. But a few important controls like pause and reset are stuck on the console itself—just as they were on the original Atari 7800—meaning you’ll probably want to have the system nearby anyway. It would have been nice to have additional buttons for these options on the controller itself, even if that would have diminished the authenticity of the controllers.

There are better versions of these games

The VIP package Atari sent me, along with a selection of cartridges. Credit: Kyle Orland

Since I’ve never owned an Atari 7800 cartridge, Atari sent me eight titles from its current line of retro cartridges to test alongside the updated hardware. This included a mix of original titles released in the ’80s and “homebrew elevation” cartridges that the company says are now “getting a well-deserved official Atari release.”

The titles I had to test were definitely a step up from the few dozen Atari 2600 games that I’ve accumulated and grown to tolerate over the years. A game like Asteroids Deluxe on the 7800 doesn’t quite match the vector graphics of the arcade original, but it comes a lot closer than the odd, colorful blobs of Asteroids on the 2600. The same goes for Frenzy on the 7800, which is a big step up from Berzerk on the 2600.

Still, I couldn’t help but feel that these arcade ports are better experienced these days on one of the many MAME-based or FPGA-based emulation boxes that can do justice to the original quarter munchers. And the more original titles I’ve sampled mostly ended up feeling like pale shadows of the NES games I knew and loved.

The new Bentley Bear’s Crystal Quest (which is included with the 7800+ package) comes across as an oversimplified knock-off of Adventure Island, for instance. And the rough vehicular combat of Fatal Run is much less engaging than the NES port of Atari’s own similar but superior Roadblasters arcade cabinet. The one exception to this rule that I found was Ninja Golf, a wacky, original mix of decent golfing and engaging run-and-punch combat.

Of course, I’m not really the target audience here. The ideal Atari 7800+ buyer is someone who still has nostalgic memories of the Atari 7800 games they played as a child and has held onto at least a few of them (and/or bought more modern homebrew cartridges) in the intervening decades.

If those retro gamers want an authentic but no-frills box that will upscale those cartridges for an HDTV, the Atari 7800+ will do the job and look cute on your mantel while it does. But any number of emulation solutions will probably do the job just as well and with more features to boot.

Photo of Kyle Orland

Kyle Orland has been the Senior Gaming Editor at Ars Technica since 2012, writing primarily about the business, tech, and culture behind video games. He has journalism and computer science degrees from University of Maryland. He once wrote a whole book about Minesweeper.

The Atari 7800+ is a no-frills glimpse into a forgotten gaming era Read More »