gaming

$500-aluminum-version-of-the-analogue-pocket-looks-like-the-game-boy’s-final-form

$500 aluminum version of the Analogue Pocket looks like the Game Boy’s final form

so metal —

Other Pocket iterations have stuck to colorful (and cheaper) plastic.

Analogue is launching another limited-edition version of its Pocket console, this time with an anodized aluminum body and buttons.

Enlarge / Analogue is launching another limited-edition version of its Pocket console, this time with an anodized aluminum body and buttons.

Analogue

Analogue has released multiple variations of the Analogue Pocket, its Game Boy-style handheld console that can play old cartridges and game ROMs using its FPGA chip. But until now, all of those designs have been riffs on the regular Pocket’s black (or white) plastic shell.

The company’s latest Pocket iteration might appeal more to people who prefer the solidity and durability of anodized aluminum to the cheap practicality of plastic. On July 15, the company will release a limited run of all-aluminum Analogue Pocket consoles in four different colors: white, gray, black, and a Game Boy Advance-esque indigo. The company says that “every single piece” of these consoles is “entirely CNC’d from aluminum,” including not just the frame but also all of the buttons.

The new material will cost you, though: Each aluminum Pocket sells for $500, over twice as much as the $220 price of a regular plastic Pocket.

The aluminum versions of the Pocket will run the exact same software as the standard plastic ones and will be compatible with all the same cartridges and accessories. Analogue’s site doesn’t compare the weight of the aluminum and plastic Pocket consoles, though intuitively we’d expect the metal one to be heavier. The aluminum consoles begin shipping on July 17.

An exploded version of the new Pocket; even the buttons are aluminum.

Enlarge / An exploded version of the new Pocket; even the buttons are aluminum.

Analogue

When the Pocket first launched in late 2021, ongoing supply chain disruptions and high demand led to monthslong wait times for the initial models. Things have gotten slightly better since then—you can’t simply open Analogue’s store on any given day and just buy one, but the basic black and white plastic models restock with some regularity. Analogue has also released multiple special edition runs of the handheld, including one made of glow-in-the-dark plastic and a colorful series of models that recall Nintendo’s mid-’90s “Play It Loud!” hardware refresh for the original Game Boy.

As much as we liked the Pocket in our original review, the hardware has gotten much more capable thanks to a series of post-launch firmware updates. In the summer of 2022, Analogue added OpenFPGA support to the pocket, allowing its FPGA chip to emulate consoles like the NES, SNES, Sega Genesis, and others aside from the portable systems that the Pocket was designed to emulate. Updates toward the end of 2023 allowed those third-party emulation cores to use their own display filters, replicating the look of classic CRT TVs and other displays.

The updates have also fixed multiple bugs in the system. The latest update is version 2.2, released back in March, which primarily adds support for the Analogue Pocket Adapter Set that allows other kinds of vintage game cartridges to plug in to the Pocket’s cartridge slot.

$500 aluminum version of the Analogue Pocket looks like the Game Boy’s final form Read More »

arm-tweaks-amd’s-fsr-to-bring-battery-saving-gpu-upscaling-to-phones-and-tablets

Arm tweaks AMD’s FSR to bring battery-saving GPU upscaling to phones and tablets

situation: there are 14 competing standards —

Arm “Accuracy Super Resolution” is optimized for power use and integrated GPUs.

An Arm sample image meant to show off its new

Enlarge / An Arm sample image meant to show off its new “Accuracy Super Resolution” upscaling tech.

Arm

Some of the best Arm processors come from companies like Apple and Qualcomm, which license Arm’s processor instruction set but create their own custom or semi-custom CPU designs. But Arm continues to plug away on its own CPU and GPU architectures and related technologies, and the company has announced that it’s getting into the crowded field of graphics upscaling technology.

Arm’s Accuracy Super Resolution (ASR) is a temporal upscaler that is based on AMD’s open source FidelityFX Super Resolution 2, which Arm says allows developers to “benefit from the familiar API and configuration options.” (This AMD presentation from GDC 2023 gets into some of the differences between different kinds of upscalers.)

AMD’s FSR and Nvidia’s DLSS on gaming PCs are mostly sold as a way to boost graphical fidelity—increasing frame rates beyond 60 fps or rendering “4K” images on graphics cards that are too slow to do those things natively, for example. But since Arm devices are still (mostly, for now) phones and tablets, Arm is leaning into the potential power savings that are possible with lower GPU use. A less-busy GPU also runs cooler, reducing the likelihood of thermal throttling; Arm mentions reduced throttling as a benefit of ASR, though it doesn’t say how much of ASR’s performance advantage over FSR is attributable to reduced throttling.

“Using [ASR] rendered high-quality results at a stable, low temperature,” writes Arm Director for Ecosystem Strategy Peter Hodges. “Rendering at a native resolution inevitably led to undesirable thermal throttling, which in games can ruin the user experience and shorten engagement.”

Why not just use FSR2 without modification? Arm claims that the ASR upscaling tech has been tuned to reduce GPU usage and to run well on devices without a ton of memory bandwidth—think low-power mobile GPUs with integrated graphics rather than desktop-class graphics cards. ASR’s GPU use is as little as one-third of FSR2’s at the same target resolutions and scaling factors. Arm also claims that ASR delivers roughly 20 to 40 percent better frame rates than FSR2 on Arm devices, depending on the settings you’re using.

  • Arm also says that reduced GPU usage when using ASR can lead to lower heat and improved battery life.

    Arm

  • Arm says that ASR runs faster and uses less power than FSR on the same mobile hardware.

    Arm

Arm says it used “a commercial mobile device that features an Arm Immortalis-G720 GPU” for its performance testing and that it worked with MediaTek to corroborate its power consumption numbers “using a Dimensity 9300 handset.”

When the ASR spec is released, it will be up to OS makers and game developers to implement it. Apple will likely stick with its own MetalFX upscaling technology—also derived from AMD’s FSR, for what that’s worth. Microsoft is pushing “Automatic Super Resolution” on Arm devices while also attempting to develop a vendor-agnostic upscaling API in “DirectSR.” Qualcomm announced Snapdragon Game Super Resolution a little over a year ago.

Arm’s upscaler has the benefit of being hardware-agnostic and also open-source (Arm says it “want[s] to share [ASR] with the developer community under an MIT open-source license”) so that other upscalers can benefit from its improvements. Qualcomm’s upscaler is also a simpler spatial upscaler a la AMD’s first-generation FSR algorithm, so Arm’s upscaler could also end up producing superior image quality on the same GPUs.

We’re undeniably getting into that one xkcd comic about the proliferation of standards territory here, but it’s at least interesting to see different companies using graphics upscaling technology to solve problems other than “make games look nicer.”

Listing image by Arm

Arm tweaks AMD’s FSR to bring battery-saving GPU upscaling to phones and tablets Read More »

microsoft-asks-many-game-pass-subscribers-to-pay-more-for-less

Microsoft asks many Game Pass subscribers to pay more for less

Raking it in —

Launch day access to first-party titles now restricted to $19.99/month “Ultimate” tier.

Artist's conception of Microsoft executives after today's Game Pass pricing announcements.

Enlarge / Artist’s conception of Microsoft executives after today’s Game Pass pricing announcements.

Getty Images

For years now, Microsoft’s Xbox Game Pass has set itself apart by offering subscribers launch-day access to new first-party titles in addition to a large legacy library of older games. That important “day one” perk is now set to go away for all but the highest tier of Game Pass’ console subscribers, even as Microsoft asks for more money for Game Pass across the board.

Let’s start with the price increases for existing Game Pass tiers, which are relatively straightforward:

  • “Game Pass Ultimate” is going from $16.99 to $19.99 per month.
  • “Game Pass for PC” is going from $9.99 to $11.99 per month.
  • “Game Pass Core” (previously known as Xbox Live Gold) is going from $59.99 to $74.99 for annual subscriptions (and remains at $9.99 for monthly subscriptions).

Things get a bit more complicated for the $10.99/month “Xbox Game Pass for Console” tier. Microsoft announced that it will no longer accept new subscriptions for that tier after today, though current subscribers will be able to keep it (for now) if they auto-renew their subscriptions.

In its place, Microsoft will “in the coming months” roll out a new $14.99 “Xbox Game Pass Standard” tier. That new option will combine the usual access to “hundreds of high-quality games on console” with the “online console multiplayer” features that previously required a separate Xbox Game Pass Core subscription (“Core” will still be available separately and include access to a smaller “25+ game” library).

Quick and dirty chart by me to display the new Xbox Game Pass structure (subject to correction).

I hope this helps. pic.twitter.com/Qj6CX7i4kG

— Klobrille (@klobrille) July 10, 2024

But while the current Xbox Game Pass Console option promises access to Xbox Game Studios games “the same day they launch,” those “Day One releases” are conspicuously absent as a perk for the replacement Xbox Game Pass Standard subscription.

“Some games available with Xbox Game Pass Ultimate on day one will not be immediately available with Xbox Game Pass Standard and may be added to the library at a future date,” Microsoft writes in an FAQ explaining the changes.

Players who want guaranteed access to all those “Day One” releases will now have to subscribe to the $19.99/month Game Pass Ultimate. That’s an 81 percent increase from the $10.99/month that console players currently pay for similar “Day One” access on the disappearing Game Pass Console tier.

To be fair, that extra subscription money does come with some added benefits. Upgrading from Game Pass Console/Standard to Game Pass Ultimate lets you use Microsoft’s cloud gaming service, access downloadable PC games and the EA Play library, and get additional “free perks every month.” But it’s the launch day access to Microsoft’s system-selling first-party titles that really sets the Ultimate tier apart now, and which will likely necessitate a costly upgrade for many Xbox Game Pass subscribers.

More problems, more money

When Game Pass first launched in 2017, it was focused on legacy games, not day one launch titles.

Enlarge / When Game Pass first launched in 2017, it was focused on legacy games, not day one launch titles.

While Xbox Game Pass launched in 2017, launch-day access to all of Microsoft’s new first-party games wasn’t promised to subscribers until the beginning of 2018. Since then, loyal Game Pass subscribers have been able to play dozens of new first-party titles at launch, from major franchises like Halo, Forza, and Gears of War to indie darlings like Hi-Fi Rush, Sea of Thieves, and Ori and the Will of the Wisps and much more.

Sure, access to hundreds of older games was nice. But the promise of brand-new major first-party titles was instrumental in driving Xbox Game Pass to 34 million subscribers as of February. And Sony found itself unwilling to match that “day one” perk for its similar PlayStation Plus service, which only includes a handful of older PlayStation Studios titles.

In a 2022 interview with GamesIndustry.biz, PlayStation CEO Jim Ryan said throwing new first-party games on their subscription service would break a “virtuous cycle” in which new full game purchases (at a price of up to $70) help fund the next round of game development. “The level of investment that we need to make in our studios would not be possible, and we think the knock-on effect on the quality of the games that we make would not be something that gamers want.”

And Microsoft may come to a similar conclusion. Including first-party titles with cheaper, console-focused Game Pass subscriptions probably seemed like a good idea when Microsoft was still trying to attract subscribers to the service. But Game Pass subscriber growth is starting to slow as the market of potential customers has become saturated. Microsoft now needs to extract more value from those subscribers to justify Game Pass cannibalizing direct sales of its own first-party games.

Call of Duty: Black Ops 6 to a Game Pass subscription.” height=”360″ src=”https://cdn.arstechnica.net/wp-content/uploads/2024/07/codblops6-640×360.jpg” width=”640″>

Enlarge / Microsoft paid a lot of money to add the value of Call of Duty: Black Ops 6 to a Game Pass subscription.

Activision

And let’s not forget Activision, which Microsoft recently spent a whopping $69 billion to acquire after lengthy legal and regulatory battles. Recouping that cost, while also offering Game Pass subscribers launch day access to massive sellers like Call of Duty, likely forced Microsoft to maximize Game Pass’ revenue-generating opportunities.

“Let’s put it this way: If 7 million Xbox Game Pass subscribers were planning to buy ‘Call of Duty’ for $70 but now have no reason to (as it’s part of their subscription), that leaves almost half a billion dollars of revenue on the table,” MIDia analyst Rhys Elliott told The Daily Upside by way of illustrating the significant numbers involved.

For players who enjoy a wide variety of games and would likely purchase all or most of Microsoft’s first-party titles at launch anyway, Xbox Game Pass Ultimate it still probably a good deal at its increased price. But players who subscribed to a relatively cheap console Game Pass option years ago may want to reevaluate if maintaining that launch day access is now worth $240 a year.

Microsoft asks many Game Pass subscribers to pay more for less Read More »

why-1994’s-lair-of-squid-was-the-weirdest-pack-in-game-of-all-time

Why 1994’s Lair of Squid was the weirdest pack-in game of all time

digital archaeology —

The HP 200LX included a mysterious maze game called Lair of Squid. We tracked down the author.

Artist's impression of a squid jumping forth from a HP 200LX.

Enlarge / Artist’s impression of a squid jumping forth from an HP 200LX.

Aurich Lawson / HP

In 1994, Hewlett-Packard released a miracle machine: the HP 200LX pocket-size PC. In the depths of the device, among the MS-DOS productivity apps built into its fixed memory, there lurked a first-person maze game called Lair of Squid. Intrigued by the game, we tracked down its author, Andy Gryc, and probed into the title’s mysterious undersea origins.

“If you ask my family, they’ll confirm that I’ve been obsessed with squid for a long time,” Gryc told Ars Technica. “It’s admittedly very goofy—and that’s my fault—although I was inspired by Doom, which had come out relatively recently.”

In Lair of Squid, you’re trapped in an underwater labyrinth, seeking a way out while avoiding squid roaming the corridors. A collision with any cephalopod results in death. To progress through each stage and ascend to the surface, you locate the exit and provide a hidden, scrambled code word. The password is initially displayed as asterisks, with letters revealed as you encounter them within the maze.

Lair of Squid running on the author’s HP 200LX, shortly after the moment of discovery.” height=”480″ src=”https://cdn.arstechnica.net/wp-content/uploads/2024/07/squid_photo_1-640×480.jpg” width=”640″>

Enlarge / A photo of Lair of Squid running on the author’s HP 200LX, shortly after the moment of discovery.

Benj Edwards

Buckle up for a tale of rogue coding, cephalopod obsession, and the most unexpected Easter egg in palmtop history. This is no fish story—it’s the saga of Lair of Squid.

A computer in the palm of your hand

Introduced in 1994, the HP 200LX palmtop PC put desktop functionality in a pocket-size package. With a small QWERTY keyboard, MS-DOS compatibility, and a suite of productivity apps, the clamshell 200LX offered a vision of one potential future of mobile computing. It featured a 7.91 MHz 80186 CPU, a monochrome 640×200 CGA display, and 1–4 megabytes of RAM.

The cover of the HP 200LX User's Guide (1994).

Enlarge / The cover of the HP 200LX User’s Guide (1994).

Hewlett Packard

I’ve collected vintage computers since 1993, and people frequently offer to send me old devices they’d rather not throw away. Recently, a former HP engineer sent me his small but nice collection of ’90s HP handheld palmtop computers, including a 95LX (1991), 100LX (1993), and 200LX.

HP designed its portable LX series to run many MS-DOS programs that feature text mode or CGA graphics, and each includes built-in versions of the Lotus 1-2-3 spreadsheet, a word processor, terminal program, calculator, and more.

I owned a 95LX as a kid (a hand-me-down from my dad’s friend), which came with a simplistic overhead maze game called TigerFox. So imagine my surprise in 2024, when trawling through the productivity and personal organization apps on that 200LX, to find a richly detailed first-person maze game based around cephalopods, of all things.

(I was less surprised to find an excellent built-in Minesweeper clone, Hearts and Bones, which is definitely a more natural fit for the power and form of the 200LX itself.)

Lair of Squid isn’t a true Doom clone since it’s not a first-person shooter (in some ways, it’s more like a first-person Pac-Man without pellets), but its mere existence—on a black-and-white device best suited for storing phone numbers and text notes—deserves note as one of the weirdest and most interesting pack-in games to ever exist.

Just after discovering Lair of Squid on my device earlier this year, I tweeted about it, and I extracted the file for the game (called “maze.exe”) from the internal ROM drive and sent it to DOS gaming historian Anatoly Shashkin, who put the game on The Internet Archive so anyone can play it in their browser.

After that, I realized that I wanted to figure out who wrote this quirky game, and thanks to a post on RGB Classic Games, I found a name: Andy Gryc. With some luck in cold-emailing, I found him.

Why 1994’s Lair of Squid was the weirdest pack-in game of all time Read More »

the-new-riven-remake-is-even-better-than-myst

The new Riven remake is even better than Myst

A bridge to a mysterious island

Enlarge / The same gorgeous vistas return in the Riven remake.

Samuel Axon

A remake of Riven: The Sequel to Myst launched this week, made by the original game’s developers. It strikes a fascinating balance between re-creation and reinvention, and based on a couple of hours of playing it, it’s a resounding success.

Myst was the classic most people remembered fondly from the early CD-ROM era, but for me, its sequel, Riven, was the highlight. After that, the sequels declined in quality. The sophomore effort was the apex.

It was certainly more ambitious than Myst. Instead of a handful of tightly packed theme park worlds, it offered a singular, cohesive one that felt lived in and steeped in history in a way that Myst couldn’t quite match.

A worthy presentation

That was thanks to outstanding art direction but also to its iconic musical score.

For the most part, the remake nails both of those things. While the original game resembled the first Myst in that you had to click to scroll between static images to explore the game’s world, the new one follows the 2020 Myst remake (and 2000’s oft-forgotten realMyst) in giving the player full movement, akin to contemporary first-person puzzle games like Portal, The Witness, or The Talos Principle. Since it’s easy to re-create a lot of the original camera angles this way, it might have been cool if there had been an option to control the game as you did originally, but I can see why that wasn’t a priority.

The environments are just as atmospheric and detailed as they used to be.

Enlarge / The environments are just as atmospheric and detailed as they used to be.

Samuel Axon

It just so happens that today’s graphics hardware does an outstanding job of replicating previously static visuals in full 3D. (There’s even VR support, though I haven’t tried it yet.) And the music is just as good as it used to be.

There are only two downsides on the presentation front. First, I’ve heard that folks running on older machines may struggle to achieve satisfactory fidelity and performance. I played it on both an M1 Max MacBook Pro and a Windows 11 desktop with an AMD Ryzen 9 5900X and an Nvidia GeForce RTX 3080. The MacBook Pro ran the game at maxed-out settings at the laptop’s native resolution at around 30 frames per second. The desktop did the same at 4K at 120 fps. But those are both high-end, recent-ish machines, so your mileage may vary.

Second, the full-motion video performances in the original game have been replaced with full 3D, video game-looking characters. It’s a necessary concession, but I feel some of the character was lost. They did a pretty good job matching the motions of the original videos, though.

  • The original’s FMV performances have been replaced by respectable but still video game-ish 3D models.

    Samuel Axon

  • The fictional animals fare a bit better visually.

    Samuel Axon

The new Riven remake is even better than Myst Read More »

patent-document-showcases-the-cloud-only-streaming-xbox-console-that-never-was

Patent document showcases the cloud-only streaming Xbox console that never was

keystone revealed —

Microsoft couldn’t get the price of its streaming Xbox low enough to release it.

  • The streaming-only Xbox would have looked like a smaller, squarer relative of the Series S.

    Microsoft

  • The console had cutouts on the bottom and back, presumably for air cooling.

    Microsoft

  • Front-mounted Xbox button and USB port, much like the Series S.

    Microsoft

  • Rear-mounted Ethernet, HDMI, and power. The console would likely have worked over Wi-Fi, too, but wired Ethernet does help with latency and consistency when streaming games.

    Microsoft

  • Controller sync button on the side.

    Microsoft

  • There was a logo and a Series S-ish circle on the top of the Keystone Xbox, but there are no cutouts depicted, so this may have been a stylistic choice rather than a place for the console to vent hot air.

    Microsoft

Microsoft’s mid-generation plans for the Xbox Series S and X consoles looked a whole lot different a couple of years ago than it does now. A leaked slide deck from the FTC v. Microsoft case last year outlined detailed plans for a spruced up Series S, an overhauled Series X, and even a redesigned controller. Another part of that roadmap included a streaming-only version of the Xbox, codenamed Keystone, that was designed to connect to Microsoft’s Xbox Cloud Gaming servers rather than rendering games locally.

Microsoft has talked openly about this version of the Xbox before. Microsoft Gaming CEO Phil Spencer told The Verge that the Keystone console was designed and fully functional, but that it wasn’t launched because Microsoft had a hard time getting the price down low enough that it made sense next to the $299 Series S (which already occasionally goes on sale in the $200 to $250 range).

We’ve already seen glimpses of Keystone—once on Spencer’s shelf, and again in the FTC v. Microsoft documents. Both of those depictions were partial, or seen from a distance. But a new design patent document (PDF) unearthed by Windows Central shows even more detailed renderings of what the cloud Xbox would have looked like.

Series S meets Apple TV

Keystone’s styling was strongly reminiscent of the disc-drive-less Series S, with the same boxy white design and front-mounted Xbox button and USB port. There’s also a similar circular cutout on top, though it may not be an air vent as it is in the Series S—all of the holes depicted in the patent are on the back and bottom, and a streaming box certainly wouldn’t have needed the same cooling capacity as the AMD-designed CPU and GPU in the Series S.

The console also would have been square-shaped and considerably smaller than a Series S—not quite as small as a dedicated video-streaming box like an Apple TV or Roku Ultra, but not too far off either (the patent document doesn’t list dimensions, but we’ve done a rough size comparison using the HDMI and Ethernet ports on the Keystone box and an Apple TV 4K). The console’s controller sync button would have been mounted on its side, rather than in front, as it is on the Series S.

The cloud Xbox compared to a current-generation Apple TV 4K, with sizes roughly normalized based on the sizes of the HDMI and Ethernet ports. The Xbox console would have been a bit larger, but not dramatically so.

Enlarge / The cloud Xbox compared to a current-generation Apple TV 4K, with sizes roughly normalized based on the sizes of the HDMI and Ethernet ports. The Xbox console would have been a bit larger, but not dramatically so.

Apple/Microsoft/Andrew Cunningham

In the alternate reality of the FTC v. Microsoft slide deck, all of these new consoles and the new controller would have been announced or launched by now. But as Microsoft Gaming CEO Phil Spencer said shortly after those documents leaked, the company’s plans have changed substantially in the interim. A disc-less version of the Series X is coming, but it looks exactly like the current version of the console without a disc drive; Microsoft is also pursuing a strategy where it takes more of its internally developed games multi-platform, rather than restricting them to the Xbox and to Windows PCs. These moves are at least partially in response to sliding revenue from Microsoft’s console business, which has seen its revenue decline by double digits year over year for the last couple of years.

Neither Spencer nor Microsoft has ever said never about the Keystone console, leaving the door open to an eventual release if and when the price of manufacturing the console comes down. In the meantime, the streaming-only Xbox lives on as an app for newer Samsung smart TVs.

Listing image by Microsoft

Patent document showcases the cloud-only streaming Xbox console that never was Read More »

apple-rejects-pc-emulators-on-the-ios-app-store

Apple rejects PC emulators on the iOS App Store

I need my portable Number Munchers fix —

New iOS emulation rules only apply to “retro game consoles,” not retro computers.

Don't get your hopes up—this iOS version of <em>Doom</em> was <a href=ported from open source code, not run via a classic PC emulator.” src=”https://cdn.arstechnica.net/wp-content/uploads/2024/06/doomios-800×703.jpg”>

Enlarge / Don’t get your hopes up—this iOS version of Doom was ported from open source code, not run via a classic PC emulator.

Earlier this year, Apple started officially allowing “retro game emulators” on the iOS App Store without the need for cumbersome jailbreaking or sideloading. But if you want to emulate retro PC games on your iOS device, you are apparently still out of luck.

In a recent blog update, iDOS developer Chaoji Li said that the latest version of the DOSBox-based MS-DOS emulator was finally rejected from the iOS App Store this month after a lengthy, two-month review process:

They have decided that iDOS is not a retro game console, so the new rule is not applicable. They suggested I make changes and resubmit for review, but when I asked what changes I should make to be compliant, they had no idea, nor when I asked what a retro game console is. It’s still the same old unreasonable answer along the line of “we know it when we see it.”

The developer of iOS Virtual Machine app UTM told a similar tale of App Store rejection on social media. The reported two-month review process for the UTM app ended with “the App Store review board determin[ing] that ‘PC is not a console’ regardless of the fact that there are retro Windows/DOS games fo[r] the PC that UTM SE can be useful in running,” the developer said.

The April revision of Rule 4.7 in Apple’s App Review Guidelines is very specifically worded so that “retro game console emulator apps can offer to download games [emphasis added].” Emulating a more generalized PC operating system falls outside the letter of this regulation, even for users interested in emulating retro PC games using these apps.

Since that narrow exception doesn’t apply to classic PC emulators, they end up falling afoul of Apple’s Rule 2.5.2, which states that iOS Apps may not “download, install, or execute code which introduces or changes features or functionality of the app, including other apps.” That rule also applies to third-party iOS App Stores that were recently allowed under new European Union rules, meaning even so-called “alternative app marketplaces” don’t offer a useful alternative in this case.

What’s the difference?

While the specific language of Apple’s App Review Guidelines is clear enough, the reasoning behind the distinction here is a bit more mystifying. Why does Apple treat the idea of a DOSBox-style emulator running an ancient copy of Microsoft Excel differently than the idea of Delta running a copy of NES Tetris on the same device? Is loading the Windows 95 Version of KidPix Studio Deluxe on your iPhone really all that different from playing an emulated copy of Mario Paint on that same iPhone?

Mario Paint on iOS, why would I buy Photoshop?” height=”498″ src=”https://cdn.arstechnica.net/wp-content/uploads/2024/06/mariopaint-640×498.jpg” width=”640″>

Enlarge / Now that I can emulate Mario Paint on iOS, why would I buy Photoshop?

A virtual machine or emulator running a modern PC operating system under iOS could theoretically offer some generalized competition for the apps Apple offers in its official App Store. But surely there’s a limit to how much that applies when we’re talking about emulating older computing environments and defunct operating systems. Just as Apple’s iOS game emulation rules only apply to “retro” game consoles, a rule for PC emulation could easily be limited to “retro” operating systems (say, those that are no longer officially supported by their original developers, as a rule of thumb).

Alas, iOS users and App makers are currently stuck abiding by this distinction without a difference when it comes to PC game emulation on iOS. Those looking for a workaround could potentially use an iOS Remote Desktop App to access games running on a physical desktop PC they actually own. The Internet Archive’s collection of thousands of MS-DOS games will also run in an iOS web browser, though you may have to struggle a bit to get controls and sound working correctly.

Apple rejects PC emulators on the iOS App Store Read More »

the-math-on-unplayed-steam-“shame”-is-way-off—and-no-cause-for-guilt

The math on unplayed Steam “shame” is way off—and no cause for guilt

Steam Backlog Simulator 2024 —

It’s fun to speculate, but sales and library quirks make it impossible to know.

Person holding a Steam Deck and playing PowerWash Simulator

Enlarge / Blast away all the guilt you want in PowerWash Simulator, but there’s no need to feel dirty in the real world about your backlog.

Getty Images

Gaming news site PCGamesN has a web tool, SteamIDFinder, that can do a neat trick. If you buy PC games on Steam and have your user profile set to make your gaming details public, you can enter your numeric user ID into it and see a bunch of stats. One set of stats is dedicated to the total value of the games listed as unplayed; you can share this page as an image linking to your “Pile of Shame,” which includes the total “Value” of your Steam collection and unplayed games.

Example findings from SteamIDFinder, from someone who likely has hundreds of games from Humble Bundles and other deals in their library.

Example findings from SteamIDFinder, from someone who likely has hundreds of games from Humble Bundles and other deals in their library.

SteamIDFinder

Using data from what it claims are the roughly 10 percent of 73 million Steam accounts in its database set to Public, PCGamesN extrapolates $1.9 billion in unplayed games, multiplies it by 10, and casually suggests that there are $19 billion in unplayed games hanging around. That is “more than the gross national product of Nicaragua, Niger, Chad, or Mauritius,” the site notes.

That is a very loose “$19 billion”

“Multiply by 10” is already a pretty soft science, but the numbers are worth digging into further. For starters, SteamIDFinder is using the current sale price of every game in your unplayed library, as confirmed by looking at a half-dozen “Pile of Shame” profiles. An informal poll of Ars Technica co-workers and friends with notable Steam libraries suggests that games purchased at full price make up a tiny fraction of the games in our backlogs. Games acquired through package deals, like the Humble Bundle, or during one of Steam’s annual or one-time sales, are a big part of most people’s Steam catalogs, I’d reckon.

  • Step 1 to seeing your unplayed collection: Click the three-vertical-bar icon next to your Steam library to filter, choose “Games,” then “Group by Collection” …

    Andrew Cunningham

  • … And pick “Unplayed” as a Play State filter.

    Andrew Cunningham

Then there’s what counts as “Unplayed.” Clicking on the filtering tool next to my Steam library and choosing “Unplayed” suggests that I have 54 titles out of 173 total that I have never cracked open. My own manual count of my library is closer to 45. Steam and I disagree on whether I’ve launched and played Baldur’s Gate II: Enhanced Edition (I definitely did and was definitely overwhelmed), Mountain, and SteamWorld Dig. And Steam is definitely not counting games that you buy through Steam, mod in some way, and then launch directly through a Windows executable. I’m certain I’ve played some TIE Fighter: Total Conversion, just not through Valve’s channels. One Ars editor played Half-Life 2 multiple times from 2004–2007, but Steam says they’ve never played it, because it didn’t start counting gameplay hours until March 2009.

Even if they’re not dedicated tools, Steam libraries sometimes end up with little bits of game that you didn’t ask for and might never play, like Half-Life Deathmatch: Source. I have quite a few Star Wars games that I never intend to launch, because they were part of a bundle that got me Jedi Knight and Jedi Outcast for cheaper than either game cost on its own.

What “shame” really looks like

Curious as to what people’s backlogs look like, I asked friends and co-workers to run their own numbers after checking them for errors and oddities. Here’s the Ars list:

  • Kevin Purdy: 173 games, 45 unplayed (26 percent)
  • Lee Hutchinson: 361 games, 109 unplayed (30 percent)
  • Benj Edwards: 404 games, 148 unplayed (36.6 percent)
  • Andrew Cunningham: 172 games, 79 unplayed (46 percent)

Friends who did a check ended up at 25 percent, 40 percent, and 52 percent. So nobody I could easily poll had fewer than 25 percent of their games unplayed, and those with higher numbers tended to have bought into bundles, sales, add-ons, and other entry generators. And nobody thought their dollar value total made any sense at all, given the full-price math.

Back in 2014, Kyle Orland went deep on Steam statistics. Among games released since Steam started tracking hour counts in March 2009, 26 percent had never been played at that point, while another 19 percent had only been played for an hour or less. That’s roughly 45 percent of games having been played for an essentially token amount of time.

There is a much larger point to argue here, too: You do not have to feel “shame” about giving too much money to people making games, especially smaller games, if you do not want to. This applies to even broader understandings of “Unplayed,” like checking out an intro level or two. Sometimes playing a game for a little bit and deciding it’s not something you want to put dozens more hours into is worth it, whether or not you go for the refund.

If you’ve looked up your own stats and feel surprised, you can keep your unplayed games as a dedicated collection in Steam, and it might inspire you to check out the most intriguing left-behinds. Or, like me, filter that list further by the games that are Steam Deck Verified and bring them on your next trip.

You can usually make additional money more easily than additional life. Nobody is going to inherit your Steam library (probably), so it’s not really worth anything anyway. Play what interests you when you have the time, and if your unplayed count helps you stave off your worst sale impulse buys or rediscover lost gems, so be it. There are neat tricks, but there is no real math—and no real shame.

The math on unplayed Steam “shame” is way off—and no cause for guilt Read More »

decades-later,-john-romero-looks-back-at-the-birth-of-the-first-person-shooter

Decades later, John Romero looks back at the birth of the first-person shooter

Daikatana didn’t come up —

Id Software co-founder talks to Ars about everything from Catacomb 3-D to “boomer shooters.”

Decades later, John Romero looks back at the birth of the first-person shooter

John Romero remembers the moment he realized what the future of gaming would look like.

In late 1991, Romero and his colleagues at id Software had just released Catacomb 3-D, a crude-looking, EGA-colored first-person shooter that was nonetheless revolutionary compared to other first-person games of the time. “When we started making our 3D games, the only 3D games out there were nothing like ours,” Romero told Ars in a recent interview. “They were lockstep, going through a maze, do a 90-degree turn, that kind of thing.”

Despite Catacomb 3-D‘s technological advances in first-person perspective, though, Romero remembers the team at id followed its release by going to work on the next entry in the long-running Commander Keen series of 2D platform games. But as that process moved forward, Romero told Ars that something didn’t feel right.

Catacombs 3-D is less widely remembered than its successor, Wolfenstein 3D.

“Within two weeks, [I was up] at one in the morning and I’m just like, ‘Guys we need to not make this game [Keen],'” he said. “‘This is not the future. The future is getting better at what we just did with Catacomb.’ … And everyone was immediately was like, ‘Yeah, you know, you’re right. That is the new thing, and we haven’t seen it, and we can do it, so why aren’t we doing it?'”

The team started working on Wolfenstein 3D that very night, Romero said. And the rest is history.

Going for speed

What set Catacomb 3-D and its successors apart from other first-person gaming experiments of the time, Romero said, “was our speed—the speed of the game was critical to us having that massive differentiation. Everyone else was trying to do a world that was proper 3D—six degrees of freedom or representation that was really detailed. And for us, the way that we were going to go was a simple rendering at a high speed with good gameplay. Those were our pillars, and we stuck with them, and that’s what really differentiated them from everyone else.”

That focus on speed extended to id’s development process, which Romero said was unrecognizable compared to even low-budget indie games of today. The team didn’t bother writing out design documents laying out crucial ideas beforehand, for instance, because Romero said “the design doc was next to us; it was the creative director… The games weren’t that big back then, so it was easy for us to say, ‘this is what we’re making’ and ‘things are going to be like this.’ And then we all just work on our own thing.”

John Carmack (left) and John Romero (second from right) pose with their id Software colleagues in the early '90s.

Enlarge / John Carmack (left) and John Romero (second from right) pose with their id Software colleagues in the early ’90s.

The early id designers didn’t even use basic development tools like version control systems, Romero said. Instead, development was highly compartmentalized between different developers; “the files that I’m going to work on, he doesn’t touch, and I don’t touch his files,” Romero remembered of programming games alongside John Carmack. “I only put the files on my transfer floppy disk that he needs, and it’s OK for him to copy everything off of there and overwrite what he has because it’s only my files, and vice versa. If for some reason the hard drive crashed, we could rebuild the source from anyone’s copies of what they’ve got.”

Decades later, John Romero looks back at the birth of the first-person shooter Read More »

apple-intelligence-and-other-features-won’t-launch-in-the-eu-this-year

Apple Intelligence and other features won’t launch in the EU this year

DMA —

iPhone Mirroring and SharePlay screen sharing will also skip the EU for now.

A photo of a hand holding an iPhone running the Image Playground experience in iOS 18

Enlarge / Features like Image Playground won’t arrive in Europe at the same time as other regions.

Apple

Three major features in iOS 18 and macOS Sequoia will not be available to European users this fall, Apple says. They include iPhone screen mirroring on the Mac, SharePlay screen sharing, and the entire Apple Intelligence suite of generative AI features.

In a statement sent to Financial Times, The Verge, and others, Apple says this decision is related to the European Union’s Digital Markets Act (DMA). Here’s the full statement, which was attributed to Apple spokesperson Fred Sainz:

Two weeks ago, Apple unveiled hundreds of new features that we are excited to bring to our users around the world. We are highly motivated to make these technologies accessible to all users. However, due to the regulatory uncertainties brought about by the Digital Markets Act (DMA), we do not believe that we will be able to roll out three of these features — iPhone Mirroring, SharePlay Screen Sharing enhancements, and Apple Intelligence — to our EU users this year.

Specifically, we are concerned that the interoperability requirements of the DMA could force us to compromise the integrity of our products in ways that risk user privacy and data security. We are committed to collaborating with the European Commission in an attempt to find a solution that would enable us to deliver these features to our EU customers without compromising their safety.

It is unclear from Apple’s statement precisely which aspects of the DMA may have led to this decision. It could be that Apple is concerned that it would be required to give competitors like Microsoft or Google access to user data collected for Apple Intelligence features and beyond, but we’re not sure.

This is not the first recent and major divergence between functionality and features for Apple devices in the EU versus other regions. Because of EU regulations, Apple opened up iOS to third-party app stores in Europe, but not in other regions. However, critics argued its compliance with that requirement was lukewarm at best, as it came with a set of restrictions and changes to how app developers could monetize their apps on the platform should they use those other storefronts.

While Apple says in the statement it’s open to finding a solution, no timeline is given. All we know is that the features won’t be available on devices in the EU this year. They’re expected to launch in other regions in the fall.

Apple Intelligence and other features won’t launch in the EU this year Read More »

why-interplay’s-original-fallout-3-was-canceled-20+-years-ago

Why Interplay’s original Fallout 3 was canceled 20+ years ago

The path untaken —

OG Fallout producer says “Project Van Buren” ran out of time and money.

What could have been.

Enlarge / What could have been.

PC gamers of a certain vintage will remember tales of Project Van Buren, a title that early ’00s Interplay intended as the sequel to 1998’s hit Fallout 2. Now, original Fallout producer Timothy Cain is sharing some behind-the-scenes details about how he contributed to the project’s cancellation during a particularly difficult time for publisher Interplay.

Cain famously left Interplay during Fallout 2‘s development in the late ’90s to help form short-lived RPG house Troika Games. After his departure, though, he was still in touch with some people from his former employer, including an unnamed Interplay vice president looking for some outside opinions on the troubled Van Buren project.

“Would you mind coming over and playing one of my game prototypes?” Cain recalls this vice president asking him sometime in mid-2003. “We’re making a Fallout game and I’m going to have to cancel it. I don’t think they can get it done… but if you could come over and look at it and give me an estimate, there’s a chance I wouldn’t cancel it.”

Cain discusses his memories of testing “Project Van Buren.”

Cain recalls walking “across the street” from Troika to the Interplay offices, motivated to help because, as he remembers it, “if you don’t do it, bad things will happen to other people.” There, he got to see the latest build of Project Van Buren, running on the 3D Jefferson Engine that was intended to replace the sprite-based isometric view of the first two Fallout games. Cain said the version he played was similar or identical to a tech demo obtained by fan site No Mutants Allowed in 2007 and featured in a recent YouTube documentary about the failed project.

After playing for about two hours and talking to the team behind the project, Cain said the VP asked him directly how long the demo needed to become a shippable game. The answer Cain reportedly gave—18 months of standard development for “a really good game” or 12 months of “death march” crunch time for an unbalanced, buggy mess—was too long for the financially strapped publisher to devote to funding the project.

“He could not afford a development period of more than six months,” Cain said. “To me, that time frame was out of the question… He thought it couldn’t be done in six months; I just confirmed that.”

Show me the money

Looking back today, Cain said it’s hard to pinpoint a single “villain” responsible for Van Buren’s failure. Even reusing the engine from the first Fallout game—as the Fallout 2 team did for that title’s quick 12-month development process—wouldn’t have necessarily helped, Cain said. “Would that engine have been acceptable five years later [after Fallout 2]?” he asked rhetorically. “Had anyone really looked at it? I started the engine in 1994… it’s creaky.”

Real “Van Buren”-heads will enjoy this in-depth look at the game’s development, including details of Interplay’s troubled financial situation in the early ’00s.

In the end, Van Buren’s cancellation (and that of a planned Interplay Fallout MMO years later) simply “comes down to money,” Cain said. “I do not believe that [with] the money they had left, the game in the state it was in, and the people who were working on it could have completed it within six months,” he said. “And [if they did], I don’t think it would have been a game you would have liked playing.”

Luckily, the all-but-name shuttering of Interplay in the years after Van Buren’s cancellation wouldn’t spell the end of the Fallout series. Bethesda acquired the license in 2007, leading to a completely reimagined Fallout 3 that has become the cornerstone of a fan-favorite franchise many years later. But for those still wondering what Interplay’s original “Fallout 3” could have been, a group of fans is trying to rebuild the Project Van Buren demo from the ground up for modern audiences.

Why Interplay’s original Fallout 3 was canceled 20+ years ago Read More »

from-infocom-to-80-days:-an-oral-history-of-text-games-and-interactive-fiction

From Infocom to 80 Days: An oral history of text games and interactive fiction

Zork running on an Amiga at the Computerspielemuseum in Berlin, Germany.

Enlarge / Zork running on an Amiga at the Computerspielemuseum in Berlin, Germany.

You are standing at the end of a road before a small brick building.

That simple sentence first appeared on a PDP-10 mainframe in the 1970s, and the words marked the beginning of what we now know as interactive fiction.

From the bare-bones text adventures of the 1980s to the heartfelt hypertext works of Twine creators, interactive fiction is an art form that continues to inspire a loyal audience. The community for interactive fiction, or IF, attracts readers and players alongside developers and creators. It champions an open source ethos and a punk-like individuality.

But whatever its production value or artistic merit, at heart, interactive fiction is simply words on a screen. In this time of AAA video games, prestige television, and contemporary novels and poetry, how does interactive fiction continue to endure?

To understand the history of IF, the best place to turn for insight is the authors themselves. Not just the authors of notable text games—although many of the people I interviewed for this article do have that claim to fame—but the authors of the communities and the tools that have kept the torch burning. Here’s what they had to say about IF and its legacy.

Examine roots: Adventure and Infocom

The interactive fiction story began in the 1970s. The first widely played game in the genre was Colossal Cave Adventure, also known simply as Adventure. The text game was made by Will Crowther in 1976, based on his experiences spelunking in Kentucky’s aptly named Mammoth Cave. Descriptions of the different spaces would appear on the terminal, then players would type in two-word commands—a verb followed by a noun—to solve puzzles and navigate the sprawling in-game caverns.

During the 1970s, getting the chance to interact with a computer was a rare and special thing for most people.

“My father’s office had an open house in about 1978,” IF author and tool creator Andrew Plotkin recalled. “We all went in and looked at the computers—computers were very exciting in 1978—and he fired up Adventure on one of the terminals. And I, being eight years old, realized this was the best thing in the universe and immediately wanted to do that forever.”

“It is hard to overstate how potent the effect of this game was,” said Graham Nelson, creator of the Inform language and author of the landmark IF Curses, of his introduction to the field. “Partly that was because the behemoth-like machine controlling the story was itself beyond ordinary human experience.”

Perhaps that extraordinary factor is what sparked the curiosity of people like Plotkin and Nelson to play Adventure and the other text games that followed. The roots of interactive fiction are entangled with the roots of the computing industry. “I think it’s always been a focus on the written word as an engine for what we consider a game,” said software developer and tech entrepreneur Liza Daly. “Originally, that was born out of necessity of primitive computers of the ’70s and ’80s, but people discovered that there was a lot to mine there.”

Home computers were just beginning to gain traction as Stanford University student Don Woods released his own version of Adventure in 1977, based on Crowther’s original Fortran work. Without wider access to comparatively pint-sized machines like the Apple 2 and the Vic-20, Scott Adams might not have found an audience for his own text adventure games, released under his company Adventure International, in another homage to Crowther. As computers spread to more people around the world, interactive fiction was able to reach more and more readers.

From Infocom to 80 Days: An oral history of text games and interactive fiction Read More »