Tech

asahi-linux’s-bespoke-gpu-driver-is-running-windows-games-on-apple-silicon-macs

Asahi Linux’s bespoke GPU driver is running Windows games on Apple Silicon Macs

A few years ago, the idea of running PC games on a Mac, in Linux, or on Arm processors would have been laughable. But the developers behind Asahi Linux—the independent project that is getting Linux working on Apple Silicon Macs—have managed to do all three of these things at once.

The feat brings together a perfect storm of open source projects, according to Asahi Linux GPU lead Alyssa Rosenzweig: the FEX project to translate x86 CPU code to Arm, the Wine project to get Windows binaries running on Linux, DXVK and the Proton project to translate DirectX 12 API calls into Vulkan API calls, and of course the Asahi project’s Vulkan-conformant driver for Apple’s graphics hardware.

Games are technically run inside a virtual machine because of differences in how Apple Silicon and x86 systems address memory—Apple’s systems use 16 KB memory pages, while x86 systems use 4 KB pages, something that causes issues for Asahi and some other Arm Linux distros on a regular basis and a gap that the VM bridges.

You’d never guess that this was the Windows version of Fallout 4 running on a Mac that was running Linux. Credit: Alyssa Rosenzweig

Rosenzweig’s post shows off screenshots of ControlFallout 4The Witcher 3GhostrunnerCyberpunk 2077, Portal 2, and Hollow Knight, though as she notes, most of these games won’t run at anywhere near 60 frames per second yet.

Asahi Linux’s bespoke GPU driver is running Windows games on Apple Silicon Macs Read More »

trek-carback-bike-radar-lets-you-know-when-cars-are-approaching

Trek CarBack bike radar lets you know when cars are approaching

“Car back!”

If you’ve ever been on a group bike ride, you’ve no doubt heard these two words shouted by a nearby rider. It’s also the name of Trek’s new bike radar.

For safety-conscious cyclists, bike radars have been a game-changer. Usually mounted on the seat post, the radar units alert cyclists to cars approaching from behind. While they will work on any bike on any road, bike radar is most useful in suburban and rural settings. After all, if you’re doing some urban bike commuting, you’ll just assume cars are behind you because that’s how it is. But on more open roads with higher speed limits or free-flowing traffic, bike radars are fantastic.

While a handful of companies make them, the Garmin Varia is the best-known and most popular option. The Varia is so popular that it is nearing the proprietary eponym status of Kleenex and Taser among cyclists. Trek hopes to change that with its new CarBack bike radar.

Like other bike radars, the CarBack can be used with either a cycling computer or your smartphone. Mounted either on a seat post or the back of a Bontrager saddle, the CarBack can detect vehicles approaching from as far away as 150 meters, beeping at you once one is in its range.

The CarBack plays just as nicely with Garmin bike computers as the Varia does. When a car comes within range, your bike computer will chirp, the edges of the screen turn orange, and a dot showing the car’s relative position travels up the right side of the screen—exactly the same as riding with a Varia.

Speaking of the Varia, there are three significant differences between it and the CarBack. The first is the effective range, 140 meters for the Varia versus the CarBack’s 150 meters. While riding, I didn’t have the feeling that I was getting alerts sooner. But testing on a busy street demonstrated that the CarBack does have at least a few more meters of range than the Varia.

Trek CarBack bike radar lets you know when cars are approaching Read More »

intel’s-core-ultra-200s-cpus-are-its-biggest-desktop-refresh-in-three-years

Intel’s Core Ultra 200S CPUs are its biggest desktop refresh in three years


CPUs bring Core Ultra features to desktops, with similar performance caveats.

Intel’s 14th-generation desktop processors were a mild update on top of a mild update: a barely faster revision of the 13th-gen Core CPUs, which were themselves a modest tweak to 2021’s 12th-gen Core processors. The new Core Ultra CPUs (and their underlying architectural changes) were exclusive to laptops.

Today, that changes: The Core Ultra 200S processors (codenamed Arrow Lake) will bring to desktops many of the changes Intel has made to its Core Ultra 100- and 200-series laptop CPUs (Meteor Lake and Lunar Lake, respectively). Changes include a new chiplet-based design, new manufacturing technologies, updated CPU and GPU architectures, and a neural processing unit (NPU) for accelerating some AI and machine learning workloads.

All of the new processors launch on October 24th.

As with the Lunar Lake-based laptop chips, Intel has said that power efficiency is a big focus for Arrow Lake—a welcome change after seeing how much power the 13th- and 14th-generation CPUs could consume when they were allowed. But also as with the laptop processors, the Core Ultra desktop CPUs aren’t always a straightforward performance upgrade from their predecessors—they’re usually faster, but how much faster depends a lot on what you’re asking them to do, at least according to the Intel-provided performance figures.

Meet Arrow Lake

Pricing remains broadly similar to the 14th-generation CPUs when they launched (it’s generally down a few dollars, if anything). Intel

The big under-the-hood change to Arrow Lake is that it shifts to a chiplet-based design, where multiple silicon dies are bound together using Intel’s Foveros packaging technology. Foveros uses an Intel-manufactured “base tile” as an interconnect, allowing for communication between four TSMC-manufactured tiles: a compute tile for the CPU cores; a GPU tile for the graphics cores; an SoC tile that includes the NPU, video encoding and decoding blocks, and display outputs; and an I/O tile that mainly handles the DDR5 memory controller (Core Ultra 200S no longer supports DDR4, following AMD’s lead).

Like the Lunar Lake laptop chips, Arrow Lake will be an Intel-designed processor where most of the silicon won’t actually be made in Intel’s factories, aside from the base tile. The compute tile is manufactured on a 3 nm TSMC process, the GPU is a 5 nm TSMC process, and both the SoC and I/O tiles use a 6 nm process.

Compared to 14th-generation processors, Intel says that Core Ultra 200S chips should provide a 10 percent increase in multi-core performance while using 30 percent less power. Though the integrated GPU will remain nothing to write home about, it should also be about twice as fast as the UHD 770 integrated GPU included in 12th-, 13th-, and 14th-generation Core chips.

Lower power usage when gaming is one of Intel’s biggest claims about Arrow Lake, which may help to offset the fact that performance doesn’t change much. Credit: Intel

Intel is announcing three distinct processors today, five if you count the GPU-less variants: the $589 Core Ultra 9 285K, the $394 and $379 Core Ultra 7 265K and KF, and the $309 and $294 Core Ultra 5 245K and KF. These are all unlocked, overclockable processors, and they differ primarily by clock speed and core count. The GPUs and NPUs are the same across the lineup for the chips that include GPUs.

The 245K has six P-cores and eight E-cores and tops out at 5.2 GHz, the 265K has eight P-cores and 12 E-cores and tops out at 5.5 GHz, and the 285K has eight P-cores and 16 E-cores and maxes out at 5.7 GHz. Those core counts match what Intel was offering in analogous 14th-generation CPUs, and pricing is also roughly in line with Intel’s initial list prices for the 14th-generation CPUs.

The fine print on the five Core Ultra 200S CPUs launching this month. Credit: Intel

The P-cores use Intel’s Lion Cove architecture, the same that Intel uses for Lunar Lake. Intel has totally removed Hyper-threading from these cores, lowering the overall thread count substantially compared to 13th- and 14th-gen Core processors, but Intel has said that the silicon space needed for Hyper-threading is better spent elsewhere now that gobs of low-power E-cores are available to split up heavily threaded workloads; the company says that Lion Cove features a 9 percent increase in instructions per clock compared to previous-generation Raptor Cove P-cores. Because the Core Ultra CPUs run at slightly slower peak clock speeds, this means that single-core performance should more or less break even.

Intel says P-core instructions-per-clock increase by about 9 percent, which in Arrow Lake is mostly wiped out by slightly lower peak clock speeds. Intel

Performance gains for the Skymont E-cores are a bit more pronounced; Intel says they’re 32 percent faster on average than the old Gracemont E-cores in integer workloads, 72 percent faster on average in single-core floating-point workloads, and 55 percent faster on average in multi-threaded floating-point workloads. Though you lose the performance benefits of Hyper-threading on the P-cores, these IPC improvements on the E-cores are where Intel is getting its claimed 10 percent generational performance boost over the 14th-generation CPUs.

In games—which generally benefit most from single-core performance improvements—Intel’s figures show that performance is basically a wash between the Core Ultra 200S chips and the 14th-generation Core processors. Sometimes Arrow Lake is a little faster, sometimes it’s a little slower, but on average, it’s about the same. But it achieves those frame rates using 73 W less power on average, and while running cooler by an average of 13 degrees Celsius.

No Copilot+ compatibility

Though these are Intel’s first desktop processors with NPUs included—AMD beat Intel to the punch here with the Ryzen 8000G series, though these are technically laptop silicon repackaged for desktop use—the NPU won’t be good enough to meet Microsoft’s requirements for Windows 11’s Copilot+ features.

Microsoft wants an NPU that can process at least 40 trillion operations per second (TOPS); Arrow Lake’s NPU offers 13 TOPS, barely more than the 11 TOPS that the Core Ultra 100-series laptop CPUs offered. That may be because it’s the same basic NPU—Intel just refers to this architecture as “NPU 3,” while the Core Ultra 200V laptop chips use NPU 4.

Though I don’t really see any of the current Copilot+ features as must-haves—right now, they’re mostly focused on image generation and webcam effects, and they’ll eventually power Windows’ controversial Recall feature—it is a little disappointing that we still don’t have a desktop processor that will support a superset of all Windows 11 features.

New chips do require a long lead time, and it’s possible (probable, even) that Arrow Lake’s design was finalized well before Microsoft defined its Copilot+ performance requirements back in May. But given that Intel worked Lunar Lake’s more advanced P-core and E-core architectures into the Arrow Lake desktop chips, it’s too bad that the newer NPU couldn’t also come along for the ride. Maybe next year.

GPU is better, still not for gaming

An overview of the Xe GPU. It’s a lot like the one in last year’s Meteor Lake laptop chips. Credit: Intel

Arrow Lake includes a better GPU than the old 12th-through-14th-generation Core CPUs, though like the NPU, it’s more similar to last year’s Meteor Lake Core Ultra 100-series GPU than the new GPU from the Lunar Lake laptop chips.

All three of the processors with GPUs use four Xe cores based on a version of the older “Alchemist” GPU architecture from the A-series Intel Arc GPUs. Like those Arc GPUs, the integrated Xe GPU here supports hardware-accelerated ray tracing, high-quality XeSS upscaling, and hardware-accelerated encoding and decoding for the AV1 video codec.

But despite these improvements and Intel’s claimed 2x performance increase compared to the UHD 770 integrated GPU, this still isn’t meant for anything other than basic low-end gaming. In high-end desktops, it will mainly be useful for driving additional displays on top of whatever your dedicated GPU can handle. (It’s not clear whether the KF-series processors without GPUs will feature hardware-accelerated video encoding and decoding; they typically don’t, but only because those features are normally a part of the GPU. In Arrow Lake, that hardware is in the SoC tile instead).

A new chipset and socket

Intel’s desktop platforms will continue to use a chipset that’s totally separate from the CPU package, unlike its laptop chips, which combine a CPU/GPU/chipset. The new 800-series chipsets can support “up to” 24 PCIe 4.0 lanes, which can be used for M.2 SSDs and various ports depending on how your motherboard maker decides to use them.

The chipsets will support up to two integrated Thunderbolt 4 ports—the first time these have been integrated directly into a desktop chipset rather than requiring a separate controller—up to 10 USB 3.2 ports that can transfer at speeds up to 20Gbps and include Wi-Fi 6E, Bluetooth 5.3, and 1Gbps Ethernet support. Some motherboard makers are already advertising Z890 motherboards, which will be required if you want to do any CPU overclocking—lower-end 800-series chipsets haven’t been announced yet, but expect more affordable versions to come with fewer PCIe lanes and port options.

While 12th-, 13th-, and 14th-generation Core CPUs all used the same LGA1700 processor socket and could all work in any 600- or 700-series motherboard as long as you’d installed a BIOS update, this year’s new 800-series chipsets come with an all-new LGA1851 socket. According to announcements from CPU cooler manufacturers like Noctua and Arctic, most coolers that work with LGA1700-series CPUs and motherboards should also be compatible with LGA1851, though you should check your manufacturer’s website to make sure you don’t need some kind of adapter or bracket for installation.

Photo of Andrew Cunningham

Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.

Intel’s Core Ultra 200S CPUs are its biggest desktop refresh in three years Read More »

amazon,-apple-make-a-deal-to-offer-apple-tv+-in-a-prime-bundle

Amazon, Apple make a deal to offer Apple TV+ in a Prime bundle

The Apple TV platform, tvOS, and the original Apple TV app were initially intended to solve this problem by offering an a la carte, consumer-friendly way to manage the options in a burgeoning streaming-TV industry.

However, Apple’s attempt to make the TV app a universal hub of content has been continually stymied by the fact that industry giant Netflix has declined to participate.

Users of the TV app and Apple TV set-top-box still must launch a separate Netflix app to see their watch history on that service, or to see if movies or shows they want to watch are available. Content from most other services—including Amazon Prime Video—is exposable through search within the app and rolls into a unified watch history.

Fighting to succeed in a messy business

Further, streaming services have become increasingly expensive, and streamers have begun trying to find new revenue from sources like bundles and advertising. The reasons for these trends are complex, but one of the key problems is that scripted television content is immensely expensive to produce—especially as the prestige TV era has driven up viewer expectations in terms of quality and production values.

As an early leader in the industry, Netflix established unrealistic expectations for everyone involved—consumers, production houses, investors, and so on—by simply throwing immense amounts of money into content without immediately seeing a return.

When larger economic factors put an end to that practice, streamers had to adjust—including Apple, which among other things is tweaking its film strategy for the new landscape.

Apple still offers several of those central hub features—for example, you can subscribe to services like Paramount+ and launch their shows from the Apple TV app, just like Amazon is doing with its app and Apple TV+ here. But the realities of the mess the industry finds itself in have clearly led Apple to keep an open mind about how it can attract and retain viewers.

Amazon, Apple make a deal to offer Apple TV+ in a Prime bundle Read More »

former-apple-hardware-chief-dan-riccio-is-retiring

Former Apple hardware chief Dan Riccio is retiring

Dan Riccio, one of Apple’s most prominent executives for more than two decades, will retire from the company this month, according to a report in Bloomberg that cites people with knowledge of the move.

Reportedly, Riccio has said he has been planning his retirement for the past five years, and his last day will be Friday, October 11.

Riccio began working at Apple in 1998, and by 2012, he had become the chief of hardware engineering. In that role, he oversaw several major hardware developments for Apple, including AirPods, the evolution of the modern iPhone, the iPad Pro, and more.

He held the title of senior vice president of hardware engineering during that time, then moved into a new role within the company in January of 2021. The public at first only knew that he was working on a “new project” at that time, but before long it became clear the project in question was what became the Vision Pro, Apple’s augmented-reality headset that launched this February.

The group that produced the Vision Pro is called the Vision Products Group within the company; that’s the 2,000-engineer-strong group Riccio has overseen since 2021. He was also involved in developing Project Titan, Apple’s smart car initiative that was eventually abandoned.

Former Apple hardware chief Dan Riccio is retiring Read More »

man-learns-he’s-being-dumped-via-“dystopian”-ai-summary-of-texts

Man learns he’s being dumped via “dystopian” AI summary of texts

The evolution of bad news via texting

Spreen’s message is the first time we’ve seen an AI-mediated relationship breakup, but it likely won’t be the last. As the Apple Intelligence feature rolls out widely and other tech companies embrace AI message summarization, many people will probably be receiving bad news through AI summaries soon. For example, since March, Google’s Android Auto AI has been able to deliver summaries to users while driving.

If that sounds horrible, consider our ever-evolving social tolerance for tech progress. Back in the 2000s when SMS texting was still novel, some etiquette experts considered breaking up a relationship through text messages to be inexcusably rude, and it was unusual enough to generate a Reuters news story. The sentiment apparently extended to Americans in general: According to The Washington Post, a 2007 survey commissioned by Samsung showed that only about 11 percent of Americans thought it was OK to break up that way.

What texting looked like back in the day.

By 2009, as texting became more commonplace, the stance on texting break-ups began to soften. That year, ABC News quoted Kristina Grish, author of “The Joy of Text: Mating, Dating, and Techno-Relating,” as saying, “When Britney Spears dumped Kevin Federline I thought doing it by text message was an abomination, that it was insensitive and without reason.” Grish was referring to a 2006 incident with the pop singer that made headline news. “But it has now come to the point where our cell phones and BlackBerries are an extension of ourselves and our personality. It’s not unusual that people are breaking up this way so much.”

Today, with text messaging basically being the default way most adults communicate remotely, breaking up through text is commonplace enough that Cosmopolitan endorsed the practice in a 2023 article. “I can tell you with complete confidence as an experienced professional in the field of romantic failure that of these options, I would take the breakup text any day,” wrote Kayle Kibbe.

Who knows, perhaps in the future, people will be able to ask their personal AI assistants to contact their girlfriend or boyfriend directly to deliver a personalized break-up for them with a sensitive message that attempts to ease the blow. But what’s next—break-ups on the moon?

This article was updated at 3: 33 PM on October 10, 2024 to clarify that the ex-girlfriend’s full real name has not been revealed by the screenshot image.

Man learns he’s being dumped via “dystopian” AI summary of texts Read More »

maze-of-adapters,-software-patches-gets-a-dedicated-gpu-working-on-a-raspberry-pi

Maze of adapters, software patches gets a dedicated GPU working on a Raspberry Pi

Actually getting the GPU working required patching the Linux kernel to include the open-source AMDGPU driver, which includes Arm support and provides decent support for the RX 460 (Geerling says the card and its Polaris architecture were chosen because they were new enough to be practically useful and to be supported by the AMDGPU driver, old enough that driver support is pretty mature, and because the card is cheap and uses PCIe 3.0). Nvidia’s GPUs generally aren’t really an option for projects like this because the open source drivers lag far behind the ones available for Radeon GPUs.

Once various kernel patches were applied and the kernel was recompiled, installing AMD’s graphics firmware got both graphics output and 3D acceleration working more or less normally.

Despite their age and relative graphical simplicity, running Doom 3 or Tux Racer on the Pi 5’s GPU is a tall order, even at 1080p. The RX 460 was able to run both at 4K, albeit with some settings reduced; Geerling also said that the card rendered the Pi operating system’s UI smoothly at 4k (the Pi’s integrated GPU does support 4K output, but things get framey quickly in our experience, especially when using multiple monitors).

Though a qualified success, anything this hacky is likely to have at least some software problems; Geerling noted that graphics acceleration in the Chromium browser and GPU-accelerated video encoding and decoding support weren’t working properly.

Most Pi owners aren’t going to want to run out and recreate this setup themselves, but it is interesting to see progress when it comes to using dedicated GPUs with Arm CPUs. So far, Arm chips across all major software ecosystems—including Windows, macOS, and Android—have mostly been restricted to using their own integrated GPUs. But if Arm processors are really going to compete with Intel’s and AMD’s in every PC market segment, we’ll eventually need to see better support for external graphics chips.

Maze of adapters, software patches gets a dedicated GPU working on a Raspberry Pi Read More »

thunderbird-android-client-is-k-9-mail-reborn,-and-it’s-in-solid-beta

Thunderbird Android client is K-9 Mail reborn, and it’s in solid beta

Thunderbird’s Android app, which is actually the K-9 Mail project reborn, is almost out. You can check it out a bit early in a beta that will feel pretty robust to most users.

Thunderbird, maintained by the Mozilla Foundation subsidiary MZLA, acquired the source code and naming rights to K-9 Mail, as announced in June 2022. The group also brought K-9 maintainer Christian Ketterer (or “cketti”) onto the project. Their initial goals, before a full rebrand into Thunderbird, involved importing Thunderbird’s automatic account setup, message filters, and mobile/desktop Thunderbird syncing.

At the tail end of 2023, however, Ketterer wrote on K-9’s blog that the punchlist of items before official Thunderbird-dom was taking longer than expected. But when it’s fully released, Thunderbird for Android will have those features. As such, beta testers are asked to check out a specific list of things to see if they work, including automatic setup, folder management, and K-9-to-Thunderbird transfer. The beta will not be “addressing longstanding issues,” Thunderbird’s blog post notes.

Launching Thunderbird for Android from K-9 Mail’s base makes a good deal of sense. Thunderbird’s desktop client has had a strange, disjointed life so far and is only just starting to regain a cohesive vision for what it wants to provide. For a long time now, K-9 Mail has been the Android email of choice for people who don’t want Gmail or Outlook, will not tolerate the default “Email” app on non-Google-blessed Android systems, and just want to see their messages.

Thunderbird Android client is K-9 Mail reborn, and it’s in solid beta Read More »

apple-brings-years-old-features-to-icloud-web-interface

Apple brings years-old features to iCloud web interface

In a rare event, Apple has rolled out substantial updates to the web-based iCloud interface meant to allow users to access Apple services like Mail and Photos when they’re away from a Mac, iPad, or iPhone.

The flagship addition is dark mode; it “will automatically match your device settings with a Light Mode or Dark Mode color scheme,” Apple explains as part of the update.

There is also now a way to customize the background for the iCloud web interface—specifically, you can choose between several colors.

A few apps received features that have been available on iOS and macOS for a while. For example, the Notes web app now supports pinned notes, and iCloud Drive supports shared views.

If you think all that seems like it’s pretty basic and late to the game, you’re not wrong.

The iCloud web interface has long seemed like an afterthought for Apple, and it has always been far behind Apple’s native software platforms in terms of features. How far behind? Well, consider this: dark mode was previously added to iOS way back in iOS 13.

Apple’s narrative to investors has long said that its services like iCloud are key to making up for slowed hardware sales in the mature smartphone market. To that end, the company has made this web interface available and has brought some of its services like Music and TV+ to other platforms like Windows and Android.

However, there seem to be limits to that. As noted, iCloud for web has historically been a subpar experience, and other key services like Messages have not been made available on other platforms at all, possibly to avoid losing the social lock-in advantage of Messages for iOS. (Messages is notably absent in the web app.)

Still, it’s nice to see any movement at all here. While iCloud.com gets infrequent and small updates, it remains actively supported at a basic level.

Apple brings years-old features to iCloud web interface Read More »

samsung-quits-updating-galaxy-z-fold-2-that-came-out-in-2020-for-$2,000

Samsung quits updating Galaxy Z Fold 2 that came out in 2020 for $2,000

In February 2022, Samsung started promising up to four generations of Android OS and One UI upgrades to “select” Galaxy devices, as well as “up to five years of security updates.” And in January, it announced moving to seven years of security and OS updates, matching a move from Google. However, the Fold 2 wasn’t included in Samsung’s list of “select” Galaxy devices.

Thus, one could have estimated that the Fold 2 might stop receiving OS and security updates by 2024, four years after its debut. But it’s still hard to reconcile with paying four figures for a phone that became a security risk after four years, despite functioning properly otherwise. Apple, by comparison, now promises at least five years of security updates. Apple only started making that promise in 2023 with the iPhone 15 series. However, the current-generation iOS 18 is supported by iPhones released in 2020, like the second-generation iPhone SE, and even older ones, like the iPhone XR that came out in 2018. Arguably short-lived expensive devices like the Fold 2 are part of the reason some activists are pushing for the FTC to require that smart devices state on their packaging how long they’ll receive updates.

However, unlike iPhones, Samsung phones aren’t all powered by a proprietary chip, making promises of upgrades require commitments from third-party vendors, like Qualcomm. With Qualcomm known for being resistant to longer chip life cycles in the past, seven years of updates is progress for Samsung users—just not those who invested in the Z Fold 2.

Samsung quits updating Galaxy Z Fold 2 that came out in 2020 for $2,000 Read More »

winamp-really-whips-open-source-coders-into-frenzy-with-its-source-release

Winamp really whips open source coders into frenzy with its source release

As people in the many, many busy GitHub issue threads are suggesting, coding has come a long way since the heyday of the Windows-98-era Winamp player, and Winamp seems to have rushed its code onto a platform it does not really understand.

Winamp flourished around the same time as illegal MP3 networks such as Napster, Limewire, and Kazaa, providing a more capable means of organizing and playing deeply compressed music with incorrect metadata. After a web shutdown in 2013 that seemed inevitable in hindsight, Winamp’s assets were purchased by a company named Radionomy in 2014, and a new version was due out in 2019, one that aimed to combine local music libraries with web streaming of podcasts and radio.

Winamp did get that big update in 2022, though the app was “still in many ways an ancient app,” Ars’ Andrew Cunningham wrote then. There was support for music NFTs added at the end of 2022.

In its press release for the code availability, the Brussels-based Llama Group SA, with roughly 100 employees, says that “Tens of millions of users still use Winamp for Windows every month.” It plans to release “two major official versions per year with new features,” as well as offering Winamp for Creators, intended for artists or labels to manage their music, licensing, distribution, and monetization on various platforms.

Winamp really whips open source coders into frenzy with its source release Read More »

report:-first-wave-of-m4-macs,-including-smaller-mac-mini,-coming-november-1

Report: First wave of M4 Macs, including smaller Mac mini, coming November 1

Reliable rumors have suggested that M4 Macs are right around the corner, and now Bloomberg’s Mark Gurman is forecasting a specific launch date: November 1, following a late-October announcement that mirrors last year’s Halloween-themed reveal for the first M3 Macs.

This date could be subject to change, and not all the products announced in October would necessarily launch on November 1—lower-end Macs are more likely to launch early, and higher-end models would be more likely to ship a bit later in the month.

The list of what to expect is the same as it has been for a while: refreshed 14- and 16-inch MacBook Pros with M4, M4 Pro, and M4 Max chips, a new M4 version of the 24-inch iMac, and an M4 update to the Mac mini that leapfrogs the M3 entirely. These will all be the first Macs to get the M4, following its unexpected introduction in the iPad Pro earlier this year.

The refreshed Mac mini is the most interesting of the new models—it’s said to come with a fully revamped design for the first time since the aluminum unibody version was released in 2010. The new Mac mini is said to be closer in size to an Apple TV box, but it will retain an internal power supply that doesn’t require a bulky external brick. The Mac mini lineup should still be split between two slightly different machines: one entry-level model with a basic M4 chip, and a higher-end M4 Pro version that bridges the gap between the Mac mini and the Mac Studio.

Report: First wave of M4 Macs, including smaller Mac mini, coming November 1 Read More »