Features

what-i-do-to-clean-up-a-“clean-install”-of-windows-11-23h2-and-edge

What I do to clean up a “clean install” of Windows 11 23H2 and Edge

What I do to clean up a “clean install” of Windows 11 23H2 and Edge

Aurich Lawson | Getty Images

I’ve written before about my nostalgia for the Windows XP- or Windows 7-era “clean install,” when you could substantially improve any given pre-made PC merely by taking an official direct-from-Microsoft Windows install disk and blowing away the factory install, ridding yourself of 60-day antivirus trials, WildTangent games, outdated drivers, and whatever other software your PC maker threw on it to help subsidize its cost.

You can still do that with Windows 11—in fact, it’s considerably easier than it was in those ’00s versions of Windows, with multiple official Microsoft-sanctioned ways to download and create an install disk, something you used to need to acquire on your own. But the resulting Windows installation is a lot less “clean” than it used to be, given the continual creep of new Microsoft apps and services into more and more parts of the core Windows experience.

I frequently write about Windows, Edge, and other Microsoft-adjacent technologies as part of my day job, and I sign into my daily-use PCs with a Microsoft account, so my usage patterns may be atypical for many Ars Technica readers. But for anyone who uses Windows, Edge, or both, I thought it might be useful to detail what I’m doing to clean up a clean install of Windows, minimizing (if not totally eliminating) the number of annoying notifications, Microsoft services, and unasked-for apps that we have to deal with.

That said, this is not a guide about creating a minimally stripped-down, telemetry-free version of Windows that removes anything other than what Microsoft allows you to remove. There are plenty of experimental hacks dedicated to that sort of thing—NTDev’s Tiny11 project is one—but removing built-in Windows components can cause unexpected compatibility and security problems, and Tiny11 has historically had issues with basic table-stakes stuff like “installing security updates.”

Avoiding Microsoft account sign-in

The most contentious part of Windows 11’s setup process relative to earlier Windows versions is that it mandates Microsoft account sign-in, with none of the readily apparent “limited account” fallbacks that existed in Windows 10. As of Windows 11 22H2, that’s true of both the Home and Pro editions.

There are two reasons I can think of not to sign in with a Microsoft account. The first is that you want nothing to do with a Microsoft account, thank you very much. Signing in makes you more of a target for Microsoft 365, OneDrive, or Game Pass subscription upsells since all you need to do is add them to an account that already exists, and Windows setup will offer subscriptions to each if you sign in first.

The second—which is my situation—is that you do use a Microsoft account because it offers some handy benefits like automated encryption of your local drive (having those encryption keys saved to my account has saved me a couple of times) or syncing of browser info and some preferences. But you don’t want to sign in at setup, either because you’re just testing something or you prefer your user folder to be located at “C:UsersAndrew” rather than “C:Users.”

Regardless of your reasoning, if you don’t want to bother with sign-in at setup, you have two options (three for Windows 11 Pro users):

Use the command line

During Windows 11 Setup, after selecting a language and keyboard layout but before connecting to a network, hit Shift+F10 to open the command prompt. Type OOBEBYPASSNRO, hit Enter, and wait for the PC to reboot.

When it comes back, click “I don’t have Internet” on the network setup screen, and you’ll have recovered the option to use “limited setup” (aka a local account) again, like older versions of Windows 10 and 11 offered.

For Windows 11 Pro

Windows 11 Pro users, take a journey with me.

Proceed through the Windows 11 setup as you normally would, including connecting to a network and allowing the system to check for updates. Eventually, you’ll be asked whether you’re setting your PC up for personal use or for “work or school.”

Select the work or school option, then sign-in options, at which point you’ll finally be asked whether you plan to join the PC to a domain. Tell it you are (even though you aren’t), and you’ll see the normal workflow for creating a “limited” local account.

This one won’t work if you don’t want to start your relationship with a new computer by lying to it, but it also doesn’t require going to the command line.

What I do to clean up a “clean install” of Windows 11 23H2 and Edge Read More »

doing-dns-and-dhcp-for-your-lan-the-old-way—the-way-that-works

Doing DNS and DHCP for your LAN the old way—the way that works

All shall tremble before your fully functional forward and reverse lookups!

Enlarge / All shall tremble before your fully functional forward and reverse lookups!

Aurich Lawson | Getty Images

Here’s a short summary of the next 7,000-ish words for folks who hate the thing recipe sites do where the authors babble about their personal lives for pages and pages before getting to the cooking: This article is about how to install bind and dhcpd and tie them together into a functional dynamic DNS setup for your LAN so that DHCP clients self-register with DNS, and you always have working forward and reverse DNS lookups. This article is intended to be part one of a two-part series, and in part two, we’ll combine our bind DNS instance with an ACME-enabled LAN certificate authority and set up LetsEncrypt-style auto-renewing certificates for LAN services.

If that sounds like a fun couple of weekend projects, you’re in the right place! If you want to fast-forward to where we start installing stuff, skip down a couple of subheds to the tutorial-y bits. Now, excuse me while I babble about my personal life.

My name is Lee, and I have a problem

(Hi, Lee.)

I am a tinkering homelab sysadmin forever chasing the enterprise dragon. My understanding of what “normal” means, in terms of the things I should be able to do in any minimally functioning networking environment, was formed in the days just before and just after 9/11, when I was a fledgling admin fresh out of college, working at an enormous company that made planes starting with the number “7.” I tutored at the knees of a whole bunch of different mentor sysadmins, who ranged on the graybeard scale from “fairly normal, just writes his own custom GURPS campaigns” to “lives in a Unabomber cabin in the woods and will only communicate via GPG.” If there was one consistent refrain throughout my formative years marinating in that enterprise IT soup, it was that forward and reverse DNS should always work. Why? Because just like a clean bathroom is generally a sign of a nice restaurant, having good, functional DNS (forward and reverse) is a sign that your IT team knows what it’s doing.

Just look at what the masses have to contend with outside of the datacenter, where madness reigns. Look at the state of the average user’s LAN—is there even a search domain configured? Do reverse queries on dynamic hosts work? Do forward queries on dynamic hosts even work? How can anyone live like this?!

I decided long ago that I didn’t have to, so I’ve maintained a linked bind and dhcpd setup on my LAN for more than ten years. Also, I have control issues, and I like my home LAN to function like the well-run enterprise LANs I used to spend my days administering. It’s kind of like how car people think: If you’re not driving a stick shift, you’re not really driving. I have the same kind of dumb hang-up, but for network services.

Honestly, though, running your LAN with bind and dhcpd isn’t even that much work—those two applications underpin a huge part of the modern Internet. The packaged versions that come with most modern Linux distros are ready to go out of the box. They certainly beat the pants off of the minimal DNS/DHCP services offered by most SOHO NAT routers. Once you have bind and dhcpd configured, they’re bulletproof. The only time I interact with my setup is if I need to add a new static DHCP mapping for a host I want to always grab the same IP address.

So, hey, if the idea of having perfect forward and reverse DNS lookups on your LAN sounds exciting—and, come on, who doesn’t want that?!—then pull up your terminal and strap in because we’re going make it happen.

(Note that I’m relying a bit on Past Lee and this old blog entry for some of the explanations in this piece, so if any of the three people who read my blog notice any similarities in some of the text, it’s because Past Lee wrote it first and I am absolutely stealing from him.)

But wait, there’s more!

This piece is intended to be part one of two. If the idea of having one’s own bind and dhcpd servers sounds a little silly (and it’s not—it’s awesome), it’s actually a prerequisite for an additional future project with serious practical implications: our own fully functioning local ACME-enabled certificate authority capable of answering DNS-01 challenges so we can issue our own certificates to LAN services and not have to deal with TLS warnings like plebes.

(“But Lee,” you say, “why not just use actual-for-real LetsEncrypt with a real domain on my LAN?” Because that’s considerably more complicated to implement if one does it the right way, and it means potentially dealing with split-horizon DNS and hairpinning if you also need to use that domain for any Internet-accessible stuff. Split-horizon DNS is handy and useful if you have requirements that demand it, but if you’re a home user, you probably don’t. We’ll keep this as simple as possible and use LAN-specific DNS zones rather than real public domain names.)

We’ll tackle all the certificate stuff in part two—because we have a ways to go before we can get there.

Doing DNS and DHCP for your LAN the old way—the way that works Read More »

why-walking-around-in-public-with-vision-pro-makes-no-sense

Why walking around in public with Vision Pro makes no sense

  • A close-up look at the Vision Pro from the front.

    Samuel Axon

  • The Apple Vision Pro with AirPods Pro, Magic Keyboard, Magic Trackpad, and an Xbox Series X|S controller.

    Samuel Axon

  • You can see the front-facing cameras that handle passthrough video just above the downward-facing cameras that read your hand gestures here.

    Samuel Axon

  • There are two buttons for Vision Pro, both on the top.

    Samuel Axon

  • This is the infamous battery pack. It’s about the size of an iPhone (but a little thicker) and has a USB-C port for external power sources.

    Samuel Axon

  • There are two displays inside the Vision Pro, one for each eye. Each offers just under 4K resolution.

    Samuel Axon

  • Apple offers several variations of the light seal to fit different face shapes.

    Samuel Axon

If you’ve spent any time in the tech-enthusiast corners of Instagram of TikTok over the past few weeks, you’ve seen the videos: so-called tech bros strolling through public spaces with confidence, donning Apple’s $3,500 Vision Pro headset on their faces while gesturing into the air.

Dive into the comments on those videos and you’ll see a consistent ratio: about 20 percent of the commenters herald this as the future, and the other 80 mock it with vehement derision. “I’ve never had as much desire to disconnect from reality as this guy does,” one reads.

Over the next few weeks, I’m going all-in on trying the Vision Pro in all sorts of situations to see which ones it suits. Last week, I talked about replacing a home theater system with it—at least when traveling away from home. Today, I’m going over my experience trying to find a use for it out on the streets of Chicago.

I’m setting out to answer a few questions here: Does it feel weird wearing it in public spaces? Will people judge you or react negatively when you wear it—and if so, will that become less common over time? Does it truly disconnect you from reality, and has Apple succeeded in solving virtual reality’s isolationist tendencies? Does it provide enough value to be worth wearing?

As it turns out, all these questions are closely related.

The potential of AR in the wild

I was excited about the Vision Pro in the lead-up to its launch. I was impressed by the demo I saw at WWDC 2023, even though I was aware that it was offered in an ideal setting: a private, well-lit room with lots of space to move around.

Part of my excitement was about things I didn’t see in that demo but that I’ve seen augmented reality developers explore in smartphone augmented reality (AR) and niche platforms like HoloLens and Xreal. Some smart folks have already produced a wide variety of neat tech demos showing what you can do with a good consumer AR headset, and many of the most exciting ideas work outside the home or office.

I’ve seen demonstrations of real-time directions provided with markers along the street while you walk around town, virtual assistant avatars guiding you through the airport, menus and Yelp reviews overlaid on the doors of every restaurant on a city strip, public art projects pieced together by multiple participants who each get to add an element to a virtual statue, and much more.

Of course, all those ideas—and most others for AR—make a lot more sense for unintrusive glasses than they do for something that is essentially a VR headset with passthrough. Nonetheless, I was hoping to get a glimpse at that eventuality with the Vision Pro.

Why walking around in public with Vision Pro makes no sense Read More »

fake-grass,-real-injuries?-dissecting-the-nfl’s-artificial-turf-debate

Fake grass, real injuries? Dissecting the NFL’s artificial turf debate

Fake grass, real injuries? Dissecting the NFL’s artificial turf debate

iStock/Getty Images

Super Bowl LVIII will be played on a natural grass field in an indoor stadium in Las Vegas on February 11, 2024. How do you keep a grass field vibrant in such a hostile growing environment like the Nevada desert?

The answer: You don’t. By the end of the regular NFL season, paint was used to camouflage the reality that only a few scant patches of grass remained in Allegiant Stadium, home to the Las Vegas Raiders. Immediately after the Raiders’ last game on January 7, 2024, the field crew ripped up the remaining grass, installed California-grown sod over three days, and began the tedious process of keeping the grass alive long enough for the big game.

Herculean efforts to prepare a vibrant natural grass field for 2024’s Super Bowl LVIII are especially questionable when one realizes that Allegiant Stadium also has an artificial turf playing surface available (used by UNLV Football). Why don’t teams in hostile environments switch to more robust artificial turf, which is designed to overcome the many limitations of natural grass fields?

The answer lies in a debate over the safety of synthetic playing surfaces. While artificial turf manufacturers tout research that their products result in fewer injuries, the NFL Players Association (NFLPA) claims it raises injury risk and is advocating for its use to be abolished in the NFL. Let’s explore some key arguments of this debate, which continues to grab headlines with each high-profile NFL injury.

Super Bowl gridirons

Pressure for NFL field managers is especially high following the embarrassingly poor field conditions of last year’s Super Bowl. Super Bowl LVII took place at State Farm Stadium in Glendale, Arizona—another natural grass field in the desert (with a retractable roof, closed at night to protect the grass). Despite two years of preparation and an $800,000 investment, the grass field was a disaster, as players struggled to find footing on its slippery surface.

Veteran NFL groundskeeper George Toma attributed the mess to woefully improper field preparation. Players also complained about the slipping issue the previous time the Super Bowl was hosted at the natural grass field in State Farm Stadium eight years prior for Super Bowl XLIX in 2015. That year, the poor traction was blamed on the green paint used on the grass.

For perspective, some of the best sports field managers in the nation oversee field preparations for the Super Bowl. However, maintaining natural grass in desert conditions is so unfavorable (especially when the grass is sometimes indoors) that even the best can mess it up.

None of these issues existed when the Super Bowl was last played on artificial turf. Super Bowl LVI in 2022 was held at SoFi Stadium in Inglewood, California, home to both the Los Angeles Rams and Chargers. Not only did the artificial turf stand up to double the workload during the regular season (hosting home games for the LA Rams and the LA Chargers), but it also withstood a busy playoff season. The artificial turf field at SoFi Stadium hosted NFL games through the regular season and right up to the last playoff game when the LA Rams beat the San Francisco 49ers. Two weeks later, the Rams ended up winning Super Bowl LVI on the very same surface.

While turf avoids the durability issues seen with grass surfaces, players have widespread concerns about its safety—a recent poll by the NFLPA reported 92 percent of players favored grass.

Fake grass, real injuries? Dissecting the NFL’s artificial turf debate Read More »

can-a-$3,500-headset-replace-your-tv?-we-tried-vision-pro-to-find-out

Can a $3,500 headset replace your TV? We tried Vision Pro to find out

Apple Vision Pro Review —

We kick off our multi-part Vision Pro review by testing it for entertainment.

  • The Apple Vision Pro with AirPods Pro, Magic Keyboard, Magic Trackpad, and an Xbox Series X|S controller.

    Samuel Axon

  • You can see the front-facing cameras that handle passthrough video just above the down-facing cameras that read your hand gestures here.

    Samuel Axon

  • There are two buttons for Vision Pro, both on the top.

    Samuel Axon

  • This is the infamous battery pack. It’s about the size of an iPhone (but a little thicker) and has a USB-C port for external power sources.

    Samuel Axon

  • There are two displays inside the Vision Pro, one for each eye. Each offers just under 4K resolution.

    Samuel Axon

  • Apple offers several variations of the light seal to fit different face shapes.

    Samuel Axon

  • A close-up look at the Vision Pro from the front.

    Samuel Axon

The Vision Pro is the strangest product Apple has introduced in the time I’ve been covering the company. By now, it’s well established that the headset is both impressively cutting-edge and ludicrously expensive.

You could certainly argue that its price means it’s only for Silicon Valley techno-optimists with too much money to burn or for developers looking to get in on the ground floor on the chance that this is the next gold rush for apps. But the platform will need more than those users to succeed.

Part of Apple’s pitch behind the price tag seems to be that the Vision Pro could replace several devices, just like the iPhone did back in the late 2000s. It could replace your laptop, your tablet, your 4K TV, your video game console, your phone or other communications device, your VR headset, and so on. If it truly replaced all of those things, the price wouldn’t seem quite so outrageous to some.

And those are just the use cases Apple has put a lot of effort into facilitating for the launch. Many of the most important uses of the company’s prior new product categories didn’t become totally clear until a couple of years and generations in. The iPhone wasn’t originally intended as a meditation aid, a flashlight, and a number of other common uses until third-party developers invented apps to make it do those things. And Apple’s approach with the Apple Watch seemed to be to just throw it out there with a number of possible uses to see what stuck with users. (The answer seemed to be health and fitness, but the device’s distinct emphasis on that took a bit of time to come into focus.)

So while I could write a dense review meandering through all the possibilities based on my week with the Vision Pro, that doesn’t seem as helpful as drilling in on each specific possibility. This is the first in a series of articles that will do that, so consider it part one of a lengthy, multi-step review. By the end, we’ll have considered several possible applications of the device, and we might be able to make some recommendations or predictions about its potential.

So far, I believe there’s one use case that’s a slam dunk, closer to clarity during launch week than any of the others: entertainment. For certain situations, The Vision Pro is a better device for consuming TV shows and movies (among other things) away from a dedicated theater than we’ve ever seen before. So let’s start there.

My (perhaps too) exacting standards

I know I’m not the usual TV consumer. It’s important to note that before we get too deep.

I bought my first OLED television (a 55-inch LG B6) in 2016. I previously had a 50-inch plasma TV I liked, but it only supported 1080p and SDR (standard dynamic range), and Sony had announced the PlayStation 4 Pro, which would support 4K games (sort of) and HDR (high dynamic range). Game consoles had always driven TV purchases in the past, so I sprung for the best I could afford.

I always cared about picture quality before I bought an OLED, but that interest turned into something more obsessive at that point. I was stunned at the difference, and I began to find it hard to accept the imperfections of LCD monitors and TVs after that. Granted, I’d always disliked LCDs, going straight from CRT to plasma to avoid that grayish backlight glow. But the comparison was even harsher once I went to OLED.

My fellow Ars Technica writers and editors often talk about their robust, multi-monitor PC setups, their expensive in-home server racks, and other Ars-y stuff. I have some of that stuff, too, but I put most of my time and energy into my home theater. I’ve invested a lot into it, and that has the unfortunate side effect of making most other screens I use feel inadequate by comparison.

All that said, some have argued that the Vision Pro is a solution in search of a problem, but there is one pre-existing problem I have that it has the potential to solve.

I travel a lot, so I spend a total of at least two months out of every year in hotel or Airbnb rooms. Whenever I’m in one of those places, I’m always irritated at how its TV compares to the one I have at home. It’s too small for the space, it’s not 4K, it doesn’t support HDR, it’s mounted way too high to comfortably watch, or it’s a cheap LCD with washed-out black levels and terrible contrast. Often, it’s all of the above. And even when I’m home, my wife might want to watch her shows on the big TV tonight.

I end up not watching movies or shows I want to watch because I feel like I’d be doing those shows a disservice by ruining the picture with such terrible hardware. “Better to hold off until I’m home,” I tell myself.

The Vision Pro could be the answer I’ve been waiting for. Those two displays in front of my eyes are capable of displaying an image that stands up to that of a mid-range OLED TV in most situations, and I can use it absolutely anywhere.

Can a $3,500 headset replace your TV? We tried Vision Pro to find out Read More »

what-i-learned-from-the-apple-store’s-30-minute-vision-pro-demo

What I learned from the Apple Store’s 30-minute Vision Pro demo

Seeing is believing? —

Despite some awe-inspiring moments, the $3,500 headset is a big lift for retail.

These mounted displays near the entrance let visitors touch, but not use, a Vision Pro.

Enlarge / These mounted displays near the entrance let visitors touch, but not use, a Vision Pro.

Kyle Orland

For decades now, potential Apple customers have been able to wander in to any Apple Store and get some instant eyes-on and hands-on experience with most of the company’s products. The Apple Vision Pro is an exception to this simple process; the “mixed-reality curious” need to book ahead for a guided, half-hour Vision Pro experience led by an Apple Store employee.

As a long-time veteran of both trade show and retail virtual-reality demos, I was interested to see how Apple would sell the concept of “spatial computing” to members of the public, many of whom have minimal experience with existing VR systems. And as someone who’s been following news and hands-on reports of the Vision Pro’s unique features for months now, I was eager to get a brief glimpse into what all the fuss was about without plunking down at least $3,499 for a unit of my own.

After going through the guided Vision Pro demo at a nearby Apple Store this week, I came away with mixed feelings about how Apple is positioning its new computer interface to the public. While the short demo contained some definite “oh, wow” moments, the device didn’t come with a cohesive story pitching it as Apple’s next big general-use computing platform.

Setup snafus

After arriving a few minutes early for my morning appointment in a sparsely attended Apple Store, I was told to wait by a display of Vision Pro units set on a table near the front. These headsets were secured tightly to their stands, meaning I couldn’t try a unit on or even hold it in my hands while I waited. But I could fondle the Vision Pro’s various buttons and straps while getting a closer look at the hardware (and at a few promotional videos running on nearby iPads).

  • Two Vision Pro headsets let you see it from multiple angles at once.

    Kyle Orland

  • Nearby iPads let you scroll through videos and information about the Vision Pro.

    Kyle Orland

  • The outward-facing display is very subtle in person.

    Kyle Orland

  • Without an appointment you can feel the headstrap with your hands but not with your skull.

    Kyle Orland

  • To Apple’s credit, it did not even try to hide the external battery in these store displays.

    Kyle Orland

After a few minutes, an Apple Store employee, who we’ll call Craig, walked over and said with genuine enthusiasm that he was “super excited” to show off the Vision Pro. He guided me to another table, where I sat in a low-backed swivel chair across from another customer who looked a little zoned out as he ran through his own Vision Pro demo.

Craig told me that the Vision Pro was the first time Apple Store employees like him had gotten early hands-on access to a new Apple device well before the public, in order to facilitate the training needed to guide these in-store demos. He said that interest had been steady for the first few days of demos and that, after some initial problems, the store now mostly managed to stay on schedule.

Unfortunately, some of those demo kinks were still present. First, Craig had trouble tracking down the dedicated iPhone used to scan my face and determine the precise Vision Pro light seal fit for my head. After consulting with a fellow employee, they decided to have me download the Apple Store app and use a QR code to reach the face-scanning tool on my own iPhone. (I was a bit surprised this fit scanning hadn’t been offered as part of the process when I signed up for my appointment days earlier.)

It took three full attempts, scanning my face from four angles, before the app managed to spit out the code that Craig needed to send my fit information to the back room. Craig told me that the store had 38 different light seals and 900 corrective lens options sitting back there, ready to be swapped in to ensure maximum comfort for each specific demo.

  • Sorry, I think I ordered the edamame…

    Kyle Orland

  • Shhh… the Vision Pro is napping.

After a short wait, another employee brought my demo unit out on a round wooden platter that made me feel like I was at a Japanese restaurant. The platter was artistically arranged, from the Solo Knit Band and fuzzy front cover to the gently coiled cord leading to the battery pack sitting in the center. (I never even touched or really noticed the battery pack for the rest of the demo.)

At this point, Craig told me that he would be able to see everything I saw in the Vision Pro, which would stream directly to his iPad. Unfortunately, getting that wireless connection to work took a good five minutes of tapping and tinkering, including removing the Vision Pro’s external battery cord several times.

Once everything was set, Craig gave me a brief primer on the glances and thumb/forefinger taps I would use to select, move, and zoom in on things in the VisionOS interface. “You’re gonna pretend like you’re pulling on a piece of string and then releasing,” he said by way of analogy. “The faster you go, the faster it will scroll, so be mindful of that. Nice and gentle, nice and easy, and things will go smoothly for you.”

Fifteen minutes after my appointed start time, I was finally ready to don the Vision Pro.

A scripted experience

After putting the headset on, my first impression was how heavy and pinchy the Vision Pro was on the bridge of my nose. Thankfully, Craig quickly explained how to tighten the fit with a dial behind my right ear, which helped immediately and immensely. After that, it only took a minute or two to run through some quick calibration of the impressively snappy eye and hand tracking. (“Keep your head nice and still as you do this,” Craig warned me during the process.)

Imagine this but with an Apple Store in the background.

Enlarge / Imagine this but with an Apple Store in the background.

Kyle Orland

As we dove into the demo proper, it quickly became clear that Craig was reading from a prepared script on his iPhone. This was a bit disappointing, as the genuine enthusiasm he had shown in our earlier, informal chat gave way to a dry monotone when delivering obvious marketing lines. “With Apple Vision Pro, you can experience your entire photo library in a brand new way,” he droned. “Right here, we have some beautiful shots, right from iPhone.”

Craig soldiered through the script as I glanced at a few prepared photos and panoramas. “Here we have a beautiful panorama, but we’re going to experience it in a whole new way… as if you were in the exact spot in which it was taken,” Craig said. Then we switched to some spatial photos and videos of a happy family celebrating a birthday and blowing bubbles in the backyard. The actors in the video felt a little stilted, but the sense of three-dimensional “presence” in the high-fidelity video was impressive.

After that, Craig informed me that “with spatial computing, your apps can exist anywhere in your space.” He asked me to turn the digital crown to replace my view of the store around me with a virtual environment of mountains bathed in cool blue twilight. Craig’s script seemed tuned for newcomers who might be freaked out by not seeing the “real world” anymore. “Remember, you’re always in control,” Craig assured me. “You can change it at any time.”

From inside the environment, Craig’s disembodied voice guided me as I opened a few flat app windows, placing them around my space and resizing them as I liked. Rather than letting these sell themselves, though, Craig pointed out how webpages are “super beautiful [and] easy to navigate” on Vision Pro. “As you can also see… text is super sharp, super easy to read. The pictures on the website look stunning.” Craig also really wanted me to know that “over one million iPhone/iPad apps” will work like this on the Vision Pro on day one.

What I learned from the Apple Store’s 30-minute Vision Pro demo Read More »

the-2024-rolex-24-at-daytona-put-on-very-close-racing-for-a-record-crowd

The 2024 Rolex 24 at Daytona put on very close racing for a record crowd

actually 23 hours and 58 minutes this time —

The around-the-clock race marked the start of the North American racing calendar.

Porsche and Cadillac GTP race cars at Daytona

Enlarge / The current crop of GTP hybrid prototypes look wonderful, thanks to rules that cap the amount of downforce they can generate in favor of more dramatic styling.

Porsche Motorsport

DAYTONA BEACH, Fla.—Near-summer temperatures greeted a record crowd at the Daytona International Speedway in Florida last weekend. At the end of each January, the track hosts the Rolex 24, an around-the-clock endurance race that’s now as high-profile as it has ever been during the event’s 62-year history.

Between the packed crowd and the 59-car grid, there’s proof that sports car racing is in good shape. Some of that might be attributable to Drive to Survive‘s rising tide lifting a bunch of non-F1 boats, but there’s more to the story than just a resurgent interest in motorsport. The dramatic-looking GTP prototypes have a lot to do with it—powerful hybrid racing cars from Acura, BMW, Cadillac, and Porsche are bringing in the fans and, in some cases, some pretty famous drivers with F1 or IndyCar wins on their resumes.

But IMSA and the Rolex 24 is about more than just the top class of cars; in addition to the GTP hybrids, the field also comprised the very competitive pro-am LMP2 prototype class and a pair of classes (one for professional teams, another for pro-ams) for production-based machines built to a global set of rules, called GT3. (To be slightly confusing, in IMSA, those classes are known as GTD-Pro and GTD. More on sports car racing being needlessly confusing later.)

The crowd for the 2024 Rolex 24 was larger even than last year. This is the pre-race grid walk, which I chose to watch from afar.

Enlarge / The crowd for the 2024 Rolex 24 was larger even than last year. This is the pre-race grid walk, which I chose to watch from afar.

Jonathan Gitlin

There was even a Hollywood megastar in attendance, as the Jerry Bruckheimer-produced, Joseph Kosinski-directed racing movie starring Brad Pitt was at the track filming scenes for the start of that movie.

GTP finds its groove

Last year’s Rolex 24 was the debut of the new GTP cars, and they didn’t have an entirely trouble-free race. These cars are some of the most complicated sports prototypes to turn a wheel due to hybrid systems, and during the 2023 race, two of the entrants required lengthy stops to replace their hybrid batteries. Those teething troubles are a thing of the past, and over the last 12 months, the cars have found an awful lot more speed, with most of the 10-car class breaking Daytona’s lap record during qualifying.

Most of that new speed has come from the teams’ familiarity with the cars after a season of racing but also from a year of software development. Only Porsche’s 963 has had any mechanical upgrades during the off-season. “You… will not notice anything on the outside shell of the car,” explained Urs Kuratle, Porsche Motorsport’s director of factory racing. “So the aerodynamics, all [those] things, they look the same… Sometimes it’s a material change, where a fitting used to be out of aluminum and due to reliability reasons we change to steel or things like this. There are minor details like this.”

  • This year, the Wayne Taylor Racing team had not one but two ARX-06s. I expected the cars to be front-runners, but a late BoP change added another 40 kg.

    Jonathan Gitlin

  • The Cadillacs are fan favorites because of their loud, naturally aspirated V8s. I think the car looks better than the other GTP cars, too.

    Jonathan Gitlin

  • Porsche’s 963 is the only GTP car that has had any changes since last year, but they’re all under the bodywork.

    Jonathan Gitlin

  • Porsche is the only manufacturer to start selling customer GTP cars so far. The one on the left is the Proton Competition Mustang Sampling car; the one on the right belongs to JDC-Miller MotorSports.

    Jonathan Gitlin

GTP cars aren’t as fast or even as powerful as an F1 single-seater, but the driver workload from inside the cockpit may be even higher. At last year’s season-ending Petit Le Mans, former F1 champion Jenson Button—then making a guest appearance in the privateer-run JDC Miller Motorsport Porsche 963—came away with a newfound respect for how many different systems could be tweaked from the steering wheel.

The 2024 Rolex 24 at Daytona put on very close racing for a record crowd Read More »

i-abandoned-openlitespeed-and-went-back-to-good-ol’-nginx

I abandoned OpenLiteSpeed and went back to good ol’ Nginx

Adventures in server babysitting —

One weather site’s sudden struggles, and musings on why change isn’t always good.

Ish is on fire, yo.

Enlarge / Ish is on fire, yo.

Since 2017, in what spare time I have (ha!), I help my colleague Eric Berger host his Houston-area weather forecasting site, Space City Weather. It’s an interesting hosting challenge—on a typical day, SCW does maybe 20,000–30,000 page views to 10,000–15,000 unique visitors, which is a relatively easy load to handle with minimal work. But when severe weather events happen—especially in the summer, when hurricanes lurk in the Gulf of Mexico—the site’s traffic can spike to more than a million page views in 12 hours. That level of traffic requires a bit more prep to handle.

Space City Weather!” data-height=”2008″ data-width=”2560″ href=”https://cdn.arstechnica.net/wp-content/uploads/2024/01/Screenshot-2024-01-24-at-9.02.05%E2%80%AFAM.jpg”>Hey, it's <a href=Space City Weather!” height=”235″ src=”https://cdn.arstechnica.net/wp-content/uploads/2024/01/Screenshot-2024-01-24-at-9.02.05%E2%80%AFAM.jpg” width=”300″>

Lee Hutchinson

For a very long time, I ran SCW on a backend stack made up of HAProxy for SSL termination, Varnish Cache for on-box caching, and Nginx for the actual web server application—all fronted by Cloudflare to absorb the majority of the load. (I wrote about this setup at length on Ars a few years ago for folks who want some more in-depth details.) This stack was fully battle-tested and ready to devour whatever traffic we threw at it, but it was also annoyingly complex, with multiple cache layers to contend with, and that complexity made troubleshooting issues more difficult than I would have liked.

So during some winter downtime two years ago, I took the opportunity to jettison some complexity and reduce the hosting stack down to a single monolithic web server application: OpenLiteSpeed.

Out with the old, in with the new

I didn’t know too much about OpenLiteSpeed (“OLS” to its friends) other than that it’s mentioned a bunch in discussions about WordPress hosting—and since SCW runs WordPress, I started to get interested. OLS seemed to get a lot of praise for its integrated caching, especially when WordPress was involved; it was purported to be quite quick compared to Nginx; and, frankly, after five-ish years of admining the same stack, I was interested in changing things up. OpenLiteSpeed it was!

check my blog. Yeah, I still have a blog. I’m old.” data-height=”1442″ data-width=”2318″ href=”https://cdn.arstechnica.net/wp-content/uploads/2024/01/Screen-Shot-2022-06-09-at-6.37.47-AM-1.jpg”>The OLS admin console, showing vhosts. This is from my personal web server rather than the Space City Weather server, but it looks the same. If you want some deeper details on the OLS config I was using, <a href=check my blog. Yeah, I still have a blog. I’m old.” height=”398″ src=”https://cdn.arstechnica.net/wp-content/uploads/2024/01/Screen-Shot-2022-06-09-at-6.37.47-AM-1.jpg” width=”640″>

Enlarge / The OLS admin console, showing vhosts. This is from my personal web server rather than the Space City Weather server, but it looks the same. If you want some deeper details on the OLS config I was using, check my blog. Yeah, I still have a blog. I’m old.

Lee Hutchinson

The first significant adjustment to deal with was that OLS is primarily configured through an actual GUI, with all the annoying potential issues that brings with it (another port to secure, another password to manage, another public point of entry into the backend, more PHP resources dedicated just to the admin interface). But the GUI was fast, and it mostly exposed the settings that needed exposing. Translating the existing Nginx WordPress configuration into OLS-speak was a good acclimation exercise, and I eventually settled on Cloudflare tunnels as an acceptable method for keeping the admin console hidden away and notionally secure.

Just a taste of the options that await within the LiteSpeed Cache WordPress plugin.

Enlarge / Just a taste of the options that await within the LiteSpeed Cache WordPress plugin.

Lee Hutchinson

The other major adjustment was the OLS LiteSpeed Cache plugin for WordPress, which is the primary tool one uses to configure how WordPress itself interacts with OLS and its built-in cache. It’s a massive plugin with pages and pages of configurable options, many of which are concerned with driving utilization of the Quic.Cloud CDN service (which is operated by LiteSpeed Technology, the company that created OpenLiteSpeed and its for-pay sibling, LiteSpeed).

Getting the most out of WordPress on OLS meant spending some time in the plugin, figuring out which of the options would help and which would hurt. (Perhaps unsurprisingly, there are plenty of ways in there to get oneself into stupid amounts of trouble by being too aggressive with caching.) Fortunately, Space City Weather provides a great testing ground for web servers, being a nicely active site with a very cache-friendly workload, and so I hammered out a starting configuration with which I was reasonably happy and, while speaking the ancient holy words of ritual, flipped the cutover switch. HAProxy, Varnish, and Nginx went silent, and OLS took up the load.

I abandoned OpenLiteSpeed and went back to good ol’ Nginx Read More »

what-happens-when-you-trigger-a-car’s-automated-emergency-stopping?

What happens when you trigger a car’s automated emergency stopping?

screen grab from a Mercedes training video; illustration of sleeping driver

Mercedes-Benz

Most car crashes begin and end in a few seconds. That’s plenty of time to get in a tiny micro-nap while driving. The famous asleep-at-the-wheel film scene in National Lampoon’s Vacation, where Clark Griswold goes off to slumberland for 72 seconds while piloting the Wagon Queen Family Truckster (a paragon of automotive virtue but lacking any advanced driver safety systems), might be a comical look at this prospect. But if Clark were in the real world, he and his family would likely have been injured or killed—or they could have caused similar un-funny consequences for other motorists or pedestrians.

There’s plenty of real-world news on the topic right now. Early in 2023, the Automobile Association of America’s Foundation for Traffic Safety published a study estimating that 16–21 percent of all fatal vehicle crashes reported to police involve drowsy driving.

With the road fatality numbers in the US hovering close to 38,000 over the past few years, that means between 6,080 and 7,980 road deaths are linked to drowsy drivers. Further research by the AAA’s Foundation finds that drivers likely under-report drowsiness in all car crashes. Nodding off while driving is as dangerous as—and potentially more dangerous than—driving drunk. And while drunk-driving figures have decreased between 1991 and 2021, the opposite is true for drowsy driving.

Nissan

Automakers have not been unaware of the problem, either. As long ago as 2007, manufacturers like Volvo began offering drowsiness-detection systems that monitored the driver, though in a simpler way than what’s seen in the leading systems of today. They sensed the velocities of inputs to steering, throttle, and brakes. Some even used a camera aimed at the driver to discern if drivers were becoming inattentive, including drooping their head or simply averting their view from the straight-ahead.

These systems chime a warning and project a visual alert on the dashboard asking if the driver wants to take a break, often with the universal symbol for wakefulness—a coffee cup—appearing in the instrument cluster. Many new cars today still have this feature. And to be sure, it was then, is now, and forever will be a beneficial and effective method of alerting drivers to their drowsiness.

But a level beyond the above audible and visual cues has changed this landscape of blunting the upward trend of drowsy driving. As Level-2, semi-autonomous capabilities emerge in medium- and even lower-priced automobiles, these features also allow cars and SUVs to take control of the vehicle should the vehicle determine that the driver has become inattentive or incapacitated.

On some vehicles, like this Mercedes, you can select the sensitivity of the drowsy driver program (“Attention Assist” in this case) to have a lower or higher threshold for activation.

Enlarge / On some vehicles, like this Mercedes, you can select the sensitivity of the drowsy driver program (“Attention Assist” in this case) to have a lower or higher threshold for activation.

Jim Resnick

Because all the pieces of a vehicle-control puzzle are already on board, enabling a system to take over from an inattentive driver is a matter of programming—extensive programming, of course, but all the critical pieces of hardware are often already there:

  • Selective braking from adaptive cruise control and stability control
  • Self-steering functions of lane-keeping and lane-centering
  • A cellular telematics network.

It’s a lengthy programming exercise that can take control of a vehicle in a simplified way, but not before three forms of human stimuli are triggered to wake up a drowsy driver: sight, sound, and a physical prompt.

This is all great in theory and in a digital vacuum, but I wanted to explore what occurs inside a car that has determined that the driver is no longer actually driving. The Infiniti QX60 and Mercedes EQE 350 have such emergency stop capabilities; I recently tested both.

What happens when you trigger a car’s automated emergency stopping? Read More »

the-5-most-interesting-pc-monitors-from-ces-2024

The 5 most interesting PC monitors from CES 2024

Dell UltraSharp 40 Curved Thunderbolt Hub Monitor (U4025QW)

Enlarge / Dell’s upcoming UltraSharp U4025QW.

Scharon Harding

Each year, the Consumer Electronics show brings a ton of new computer monitor announcements, and it’s often difficult to figure out what’s worth paying attention to. When it comes to the most interesting models this year, there were two noteworthy themes.

First of all, my complaint in 2022 about there not being enough OLED monitors was largely addressed this year. CES revealed many plans for OLED monitors in 2024, with a good number of those screens set to be appropriately sized for desktops. That includes the introduction of 32-inch, non-curved QD-OLED options and other smaller screens for people who have been waiting for OLED monitors in more varied form factors.

Secondly, with more people blending their work and home lives these days, CES brought hints that the line between gaming monitors and premium monitors used for general or even professional purposes will be blurring more in the future. We’re not at the point where the best productivity monitor and ideal gaming monitor perfectly align in a single product. But this week’s announcements have me imagining ways that future monitors could better serve users with serious work and play interests.

For now, here are the most intriguing monitors from CES 2024.

Dell UltraSharps hit 120 Hz

  • Dell started adding 120 Hz models to its UltraSharp series.

    Scharon Harding

  • This monitor is VESA DisplayHDR 600-certified.

    Dell

  • Ports include Thunderbolt 4 with 140 W power delivery. There’s also a pop-out box of ports by the monitor’s chin.

    Dell

Dell UltraSharp monitors have long attracted workers and creatives and, with their USB-C connectivity, even Mac users. The last few CES shows have shown Dell attempting to improve its lineup, with the most landmark innovation being the introduction of IPS Black. With CES 2024, though, Dell focused on improved video resolution.

Dell’s UltraSharp 40 Curved Thunderbolt Hub Monitor (U4025QW), pictured above, is a 39.7-inch ultrawide with a 5120×2160 resolution and a 120 Hz refresh rate. As most monitors are aimed at workers still using 60 Hz, this is a big step up for people with systems capable of supporting 11,059,200 pixels at 120 frames per second. Such speeds have been relegated to gaming monitors for a while, but with TVs moving to higher refresh rates (with encouragement from gaming consoles), more people are becoming accustomed to faster screens. And with other attributes, like a 2500R curve, we wouldn’t blame workers for doing some light gaming on the U4025QW, too.

But Dell says the refresh rate boost is about increasing eye comfort. The UltraSharp U4025QW is one of two monitors with 5-star certification from TÜV Rheinland’s new Eye Comfort program, which Dell helped create, a Dell spokesperson told me last month at a press event.

According to TÜV, the certification program “is no longer limited to the old low-blue-light or flicker-free labels” and now “covers a broader range of safety indicators, such as ambient brightness, color temperature adjustment and regulation, and brightness.” New requirements include brightness and color temperature control for different ambient lighting. Dell’s ultrawide covers this with an integrated ambient light sensor.

The certification also requires a minimum 120 Hz refresh rate, which is probably where Dell got the number from. A Dell spokesperson confirmed to Ars that the use of IPS Black didn’t impact the monitor’s ability to get TÜV certifications and that it could have theoretically earned five stars with another panel type, like VA.

Dell announced bringing 120 Hz to the UltraSharp lineup in November when it debuted two 24-inch and two 27-inch UltraSharp monitors with 120 Hz refresh rates. At CES, Dell proved this upgrade wasn’t a fluke relegated to its smaller UltraSharps and went all in, bringing the refresh rate to a top-line ultrawide 5K Thunderbolt 4 monitor.

The U4025QW has an updated version of ComfortView Plus, which uses hardware to lower blue light levels. I’ve seen it function without making colors turn yellowish, as some other blue-light-fighting techniques do. After not significantly updating ComfortView Plus since its 2020 release, Dell now says it’s using a “more advanced LED backlight” to reduce blue light exposure from 50 percent to under 35 percent.

The effects are minimal, though. Dell-provided numbers claim the reduced blue light exposure could reduce eye fatigue by 8 percent after 50 minutes, but we should take that with a grain of salt. It’s nearly impossible to quantify how well blue light reduction techniques work from person to person.

The UltraSharp U4025QW releases on February 27, starting at $2,400.

The 5 most interesting PC monitors from CES 2024 Read More »

vectrex-reborn:-how-a-chance-encounter-gave-new-life-to-a-dead-console

Vectrex reborn: How a chance encounter gave new life to a dead console

Vector Graphics —

40 years later, it’s time for the Vectrex to shine.

A black, tall CRT screen sits on a table with a black cart in front of it. The cart reads

Enlarge / A Vectrex console and CRT display with a cart for a long-lost game.

Tim Stevens

The Vectrex may be the most innovative video game console you’ve never heard of. It had everything it needed to prompt a revolution, including controllers far more sophisticated than the competition and the ability to render polygons a decade before gaming’s 3D revolution.

It was years ahead of anything else on the market, yet it could not have launched at a worse time. The Vectrex hit stores at the tail end of 1982. Over the next six months, the then-booming video game market went bust. The Vectrex, a potential revolution in home gaming, was swept into bargain bins, forgotten by all but the most ardent of collectors.

Forty years later, it’s having something of a comeback. New developers are breathing fresh code into this aged machine, hardware hackers and tinkerers are ensuring that tired capacitors and CRTs stay functional, and a new game has seen retail release after sitting unplayed for four decades.

This, finally, could be the Vectrex’s time to shine.

Vectrex history

1982 was a banner year for video games. Titles like Zaxxon, Pole Position, Q*bert, and Dig Dug were fresh in the arcades. In the home gaming scene, seemingly unquenchable consumer desire fueled a period of innovation unlike anything the now $200 billion industry has seen since.

To give some context, Sony sold 11.8 million PlayStation 5s in 2021, the console’s first full year of availability. Back in 1982, 12 million Atari 2600 home consoles flew off store shelves, despite the nascent home gaming industry amounting to a paltry $4 billion.

This boom drove the creation of the Vectrex. The system was born at LA-based hardware design firm Smith Engineering. Envisioned as a portable system with a tiny, 1-inch cathode ray tube screen, the Vectrex concept ultimately grew into the 9-inch screen production version you see here.

Kenner Toys was initially slated to release the system, but when that deal fell through, General Consumer Electronics (GCE) stepped in and brought it to market in late 1982 after a successful debut at that summer’s Consumer Electronics Show. The Vectrex’s initial buzz was so successful that Milton Bradley acquired GCE in 1983.

The Vectrex design was unique, a video game console wholly integrated into a portrait-oriented CRT. This was at a time when most households had just a single television set. Playing Atari back then meant fighting with your siblings and parents about who had control of the TV because missing an episode of The A-Team had real consequences. Not only was DVR technology still decades away, Sony was still trying to say that recording television programs on VCR cassettes was illegal.

But the real reason for the Vectrex’s integrated display was its reliance on a display technique not seen on a home system before (nor since). Vector graphics are a true rarity on the gaming scene. 1979’s Asteroids is probably the most famous example, while 1983’s Star Wars is far and away the most impressive.

Outside of a few exceptions, every video game you’ve ever played has been made up of a series of pixels. Whether it’s CRT, LCD, LED, or even OLED, you’re still talking about images made up of tiny dots of light. As the years have progressed, those pixels have gotten smaller and smaller. Likewise, the graphical power provided by advanced GPU systems like the GeForce RTX 4090 allows those pixels to assemble into ever-more realistic 3D worlds.

Ultimately, though, it’s all a bunch of pixels. On the Vectrex, there are no pixels. As its name implies, graphics here are all made up of vectors. That means straight beams of light drawn from A to B, electrons shot straight and narrow onto a cathode ray tube that glows in response. Connect three such lines, and you have a triangle, a simple polygon, the building block of all mainstream 3D gaming even today.

That lack of pixels means that, even 40 years on, watching a Vectrex game in action is an oddly captivating thing. There’s a fluidity in the rudimentary graphics, an innate sharpness that was not only lacking in other games of the period but that still looks novel today.

Overall fidelity, however, is admittedly low. Though color TVs were well and truly mainstream by 1983, the Vectrex is decidedly black and white, a problem “solved” by some crafty, budget-minded engineering. Most Vectrex titles came with a transparent overlay, a full-color sheet of plastic that clips in place over the display, injecting some hue into the unfortunately desaturated CRT.

Powering this was a relatively simple set of silicon with an 8-bit Motorola 6809 microprocessor at its heart, the same processor behind arcade classics like Robotron: 2084 and many later Williams pinball machines. It ran at a mighty 1 MHz with a whole 1KB of RAM at its disposal.

The chip was paired with an integrated control pad with an analog joystick, far more advanced than the four-way joysticks found on every other home console controller at the time.

All that specialized hardware led to a specialized price. The Vectrex launched in 1982 at $199—about $650 in 2023 dollars. Less than 18 months later, it was dead.

The collector

Sean Kelly is among the world’s preeminent video game collectors. “I’ve been collecting video games for a long time,” he told me. “I’ve had probably over 100,000 video games pass through my hands over the years.” At one point, he said, he had more than 50,000 in his garage.

If that sounds like an industrial operation rather than a mere obsession, you’re not wrong. Kelly is co-founder of the National Videogame Museum in Frisco, Texas, established in 2016 and home to many unicorns of video game collecting, like an original Nintendo World Championship cartridge.

Perhaps it was an affinity for another failed early ’80s console—the Intellivision—that initially fostered Kelly’s love of video games, but he’s had a huge hand in keeping the Vectrex alive. He began by releasing so-called multi-carts, Vectrex cartridges that contained multiple discrete games accessed first by toggling DIP switches and later via a software menu.

Considering many Vectrex titles saw limited releases or no release at all, multi-carts like this were the only way for those few die-hard fans of the system to ever have a chance of playing them.

One of those games was Mail Plane, where you plot optimal delivery routes, then load up the packages and navigate across the country.

Thanks to the Vectrex’s abrupt cancellation, Mail Plane never saw release. You’d be forgiven for thinking it had, though. At Sean’s website, VectrexMulti, you’ll find boxed copies of Mail Plane ready to order.

The game comes in the silvery packaging that was standard for Vectrex releases in its day and even comes with a light pen, a peripheral used for keying in those delivery routes.

Kelly sourced manufacturers for every aspect of the retail packaging. Different prototype versions of the game code were floating around, but Kelly says most were incomplete. “In addition to collecting the video games, I’ve also had a passion for hunting down the people that used to produce the games,” he said. This began a quest to find the most complete version of Mail Plane.

“We would find this former employee or that former employee had a couple of cartridges, and we would go through the cartridges and look at them,” Kelly said, and he ultimately sourced the one closest to final. “Nobody knows for sure if it’s 100 percent complete, but generally, we believe that that’s the most complete version.”

He gave other games the same treatment, including Tour de France, in which you frantically pedal across a polygonal route to Paris, grabbing water bottles along the way and carefully managing the stamina of your rider. It’s an odd title, one that Kelly laments hasn’t exactly been a sales success. “Tour de France is one of the ones that I will be buried with,” he said. “I lost money on Tour de France.”

Kelly declined to say which games have made money, but it’s clear in speaking to him that this is all about passion, not profits.

Along the way, releasing those games provided Kelly and his associates with some valuable experience ahead of a surprise: the discovery of a game that seemingly nobody, not even those who worked for GCE or Milton Bradley, had ever heard of.

Vectrex reborn: How a chance encounter gave new life to a dead console Read More »

here’s-how-the-epa-calculates-how-far-an-ev-can-go-on-a-full-charge

Here’s how the EPA calculates how far an EV can go on a full charge

Here’s how the EPA calculates how far an EV can go on a full charge

Aurich Lawson | Getty Images

How does the US Environmental Protection Agency decide how far an electric vehicle can go on a single charge? The simple explanation is that an EV is driven until the battery runs flat, providing the number that goes on the window sticker. In practice, it’s a lot more complicated than that, with varying test cycles, real-world simulations, and more variables than a book of Mad Libs, all in an effort to give you a number that you can count on to be consistent and comparable with other vehicles on the road.

The start of EPA mileage testing

The EPA started testing vehicle fuel economy in 1971, and that initial testing still plays a major role in how modern cars are measured.

The year before, President Richard Nixon signed the National Environmental Policy Act of 1969 (followed by the Clean Air Act of 1970) and established the EPA with a mandate that included lowering motor vehicle emissions. Part of the EPA’s plan to reduce emissions was to let buyers know just how much fuel a car would use so they could cross-shop cars effectively.

Testing started with a route called the Federal Test Procedure. The EPA adopted an 11-mile (18-km) route that was originally done on real roads in Los Angeles. The route had an average speed of 21 miles per hour (34 km/h) and a top speed of 56 mph (90 km/h). Tailpipe emissions were measured, fuel economy was calculated, and the “city” fuel economy rating was born.

By the time the 10-mile (16-km) Highway Fuel Economy Test was added in 1974, the tests were performed in a lab on a dynamometer. Running tests on the dyno made them more consistent and easier to repeat, though it wasn’t perfect.

Small changes and tweaks were made over the years, with the biggest change announced in 2005. That year, the EPA announced changes to the test to meet new highway speeds, account for heating and air conditioning use, and make the test more relevant to real-world driving. Drivers weren’t able to hit the published numbers, and the EPA wanted to fix that. The system was introduced for the 2008 model year and is largely the one we use today.

Modern range testing

Today, automakers have two different test options for EVs. The automaker can decide that it wants to perform a “single cycle” test. On that test, the car drives the EPA city cycle over and over again until the charge runs out, then does the same on the highway cycle, starting with a full charge. The process is repeated for reliability. The alternative is that the automaker can perform a multi-cycle test that has completed four city cycles, two highway cycles, and two constant speed cycles.

Getty Images

The test cycles

The city cycle

The EPA’s Urban Dynamometer Driving Schedule is the official “city cycle” test loop. It is a complicated graph of time, vehicle speed, and allowable acceleration. The total test time is 1,369 seconds, the distance simulated is 7.45 miles (12 km), and the average speed is 19.59 mph (32.11 km/h). As with all of the tests, the exact speed required at each second of the test is laid out in a spreadsheet.

The highest speed reached on the test is 56.7 mph (91.25 km/h), and there are several periods where the vehicle sits stationary. Stationary seconds of the test made more sense when it was designed to measure a gas vehicle’s idle emissions and consumption, but it does still have some relevance today when it comes to climate control use and energy required to accelerate the vehicle.

The highway cycle

For higher speeds, vehicles complete the Highway Fuel Economy Driving Schedule (HFEDS). This test has a top speed of 59.9 mph (96.4 km/h) and an average of 48.3 mph (77.73 km/h), and it takes 765 seconds to complete.

Only the UDDS and HFEDS tests are required to certify an EV. But a top speed of 59.9 mph is a much lower highway speed than most drivers will experience.

Driving more quickly or using climate control can greatly impact range. More tests were introduced to help give a more realistic range, and they’re part of the 5-cycle test covered below.

Here’s how the EPA calculates how far an EV can go on a full charge Read More »