Author name: Rejus Almole

nvidia-is-ditching-dedicated-g-sync-modules-to-push-back-against-freesync’s-ubiquity

Nvidia is ditching dedicated G-Sync modules to push back against FreeSync’s ubiquity

sync or swim —

But G-Sync will still require specific G-Sync-capable MediaTek scaler chips.

Nvidia is ditching dedicated G-Sync modules to push back against FreeSync’s ubiquity

Nvidia

Back in 2013, Nvidia introduced a new technology called G-Sync to eliminate screen tearing and stuttering effects and reduce input lag when playing PC games. The company accomplished this by tying your display’s refresh rate to the actual frame rate of the game you were playing, and similar variable refresh-rate (VRR) technology has become a mainstay even in budget monitors and TVs today.

The issue for Nvidia is that G-Sync isn’t what has been driving most of that adoption. G-Sync has always required extra dedicated hardware inside of displays, increasing the costs for both users and monitor manufacturers. The VRR technology in most low-end to mid-range screens these days is usually some version of the royalty-free AMD FreeSync or the similar VESA Adaptive-Sync standard, both of which provide G-Sync’s most important features without requiring extra hardware. Nvidia more or less acknowledged that the free-to-use, cheap-to-implement VRR technologies had won in 2019 when it announced its “G-Sync Compatible” certification tier for FreeSync monitors. The list of G-Sync Compatible screens now vastly outnumbers the list of G-Sync and G-Sync Ultimate screens.

Today, Nvidia is announcing a change that’s meant to keep G-Sync alive as its own separate technology while eliminating the requirement for expensive additional hardware. Nvidia says it’s partnering with chipmaker MediaTek to build G-Sync capabilities directly into scaler chips that MediaTek is creating for upcoming monitors. G-Sync modules ordinarily replace these scaler chips, but they’re entirely separate boards with expensive FPGA chips and dedicated RAM.

These new MediaTek scalers will support all the same features that current dedicated G-Sync modules do. Nvidia says that three G-Sync monitors with MediaTek scaler chips inside will launch “later this year”: the Asus ROG Swift PG27AQNR, the Acer Predator XB273U F5, and the AOC AGON PRO AG276QSG2. These are all 27-inch 1440p displays with maximum refresh rates of 360 Hz.

As of this writing, none of these companies has announced pricing for these displays—the current Asus PG27AQN has a traditional G-Sync module and a 360 Hz refresh rate and currently goes for around $800, so we’d hope for the new version to be significantly cheaper to make good on Nvidia’s claim that the MediaTek chips will reduce costs (or, if they do reduce costs, whether monitor makers are willing to pass those savings on to consumers).

For most people most of the time, there won’t be an appreciable difference between a “true” G-Sync monitor and one that uses FreeSync or Adaptive-Sync, but there are still a few fringe benefits. G-Sync monitors support a refresh rate between 1 and the maximum refresh rate of the monitor, whereas FreeSync and Adaptive-Sync stop working on most displays when the frame rate drops below 40 or 48 frames per second. All G-Sync monitors also support “variable overdrive” technology to help eliminate display ghosting, and the new MediaTek-powered displays will support the recent “G-Sync Pulsar” feature to reduce blur.

Nvidia is ditching dedicated G-Sync modules to push back against FreeSync’s ubiquity Read More »

against-all-odds,-an-asteroid-mining-company-appears-to-be-making-headway

Against all odds, an asteroid mining company appears to be making headway

Forging ahead —

“It’s not easy to ever raise for an asteroid mining company, right?”

The Odin spacecraft passed vibration testing.

Enlarge / The Odin spacecraft passed vibration testing.

Astro Forge

When I first spoke with space entrepreneurs Jose Acain and Matt Gialich a little more than two years ago, I wondered whether I would ever talk to them again.

That is not meant to be offensive; rather, it is a reflection of the fact that the business they entered into—mining asteroids for platinum and other precious metals—is a perilous one. To date, NASA and other space agencies have spent billions of dollars returning a few grams of rocky material from asteroids. Humanity has never visited a metal-rich asteroid, although that will finally change with NASA’s $1.4 billion Psyche mission in 2029. And so commercial asteroid mining seems like a stretch, and indeed, other similarly minded startups have come and gone.

But it turns out that I did hear from Acain and Gialich again about their asteroid mining venture, AstroForge. On Tuesday the co-founders announced that they have successfully raised $40 million in Series A funding and shared plans for their next two missions. AstroForge has now raised a total of $55 million to date.

“It was challenging,” Gialich said of the latest fundraising effort, in an interview. “It’s not easy to ever raise for an asteroid mining company, right? Let’s be honest. We talked two years ago and you told us this. And you were not wrong. So a big part of this funding round was just showing people that we can actually build a spacecraft.”

Making some mistakes

In April 2023, the company launched a shoebox-sized cubesat, named the Brokkr-1 mission, on a SpaceX Transporter flight. Although the vehicle flew as intended for a while, AstroForge was unable to send the necessary commands to the spacecraft to initiate a demonstration of its space-based refining technology.

However, Gialich said AstroForge learned a lot from this mission and is working toward launching a second spacecraft named Odin. This will be a rideshare payload on the Intuitive Machines-2 mission, which is due to launch during the fourth quarter of this year. If successful, the Odin mission would be spectacular. About seven months after launching, Odin will attempt to fly by a near-Earth, metallic-rich asteroid while capturing images and taking data—truly visiting terra incognita. Odin would also be the first private mission to fly by a body in the Solar System beyond the Moon.

It has not been an easy project to develop. In the name of expediency, AstroForge initially sought to develop this spacecraft by largely outsourcing key components from suppliers—a practice known as horizontal integration. However, in March, the Odin spacecraft failed vibration testing. “Originally, our concept was to be different than SpaceX, and be horizontally integrated, not vertical,” Gialich said. “That was completely wrong. We have very much made changes there to be vertical.”

After the original vehicle failed vibration testing, which ensures it can survive the rigors of launch, AstroForge decided to bring forward a spacecraft being developed internally for the company’s third flight and use that for the Odin mission. To remain on track for a launch this year, the company had to complete vibration testing of the new, 100-kg Odin vehicle by August 1. AstroForge made that deadline but still must complete several other tests before shipping Odin to the launch pad.

Docking with an asteroid

On Tuesday, the company also announced plans for its third mission, Vestri (the company is naming its missions after Norse deities). This spacecraft will be about twice as large as Odin and is intended to return to the targeted metallic asteroid and dock with it. The docking mechanism is simple—since the asteroid is likely to be iron-rich, Vestri will use magnets to attach itself.

The plan is to use a mass spectrometer to sample and characterize the asteroid weekly until the spacecraft fails. AstroForge seeks to launch Vestri on another Intuitive Machines mission in 2025. Vestri’s goals are highly ambitious, as no private spacecraft has ever landed on a body beyond the Moon.

AstroForge is tracking several candidate asteroids as the target body for Odin and Vestri, Gialich said, each of which is about 400 meters across. He won’t make a final decision for several months. The company does not want to tip its hand due to the interest of potential competitors, including China-based Origin Space.

However, there is no shortage of potential targets. Scientists estimate that there are about 10 million near-Earth asteroids, which come within one astronomical unit (the distance between the Sun and Earth) of our planet. Perhaps 3 to 5 percent of these are rich in metals, so there are potentially hundreds of thousands of candidates for mining.

Against all odds, an asteroid mining company appears to be making headway Read More »

amd-signs-$4.9-billion-deal-to-challenge-nvidia’s-ai-infrastructure-lead

AMD signs $4.9 billion deal to challenge Nvidia’s AI infrastructure lead

chip wars —

Company hopes acquisition of ZT Systems will accelerate adoption of its data center chips.

Visitors walk past the AMD booth at the 2024 Mobile World Congress

AMD has agreed to buy artificial intelligence infrastructure group ZT Systems in a $4.9 billion cash and stock transaction, extending a run of AI investments by the chip company as it seeks to challenge market leader Nvidia.

The California-based group said the acquisition would help accelerate the adoption of its Instinct line of AI data center chips, which compete with Nvidia’s popular graphics processing units (GPUs).

ZT Systems, a private company founded three decades ago, builds custom computing infrastructure for the biggest AI “hyperscalers.” While the company does not disclose its customers, the hyperscalers include the likes of Microsoft, Meta, and Amazon.

The deal marks AMD’s biggest acquisition since it bought Xilinx for $35 billion in 2022.

“It brings a thousand world-class design engineers into our team, it allows us to develop silicon and systems in parallel and, most importantly, get the newest AI infrastructure up and running in data centers as fast as possible,” AMD’s chief executive Lisa Su told the Financial Times.

“It really helps us deploy our technology much faster because this is what our customers are telling us [they need],” Su added.

The transaction is expected to close in the first half of 2025, subject to regulatory approval, after which New Jersey-based ZT Systems will be folded into AMD’s data center business group. The $4.9bn valuation includes up to $400mn contingent on “certain post-closing milestones.”

Citi and Latham & Watkins are advising AMD, while ZT Systems has retained Goldman Sachs and Paul, Weiss.

The move comes as AMD seeks to break Nvidia’s stranglehold on the AI data center chip market, which earlier this year saw Nvidia temporarily become the world’s most valuable company as big tech companies pour billions of dollars into its chips to train and deploy powerful new AI models.

Part of Nvidia’s success stems from its “systems” approach to the AI chip market, offering end-to-end computing infrastructure that includes pre-packaged server racks, networking equipment, and software tools to make it easier for developers to build AI applications on its chips.

AMD’s acquisition shows the chipmaker building out its own “systems” offering. The company rolled out its MI300 line of AI chips last year, and says it will launch its next-generation MI350 chip in 2025 to compete with Nvidia’s new Blackwell line of GPUs.

In May, Microsoft was one of the first AI hyperscalers to adopt the MI300, building it into its Azure cloud platform to run AI models such as OpenAI’s GPT-4. AMD’s quarterly revenue for the chips surpassed $1 billion for the first time in the three months to June 30.

But while AMD has feted the MI300 as its fastest-ever product ramp, its data center revenue still represented a fraction of the $22.6 billion that Nvidia’s data center business raked in for the quarter to the end of April.

In March, ZT Systems announced a partnership with Nvidia to build custom AI infrastructure using its Blackwell chips. “I think we certainly believe ZT as part of AMD will significantly accelerate the adoption of AMD AI solutions,” Su said, but “we have customer commitments and we are certainly going to honour those”.

Su added that she expected regulators’ review of the deal to focus on the US and Europe.

In addition to increasing its research and development spending, AMD says it has invested more than $1 billion over the past year to expand its AI hardware and software ecosystem.

In July the company announced it was acquiring Finnish AI start-up Silo AI for $665 million, the largest acquisition of a privately held AI startup in Europe in a decade.

© 2024 The Financial Times Ltd. All rights reserved. Please do not copy and paste FT articles and redistribute by email or post to the web.

AMD signs $4.9 billion deal to challenge Nvidia’s AI infrastructure lead Read More »

rocket-lab-entered-“hero-mode”-to-finish-mars-probes—now-it’s-up-to-blue-origin

Rocket Lab entered “hero mode” to finish Mars probes—now it’s up to Blue Origin

The two spacecraft for NASA's ESCAPADE mission at Rocket Lab's factory in Long Beach, California.

Enlarge / The two spacecraft for NASA’s ESCAPADE mission at Rocket Lab’s factory in Long Beach, California.

Two NASA spacecraft built by Rocket Lab are on the road from California to Florida this weekend to begin preparations for launch on Blue Origin’s first New Glenn rocket.

These two science probes must launch between late September and mid-October to take advantage of a planetary alignment between Earth and Mars that only happens once every 26 months. NASA tapped Blue Origin, Jeff Bezos’ space company, to launch the Escape and Plasma Acceleration and Dynamics Explorers (ESCAPADE) mission with a $20 million contract.

Last November, the space agency confirmed the $79 million ESCAPADE mission will launch on the inaugural flight of Blue Origin’s New Glenn rocket. With this piece of information, the opaque schedule for Blue Origin’s long-delayed first New Glenn mission suddenly became more clear.

The launch period opens on September 29. The two identical Mars-bound spacecraft for the ESCAPADE mission, nicknamed Blue and Gold, are now complete. Rocket Lab announced Friday that its manufacturing team packed the satellites and shipped them from their factory in Long Beach, California. Over the weekend, they arrived at a clean room facility just outside the gates of NASA’s Kennedy Space Center in Florida, where technicians will perform final checkups and load hydrazine fuel into both spacecraft, each a little more than a half-ton in mass.

Then, if Blue Origin is ready, ground teams will connect the ESCAPADE spacecraft with the New Glenn’s launch adapter, encapsulate the probes inside the payload fairing, and mount them on top of the rocket.

“There’s a whole bunch of checking and tests to make sure everything’s OK, and then we move into fueling, and then we integrate with the launch vehicle. So it’s a big milestone,” said Rob Lillis, the mission’s lead scientist from the University of California Berkeley’s Space Science Laboratory. “There have been some challenges along the way. This wasn’t easy to make happen on this schedule and for this cost. So we’re very happy to be where we are.”

Racing to the finish line

But there’s a lot for Blue Origin to accomplish in the next couple of months if the New Glenn rocket is going to be ready to send the ESCAPADE mission toward Mars in this year’s launch period. Blue Origin has not fully exercised a New Glenn rocket during a launch countdown, hasn’t pumped a full load of cryogenic propellants into the launch vehicle, and hasn’t test-fired a full complement of first stage or second stage engines.

These activities typically take place months before the first launch of a large new orbital-class rocket. For comparison, SpaceX test-fired its first fully assembled Falcon 9 rocket on the launch pad about three months before its first flight in 2010. United Launch Alliance completed a hot-fire test of its new Vulcan rocket on the launch pad last year, about seven months before its inaugural flight.

However, Blue Origin is making visible progress toward the first flight of New Glenn, after years of speculation and few outward signs of advancement. Earlier this year, the company raised a full-scale, 320-foot-tall (98-meter) New Glenn rocket on its launch pad at Cape Canaveral Space Force Station and loaded it with liquid nitrogen, a cryogenic substitute for the methane and liquid hydrogen fuel it will burn in flight.

Rocket Lab entered “hero mode” to finish Mars probes—now it’s up to Blue Origin Read More »

google-denies-reports-that-it’s-discontinuing-fitbit-products

Google denies reports that it’s discontinuing Fitbit products

Fitbit lives on … for now —

Claims that there will be no new Versas or Senses is incorrect, rep says.

The Fitbit Sense 2.

Enlarge / The Fitbit Sense 2.

Google

Google is denying a recent report that it is no longer making Fitbit smartwatches. A company spokesperson told Ars Technica today that Google has no current plans to discontinue the Fitbit Sense or Fitbit Versa product lines.

On Sunday, TechRadar published an article titled “RIP Fitbit smartwatches—an end we could see coming a mile away.” The article noted last week’s announcement of the new Google Pixel Watch 3. Notably, the watch from Google, which acquired Fitbit in 2019, gives users free access to the Daily Readiness Score, a feature that previously required a Fitbit Premium subscription (Pixel Watch 3 owners also get six free months of Fitbit Premium). The publication said that Fitbit has been “consigned to wearable history” and reported:

Google quietly confirmed that there would never be another Fitbit Sense or Versa model produced. From now on, Fitbit-branded devices will be relegated to Google’s best fitness trackers: the Fitbit Inspire, Luxe, and Charge ranges. The smartwatch form factor would be exclusively reserved for the Pixel Watch line.

The story followed a report from Engadget last week, when the puiblication said that “moving forward everything from Fitbit would focus on the more minimalistic, long-lasting trackers the brand has become synonymous with,” citing a conversation with the senior director of product management for Pixel Wearables, Sandeep Waraich. “Pixel Watches are our next iteration of smartwatch for Fitbit,” he reportedly said.

When reached for comment, however, a Google spokesperson told me that the TechRadar story is “not correct” and shared the following statement:

We are very committed to Fitbit, and even more importantly to the customers that use and depend on those products and technology. It’s also worth noting that many of the health and fitness features we launched in Pixel Watch 3 were because of Fitbit’s innovation and ground-breaking fitness advancements. In addition, we just launched Fitbit Ace LTE, [a smartwatch for kids released on June 5], and you’ll continue to see new products and innovation from Fitbit.

While the company rep told me that they could not confirm a specific upcoming Sense or Versa model or any other specifics about Google’s product road map, they claimed that Google hasn’t discontinued the lines.

Fitbit fears

TechRadar’s concerns about Fitbit smartwatches dying also stem from the Sense 2 and Versa 4 lacking some features of its predecessors, including ways to control music or access music apps. The Pixel Watch, meanwhile, has music app support, like YouTube, Spotify, and Pandora. “Once Google completed its acquisition in January 2021 and debuted its first Pixel Watch in 2022, the Versa and the Sense watches were holdovers of a bygone era,” TechRadar wrote.

Google also has more than its fair share of dead products, prompting Fitbit fans to be wary about the future of the smartwatch brand.

However, Google’s spokesperson noted that “part of everything that we just launched from Pixel Watch is based on Fitbit technology, so it is not going anywhere.”

While Fitbit tech and perhaps its name may live on, it’s reasonable to question the brand’s longevity. Concerns about Google discontinuing Fitbit smartwatches have been fueled by Google taking Fitbit features and incorporating them into Google-branded watches. Google has also discontinued various beloved Fitbit features, including the Fitbit.com online dashboard, social features, and the ability to sync with computers. Google also previously announced that it’s closing all Fitbit accounts (forcing users onto Google accounts) next year and also shut down the Fitbit SDK for app development. Google’s Fitbit reputation has been further damaged by widely reported battery problems that some Charge 5 users have experienced. Google denied that the quick-dying battery issue stemmed from a firmware update but never publicly confirmed what it believes the problem is. This Google-fication of Fitbit has led long-term customers to publicly complain about Google allegedly reducing customer support and care for Fitbit users.

At this time, Google isn’t announcing the end of any Fitbit product lines. But it remains possible that if future devices arrive, they may lack the features of previous Fitbits or Pixel Watches. The Fitbit brand isn’t dead, but Fitbit, as people knew it before Google’s acquisition, is no more.

This article was updated with information from Engadget. 

Google denies reports that it’s discontinuing Fitbit products Read More »

$50-2gb-raspberry-pi-5-comes-with-a-lower-price-and-a-tweaked,-cheaper-cpu

$50 2GB Raspberry Pi 5 comes with a lower price and a tweaked, cheaper CPU

cheaper pi —

Despite changes, 2GB Pi 5 is “functionally identical” to other iterations.

The 8GB Raspberry Pi 5 with the official fan and heatsink installed.

Enlarge / The 8GB Raspberry Pi 5 with the official fan and heatsink installed.

Andrew Cunningham

We’re many months past the worst of the Raspberry Pi shortages, and the board is finally widely available at its suggested retail price at most sites without wait times or quantity limitations. One sign that the Pi Foundation is feeling more confident about the stock situation: the launch of a new 2GB configuration of the Raspberry Pi 5, available starting today for $50. That’s $10 less than the 4GB configuration and $30 less than the 8GB version of the board.

Raspberry Pi CEO Eben Upton writes that the 2GB version of the board includes a revised version of the Broadcom BCM2712C1 SoC that is slightly cheaper to manufacture. Upton says that the D0 stepping of the BCM2712C1 strips out some “dark silicon” built-in functionality that the Pi wasn’t using but was still taking up space on the silicon die and increasing the cost of the chip.

“From the perspective of a Raspberry Pi user, [the chip] is functionally identical to its predecessor: the same fast quad-core processor; the same multimedia capabilities; and the same PCI Express bus that has proven to be one of the most exciting features of the Raspberry Pi 5 platform,” Upton writes. “However, it is cheaper to make, and so is available to us at somewhat lower cost. And this, combined with the savings from halving the memory capacity, has allowed us to take $10 out of the cost of the finished product.”

At $50, the price tag is still north of the baseline $35 price that the Pi started at for many years. The Pi 4 had a 1GB model for $35 when it launched, and there was a $35 2GB model available for a while in 2020, but widespread shortages and supply chain issues led to a “temporary” price increase in late 2021 that is, as of this writing, still in place. At least the 2GB Pi 5 is only $5 more expensive than the 2GB version of the Pi 4, which is still in stock for $45 at many retailers.

Though you’ll want a fully fledged 8GB Raspberry Pi if you want to try using one as an everyday desktop PC, there are plenty of Pi use cases that will benefit from its additional speed and connectivity options without needing more RAM. Retro emulation boxes aren’t necessarily RAM-hungry but can benefit from the Pi 5’s extra CPU and GPU speed, and many types of lightweight server apps (Wireguard, Homebridge, Pi-hole, to name a few) can benefit from the faster Wi-Fi and Ethernet and improved support for more reliable NVMe storage.

All that said, for just $10 more, we’d still probably point most people to the more flexible and future-proof 4GB version. The Pi boards sitting around my house have all lived multiple lives at this point, picking up new tasks as my needs have changed, and new Pi boards have come out—if your Pi project today won’t benefit from more RAM, it’s possible that tomorrow’s Pi project will.

The 2GB Pi 5 is available for order from outlets like PiShop and CanaKit and should filter out to other Pi retailers soon.

$50 2GB Raspberry Pi 5 comes with a lower price and a tweaked, cheaper CPU Read More »

new-windows-11-build-removes-ancient,-arbitrary-32gb-size-limit-for-fat32-disks

New Windows 11 build removes ancient, arbitrary 32GB size limit for FAT32 disks

getting fat —

But the Windows NT-era disk formatting UI hasn’t been fixed yet.

If you've formatted a disk in Windows in the last 30 years, you may have come across this dialog box.

Enlarge / If you’ve formatted a disk in Windows in the last 30 years, you may have come across this dialog box.

Andrew Cunningham

As we wait for this fall’s Windows 11 24H2 update to be released to the general public, work continues on other new features that could be part of other future Windows updates. A new Canary channel Windows Insider build released yesterday fixes a decades-old and arbitrary limitation that restricted new FAT32 partitions to 32GB in size, even though the filesystem itself has a maximum supported size of 2TB (and Windows can read and recognize 2TB FAT32 partitions without an issue).

For now, this limit is only being lifted for the command-line formatting tools in Windows. The disk formatting UI, which looks more or less the same now as it did when it was introduced in Windows NT 4.0 almost 30 years ago, still has the arbitrary 32GB capacity restriction.

The 32GB limit can allegedly be pinned on former Microsoft programmer Dave Plummer, who occasionally shares stories about his time working on Windows in the 1990s and early 2000s. Plummer says that he wrote the file format dialog, intending it as a “temporary” solution, and arbitrarily chose 32GB as a size limit for disks, likely because it seemed big enough at the time (Windows NT 4.0 required a whopping 110MB of disk space).

There aren’t a ton of reasons to actually use a FAT32 disk in 2024, and it’s been replaced by other filesystems for just about everything. As a filesystem for your main OS drive, it was replaced by NTFS decades ago; as a widely compatible filesystem for external drives that can be read from and written to by many operating systems, you’d probably want to use exFAT instead. FAT32 still has a 4GB limit on the size of individual files.

But if you’re formatting a disk to use with an old version of Windows, or with some older device that can only work with FAT32 disks, this tweak could make Windows a tiny bit more useful for you.

Listing image by Alpha Six

New Windows 11 build removes ancient, arbitrary 32GB size limit for FAT32 disks Read More »

isp-to-supreme-court:-we-shouldn’t-have-to-disconnect-users-accused-of-piracy

ISP to Supreme Court: We shouldn’t have to disconnect users accused of piracy

A pair of scissors cutting an Ethernet cable.

A large Internet service provider wants the Supreme Court to rule that ISPs shouldn’t have to disconnect broadband users who have been accused of piracy. Cable firm Cox Communications, which is trying to overturn a ruling in a copyright infringement lawsuit brought by Sony, petitioned the Supreme Court to take up the case yesterday.

Cox said in a press release that a recent appeals court ruling “would force ISPs to terminate Internet service to households or businesses based on unproven allegations of infringing activity, and put them in a position of having to police their networks—contrary to customer expectations… Terminating Internet service would not just impact the individual accused of unlawfully downloading content, it would kick an entire household off the Internet.”

The case began in 2018 when Sony and other music copyright holders sued Cox, claiming that it didn’t adequately fight piracy on its network and failed to terminate repeat infringers. A US District Court jury in the Eastern District of Virginia ruled in December 2019 that Cox must pay $1 billion in damages to the major record labels.

Digital rights groups such as the Electronic Frontier Foundation (EFF) objected to the ruling, saying it “would result in innocent and vulnerable users losing essential Internet access.” The case went to the US Court of Appeals for the 4th Circuit, which vacated the $1 billion damages award in February 2024 but upheld one of the major copyright infringement verdicts.

Specifically, the appeals court affirmed the jury’s finding that Cox was guilty of willful contributory infringement and reversed a verdict on vicarious infringement. The vicarious liability verdict was scrapped “because Cox did not profit from its subscribers’ acts of infringement.”

Cox wants ruling on contributory infringement

On the contributory infringement charge, appeals court judges indicated that their hands were tied in part by Cox’s failure to make a key argument to the District Court. Proving “contributory infringement by an Internet service provider based on its subscribers’ direct infringement” can be achieved by showing “willful blindness,” the court said.

“Cox did not argue to the district court, as it does now on appeal, that notices of past infringement failed to establish its knowledge that the same subscriber was substantially certain to infringe again… Because Cox did not press this argument in the district court, it is forfeited for appeal,” the appeals court said. In District Court, Cox argued that copyright infringement notices sent to the ISP were too vague.

The Supreme Court held in MGM v. Grokster, in 2005, that “One who distributes a device with the object of promoting its use to infringe copyright, as shown by clear expression or other affirmative steps taken to foster infringement, going beyond mere distribution with knowledge of third-party action, is liable for the resulting acts of infringement by third parties using the device, regardless of the device’s lawful uses.”

In its Supreme Court petition yesterday, Cox said that circuit appeals courts “have split three ways over the scope of that ruling, developing differing standards for when it is appropriate to hold an online service provider secondarily liable for copyright infringement committed by users.”

Cox asked justices to decide whether the 4th Circuit “err[ed] in holding that a service provider can be held liable for ‘materially contributing’ to copyright infringement merely because it knew that people were using certain accounts to infringe and did not terminate access, without proof that the service provider affirmatively fostered infringement or otherwise intended to promote it.”

The case raises one other major question, Cox told SCOTUS:

Generally, a defendant cannot be held liable as a willful violator of the law—and subject to increased penalties—without proof that it knew or recklessly disregarded a high risk that its own conduct was illegal. In conflict with the Eighth Circuit, the Fourth Circuit upheld an instruction allowing the jury to find willfulness if Cox knew its subscribers’ conduct was illegal—without proof Cox knew its own conduct in not terminating them was illegal.

Justices should rule on whether the 4th Circuit “err[ed] in holding that mere knowledge of another’s direct infringement suffices to find willfulness,” Cox said.

ISP to Supreme Court: We shouldn’t have to disconnect users accused of piracy Read More »

smart-sous-vide-cooker-to-start-charging-$2/month-for-10-year-old-companion-app

Smart sous vide cooker to start charging $2/month for 10-year-old companion app

Anova Precision Cooker 3.0

Anova, a company that sells smart sous vide cookers, is getting backlash from customers after announcing that it will soon charge a subscription fee for the device’s companion app.

Sous vide cooking, per Ars Technica sister site Bon appétit, “is the process of sealing food in an airtight container—usually a vacuum sealed bag—and then cooking that food in temperature-controlled water.” Sous vide translates from French to “under vacuum,” and this cooking method ensures that the water stays at the desired temperature for the ideal cook.

Anova was founded in 2013 and sells sous vide immersion circulators. Its current third-generation Precision Cooker 3.0 has an MSRP of $200. Anova also sells a $149 model and a $400 version that targets professionals. It debuted the free Anova Culinary App in 2014.

In a blog post on Thursday, Anova CEO and cofounder Stephen Svajian announced that starting on August 21, people who sign up to use the Anova Culinary App with the cooking devices will have to pay $2 per month, or $10 per year. The app does various things depending on the paired cooker, but it typically offers sous vide cooking guides, cooking notifications, and the ability to view, save, bookmark, and share recipes.

The subscription fee will only apply to people who make an account after August 21. Those who downloaded the app and made an account before August 21 won’t have to pay. But everyone will have to make an account; some people have been using the app without one until now.

“You helped us build Anova, and our intent is that you will be grandfathered in forever,” Svajian wrote.

According to Svajian, the subscription fees are necessary so Anova can “continue delivering the exceptional service and innovative recipes” and “maintain and enhance the app, ensuring it remains a valuable resource.”

As Digital Trends pointed out, the announcement follows an Anova statement saying it will no longer let users remotely control their kitchen gadgets via Bluetooth starting on September 28, 2025. This means that remote control via the app will only be possible for models offering and using Wi-Fi connectivity. Owners of affected devices will no longer be able to access their device via the Anova app, get notifications, or use status monitoring. Users will still be able to manually set the time, temperature, and timer via the device itself.

Customers are heated

Changing or removing features of a tech gadget people have already purchased is a risky move that can anger customers who have paid for a device they expected to work a certain way indefinitely.

As of this writing, there are 104 comments under Anova’s blog post, with many posters saying they will not purchase or recommend another Anova device because of the changes. Many echo a commenter named Nathan Johnson, who wrote, “You’ve just lost a LONGTIME and very faithful customer.”

Another commenter going by Tony Nguyen wrote, “Charging a subscription fee for feature that was free before is anti-consumer. I will never buy another Anova product again and will share with everyone I know how terrible and greedy this company is. You’ve lost me and all my family and friends as customer…”

Smart sous vide cooker to start charging $2/month for 10-year-old companion app Read More »

facing-“financial-crisis,”-russia-on-pace-for-lowest-launch-total-in-6-decades

Facing “financial crisis,” Russia on pace for lowest launch total in 6 decades

SMO fallout —

“This forces us to build a new economy in severe conditions.”

A Soyuz 2.1b rocket booster with a Frigate upper stage block, the Meteor-M 2-1 meteorological satellite, and 18 small satellites launched from the Vostochny Cosmodrome.

Enlarge / A Soyuz 2.1b rocket booster with a Frigate upper stage block, the Meteor-M 2-1 meteorological satellite, and 18 small satellites launched from the Vostochny Cosmodrome.

Yuri Smityuk/TASS

A Progress cargo supply spacecraft launched from the Baikonur Cosmodrome in Kazakhstan early on Thursday, local time. The mission was successful, and Russia has launched hundreds of these spacecraft before. So it wasn’t all that big of a deal, except for one small detail: This was just Russia’s ninth orbital launch of the year.

At this pace, it appears that the country’s space program is on pace for the fewest number of Russian or Soviet space launches in a year since 1961. That was when Yuri Gagarin went to space at the dawn of the human spaceflight era.

There are myriad reasons for this, including a decision by Western space powers to distance themselves from the Russian space corporation, Roscosmos, after the invasion of Ukraine. This has had disastrous effects on the Russian space program, but only recently have we gotten any insight into how deep those impacts have cut.

In recent weeks, the first deputy director of Roscosmos, Andrei Yelchaninov, has given a series of interviews to Russian news outlets. (Most Russian media are state-owned or state-controlled, so none of this information can be independently verified, but it is interesting nonetheless.) One of the most revealing of these interviews was given to national news agency Interfax. It was translated for Ars by Rob Mitchell and provides perspective on Russia’s space crisis and how the country will seek to rebound.

A financial crisis

“We are in an ongoing process of emerging from financial crisis, and it’s complicated,” Yelchaninov told Interfax. “I would remind you that contract cancellations by unfriendly contacts cost Roscosmos 180 billion rubles ($2.1 billion US). This forces us to build a new economy in severe conditions.”

As a result of this, Russia’s space industry has been operating at a loss in recent years and may not begin to break even until 2025. Russia’s invasion of Ukraine also came as United Launch Alliance finally ended its practice of purchasing RD-180 rocket engines, manufactured by NPO Energomash. This fact, in concert with decreased commercial demand for Russia’s Proton and Soyuz rockets, has forced the Russian government to subsidize these elements of Roscosmos.

These companies “are currently in a financial revitalization procedure and have received State subsidies several years ago in order to maintain viability, and are now seeking new sales markets and additional workload,” Yelchaninov said. Asked about possibly selling more Russian-made engines to the United States, Yelchaninov replied, “That issue is not on the agenda.”

Russia had to look to new sales markets after what Yelchaninov euphemistically refers to as the “special military operation,” which is Russia’s term of art for its war against Ukraine. “After the beginning of the SMO we were forced to shift from our traditional partners in Europe and the US, with whom we had many years of interaction, for new international directions including the countries in Africa, the Mideast, and Southeast Asia,” he said.

During the interview, Yelchaninov confirmed that Russia has committed to participating in the International Space Station program until “at least” 2028. NASA is pushing to extend the operational lifetime of the station to 2030, at which point the United States plans to de-orbit the aging laboratory using a modified Crew Dragon spacecraft.

Rather than working with the United States in space, Yelchaninov said that Russia’s space program would focus on cooperation with China rather than competition there. “The key project of our bilateral cooperation is creating an International Lunar Station to which we are jointly striving to attract additional international partners,” he said.

Big plans, big delays?

In addition, Russia is also continuing the development of its oft-delayed “Russian Orbital Station,” or ROS. The current plans call for the launch of a scientific and power module in 2027, with the core of the station (four modules) to be launched into orbit by 2030. Further expansions will take place in the early 2030s. It should be noted, however, that these dates can charitably be described as aspirational.

Even more speculatively, Yelchaninov mentioned several future rocket projects, including the Amur-LNG vehicle and the Corona rocket.

In 2020, Russia aimed to debut the methane-powered Amur rocket with a reusable first stage by 2026. This vehicle was developed to be cost-competitive with SpaceX’s Falcon 9 rocket. Yelchaninov now said Roscosmos intends to develop first-stage reuse in two phases. In the first of these, a Grasshopper-like program would test landing technologies before moving to experiments with a complete booster. But don’t expect to see Amur any time soon. Yelchaninov revealed that Russian and Kazakh officials are still in the design phase of a launch site at Baikonur, rather than actively building anything.

Yelchaninov also said Roscosmos would like to develop a single-stage-to-orbit rocket named Corona in the future. This appears to be an updated take on a Russian rocket design that is more than three decades old.

“We have already studied whether or not a new booster of this type will be in demand,” Yelchaninov said. “The answer is obvious—we are reducing the cost of access to space by more than an order of magnitude and discovering entirely new opportunities for super-operational delivery of cargo, and we are moving toward an ideology of space as a service.”

I would not hold my breath on seeing Corona fly.

Facing “financial crisis,” Russia on pace for lowest launch total in 6 decades Read More »

nasa-shuts-down-asteroid-hunting-telescope,-but-a-better-one-is-on-the-way

NASA shuts down asteroid-hunting telescope, but a better one is on the way

Prolific —

The NEOWISE spacecraft is on a course to fall out of orbit in the next few months.

Artist's illustration of NASA's Wide-field Infrared Survey Explorer spacecraft.

Enlarge / Artist’s illustration of NASA’s Wide-field Infrared Survey Explorer spacecraft.

Last week, NASA decommissioned a nearly 15-year-old spacecraft that discovered 400 near-Earth asteroids and comets, closing an important chapter in the agency’s planetary defense program.

From its position in low-Earth orbit, the spacecraft’s infrared telescope scanned the entire sky 23 times and captured millions of images, initially searching for infrared emissions from galaxies, stars, and asteroids before focusing solely on objects within the Solar System.

Wising up to NEOs

The Wide-field Infrared Survey Explorer, or WISE, spacecraft launched in December 2009 on a mission originally designed to last seven months. After WISE completed checkouts and ended its primary all-sky astronomical survey, NASA put the spacecraft into hibernation in 2011 after its supply of frozen hydrogen coolant ran out, reducing the sensitivity of its infrared detectors. But astronomers saw that the telescope could still detect objects closer to Earth, and NASA reactivated the mission in 2013 for another decade of observations.

The reborn mission was known as NEOWISE (Near-Earth Object Wide-field Infrared Survey Explorer). Its purpose was to use the spacecraft’s infrared vision to detect faint asteroids and comets on trajectories that bring them close to Earth.

“We never thought it would last this long,” said Amy Mainzer, NEOWISE’s principal investigator from the University of Arizona and UCLA.

Ground controllers at NASA’s Jet Propulsion Laboratory in California sent the final command to the NEOWISE spacecraft on August 8. The spacecraft, currently at an altitude of about 217 miles (350 kilometers), is falling out of orbit as atmospheric drag slows it down. NASA expects the spacecraft will reenter the atmosphere and burn up before the end of this year, a few months earlier than expected, due to higher levels of solar activity, which causes expansion in the upper atmosphere. The satellite doesn’t have its own propulsion to boost itself into a higher orbit.

“The Sun’s just been incredibly quiet for many years now, but it’s picking back up, and it was the right time to let it go,” Mainzer told Ars.

Astronomers have used ground-based telescopes to discover most of the near-Earth objects detected so far. But there’s an advantage to using a space-based telescope, because Earth’s atmosphere absorbs most of the infrared energy coming from faint objects like asteroids.

With ground-based telescopes, astronomers are “predominantly seeing sunlight reflecting off the surfaces of the objects,” Mainzer said. NEOWISE measures thermal emissions from the asteroids, giving scientists information about their sizes. “We can actually get pretty good measurements of size with relatively few infrared measurements,” she said.

The telescope on NEOWISE was relatively modest in size, with a 16-inch (40-centimeter) primary mirror, more than 16 times smaller than the mirror on the James Webb Space Telescope. But its wide field of view allowed NEOWISE to scour the sky for infrared light sources, making it well-suited for studying large populations of objects. One of the mission’s most famous discoveries was a comet officially named C/2020 F3, more commonly known as Comet NEOWISE, which became visible to the naked eye in 2020. As the comet moved closer to Earth, large telescopes like Hubble were able to take a closer look.

“The NEOWISE mission has been an extraordinary success story as it helped us better understand our place in the universe by tracking asteroids and comets that could be hazardous for us on Earth,” said Nicola Fox, associate administrator of NASA’s science mission directorate.

What’s out there?

The original mission of WISE and the extended survey of NEOWISE combined to discover 366 near-Earth asteroids and 34 comets, according to the Center for Near-Earth Object Studies. Of these, 64 were classified as potentially hazardous asteroids, meaning they come within 4.65 million miles (7.48 million kilometers) of Earth and are at least 500 feet (140 meters) in diameter. These are the objects astronomers want to find and track in order to predict if they pose a risk of colliding with Earth.

There are roughly 2,400 known potentially hazardous asteroids, but there are more lurking out there. Another advantage of using space-based telescopes to search for these asteroids is that they can observe 24 hours a day, while telescopes on the ground are limited to nighttime surveys. Some hazardous asteroids, such as the house-sized object that exploded in the atmosphere over Chelyabinsk, Russia, in 2013, approach Earth from the direction of the Sun. A space telescope has a better chance of finding these kinds of asteroids.

WISE, and then the extended mission of NEOWISE, helped scientists estimate there are approximately 25,000 near-Earth objects.

“The objects (NEOWISE) did discover tended to be overwhelmingly just dark, [and] these are the objects that are much more likely to be missed by the ground-based telescopes,” Mainzer said. “So that, in turn, gives us a much better idea of how many are really out there.”

NASA shuts down asteroid-hunting telescope, but a better one is on the way Read More »

scientists-solved-mysterious-origin-of-stonehenge’s-altar-stone:-scotland

Scientists solved mysterious origin of Stonehenge’s Altar Stone: Scotland

The Altar Stone at Stonehenge.

Enlarge / The Altar Stone at Stonehenge weighs roughly 6 tons and was probably transported by land—or possibly by sea.

English Heritage

The largest of the “bluestones” that comprise the inner circle at Stonehenge is known as the Altar Stone. Like its neighbors, scientists previously thought the stone had originated in western Wales and been transported some 125 miles to the famous monument that still stands on the Salisbury Plain in Wiltshire, England. But a new paper published in the journal Nature came to a different conclusion based on fresh analysis of its chemical composition: The Altar Stone actually hails from the very northeast corner of Scotland.

“Our analysis found specific mineral grains in the Altar Stone are mostly between 1,000 to 2,000 million years old, while other minerals are around 450 million years old,” said co-author Anthony Clarke, a graduate student at Curtin University in Australia, who grew up in Mynydd Preseli in Wales—origin of most of the bluestones—and first visited the monument when he was just a year old. “This provides a distinct chemical fingerprint suggesting the stone came from rocks in the Orcadian Basin, Scotland, at least 750 kilometers [450 miles] away from Stonehenge.”

As previously reported, Stonehenge consists of an outer circle of vertical sandstone slabs (sarsen stones), connected on top by horizontal lintel stones. There is also an inner ring of smaller bluestones and, within that ring, several free-standing trilithons (larger sarsens joined by one lintel). Radiocarbon dating indicates that the inner ring of bluestones was set in place between 2400 and 2200 BCE. But the standing arrangement of sarsen stones wasn’t erected until around 500 years after the bluestones.

No contemporary written records exist concerning the monument’s construction, and scholars have pondered its likely use and cultural significance for centuries. Stonehenge’s form (and maybe its purpose) changed several times over the centuries, and archaeologists are still trying to piece together the details of its story and the stories of the people who built it and gathered in its circles.

In 2019, Parker Pearson and several colleagues reported the results of their investigation into the quarry source for the bluestones. They found that the 42 bluestones came all the way from western Wales. Chemical analysis has even matched some of them to two particular quarries on the northern slopes of the Preseli Hills.

One quarry, an outcrop called Carn Goedog, seems to have supplied most of the bluish-gray, white-speckled dolerite at Stonehenge. And another outcrop in the valley below, Craig Rhos-y-felin, supplied most of the rhyolite. When another group of archaeologists studied the chemical isotope ratios in the cremated remains of people once buried beneath the bluestones, those researchers found that many of those people may have come from the same part of Wales between 3100 and 2400 BCE.

But the sarsen stones hail from much closer to home. Since the 1500s, most Stonehenge scholars have assumed the sarsen stones came from nearby Marlborough Downs, an area of round, grassy hills 25 to 30km (17 miles) north of Stonehenge, which has the largest concentration of sarsen in the UK. A 2020 study by University of Brighton archaeologist David Nash and colleagues confirmed that.

The arrangement of stones at Stonehenge, color-coded to show where they came from.

Enlarge / The arrangement of stones at Stonehenge, color-coded to show where they came from.

English Heritage/Curtin University

Fifty of the sarsens shared very similar chemical fingerprints, which means they probably all came from the same place, most likely one site in the southeastern Marlborough Downs: West Woods, about 25 km (16 miles) north of Stonehenge and just 3 km (2 miles) south of where most earlier studies had looked for Neolithic sarsen quarries. The other two surviving sarsens came from two different places, which archaeologists haven’t pinpointed yet.

Scientists solved mysterious origin of Stonehenge’s Altar Stone: Scotland Read More »