data centers

trump-gets-data-center-companies-to-pledge-to-pay-for-power-generation

Trump gets data center companies to pledge to pay for power generation

On Wednesday, the Trump administration announced that a large collection of tech companies had signed on to what it’s calling the Ratepayer Protection Pledge. By agreeing, the initial signatories—Amazon, Google, Meta, Microsoft, OpenAI, Oracle, and xAI—are saying they will pay for the new generation and transmission capacities needed for any additional data centers they build. But the agreement has no enforcement mechanism, and it will likely run into issues with hardware supplies. It also ignores basic economics.

Other than that, it seems like a great idea.

What’s being agreed to

The agreement is quite simple, laying out five points. The key ones are the first three: that the companies building data centers pledge to pay for new generating capacity, either building it themselves or paying for it as part of a new or expanded power plant. They’ll also pay for any transmission infrastructure needed to connect their data centers and the new supply to the grid and will cover these costs whether or not the power ultimately gets used by their facilities.

The companies also pledge to consider allowing the local grid to use on-site backup generators to handle emergency power shortages affecting the community. They will also hire and train locally when they build new data centers.

The agreement suggests that these promises will protect American consumers from price hikes due to the expansion of data centers and will somehow “lower electricity costs for consumers in the long term.” How that will happen is not specified.

Also missing from the agreement is any sort of enforcement mechanism. If a company decides to ignore the agreement, the worst it is guaranteed to suffer is bad publicity, something these companies already have experience handling. That said, Trump has been known to resort to blatantly illegal tactics to pressure companies to conform to his wishes, so ignoring the agreement carries risks.

That’s important because the companies will struggle to live up to the agreement. (Though Google, for its part, told Ars that it has typically followed the guidelines as a normal part of its process for building new data centers.)

Trump gets data center companies to pledge to pay for power generation Read More »

iowa-county-adopts-strict-zoning-rules-for-data-centers,-but-residents-still-worry

Iowa county adopts strict zoning rules for data centers, but residents still worry


Though the rules are among the strictest in the US, locals say they aren’t enough.

A rendering of the QTS data center currently under construction in Cedar Rapids, Iowa. Credit: QTS

PALO, Iowa—There are two restaurants in Palo, not counting the chicken wings and pizza sold at the only gas station in town.

All three establishments, including the gas station, stand on the same half-mile stretch of First Street, an artery that divides the marshy floodplain of the Cedar River to the east from hundreds of acres of cornfields on the west.

During historic flooding in 2008, the Cedar River surged 10 feet above its previous record, cresting at 31 feet and wiping out homes and businesses well outside the floodplain.

Nearly 20 years later, those structures have been rebuilt, but Palo residents still worry about the river. Except these days, they worry that data centers will drink it dry.

In an effort to shield residents and natural resources from the negative impacts of hyperscale data center development in rural Linn County, officials have adopted what may be one of the most comprehensive local data center zoning ordinances in the nation.

The new ordinance requires data center developers to conduct a comprehensive water study as part of their zoning application and to enter into a water-use agreement with the county before construction. It also places limits on noise and light pollution, introduces mandatory setbacks of 1,000 feet from residentially zoned property, and requires developers to compensate the county for damage to roads or infrastructure during construction and to contribute to a community betterment fund.

“We are trying to put together the most protective, transparent ordinance possible,” Kirsten Running-Marquardt, chair of the Linn County Board of Supervisors, told the nearly 100 residents who gathered for the draft ordinance’s first public reading in early February.

But seated beneath a van-sized American flag hanging from the rafters of the drafty Palo Community Center gymnasium, residents asked for even stronger protections.

One by one, they approached the microphone at the front of the gym to voice concerns about water use, electricity rates, light pollution, the impacts of low-frequency noise on livestock, and the county’s ability to enforce the terms of the ordinance. Some, including Dorothy Landt of Palo, called for a complete moratorium on new data center development.

“Why has Linn County, Iowa, become a dumping ground for soon-to-be obsolete technology that spoils our landscape and robs us of our resources?” Landt asked. “While I admire the efforts of the Board of Supervisors to propose a data center ordinance, I would prefer to see all future data centers banned from Linn County.”

The county is already home to two major data center projects, operated by Google and QTS. Both are located in Cedar Rapids, Iowa’s second-largest city, and are therefore subject to its laws. The new ordinance would apply only to unincorporated areas of the county, which make up more than two-thirds of its geographic footprint.

In October 2025, Google informed the Linn County Board of Supervisors of early plans to construct a six-building campus in Palo, part of unincorporated Linn County, alongside the soon-to-reopen Duane Arnold Energy Center, Iowa’s sole nuclear power plant. Later that month, Google signed a 25-year power purchase agreement with the plant, committing to buy the bulk of the electricity it generates.

A view of the Duane Arnold Energy Center in Palo, Iowa.

Credit: NextEra Energy

A view of the Duane Arnold Energy Center in Palo, Iowa. Credit: NextEra Energy

Google has not yet submitted a formal application to the county for the second campus, but its announcement last year, as well as interest from another, unnamed, hyperscale data company, prompted Linn County officials to begin work on an ordinance setting the terms for any new development, said Charlie Nichols, director of planning and development for Linn County.

“I just don’t want to be misled by anything. … I want to know as much as possible before we go ahead with this,” Sue Biederman of Cedar Rapids told supervisors at the public meeting in February.

In drafting the ordinance, Nichols and his staff drew on the experiences of communities nationwide, meeting with local government officials in regions that have seen massive booms in data center development, including several counties in northern Virginia, the “data center capital of the world.”

As data center development balloons, many communities that initially zoned the operations as warehouses or standard commercial users are abandoning that practice, Nichols noted.

The extreme energy and water demands of data centers simply cannot be accounted for by existing zoning frameworks, he said. “These are generational uses with generational infrastructure impacts, and treating them as a normal warehouse or normal commercial user is just not working.”

Loudoun County, Virginia, for example, is home to 198 data centers, nearly all of which were built before the county required conditional or “special exception” use designations for data centers. At the urging of hyperscale-weary residents, the county is now in the second phase of a plan to establish data-center-specific zoning standards.

Similar reassessments are taking place across the country, Chris Jordan, program manager for AI and innovation at the National League of Cities, wrote in an email to Inside Climate News. “We’re seeing tighter zoning standards, more required impact studies, and in some cases temporary moratoria while communities assess infrastructure capacity,” Jordan wrote.

The Linn County, Iowa, ordinance goes one step further than tightening existing zoning rules. Instead, it creates a new, exclusive-use zoning district for data centers, granting county officials the power to set specific application requirements and development standards for projects.

Residents of Linn County, Iowa, gather at the Palo Community Center on Feb. 4 to comment on a draft of a new data center ordinance.

Credit: Anika Jane Beamer/Inside Climate News

Residents of Linn County, Iowa, gather at the Palo Community Center on Feb. 4 to comment on a draft of a new data center ordinance. Credit: Anika Jane Beamer/Inside Climate News

No other counties in the state have introduced similar zoning requirements, said Nichols. In fact, few jurisdictions nationwide have.

“Linn County’s approach is more comprehensive than many local zoning updates we’ve seen,” Jordan wrote. The creation of a data center-specific district, especially one that requires formal water-use agreements and economic development agreements, goes further than typical zoning amendments for data centers, Jordan said.

Despite the layers of protection baked into the new ordinance, Linn County still has limited ability to protect local water resources. Without a municipal water utility, permitting in rural Iowa communities falls to the state Department of Natural Resources (DNR), explained Nichols. Similarly, electric rates fall under the jurisdiction of the state utilities commission and cannot be regulated by the county.

Data centers may tap rivers or drill deep wells into shared aquifers, so long as that use complies with the terms of their water-use permit from the Iowa DNR. That leaves the Cedar River and public and private wells, which provide drinking water to much of Linn County, vulnerable.

Residents fear a new, large water user will dry up their wells, as occurred near a Meta data center in Mansfield, Georgia.

“We know that we can have multi-year droughts. The question is, are we depleting that river and the water table faster than it’s running?” Leland Freie, a Linn County resident, told supervisors at the first public meeting on the ordinance.

Without superseding state authority, the Linn County ordinance attempts to claw back a bit more local control, Nichols explained.

As part of their zoning application, data centers would submit a study “prepared by a qualified professional” assessing the capacity of proposed water sources, anticipating demands and cooling technologies, and developing contingency plans in case the water supply is interrupted.

Credit: Inside Climate News

Credit: Inside Climate News

Requiring a water study ensures, at a minimum, a baseline understanding of local water resources and dynamics near proposed data centers. That’s something the state of Iowa generally lacks, said Cara Matteson, a former geologist and the sustainability director for Linn County.

DNR staff told Matteson that water data gathered in Linn County by qualified researchers on behalf of a data center applicant would be incorporated in state-level permitting and enforcement decisions.

The department confirmed in an email to Inside Climate News that it would use the additional local water data.

If a data center’s application is approved, developers would then enter into an agreement with Linn County, outlining terms for water-use monitoring and reporting to both the county and the DNR. The agreement could also include contingency plans for droughts.

Still, the county has limited ability to act on the water monitoring data it’s seeking. The DNR doesn’t just issue water-use permits; it also issues penalties for permit violations.

Linn County’s zoning rule underwent several modifications in response to questions raised by attendees at the first two public readings, Nichols said.

From its first reading to final adoption, the ordinance has expanded to include language setting light pollution standards, requiring a waste management plan, including the Iowa DNR in the water-use agreement to address potential well interference issues, and requiring an applicant-led public meeting before any zoning commission meetings.

“I am very confident that no ordinance for data centers in Iowa is asking for more information or asking for more requirements to be met than our ordinance right now,” said Nichols at the final reading.

The Cedar Rapids Metro Economic Alliance has said that it strongly supports current and future data center development in the area. The new ordinance is not an effective moratorium, Nichols said. He said he “strongly believes” that a data center can be built within the adopted framework.

Google spokespeople did not respond to requests for comment.

New rules may prompt data centers to develop elsewhere, acknowledged Brandy Meisheid, a supervisor whose district includes many of Linn County’s smaller communities. But the ordinance sets out to protect residents, not developers, Meisheid said. “If it’s too high a price for them to pay, they don’t have to come.”

Anika Jane Beamer covers the environment and climate change in Iowa, with a particular focus on water, soil, and CAFOs. A lifelong Midwesterner, she writes about changing ecosystems from one of the most transformed landscapes on the continent. She holds a master’s degree in science writing from the Massachusetts Institute of Technology as well as a bachelor’s degree in biology and Spanish from Grinnell College. She is a former Outrider Fellow at Inside Climate News and was named a Taylor-Blakeslee Graduate Fellow by the Council for the Advancement of Science Writing.

This story originally appeared on Inside Climate News.

Photo of Inside Climate News

Iowa county adopts strict zoning rules for data centers, but residents still worry Read More »

musk-has-no-proof-openai-stole-xai-trade-secrets,-judge-rules,-tossing-lawsuit

Musk has no proof OpenAI stole xAI trade secrets, judge rules, tossing lawsuit


Hostility is not proof of theft

Even twisting an ex-employee’s text to favor xAI’s reading fails to sway judge.

Elon Musk appears to be grasping at straws in a lawsuit accusing OpenAI of poaching eight xAI employees in an allegedly unlawful bid to access xAI trade secrets connected to its data centers and chatbot, Grok.

In a Tuesday order granting OpenAI’s motion to dismiss, US District Judge Rita F. Lin said that xAI failed to provide evidence of any misconduct from OpenAI.

Instead, xAI seemed fixated on a range of alleged conduct of former employees. But in assessing xAI’s claims, Lin said that xAI failed to show proof that OpenAI induced any of these employees to steal trade secrets “or that these former xAI employees used any stolen trade secrets once employed by OpenAI.”

Two employees admitted to stealing confidential information, with both downloading xAI’s source code and one improperly grabbing a supposedly sensitive recording from a Musk “All Hands” meeting. But the rest were either accused of retaining seemingly less consequential data, like retaining work chats on their devices, or didn’t seem to hold any confidential information at all. Lin called out particularly weak arguments that xAI’s complaint acknowledged that one employee who OpenAI poached never received access to confidential information allegedly sought after exiting xAI, and two employees were lumped into the complaint who “simply left xAI for OpenAI,” Lin noted.

From the limited evidence, Lin concluded that “while xAI may state misappropriation claims against a couple of its former employees, it does not state a plausible misappropriation claim against OpenAI.”

Lin’s order will likely not be the end of the litigation, as she is allowing xAI to amend its complaint to address the current deficiencies.

Ars could not immediately reach xAI for comment, so it’s unclear what steps xAI may take next.

However, xAI seems unlikely to give up the fight, which OpenAI has alleged is part of a “harassment campaign” that Musk is waging through multiple lawsuits attacking his biggest competitor’s business practices.

Unsurprisingly, OpenAI celebrated the order on X, alleging that “this baseless lawsuit was never anything more than yet another front in Mr. Musk’s ongoing campaign of harassment.”

Other tech companies poaching talent for AI projects will likely be relieved while reading Lin’s order. Commercial litigator Sarah Tishler told Ars that the order “boils down to a fundamental concept in trade secret law: hiring from a competitor is not the same as stealing trade secrets from one.”

“Under the Defend Trade Secrets Act, xAI has to show that OpenAI actually received and used the alleged trade secrets, not just that it hired employees who may have taken them,” Tishler said. “Suspicious timing, aggressive recruiting, and even downloaded files are not enough on their own.”

Tishler suggested that the ruling will likely be welcomed by AI firms eager to secure the best talent without incurring legal risks from their hiring practices.

“In the AI industry, where talent moves fast and the competitive stakes are enormous, this ruling reaffirms that suspicion is not enough,” Tishler said. “You have to show the stolen information actually made it into the competitor’s hands and was put to use.”

OpenAI not liable for engineers swiping source code

Through the lawsuit, Musk has alleged that OpenAI is violating California’s unfair competition law. He claims that OpenAI is attempting “to destroy legitimate competition in the AI industry by neutralizing xAI’s innovations” and forcing xAI “to unfairly compete against its own trade secrets.”

But this claim hinges entirely upon xAI proving that OpenAI poached its employees to steal its trade secrets. So, for xAI’s lawsuit to proceed, xAI will need to beef up the evidence base for its other claim, that OpenAI has violated the federal Defend Trade Secrets Act, Lin said. To succeed on that, xAI must prove that OpenAI unlawfully acquired, disclosed, or used a trade secret with xAI’s consent.

That will likely be challenging because xAI, at this point, has not offered “any nonconclusory allegations that OpenAI itself acquired, disclosed, or used xAI’s trade secrets,” Lin wrote.

All xAI has claimed is that OpenAI induced former employees to share secrets, and so far, nothing backs that claim, Lin said. Tishler noted that the court also rejected an xAI theory that “OpenAI should be responsible for what its new hires did before they arrived” for “the same reason: without evidence that OpenAI directed the theft or actually put the stolen information to use, you cannot hold the company liable.”

The strongest evidence that xAI had of employee misconduct, allegedly allowing OpenAI to misappropriate xAI trade secrets, revolves around the departure of one of xAI’s earliest engineers, Xuechen Li.

That evidence wasn’t enough, Lin said. xAI alleged that Li gave a presentation to OpenAI that supposedly included confidential information. Li also uploaded “the entire xAI source code base to a personal cloud account,” which he had connected to ChatGPT, Lin noted, after a recruiter sent a message on Signal sharing a link with Li to another unrelated cloud storage location.

xAI hoped the Signal messages would shock the court, expecting it to read through the lines the way xAI did. As proof that OpenAI allegedly got access to xAI’s source code, xAI pointed to a Signal message that an OpenAI recruiter sent to Li “four hours after” Li downloaded the source code, saying “nw!” xAI has alleged this message is short-hand for “no way!”—suggesting the OpenAI recruiter was geeked to get access to xAI’s source code. But in a footnote, Lin said that “OpenAI insists that ‘nw’ means ‘no worries,’” and thus is unconnected to Li’s decision to upload the source code to a ChatGPT-linked cloud account.

Even interpreting the text using xAI’s reading, however, xAI did not show enough to prove the recruiter or OpenAI accessed or requested the files, Lin said.

It also didn’t help xAI’s case that a temporary injunction that xAI secured in a separate lawsuit targeting the engineer blocked Li from accepting a job at OpenAI.

That injunction led OpenAI to withdraw its job offer to Li. And that’s a problem for xAI, because since Li never worked at OpenAI, it’s clear that he never used xAI’s trade secrets while working for OpenAI.

Further weakening xAI’s arguments, if Li indeed shared confidential information during his presentation while interviewing for OpenAI, xAI has alleged no facts suggesting that OpenAI was aware Li was sharing xAI trade secrets, Lin wrote.

This “makes it very hard to argue OpenAI ever used anything he allegedly took,” Tishler told Ars.

Another former xAI engineer, Jimmy Fraiture, was accused of copying xAI trade secrets, but Fraiture has said he deleted the information he improperly downloaded before starting his job at OpenAI. Importantly, Lin said, since he joined OpenAI, there’s no evidence that he used xAI trade secrets to benefit xAI’s rival.

“Other than the bare fact that Fraiture had been recruited” by the same OpenAI employee “who had also recruited Li, xAI does not allege any facts indicating that OpenAI had encouraged Fraiture to take xAI’s confidential information in the first place,” Lin wrote.

Since “none of the other former employees allegedly shared with or disclosed to OpenAI any xAI trade secrets,” xAI could not advance its claim that OpenAI misappropriated trade secrets based only on allegations tied to Li and Fraiture’s supposed misconduct, Lin said.

xAI may be able to amend its complaint to maintain these arguments, but the company has thus far presented scant, purely circumstantial evidence.

It’s possible that xAI will secure more evidence to support its misappropriation claims against OpenAI in its ongoing lawsuit against Li. Ars could not immediately reach Li’s lawyer to find out if today’s ruling may impact that case.

Ex-executive’s “hostility” is not proof of theft

Among the least convincing arguments that xAI raised was a claim that an unnamed finance executive left xAI to take a “lesser role” at OpenAI after learning everything he knew about data centers from xAI.

That executive slighted xAI when Musk’s company later attempted to inquire about “confidentiality concerns.”

“Suck my dick,” the former xAI executive allegedly said, refusing to explain how his OpenAI work might overlap with his xAI position. “Leave me the fuck alone.”

xAI tried to argue that the executive’s hostility was proof of misconduct. But Lin wrote that xAI only alleged that the executive “merely possessed xAI trade secrets about data centers” and did not allege that he ever used trade secrets to benefit OpenAI.

Had xAI found evidence that OpenAI’s data center strategy suddenly mirrored xAI’s after the executive joined xAI’s rival, that may have helped xAI’s case. But there are plenty of reasons a former employee might reject an ex-employer’s outreach following an exit, Lin suggested.

“His hostility when xAI reached out about its confidentiality concerns also does not support a plausible inference of use,” Lin wrote. “Hostility toward one’s former employer during departure does not, without more, indicate use of trade secrets in a subsequent job. Nor does the executive’s lack of experience with AI data centers before his time at xAI, without more, support a plausible inference that he used xAI’s trade secrets at OpenAI.”

xAI has until March 17 to amend its complaint to keep up this particular fight against OpenAI. But the company won’t be able to add any new claims or parties, Lin noted, “or otherwise change the allegations except to correct the identified deficiencies.”

Criminal probe likely leaves OpenAI on pins

For Li, the engineer accused of disclosing xAI trade secrets with OpenAI, the litigation could eliminate one front of discovery as he navigates two other legal fights over xAI’s trade secrets claims.

Tishler has been closely monitoring xAI’s trade secret legal battles. In October, she noted that Li is in a particularly prickly position, facing pressure in civil litigation from Musk to turn over data that could be used against him in the Federal Bureau of Investigation’s criminal investigation into Musk’s allegations. As Tishler explained:

“The practical reality is stark: Li faces a choice between protecting himself in the criminal action with his silence, and the civil consequences of doing so. Refuse to answer, and xAI could argue adverse inferences; answer, and the responses could feed the criminal case.”

Ultimately, the FBI is trying to prove that Li stole information that qualified as a trade secret and intended to use it for OpenAI’s benefit, while knowing that it would harm xAI. If they succeed, “xAI would suddenly have a government-backed record that its trade secrets were stolen,” Tishler wrote.

If xAI were so armed and able to keep the OpenAI lawsuit alive, the central question in the lawsuit that Lin dismissed today would shift, Tishler suggested, from “was there a theft?” to “what did OpenAI know, and when did it know it?”

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

Musk has no proof OpenAI stole xAI trade secrets, judge rules, tossing lawsuit Read More »

data-center-builders-thought-farmers-would-willingly-sell-land,-learn-otherwise

Data center builders thought farmers would willingly sell land, learn otherwise

Notably, one resident in Huddleston’s county who received an offer, 75-year-old Timothy Grosser, even declined a proposal to “name your price” when a tech company sought to buy his 250-acre farm, The Guardian reported.

“There is none,” Grosser said.

The farm is where he “lives, hunts, and raises cattle” and where his grandson hunts a turkey every Christmas for the family feast.

“The money’s not worth giving up your lifestyle,” Grosser said.

Another farmer in Wisconsin, Anthony Barta, reportedly fretted about what would happen to his neighbors if he took a deal he was offered—showing the deep bonds of people whose farms have bordered each other for years. In his community, another farmer was offered between $70 million and $80 million for 6,000 acres.

“Me and my family, we own the farm and run close to 1,000 animals,” Barta said. “What would that do if that’s next to it? Can they even be there? You know, that’s our livelihood—the farm. We’re just concerned what, if it would go through, what would happen to us and our neighbors and farms and our community? What would happen to that?”

Some tech companies are apparently not taking “no” for an answer. At least one farmer who spent 51 years milking cows in Pennsylvania prior to the AI boom described tech companies as “relentless.”

Eighty-six-year-old Mervin Raudabaugh, Jr., found a creative solution to end the pressure to sell two contiguous farms. He reportedly staved off developers by turning to “a farmland preservation program dedicating taxpayer dollars toward protecting agricultural resources.”

By working with the program, Raudabaugh will only receive about one-eighth of what the developers were offering. But he said it’s worth it to know his land would be preserved for farming purposes and out of reach of persistent tech companies.

“These people have hounded the living daylights out of me,” Raudabaugh said.

Data center deals come amid fragile farm economy

For people in rural communities, data center fights go beyond concerns about water and electricity consumption—although those are concerns, too. Communities are defending the character of the land, which they don’t want to see suddenly disrupted by extensive construction, data center noise pollution, or untold environmental impacts from massive operations.

Data center builders thought farmers would willingly sell land, learn otherwise Read More »

senators-count-the-shady-ways-data-centers-pass-energy-costs-on-to-americans

Senators count the shady ways data centers pass energy costs on to Americans


Senators demand Big Tech pay upfront for data center spikes in electricity bills.

Senators launched a probe Tuesday demanding that tech companies explain exactly how they plan to prevent data center projects from increasing electricity bills in communities where prices are already skyrocketing.

In letters to seven AI firms, Senators Elizabeth Warren (D-Mass.), Chris Van Hollen (D-Md.), and Richard Blumenthal (D-Conn.) cited a study estimating that “electricity prices have increased by as much as 267 percent in the past five years” in “areas located near significant data center activity.”

Prices increase, senators noted, when utility companies build out extra infrastructure to meet data centers’ energy demands—which can amount to one customer suddenly consuming as much power as an entire city. They also increase when demand for local power outweighs supply. In some cases, residents are blindsided by higher bills, not even realizing a data center project was approved, because tech companies seem intent on dodging backlash and frequently do not allow terms of deals to be publicly disclosed.

AI firms “ask public officials to sign non-disclosure agreements (NDAs) preventing them from sharing information with their constituents, operate through what appear to be shell companies to mask the real owner of the data center, and require that landowners sign NDAs as part of the land sale while telling them only that a ‘Fortune 100 company’ is planning an ‘industrial development’ seemingly in an attempt to hide the very existence of the data center,” senators wrote.

States like Virginia with the highest concentration of data centers could see average electricity prices increase by another 25 percent by 2030, senators noted. But price increases aren’t limited to the states allegedly striking shady deals with tech companies and greenlighting data center projects, they said. “Interconnected and interstate power grids can lead to a data center built in one state raising costs for residents of a neighboring state,” senators reported.

Under fire for supposedly only pretending to care about keeping neighbors’ costs low were Amazon, Google, Meta, Microsoft, Equinix, Digital Realty, and CoreWeave. Senators accused firms of paying “lip service,” claiming that they would do everything in their power to avoid increasing residential electricity costs, while actively lobbying to pass billions in costs on to their neighbors.

For example, Amazon publicly claimed it would “make sure” it would cover costs so they wouldn’t be passed on. But it’s also a member of an industry lobbying group, the Data Center Coalition, that “has opposed state regulatory decisions requiring data center companies to pay a higher percentage of costs upfront,” senators wrote. And Google made similar statements, despite having an executive who opposed a regulatory solution that would set data centers into their own “rate class”—and therefore responsible for grid improvement costs that could not be passed on to other customers—on the grounds that it was supposedly “discriminatory.”

“The current, socialized model of electricity ratepaying,” senators explained—where costs are shared across all users—”was not designed for an era where just one customer requires the same amount of electricity as some of the largest cities in America.”

Particularly problematic, senators emphasized, were reports that tech firms were getting discounts on energy costs as utility companies competed for their business, while prices went up for their neighbors.

Ars contacted all firms targeted by lawmakers. Four did not respond. Microsoft and Meta declined to comment. Digital Realty told Ars that it “looks forward to working with all elected officials to continue to invest in the digital infrastructure required to support America’s leadership in technology, which underpins modern life and creates high-paying jobs.”

Regulatory pressure likely to increase as bills go up

Senators are likely exploring whether to pass legislation that would help combat price increases that they say cause average Americans to struggle to keep the lights on. They’ve asked tech companies to respond to their biggest questions about data center projects by January 12, 2026.

Among their top questions, senators wanted to know about firms’ internal projections looking forward with data center projects. That includes sharing their projected energy use through 2030, as well as the “impact of your AI data centers on regional utility costs.” Companies are also expected to explain how “internal projections of data center energy consumption” justify any “opposition to the creation of a distinct data center rate class.”

Additionally, senators asked firms to outline steps they’ve taken to prevent passing on costs to neighbors and details of any impact studies companies have conducted.

Likely to raise the most eyebrows, however, would be answers to questions about “tax deductions or other financial incentives” tech firms have received from city and state governments. Those numbers would be interesting to compare with other information senators demanded that companies share, detailing how much they’ve spent on lobbying and advocacy for data centers. Senators appear keen to know how much tech companies are paying to avoid covering a proportionate amount of infrastructure costs.

“To protect consumers, data centers must pay a greater share of the costs upfront for future energy usage and updates to the electrical grid provided specifically to accommodate data centers’ energy needs,” senators wrote.

Requiring upfront payment is especially critical, senators noted, since some tech firms have abandoned data center projects, leaving local customers to bear the costs of infrastructure changes without utility companies ever generating any revenue. Communities must also consider that AI firms’ projected energy demand could severely dip if enterprise demand for AI falls short of expectations, AI capabilities “plateau” and trigger widespread indifference, AI companies shift strategies “away from scaling computer power,” or chip companies “find innovative ways to make AI more energy-efficient.”

“If data centers end up providing less business to the utility companies than anticipated, consumers could be left with massive electricity bills as utility companies recoup billions in new infrastructure costs, with nothing to show for it,” senators wrote.

Already, Utah, Oregon, and Ohio have passed laws “creating a separate class of utility customer for data centers which includes basic financial safeguards such as upfront payments and longer contract length,” senators noted, and Virginia is notably weighing a similar law.

At least one study, The New York Times noted, suggested that data centers may have recently helped reduce electricity costs by spreading the costs of upgrades over more customers, but those outcomes varied by state and could not account for future AI demand.

“It remains unclear whether broader, sustained load growth will increase long-run average costs and prices,” Lawrence Berkeley National Laboratory researchers concluded. “In some cases, spikes in load growth can result in significant, near-term retail price increase.”

Until companies prove they’re paying their fair share, senators expect electricity bills to keep climbing, particularly in vulnerable areas. That will likely only increase pressure for regulators to intervene, the director of the Electricity Law Initiative at the Harvard Law School Environmental and Energy Law Program, Ari Peskoe, suggested in September.

“The utility business model is all about spreading costs of system expansion to everyone, because we all benefit from a reliable, robust electricity system,” Peskoe said. “But when it’s a single consumer that is using so much energy—basically that of an entire city—and when that new city happens to be owned by the wealthiest corporations in the world, I think it’s time to look at the fundamental assumptions of utility regulation and make sure that these facilities are really paying for all of the infrastructure costs to connect them to the system and to power them.”

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

Senators count the shady ways data centers pass energy costs on to Americans Read More »

after-years-of-resisting-it,-spacex-now-plans-to-go-public.-why?

After years of resisting it, SpaceX now plans to go public. Why?


“Much of the AI race comes down to amassing and deploying assets.”

Elon Musk gestures as he speaks during a press conference at SpaceX’s Starbase facility near Boca Chica Village in South Texas on February 10, 2022. Credit: JIM WATSON/AFP via Getty Images

SpaceX is planning to raise tens of billions of dollars through an initial public offering next year, multiple outlets have reported, and Ars can confirm. This represents a major change in thinking from the world’s leading space company and its founder, Elon Musk.

The Wall Street Journal and The Information first reported about a possible IPO last Friday, and Bloomberg followed that up on Tuesday evening with a report suggesting the company would target a $1.5 trillion valuation. This would allow SpaceX to raise in excess of $30 billion.

This is an enormous amount of funding. The largest IPO in history occurred in 2019, when the state-owned Saudi Arabian oil company began public trading as Aramco and raised $29 billion. In terms of revenue, Aramco is a top-five company in the world.

Now SpaceX is poised to potentially match or exceed this value. That SpaceX would be attractive to public investors is not a surprise—it’s the world’s dominant space company in launch, space-based communications, and much more. For investors seeking unlimited growth, space is the final frontier.

Buy why would Musk take SpaceX public now, at a time when the company’s revenues are surging thanks to the growth of the Starlink Internet constellation? The decision is surprising because Musk has, for so long, resisted going public with SpaceX. He has not enjoyed the public scrutiny of Tesla, and feared that shareholder desires for financial return were not consistent with his ultimate goal of settling Mars.

Data centers

Ars spoke with multiple people familiar with Musk and his thinking to understand why he would want to take SpaceX public.

A significant shift in recent years has been the rise of artificial intelligence, which Musk has been involved in since 2015, when he co-founded OpenAI. He later had a falling out with his cofounders and started his own company, xAI, in 2023. At Tesla, he has been pushing smart-driving technology forward and more recently focused on robotics. Musk sees a convergence of these technologies in the near future, which he believes will profoundly change civilization.

Raising large amounts of money in the next 18 months would allow Musk to have significant capital to deploy at SpaceX as he influences and partakes in this convergence of technology.

How can SpaceX play in this space? In the near term, the company plans to develop a modified version of the Starlink satellite to serve as a foundation for building data centers in space. Musk said as much on the social media network he owns, X, in late October: “SpaceX will be doing this.”

But using a next-generation Starlink satellite manufactured on Earth is just the beginning of his vision. “The level beyond that is constructing satellite factories on the Moon and using a mass driver (electromagnetic railgun) to accelerate AI satellites to lunar escape velocity without the need for rockets,” Musk said this weekend on X. “That scales to >100TW/year of AI and enables non-trivial progress towards becoming a Kardashev II civilization.”

Based on some projected analyses, SpaceX is expected to have in the neighborhood of $22 to $24 billion in revenue next year. That is a lot of money—it’s on par with NASA’s annual budget, for example, and SpaceX can deploy its capital far, far more efficiently than the government can. So the company will be able to accomplish a lot. But with a large infusion of cash, SpaceX will be able to go much faster. And it will take a lot of cash to design and build the satellites and launch the rockets to deploy data centers in space.

Abhi Tripathi, a long-time SpaceX employee who is now director of mission operations at the UC Berkeley Space Sciences Laboratory, believes that once Musk realized Starlink satellites could be architected into a distributed network of data centers, the writing was on the wall.

“That is the moment an IPO suddenly came into play after being unlikely for so long,” Tripathi told Ars. “If you have followed Elon’s tactics, you know that once he commits to something, he leans fully into it. Much of the AI race comes down to amassing and deploying assets that work quicker than your competition. A large war chest resulting from an IPO will greatly help his cause and disadvantage all others.”

Foremost among Musk’s goals right now is to “win” the battle for artificial intelligence. He is already attacking the problem at xAI and Tesla, and he now seeks to throw SpaceX into the fray as well. Taking SpaceX public and using it to marshal an incredible amount of resources shows he is playing to win.

What about Mars?

Musk founded SpaceX in 2002 with the goal of one day settling Mars. He has never wavered from that goal, and indeed, the company has made considerable progress in more than two decades. SpaceX now launches more than 90 percent of the world’s mass to orbit, has nearly 90 percent of the satellites in orbit, and backstops a large portion of the US government’s civil and military activities in space. Moreover, with Starship, SpaceX is building the first vehicle that could realistically send humans and a lot of the stuff humans need to survive to Mars one day.

But if Musk’s rationale for keeping SpaceX private was to protect the Mars dream, is he abandoning this long-standing aim?

Not necessarily. It’s likely that Musk sees artificial intelligence as a key part of the Mars vision. Whether one believes the Optimus robot will become a viable product or not, Musk does. And he’s spoken about sending the robots to Mars to make the way smoother for the first human settlers.

Musk also believes that a larger and more financially robust SpaceX is necessary to undertake the settling of Mars. He understands that NASA will not pay for this, as the civil space agency is in the business of exploration and not settlement. For several years now, he has expressed that it will require about 1 million tons of supplies to be shipped to Mars to make a self-sustaining settlement. This is roughly 1,000 ships, and including refueling, at least 10,000 Starship launches. At $100 million per launch, that’s $1 trillion in launch costs alone.

Musk has frequently expressed a concern that there may be a limited window for settling Mars. Perhaps financial markets collapse. Perhaps there’s a worse pandemic. Perhaps a large asteroid hits the planet. Taking SpaceX public now is a bet that he can marshal the resources now, during his lifetime, to make Mars City One a reality. He is 54 years old.

The plan is not without risks, of course. If AI is something of a bubble, ten years from now, SpaceX may be sitting on hundreds of billions of dollars worth of satellites in space for which there is limited use. Maybe shareholders would rather SpaceX make them multimillionaires than make humans multiplanetary.

But Musk has never shied away from risks. So doubling down on his most successful asset in this moment is precisely what one would expect him to do.

Photo of Eric Berger

Eric Berger is the senior space editor at Ars Technica, covering everything from astronomy to private space to NASA policy, and author of two books: Liftoff, about the rise of SpaceX; and Reentry, on the development of the Falcon 9 rocket and Dragon. A certified meteorologist, Eric lives in Houston.

After years of resisting it, SpaceX now plans to go public. Why? Read More »

elon-musk-on-data-centers-in-orbit:-“spacex-will-be-doing-this”

Elon Musk on data centers in orbit: “SpaceX will be doing this”

Interest is growing rapidly

“The amount of momentum from heavyweights in the tech industry is very much worth paying attention to,” said Caleb Henry, director of research at Quilty Space, in an interview. “If they start putting money behind it, we could see another transformation of what’s done in space.”

The essential function of a data center is to store, process, and transmit data. Historically, satellites have already done a lot of this, Henry said. Telecommunications satellites specialize in transmitting data. Imaging satellites store a lot of data and then dump it when they pass over ground stations. In recent years, onboard computers have gotten more sophisticated at processing data. Data centers in space could represent the next evolution of that.

Critics rightly note that it would require very large satellites with extensive solar panels to power data centers that rival ground-based infrastructure. However, SpaceX’s Starlink V3 satellites are unlike any previous space-based technology, Henry said.

A lot more capacity

SpaceX’s current Starlink V2 mini satellites have a maximum downlink capacity of approximately 100 Gbps. The V3 satellite is expected to increase this capacity by a factor of 10, to 1 Tbps. This is not unprecedented in satellite capacity, but it certainly is at scale.

For example, Viasat contracted with Boeing for the better part of a decade, spending hundreds of millions of dollars, to build Viasat-3, a geostationary satellite with a capacity of 1 Tbps. This single satellite may launch next week on an Atlas V rocket.

SpaceX plans to launch dozens of Starlink V3 satellites—Henry estimates the number is about 60—on each Starship rocket launch. Those launches could occur as soon as the first half of 2026, as SpaceX has already tested a satellite dispenser on its Starship vehicle.

“Nothing else in the rest of the satellite industry that comes close to that amount of capacity,” Henry said.

Exactly what “scaling up” Starlink V3 satellites might look like is not clear, but it doesn’t seem silly to expect it could happen. The very first operational Starlink satellites launched a little more than half a decade ago with a mass of about 300 kg and a capacity of 15Gbps. Starlink V3 satellites will likely mass 1,500 kg.

Elon Musk on data centers in orbit: “SpaceX will be doing this” Read More »

nvidia-hits-record-$5-trillion-mark-as-ceo-dismisses-ai-bubble-concerns

Nvidia hits record $5 trillion mark as CEO dismisses AI bubble concerns

Partnerships and government contracts fuel optimism

At the GTC conference on Tuesday, Nvidia’s CEO went out of his way to repeatedly praise Donald Trump and his policies for accelerating domestic tech investment while warning that excluding China from Nvidia’s ecosystem could limit US access to half the world’s AI developers. The overall event stressed Nvidia’s role as an American company, with Huang even nodding to Trump’s signature slogan in his sign-off by thanking the audience for “making America great again.”

Trump’s cooperation is paramount for Nvidia because US export controls have effectively blocked Nvidia’s AI chips from China, costing the company billions of dollars in revenue. Bob O’Donnell of TECHnalysis Research told Reuters that “Nvidia clearly brought their story to DC to both educate and gain favor with the US government. They managed to hit most of the hottest and most influential topics in tech.”

Beyond the political messaging, Huang announced a series of partnerships and deals that apparently helped ease investor concerns about Nvidia’s future. The company announced collaborations with Uber Technologies, Palantir Technologies, and CrowdStrike Holdings, among others. Nvidia also revealed a $1 billion investment in Nokia to support the telecommunications company’s shift toward AI and 6G networking.

The agreement with Uber will power a fleet of 100,000 self-driving vehicles with Nvidia technology, with automaker Stellantis among the first to deliver the robotaxis. Palantir will pair Nvidia’s technology with its Ontology platform to use AI techniques for logistics insights, with Lowe’s as an early adopter. Eli Lilly plans to build what Nvidia described as the most powerful supercomputer owned and operated by a pharmaceutical company, relying on more than 1,000 Blackwell AI accelerator chips.

The $5 trillion valuation surpasses the total cryptocurrency market value and equals roughly half the size of the pan European Stoxx 600 equities index, Reuters notes. At current prices, Huang’s stake in Nvidia would be worth about $179.2 billion, making him the world’s eighth-richest person.

Nvidia hits record $5 trillion mark as CEO dismisses AI bubble concerns Read More »

white-house-unveils-sweeping-plan-to-“win”-global-ai-race-through-deregulation

White House unveils sweeping plan to “win” global AI race through deregulation

Trump’s plan was not welcomed by everyone. J.B. Branch, Big Tech accountability advocate for Public Citizen, in a statement provided to Ars, criticized Trump as giving “sweetheart deals” to tech companies that would cause “electricity bills to rise to subsidize discounted power for massive AI data centers.”

Infrastructure demands and energy requirements

Trump’s new AI plan tackles infrastructure head-on, stating that “AI is the first digital service in modern life that challenges America to build vastly greater energy generation than we have today.” To meet this demand, it proposes streamlining environmental permitting for data centers through new National Environmental Policy Act (NEPA) exemptions, making federal lands available for construction and modernizing the power grid—all while explicitly rejecting “radical climate dogma and bureaucratic red tape.”

The document embraces what it calls a “Build, Baby, Build!” approach—echoing a Trump campaign slogan—and promises to restore semiconductor manufacturing through the CHIPS Program Office, though stripped of “extraneous policy requirements.”

On the technology front, the plan directs Commerce to revise NIST’s AI Risk Management Framework to “eliminate references to misinformation, Diversity, Equity, and Inclusion, and climate change.” Federal procurement would favor AI developers whose systems are “objective and free from top-down ideological bias.” The document strongly backs open source AI models and calls for exporting American AI technology to allies while blocking administration-labeled adversaries like China.

Security proposals include high-security military data centers and warnings that advanced AI systems “may pose novel national security risks” in cyberattacks and weapons development.

Critics respond with “People’s AI Action Plan”

Before the White House unveiled its plan, more than 90 organizations launched a competing “People’s AI Action Plan” on Tuesday, characterizing the Trump administration’s approach as “a massive handout to the tech industry” that prioritizes corporate interests over public welfare. The coalition includes labor unions, environmental justice groups, and consumer protection nonprofits.

White House unveils sweeping plan to “win” global AI race through deregulation Read More »

how-a-data-center-company-uses-stranded-renewable-energy

How a data center company uses stranded renewable energy

“Decisions around where data centers get built have shifted dramatically over the last six months, with access to power now playing the most significant role in location scouting,” Joshi said. “The grid can’t keep pace with AI demands, so the industry is taking control with onsite power generation.”

Soluna, like other data center developers looking to rely on renewable energy, buys the excess power from wind, hydro, and solar plants that they can’t sell to the grid. By the end of the year, Soluna will have three facilities totaling 123 megawatts of capacity in Kentucky and Texas and seven projects in the works with upwards of 800 total megawatts.

Belizaire and I talked about how in Texas, where I report from, there’s plenty of curtailed energy from wind and solar farms because of the region’s transmission capacity. In West Texas, other data center developers are also taking advantage of the unused wind energy, far from major load centers like Dallas and Houston, by co-locating their giant warehouses full of advanced computers and high-powered cooling systems with the excess energy.

One data center developer using curtailed renewable power in Texas is IREN. The firm owns and operates facilities optimized for Bitcoin mining and AI. It developed a 7.5-gigawatt facility in Childress and broke ground on a 1.4-gigawatt data center in Sweetwater.

IREN purchases power through the state grid’s wholesale market during periods of oversupply, said Kent Draper, the company’s chief commercial officer, and reduces its consumption when prices are high. It’s able to do that by turning off its computers and minimizing power demand from its data centers.

But curtailment is an issue all over the world, Belizaire said, from Oklahoma, North Dakota, South Dakota, California, and Arizona in the US, to Northern Ireland, Germany, Portugal, and Australia.

“Anywhere where you have large utility-scale renewable development that’s been built out, you’re going to find it,” Belizaire said.

In a March analysis, the US Energy Information Administration reported that solar and wind power curtailments are increasing in California. In 2024, the grid operator for most of California curtailed 3.4 million megawatt hours of utility-scale wind and solar output, a 29 percent increase from the amount of electricity curtailed in 2023.

How a data center company uses stranded renewable energy Read More »

where-hyperscale-hardware-goes-to-retire:-ars-visits-a-very-big-itad-site

Where hyperscale hardware goes to retire: Ars visits a very big ITAD site

Inside the laptop/desktop examination bay at SK TES’s Fredericksburg, Va. site.

Credit: SK tes

Inside the laptop/desktop examination bay at SK TES’s Fredericksburg, Va. site. Credit: SK tes

The details of each unit—CPU, memory, HDD size—are taken down and added to the asset tag, and the device is sent on to be physically examined. This step is important because “many a concealed drive finds its way into this line,” Kent Green, manager of this site, told me. Inside the machines coming from big firms, there are sometimes little USB, SD, SATA, or M.2 drives hiding out. Some were make-do solutions installed by IT and not documented, and others were put there by employees tired of waiting for more storage. “Some managers have been pretty surprised when they learn what we found,” Green said.

With everything wiped and with some sense of what they’re made of, each device gets a rating. It’s a three-character system, like “A-3-6,” based on function, cosmetic condition, and component value. Based on needs, trends, and other data, devices that are cleared for resale go to either wholesale, retail, component harvesting, or scrap.

Full-body laptop skins

Wiping down and prepping a laptop, potentially for a full-cover adhesive skin.

Credit: SK TES

Wiping down and prepping a laptop, potentially for a full-cover adhesive skin. Credit: SK TES

If a device has retail value, it heads into a section of this giant facility where workers do further checks. Automated software plays sounds on the speakers, checks that every keyboard key is sending signals, and checks that laptop batteries are at 80 percent capacity or better. At the end of the line is my favorite discovery: full-body laptop skins.

Some laptops—certain Lenovo, Dell, and HP models—are so ubiquitous in corporate fleets that it’s worth buying an adhesive laminating sticker in their exact shape. They’re an uncanny match for the matte black, silver, and slightly less silver finishes of the laptops, covering up any blemishes and scratches. Watching one of the workers apply this made me jealous of their ability to essentially reset a laptop’s condition (so one could apply whole new layers of swag stickers, of course). Once rated, tested, and stickered, laptops go into a clever “cradle” box, get the UN 3481 “battery inside” sticker, and can be sold through retail.

Where hyperscale hardware goes to retire: Ars visits a very big ITAD site Read More »

proposed-underwater-data-center-surprises-regulators-who-hadn’t-heard-about-it

Proposed underwater data center surprises regulators who hadn’t heard about it

Proposed underwater data center surprises regulators who hadn’t heard about it

BalticServers.com

Data centers powering the generative AI boom are gulping water and exhausting electricity at what some researchers view as an unsustainable pace. Two entrepreneurs who met in high school a few years ago want to overcome that crunch with a fresh experiment: sinking the cloud into the sea.

Sam Mendel and Eric Kim launched their company, NetworkOcean, out of startup accelerator Y Combinator on August 15 by announcing plans to dunk a small capsule filled with GPU servers into San Francisco Bay within a month. “There’s this vital opportunity to build more efficient computer infrastructure that we’re gonna rely on for decades to come,” Mendel says.

The founders contend that moving data centers off land would slow ocean temperature rise by drawing less power and letting seawater cool the capsule’s shell, supplementing its internal cooling system. NetworkOcean’s founders have said a location in the bay would deliver fast processing speeds for the region’s buzzing AI economy.  

But scientists who study the hundreds of square miles of brackish water say even the slightest heat or disturbance from NetworkOcean’s submersible could trigger toxic algae blooms and harm wildlife. And WIRED inquiries to several California and US agencies who oversee the bay found that NetworkOcean has been pursuing its initial test of an underwater data center without having sought, much less received, any permits from key regulators.

The outreach by WIRED prompted at least two agencies—the Bay Conservation and Development Commission and the San Francisco Regional Water Quality Control Board—to email NetworkOcean that testing without permits could run afoul of laws, according to public records and spokespeople for the agencies. Fines from the BCDC can run up to hundreds of thousands of dollars.

The nascent technology has already been in hot water in California. In 2016, the state’s coastal commission issued a previously unreported notice to Microsoft saying that the tech giant had violated the law the year before by plunging an unpermitted server vessel into San Luis Obispo Bay, about 250 miles south of San Francisco. The months-long test, part of what was known as Project Natick, had ended without apparent environmental harm by the time the agency learned of it, so officials decided not to fine Microsoft, according to the notice seen by WIRED.

The renewed scrutiny of underwater data centers has surfaced an increasingly common tension between innovative efforts to combat global climate change and long-standing environmental laws. Permitting takes months, if not years, and can cost millions of dollars, potentially impeding progress. Advocates of the laws argue that the process allows for time and input to better weigh trade-offs.

“Things are overregulated because people often don’t do the right thing,” says Thomas Mumley, recently retired assistant executive officer of the bay water board. “You give an inch, they take a mile. We have to be cautious.”

Over the last two weeks, including during an interview at the WIRED office, NetworkOcean’s founders have provided driblets of details about their evolving plans. Their current intention is to test their underwater vessel for about an hour, just below the surface of what Mendel would only describe as a privately owned and operated portion of the bay that he says is not subject to regulatory oversight. He insists that a permit is not required based on the location, design, and minimal impact. “We have been told by our potential testing site that our setup is environmentally benign,” Mendel says.

Mumley, the retired regulator, calls the assertion about not needing a permit “absurd.” Both Bella Castrodale, the BCDC’s lead enforcement attorney, and Keith Lichten, a water board division manager, say private sites and a quick dip in the bay aren’t exempt from permitting. Several other experts in bay rules tell WIRED that even if some quirk does preclude oversight, they believe NetworkOcean is sending a poor message to the public by not coordinating with regulators.

“Just because these centers would be out of sight does not mean they are not a major disturbance,” says Jon Rosenfield, science director at San Francisco Baykeeper, a nonprofit that investigates industrial polluters.

School project

Mendel and Kim say they tried to develop an underwater renewable energy device together during high school in Southern California before moving onto non-nautical pursuits. Mendel, 23, dropped out of college in 2022 and founded a platform for social media influencers.

About a year ago, he built a small web server using the DIY system Raspberry Pi to host another personal project, and temporarily floated the equipment in San Francisco Bay by attaching it to a buoy from a private boat in the Sausalito area. (Mendel declined to answer questions about permits.) After talking with Kim, also 23, about this experiment, the two decided to move in together and start NetworkOcean.

Their pitch is that underwater data centers are more affordable to develop and maintain, especially as electricity shortages limit sites on land. Surrounding a tank of hot servers with water naturally helps cools them, avoiding the massive resource drain of air-conditioning and also improving on the similar benefits of floating data centers. Developers of offshore wind farms are eager to electrify NetworkOcean vessels, Mendel says.

Proposed underwater data center surprises regulators who hadn’t heard about it Read More »