waymo

“i’m-getting-dizzy”:-man-films-waymo-self-driving-car-driving-around-in-circles

“I’m getting dizzy”: Man films Waymo self-driving car driving around in circles

Waymo says the problem only caused a delay of just over five minutes and that Johns was not charged for the trip. A spokesperson for Waymo, which is owned by Google parent Alphabet, told Ars today that the “looping event” occurred on December 9 and was later addressed during a regularly scheduled software update.

Waymo did not answer our question about whether the software update only addressed routing at the specific location the problem occurred at, or a more general routing problem that could have affected rides in other locations.

The problem affecting Johns’ ride occurred near the user’s pickup location, Waymo told us. The Waymo car took the rider to his destination after the roughly five-minute delay, the spokesperson said. “Our rider support agent did help initiate maneuvers that helped resolve the issue,” Waymo said.

Rider would like an explanation

CBS News states that Johns is “still not certain he was communicating with a real person or AI” when he spoke to the support rep in the car. However, the Waymo spokesperson told Ars that “all of our rider support staff are trained human operators.”

Waymo told Ars that the company tried to contact Johns after the incident and left him a voicemail. Johns still says that he never received an explanation of what caused the circling problem.

We emailed Johns today and received a reply from a public relations firm working on his behalf. “To date, Mike has not received an explanation as to the reason for the circling issue,” his spokesperson said. His spokesperson confirmed that Johns did not miss his flight.

It wasn’t clear from the video whether Johns tried to use the “pull over” functionality available in Waymo cars. “If at any time you want to end your ride early, tap the Pull over button in your app or on the passenger screen, and the car will find a safe spot to stop,” a Waymo support site says.

Johns’ spokesperson told us that “Mike was not immediately aware of the ‘pull over’ button,” so “he did not have an opportunity to use it before engaging with the customer service representative over the car speaker.”

While Waymo says all its agents are human, Johns’ spokesperson told Ars that “Mike is still unsure if he was speaking with a human or an AI agent.”

“I’m getting dizzy”: Man films Waymo self-driving car driving around in circles Read More »

the-hyundai-ioniq-5-will-be-the-next-waymo-robotaxi

The Hyundai Ioniq 5 will be the next Waymo robotaxi

Waymo’s robotaxis are going to get a lot more angular in the future. Today, the autonomous driving startup and Hyundai announced that they have formed a strategic partnership, and the first product will be the integration of Waymo’s autonomous vehicle software and hardware with the Hyundai Ioniq 5.

“Hyundai and Waymo share a vision to improve the safety, efficiency, and convenience of how people move,” said José Muñoz, president and global COO of Hyundai Motor Company.

“We are thrilled to partner with Hyundai as we further our mission to be the world’s most trusted driver,” said Waymo’s co-CEO Tekedra Mawakana. “Hyundai’s focus on sustainability and strong electric vehicle roadmap makes them a great partner for us as we bring our fully autonomous service to more riders in more places.”

Now, this doesn’t mean you’ll be able to buy a driverless Ioniq 5 from your local Hyundai dealer; Waymo will operate these Ioniq 5s as part of its ride-hailing Waymo One fleet, which currently operates in parts of Austin, Texas; Los Angeles; Phoenix; and San Francisco. Currently, Waymo operates a fleet of Jaguar I-Pace EVs and has also used Chrysler Pacifica minivans.

The Hyundai Ioniq 5 will be the next Waymo robotaxi Read More »

human-drivers-keep-rear-ending-waymos

Human drivers keep rear-ending Waymos

Traffic safety —

We took a close look at the 23 most serious Waymo crashes.

A Waymo vehicle in San Francisco.

Enlarge / A Waymo vehicle in San Francisco.

Photo by JasonDoiy via Getty Images

On a Friday evening last November, police chased a silver sedan across the San Francisco Bay Bridge. The fleeing vehicle entered San Francisco and went careening through the city’s crowded streets. At the intersection of 11th and Folsom streets, it sideswiped the fronts of two other vehicles, veered onto a sidewalk, and hit two pedestrians.

According to a local news story, both pedestrians were taken to the hospital with one suffering major injuries. The driver of the silver sedan was injured, as was a passenger in one of the other vehicles.

No one was injured in the third car, a driverless Waymo robotaxi. Still, Waymo was required to report the crash to government agencies. It was one of 20 crashes with injuries that Waymo has reported through June.  And it’s the only crash Waymo has classified as causing a serious injury.

Twenty injuries might sound like a lot, but Waymo’s driverless cars have traveled more than 22 million miles. So driverless Waymo taxis have been involved in fewer than one injury-causing crash for every million miles of driving—a much better rate than a typical human driver.

Last week Waymo released a new website to help the public put statistics like this in perspective. Waymo estimates that typical drivers in San Francisco and Phoenix—Waymo’s two biggest markets—would have caused 64 crashes over those 22 million miles. So Waymo vehicles get into injury-causing crashes less than one-third as often, per mile, as human-driven vehicles.

Waymo claims an even more dramatic improvement for crashes serious enough to trigger an airbag. Driverless Waymos have experienced just five crashes like that, and Waymo estimates that typical human drivers in Phoenix and San Francisco would have experienced 31 airbag crashes over 22 million miles. That implies driverless Waymos are one-sixth as likely as human drivers to experience this type of crash.

The new data comes at a critical time for Waymo, which is rapidly scaling up its robotaxi service. A year ago, Waymo was providing 10,000 rides per week. Last month, Waymo announced it was providing 100,000 rides per week. We can expect more growth in the coming months.

So it really matters whether Waymo is making our roads safer or more dangerous. And all the evidence so far suggests that it’s making them safer.

It’s not just the small number of crashes Waymo vehicles experience—it’s also the nature of those crashes. Out of the 23 most serious Waymo crashes, 16 involved a human driver rear-ending a Waymo. Three others involved a human-driven car running a red light before hitting a Waymo. There were no serious crashes where a Waymo ran a red light, rear-ended another car, or engaged in other clear-cut misbehavior.

Digging into Waymo’s crashes

In total, Waymo has reported nearly 200 crashes through June 2024, which works out to about one crash every 100,000 miles. Waymo says 43 percent of crashes across San Francisco and Phoenix had a delta-V of less than 1 mph—in other words, they were very minor fender-benders.

But let’s focus on the 23 most severe crashes: those that either caused an injury, caused an airbag to deploy, or both. These are good crashes to focus on not only because they do the most damage but because human drivers are more likely to report these types of crashes, making it easier to compare Waymo’s software to human drivers.

Most of these—16 crashes in total—involved another car rear-ending a Waymo. Some were quite severe: three triggered airbag deployments, and one caused a “moderate” injury. One vehicle rammed the Waymo a second time as it fled the scene, prompting Waymo to sue the driver.

There were three crashes where a human-driven car ran a red light before crashing into a Waymo:

  • One was the crash I mentioned at the top of this article. A car fleeing the police ran a red light and slammed into a Waymo, another car, and two pedestrians, causing several injuries.
  • In San Francisco, a pair of robbery suspects fleeing police in a stolen car ran a red light “at a high rate of speed” and slammed into the driver’s side door of a Waymo, triggering an airbag. The suspects were uninjured and fled on foot. The Waymo was thankfully empty.
  • In Phoenix, a car ran a red light and then “made contact with the SUV in front of the Waymo AV, and both of the other vehicles spun.” The Waymo vehicle was hit in the process, and someone in one of the other vehicles suffered an injury Waymo described as minor.

There were two crashes where a Waymo got sideswiped by a vehicle in an adjacent lane:

  • In San Francisco, Waymo was stopped at a stop sign in the right lane when another car hit the Waymo while passing it on the left.
  • In Tempe, Arizona, an SUV “overtook the Waymo AV on the left” and then “initiated a right turn,” cutting the Waymo off and causing a crash. A passenger in the SUV said they suffered moderate injuries.

Finally, there were two crashes where another vehicle turned left across the path of a Waymo vehicle:

  • In San Francisco, a Waymo and a large truck were approaching an intersection from opposite directions when a bicycle behind the truck made a sudden left in front of the Waymo. Waymo says the truck blocked Waymo’s vehicle from seeing the bicycle until the last second. The Waymo slammed on its brakes but wasn’t able to stop in time. The San Francisco Fire Department told local media that the bicyclist suffered only minor injuries and was able to leave the scene on their own.
  • A Waymo in Phoenix was traveling in the right lane. A row of stopped cars was in the lane to its left. As Waymo approached an intersection, a car coming from the opposite direction made a left turn through a gap in the row of stopped cars. Again, Waymo says the row of stopped cars blocked it from seeing the turning car until it was too late. A passenger in the turning vehicle reported minor injuries.

It’s conceivable that Waymo was at fault in these last two cases—it’s impossible to say without more details. It’s also possible that Waymo’s erratic braking contributed to a few of those rear-end crashes. Still, it seems clear that a non-Waymo vehicle bore primary responsibility for most, and possibly all, of these crashes.

“About as good as you can do”

One should always be skeptical when a company publishes a self-congratulatory report about its own safety record. So I called Noah Goodall, a civil engineer with many years of experience studying roadway safety, to see what he made of Waymo’s analysis.

“They’ve been the best of the companies doing this,” Goodall told me. He noted that Waymo has a team of full-time safety researchers who publish their work in reputable journals.

Waymo knows precisely how often its own vehicles crash because its vehicles are bristling with sensors. The harder problem is calculating an appropriate baseline for human-caused crashes.

That’s partly because human drivers don’t always report their own crashes to the police, insurance companies, or anyone else. But it’s also because crash rates differ from one area to another. For example, there are far more crashes per mile in downtown San Francisco than in the suburbs of Phoenix.

Waymo tried to account for these factors as it calculated crash rates for human drivers in both Phoenix and San Francisco. To ensure an apples-to-apples comparison, Waymo’s analysis excludes freeway crashes from its human-driven benchmark, since Waymo’s commercial fleet doesn’t use freeways yet.

Waymo estimates that human drivers fail to report 32 percent of injury crashes; the company raised its benchmark for human crashes to account for that. But even without this under-reporting adjustment, Waymo’s injury crash rate would still be roughly 60 percent below that of human drivers. The true number is probably somewhere between the adjusted number (70 percent fewer crashes) and the unadjusted one (60 percent fewer crashes). It’s an impressive figure either way.

Waymo says it doesn’t apply an under-reporting adjustment to its human benchmark for airbag crashes, since humans almost always report crashes that are severe enough to trigger an airbag. So it’s easier to take Waymo’s figure here—an 84 percent decline in airbag crashes—at face value.

Waymo’s benchmarks for human drivers are “about as good as you can do,” Goodall told me. “It’s very hard to get this kind of data.”

When I talked to other safety experts, they were equally positive about the quality of Waymo’s analysis. For example, last year, I asked Phil Koopman, a professor of computer engineering at Carnegie Mellon, about a previous Waymo study that used insurance data to show its cars were significantly safer than human drivers. Koopman told me Waymo’s findings were statistically credible, with some minor caveats.

Similarly, David Zuby, the chief research officer at the Insurance Institute for Highway Safety, had mostly positive things to say about a December study analyzing Waymo’s first 7.1 million miles of driverless operations.

I found a few errors in Waymo’s data

If you look closely, you’ll see that one of the numbers in this article differs slightly from Waymo’s safety website. Specifically, Waymo says that its vehicles get into crashes that cause injury 73 percent less often than human drivers, while the figure I use in this article is 70 percent.

This is because I spotted a couple of apparent classification mistakes in the raw data Waymo used to generate its statistics.

Each time Waymo reports a crash to the National Highway Traffic Safety Administration, it records the severity of injuries caused by the crash. This can be fatal, serious, moderate, minor, none, or unknown.

When Waymo shared an embargoed copy of its numbers with me early last week, it said that there had been 16 injury crashes. However, when I looked at the data Waymo had submitted to federal regulators, it showed 15 minor injuries, two moderate injuries, and one serious injury, for a total of 18.

When I asked Waymo about this discrepancy, the company said it found a programming error. Waymo had recently started using a moderate injury category and had not updated the code that generated its crash statistics to count these crashes. Waymo fixed the error quickly enough that the official version Waymo published on Thursday of last week showed 18 injury crashes.

However, as I continued looking at the data, I noticed another apparent mistake: Two crashes had been put in the “unknown” injury category, yet the narrative for each crash indicated an injury had occurred. One report said “the passenger in the Waymo AV reported an unspecified injury.” The other stated that “an individual involved was transported from the scene to a hospital for medical treatment.”

I notified Waymo about this apparent mistake on Friday and they said they are looking into it. As I write this, the website still claims a 73 percent reduction in injury crashes. But I think it’s clear that these two “unknown” crashes were actually injury crashes. So, all of the statistics in this article are based on the full list of 20 injury crashes.

I think this illustrates that I come by my generally positive outlook on Waymo honestly: I probably scrutinize Waymo’s data releases more carefully than any other journalist, and I’m not afraid to point out when the numbers don’t add up.

Based on my conversations with Waymo, I’m convinced these were honest mistakes rather than deliberate efforts to cover up crashes. I could only identify these mistakes because Waymo went out of its way to make its findings reproducible. It would make no sense to do that if the company simultaneously tried to fake its statistics.

Could there be other injury or airbag-triggering crashes that Waymo isn’t counting? It’s certainly possible, but I doubt there have been very many. You might have noticed that I linked to local media reporting for some of Waymo’s most significant crashes. If Waymo deliberately covered up a severe crash, there would be a big risk that a crash would get reported in the media and then Waymo would have to explain to federal regulators why it wasn’t reporting all legally required crashes.

So, despite the screwups, I find Waymo’s data to be fairly credible, and those data show that Waymo’s vehicles crash far less often than human drivers on public roads.

Tim Lee was on staff at Ars from 2017 to 2021. Last year, he launched a newsletter, Understanding AI, that explores how AI works and how it’s changing our world. You can subscribe here.

Human drivers keep rear-ending Waymos Read More »

self-driving-waymo-cars-keep-sf-residents-awake-all-night-by-honking-at-each-other

Self-driving Waymo cars keep SF residents awake all night by honking at each other

The ghost in the machine —

Haunted by glitching algorithms, self-driving cars disturb the peace in San Francisco.

A Waymo self-driving car in front of Google's San Francisco headquarters, San Francisco, California, June 7, 2024.

Enlarge / A Waymo self-driving car in front of Google’s San Francisco headquarters, San Francisco, California, June 7, 2024.

Silicon Valley’s latest disruption? Your sleep schedule. On Saturday, NBC Bay Area reported that San Francisco’s South of Market residents are being awakened throughout the night by Waymo self-driving cars honking at each other in a parking lot. No one is inside the cars, and they appear to be automatically reacting to each other’s presence.

Videos provided by residents to NBC show Waymo cars filing into the parking lot and attempting to back into spots, which seems to trigger honking from other Waymo vehicles. The automatic nature of these interactions—which seem to peak around 4 am every night—has left neighbors bewildered and sleep-deprived.

NBC Bay Area’s report: “Waymo cars keep SF neighborhood awake.”

According to NBC, the disturbances began several weeks ago when Waymo vehicles started using a parking lot off 2nd Street near Harrison Street. Residents in nearby high-rise buildings have observed the autonomous vehicles entering the lot to pause between rides, but the cars’ behavior has become a source of frustration for the neighborhood.

Christopher Cherry, who lives in an adjacent building, told NBC Bay Area that he initially welcomed Waymo’s presence, expecting it to enhance local security and tranquility. However, his optimism waned as the frequency of honking incidents increased. “We started out with a couple of honks here and there, and then as more and more cars started to arrive, the situation got worse,” he told NBC.

The lack of human operators in the vehicles has complicated efforts to address the issue directly since there is no one they can ask to stop honking. That lack of accountability forced residents to report their concerns to Waymo’s corporate headquarters, which had not responded to the incidents until NBC inquired as part of its report. A Waymo spokesperson told NBC, “We are aware that in some scenarios our vehicles may briefly honk while navigating our parking lots. We have identified the cause and are in the process of implementing a fix.”

The absurdity of the situation prompted tech author and journalist James Vincent to write on X, “current tech trends are resistant to satire precisely because they satirize themselves. a car park of empty cars, honking at one another, nudging back and forth to drop off nobody, is a perfect image of tech serving its own prerogatives rather than humanity’s.”

Self-driving Waymo cars keep SF residents awake all night by honking at each other Read More »

waymo-is-suing-people-who-allegedly-smashed-and-slashed-its-robotaxis

Waymo is suing people who allegedly smashed and slashed its robotaxis

Waymo car is vandalized in San Francisco

The people of San Francisco haven’t always been kind to Waymo’s growing fleet of driverless taxis. The autonomous vehicles, which provide tens of thousands of rides each week, have been torched, stomped on, and verbally berated in recent months. Now Waymo is striking back—in the courts.

This month, the Silicon Valley company filed a pair of lawsuits, neither of which have been previously reported, that demand hundreds of thousands of dollars in damages from two alleged vandals. Waymo attorneys said in court papers that the alleged vandalism, which ruined dozens of tires and a tail end, are a significant threat to the company’s reputation. Riding in a vehicle in which the steering wheel swivels on its own can be scary enough. Having to worry about attackers allegedly targeting the rides could undermine Waymo’s ride-hailing business before it even gets past its earliest stage.

Waymo, which falls under the umbrella of Google parent Alphabet, operates a ride-hailing service in San Francisco, Phoenix, and Los Angeles that is comparable to Uber and Lyft except with sensors and software controlling the driving. While its cars haven’t contributed to any known deadly crashes, US regulators continue to probe their sometimes erratic driving. Waymo spokesperson Sandy Karp says the company always prioritizes safety and that the lawsuits reflect that strategy. She declined further comment for this story.

In a filing last week in the California Superior Court of San Francisco County, Waymo sued a Tesla Model 3 driver whom it alleges intentionally rear-ended one of its autonomous Jaguar crossovers. According to the suit, the driver, Konstantine Nikka-Sher Piterman, claimed in a post on X that “Waymo just rekt me” before going on to ask Tesla CEO Elon Musk for a job. The other lawsuit from this month, filed in the same court, targets Ronaile Burton, who allegedly slashed the tires of at least 19 Waymo vehicles. San Francisco prosecutors have filed criminal charges against her to which she has pleaded not guilty. A hearing is scheduled for Tuesday.

Burton’s public defender, Adam Birka-White, says in a statement that Burton “is someone in need of help and not jail” and that prosecutors continue “to prioritize punishing poor people at the behest of corporations, in this case involving a tech company that is under federal investigation for creating dangerous conditions on our streets.”

An attorney for Burton in the civil case hasn’t been named in court records, and Burton is currently in jail and couldn’t be reached for comment. Piterman didn’t respond to a voicemail, a LinkedIn message, and emails seeking comment. He hasn’t responded in court to the accusations.

Based on available records from courts in San Francisco and Phoenix, it appears that Waymo hasn’t previously filed similar lawsuits.

In the Tesla case, Piterman “unlawfully, maliciously, and intentionally” sped his car past a stop sign and into a Waymo car in San Francisco on March 19, according to the company’s suit. When the Waymo tried to pull over, Piterman allegedly drove the Tesla into the Waymo car again. He then allegedly entered the Waymo and later threatened a Waymo representative who responded to the scene in person. San Francisco police cited Piterman, according to the lawsuit. The police didn’t respond to WIRED’s request for comment.

Waymo is suing people who allegedly smashed and slashed its robotaxis Read More »

on-self-driving,-waymo-is-playing-chess-while-tesla-plays-checkers

On self-driving, Waymo is playing chess while Tesla plays checkers

A Waymo autonomous taxi in San Francisco.

Enlarge / A Waymo autonomous taxi in San Francisco.

David Paul Morris/Bloomberg via Getty Images

Tesla fans—and CEO Elon Musk himself—are excited about the prospects for Tesla’s Full Self Driving (FSD) software. Tesla released a major upgrade—version 12.3—of the software in March. Then, last month, Musk announced that Tesla would unveil a purpose-built robotaxi on August 8. Last week, Musk announced that a new version of FSD—12.4—will come out in the coming days and will have a “5X to 10X improvement in miles per intervention.”

But I think fans expecting Tesla to launch a driverless taxi service in the near future will be disappointed.

During a late March trip to San Francisco, I had a chance to try the latest self-driving technology from both Tesla and Google’s Waymo.

During a 45-minute test drive in a Tesla Model X, I had to intervene twice to correct mistakes by the FSD software. In contrast, I rode in driverless Waymo vehicles for more than two hours and didn’t notice a single mistake.

So while Tesla’s FSD version 12.3 seems like a significant improvement over previous versions of FSD, it still lags behind Waymo’s technology.

However, Waymo’s impressive performance comes with an asterisk. While no one was behind the wheel during my rides, Waymo has remote operators that sometimes provide guidance to its vehicles (Waymo declined to tell me whether—or how often—remote operators intervened during my rides). And while Tesla’s FSD works on all road types, Waymo’s taxis avoid freeways.

Many Tesla fans see these limitations as signs that Waymo is headed for a technological dead end. They see Tesla’s FSD, with its capacity to operate in all cities and on all road types, as a more general technology that will soon surpass Waymo.

But this fundamentally misunderstands the situation.

Safely operating driverless vehicles on public roads is hard. With no one in the driver’s seat, a single mistake can be deadly—especially at freeway speeds. So Waymo launched its driverless service in 2020 in the easiest environment it could find—residential streets in the Phoenix suburbs—and has been gradually ratcheting up the difficulty level as it gains confidence in its technology.

In contrast, Tesla hasn’t started driverless testing because its software isn’t ready. For now, geographic restrictions and remote assistance aren’t needed because there’s always a human being behind the wheel. But I predict that when Tesla begins its driverless transition, it will realize that safety requires a Waymo-style incremental rollout.

So Tesla hasn’t found a different, better way to bring driverless technology to market. Waymo is just so far ahead that it’s dealing with challenges Tesla hasn’t even started thinking about. Waymo is playing chess while Tesla is still playing checkers.

The current excitement around Tesla’s FSD reminds me of the hype that surrounded Waymo in 2018. Early that year, Waymo announced deals to purchase 20,000 I-Pace sedans from Jaguar and 62,000 Pacifica minivans from Fiat Chrysler.

But the service Waymo launched in December 2018 was a disappointment. There were still safety drivers behind the wheel on most rides, and access was limited to a handpicked group of passengers.

It wasn’t until October 2020 that Waymo finally launched a fully driverless taxi service in the Phoenix area that was open to the general public. And even after that, Waymo expanded slowly.

Waymo began offering commercial service in San Francisco in 2023 and is now expanding to Los Angeles and Austin. Today, the company has only a few hundred vehicles in its commercial fleet—far fewer than the 82,000 vehicles it was planning to purchase six years ago.

What went wrong? In an August 2018 article, journalist Amir Efrati reported on the limitations of Waymo’s technology. Efrati wrote that “Waymo vans have trouble with many unprotected left turns and with merging into heavy traffic in the Phoenix area.” In addition, “the cars have trouble separating people, or cyclists, who are in groups, especially people near shopping centers or in parking lots.”

On self-driving, Waymo is playing chess while Tesla plays checkers Read More »

waymo-and-uber-eats-start-human-less-food-deliveries-in-phoenix

Waymo and Uber Eats start human-less food deliveries in Phoenix

Someday the robots will be mad that we aren’t tipping them —

You’ll need to run outside when your robot delivery arrives.

A Waymo Jaguar I-Pace.

Enlarge / A Waymo Jaguar I-Pace.

Waymo

Your next food delivery driver may be a robot.

Waymo and Uber have been working together on regular Ubers for a while, but the two companies are now teaming up for food delivery. Automated Uber Eats is rolling out to Waymo’s Phoenix service area. Waymo says this will start in “select merchants in Chandler, Tempe and Mesa, including local favorites like Princess Pita, Filiberto’s, and Bosa Donuts.”

Phoenix Uber Eats customers can fire up the app and order some food, and they might see the message “autonomous vehicles may deliver your order.” Waymo says you’ll be able to opt out of robot delivery at checkout if you want.

  • The pop-up screen if a Waymo is delivering your order.

    Waymo

Of course, the big difference between human and robot food delivery is that the human driver will take your food door to door, while for the Waymo option, you’ll need to run outside and flag down your robot delivery vehicle when it arrives. Just like regular Uber, you’ll get a notification through the app when it’s time. The food should be in the trunk. If you get paired with a Waymo, your delivery tip will be refunded. Waymo doesn’t explain how the restaurant side of things will work, but inevitably, some poor food server will need to run outside when the Waymo arrives.

It seems pretty wasteful to have a 2-ton, crash-tested vehicle designed to seat five humans delivering a small bag of food, but at least the Jaguar i-Pace Waymos are all-electric. It’s a shame Waymo’s smaller “Firefly” cars were retired. There are smaller, more purpose-built food delivery bots out there—Uber Eats is partnered with Serve Robotics for smaller robot delivery—but these are all sidewalk-cruising, walking-speed robots that can only go a few blocks. The Nuro R3 (Nuro is also partnered with Uber) seems like a good example of what a road-going delivery should look like—it’s designed for food and not people, and it comes with heated or cooled food compartments. Waymo is still the industry leader in automated driving, though.

Waymo and Uber Eats start human-less food deliveries in Phoenix Read More »