一个学区试图帮助训练Waymo自动驾驶汽车在校车停靠时停车。
A School District Tried to Help Train Waymos to Stop for School Buses

原始链接: https://www.wired.com/story/a-school-district-tried-to-help-train-waymos-to-stop-for-school-buses-it-didnt-work/

Waymo的自动驾驶汽车在德克萨斯州奥斯汀多次未能正确让校车,校车闪烁警示灯并伸出停止臂——这是一个关键的安全问题。尽管该技术承诺从整个车队的经验中学习,但Waymo车辆在联邦召回和旨在解决该问题的软件更新*之后*仍然非法通过校车。 奥斯汀独立学区(AISD)记录了至少19起事件,其中一些事件发生时离学生非常近。AISD与Waymo合作,甚至举办了数据收集活动,但违规行为仍然存在。专家认为,问题在于软件难以识别闪烁的灯光和伸出的安全装置,而这个问题会随着驾驶增加而恶化。 国家运输安全委员会和NHTSA正在调查,AISD正在考虑采取法律行动以确保学生安全。此案例凸显了自动驾驶技术中潜在的“盲点”,并质疑行业有效纠正这些问题的能力,即使在识别问题之后。

## Waymo 与校车安全:摘要 一篇最近的《连线》杂志文章详细介绍了 Waymo 在持续且安全地应对停校车方面存在的问题。一个学区尝试训练 Waymo 的系统未能成功,并且国家运输安全委员会(NTSB)的初步报告显示,Waymo 车辆有时会驶过停校车,有时是由于远程人工助理的错误指导。 Hacker News 上的讨论强调了人们对 Waymo 遵守交通法规的担忧,包括超速和车道使用不当。评论员质疑责任归属——Waymo 的违规行为由谁承担罚单?——以及车队是否应该面临与人类司机相似的处罚(例如吊销驾照)。 许多人认为 Waymo 需要像司机一样操作,适应现实世界的交通流量,即使这意味着超过限速。另一些人建议目前的罚款对于像谷歌这样的大公司来说不足,并建议提高处罚力度,甚至可能对整个车队进行驾照吊销,以激励更安全的编程。核心争论在于,Waymo 是否应该比人类司机受到更高的标准约束,或者仅仅是在现有且通常不完善的道路规则内运行。
相关文章

原文

One of the purported advantages of self-driving car tech is that every car can learn from one vehicle’s mistakes. Here’s how Waymo puts it on its website: “The Waymo Driver learns from the collective experiences gathered across our fleet, including previous hardware generations.”

But in Austin, Waymo’s vehicles struggled for months to learn how to stop for school buses as drivers picked up and dropped off children. An official with the Austin Independent School District (AISD) alleged that the vehicles had, in at least 19 instances, “illegally and dangerously” passed the district’s school buses while their red lights were flashing and their stop arms were extended rather than coming to complete stops, as the law requires.

In early December, Waymo even issued a federal recall related to the incidents, acknowledging at least 12 of them to federal regulators at the National Highway Traffic Safety Administration (NHTSA), which oversees road safety. According to federal filings, engineers with the self-driving vehicle company had “developed software changes to address the behavior” weeks before.

But even after the recall, the school-bus-passing incidents continued, according to school officials and a report from the National Transportation Safety Board (NTSB), an independent federal safety watchdog that’s also investigating the situation.

Now, email and text messages between school officials and Waymo representatives, obtained by WIRED through a public records request, show the lengths that the Austin public school district and Waymo went to try to solve the problem. AISD even hosted a half-day “data collection” event in a school parking lot in mid-December, the documents show, with several employees pulling together school buses and stop-arm signals from across the fleet so the self-driving car company could collect information related to vehicles and their flashing lights.

Still, by mid-January, over a month later, the school district reported at least four more school-bus-passing incidents had taken place in Austin. “The data we collected from the beginning of the school year to the end of the semester shows that about 98 percent of people that receive one violation do not receive another,” an official with the school’s police department told the local NBC affiliate that month. “That tells us that the person is learning, but it does not appear the Waymo automated driver system is learning through its software updates, its recall, what have you, because we are still having violations.”

The situation raises questions about the self-driving technologies' curious blind spots and the industry’s ability to compensate for them even after they’ve been spotted.

Self-driving software has long struggled with recognizing flashing emergency lights and road safety devices with long, thin arms, including gates and stop-arms, says Missy Cummings, who researches autonomous vehicles at George Mason University and served as a safety adviser to the NHTSA during the Biden administration. “If [the company] didn't fix this a few years ago, the more they drive, the more it’s going to be a problem,” she says. “That’s exactly what’s happening here.”

Waymo did not respond to WIRED’s requests for comment. A spokesperson for the Austin Independent School District referred WIRED to the NTSB while the incidents are under investigation. A spokesperson for the NTSB declined to answer WIRED’s questions while its investigation continues.

By midwinter of 2025, AISD officials were frustrated. In one of the 19 incidents alleged by a lawyer for the district in a letter later released by federal road safety regulators, a Waymo passed a school bus letting off children “only moments after a student crossed in front of the vehicle, and while the student was still in the road.”

“Alarmingly,” the lawyer wrote, five of the alleged incidents had occurred after Waymo had assured the district that it had updated its software to fix the problem. Federal regulators with the NHTSA had already launched a probe into the behavior. “Austin ISD is evaluating all potential legal remedies at its disposal and intends to take whatever action is necessary to protect the safety of its students, if required,” the lawyer warned.

联系我们 contact @ memedata.com