2020-10-24 11:31

Tesla’s "Full Self-Driving" beta is here, and it looks scary as hell cNh世界播



Illustration by Alex Castro / The Verge cNh世界播

插图:亚历克斯·卡斯特罗/The VergecNh世界播

This week, Tesla began pushing its “Full Self-Driving” (FSD) update to a select group of customers, and the first reactions are now beginning to roll in. The software, which enables drivers to use many of Autopilot’s advanced driver-assist features on local, non-highway streets, is still in beta. As such, it requires constant monitoring while in operation. Or as Tesla warns in its introductory language, “it may do the wrong thing at the worse time.” cNh世界播


Frankly, this looks terrifying — not because it seems erratic or malfunctioning, but because of the way it will inevitably be misused. cNh世界播


Early reactions to the software update range from “that was a little scary” to full-throated enthusiasm for CEO Elon Musk’s willingness to let his customers beta-test features that aren’t ready for wide release. This willingness has helped Tesla maintain its market leader position at the forefront of electric and autonomous vehicle technology, but it also presents a huge risk to the company, especially if those early tests go wrong. cNh世界播

对软件更新的早期反应从“有点吓人”到对首席执行官埃隆·马斯克(Elon Musk)愿意让客户对尚未准备好广泛发布的功能进行Beta测试的热情不一而足。这种意愿帮助特斯拉保持了其在电动和自动驾驶汽车技术前沿的市场领先者地位,但这也给该公司带来了巨大的风险,特别是如果这些早期测试出现问题的话。cNh世界播

A Tesla owner who goes by the handle “Tesla Raj” posted a 10-minute video on Thursday that purports to show his experience with FSD. He says he used the feature while driving down “a residential street... with no lane markers” — a function that Tesla’s Autopilot previously was unable to do. cNh世界播


Right off the bat, there are stark differences in how FSD is presented to the driver. The visuals displayed on the instrument cluster look more like training footage from an autonomous vehicle, with transparent orange boxes outlining parked cars and other vehicles on the road and icons that represent road signs. The car’s path is depicted as blue dots stretching out in front of the vehicle. And various messages pop up that tell the driver what the car is going to do, such as “stopping for traffic control in 75 ft.” cNh世界播


The car also made several left- and right-hand turns on its own, which Raj described as “kind of scary, because we’re not used to that.” He also said the turns were “human like” in so far as the vehicle inched out into the opposite lane of traffic to assert itself before making the turn. cNh世界播


Another Tesla owner who lives in Sacramento, California, and tweets under the handle @brandonee916 posted a series of short videos that claim to show a Tesla vehicle using FSD to navigate a host of tricky driving scenarios, including intersections and a roundabout. These videos were first reported by Electrek. cNh世界播


The vehicles in both Tesla Raj and @brandonee916’s tests are driving at moderate speeds, between 25 and 35 mph, which has been very challenging for Tesla. Musk said Tesla Autopilot can handle high-speed driving with its Navigate on Autopilot feature and low speeds with its Smart Summon parking feature. (How well Smart Summon works is up for debate, given the number of Tesla owners reporting bugs in the system.) The company has yet to allow its customers hands-off driving on highways, like Cadillac with its Autopilot competitor Super Cruise. But these medium speeds, where the vehicle is more likely to encounter traffic signals, intersections, and other complexities, is where Tesla has encountered a lot of difficulties. cNh世界播

特斯拉Raj和@brandonee916测试中的车辆都是以中速行驶,时速在25到35英里之间,这对特斯拉来说是非常有挑战性的。马斯克表示,特斯拉自动驾驶功能可以处理高速驾驶,自动驾驶功能可以处理低速驾驶,智能召唤停车功能可以处理低速驾驶。(考虑到特斯拉车主报告系统中有漏洞的数量,Smart Summon的工作效果如何还有待讨论。)。该公司尚未允许客户在高速公路上不插手驾驶,比如凯迪拉克(Cadillac)及其自动驾驶竞争对手超级巡航(Super Cruise)。但这些中速是特斯拉遇到很多困难的地方,也是车辆更容易遇到交通信号灯、十字路口和其他复杂情况的地方。cNh世界播

For now, FSD is only available to Tesla owners in the company’s early access beta-testing program, but Musk has said he expects a “wide release” before the end of 2020. The risk, obviously, is that Tesla’s customers will ignore the company’s warnings and misuse FSD to record themselves performing dangerous stunts — much like they have done for years and continue to do on a regular basis. This type of rule-breaking is to be expected, especially in a society where clout-chasing has become a way of life for many people. cNh世界播


Tesla has said Autopilot should only be used by attentive drivers with both hands on the wheel. But the feature is designed to assist a driver, and it’s not foolproof: there have been several high-profile incidents in which some drivers have engaged Autopilot, crashed, and died. cNh世界播


“Public road testing is a serious responsibility and using untrained consumers to validate beta-level software on public roads is dangerous and inconsistent with existing guidance and industry norms,” said Ed Niedermeyer, communications director for Partners for Automated Vehicle Education, a group that includes nonprofits and AV operators like Waymo, Argo, Cruise, and Zoox. “Moreover, it is extremely important to clarify the line between driver assistance and autonomy. Systems requiring human driver oversight are not self-driving and should not be called self-driving.” cNh世界播

自动车辆教育合作伙伴(Partners For Automated Vehicle Education)的公关总监埃德·尼德迈尔(Ed Niedermeyer)说,“公共道路测试是一项严肃的责任,使用未经培训的消费者在公共道路上验证测试级软件是危险的,与现有的指导和行业规范不一致。”自动车辆教育合作伙伴组织包括Waymo、Argo、Cruise和Zoox等非营利性和AV运营商。“而且,厘清驾驶员辅助和自主之间的界限是极其重要的。需要人类司机监督的系统不是自动驾驶,不应该被称为自动驾驶。“cNh世界播

© 2016 世界播 www.shijiebobao.com 中国互联网举报中心 京ICP证140141号 鄂ICP备18018000号-1