close
close

Apre-salomemanzo

Breaking: Beyond Headlines!

Being a Tesla self-driving tester seems like the most stressful job ever
aecifo

Being a Tesla self-driving tester seems like the most stressful job ever

If the job description says “must remove all sense of legal duty, moral ethics and humanity” as responsibilities, would you apply? According to a new report, that’s essentially the role of a test pilot to Tesla.

Autonomous driving vehicles are the future. If tech companies have their way, self-driving cars would be the only future. However, many of us live in reality and understand that software is nowhere near where it should be for this sci-fi level of automation. But those with the funds will continue to try to speed up the process, even if it means beta testing on public roads with all of us as guinea pigs.

Business Insider has revealed details of a specialist team of Tesla test drivers dubbed Project Rodeo. This test team is trained to push the limits of the car manufacturer. Fully autonomous driving (FSD) And Autopilot software. What exactly is the limit? Crash. But the general rule seems to be to get as close as possible to colliding with something or someone. The scarier the situation, the better.

“You’re riding on adrenaline for the entire eight-hour shift,” said one former test pilot. “We have the feeling that we are on the verge of something that is really wrong. »

Nine current and former Project Rodeo pilots and three Autopilot engineers from California, Florida and Texas were interviewed. Most asked to remain anonymous. The situations they describe are revealing but not surprising. Although FSD-related crashes were well documented, none of the interviewees participated in them.

Project Rodeo is a test group made up of small teams. For example, the “golden manual” team drives according to the rules, respects the highway code and does not use any driver assistance functions. On the opposite end of this spectrum is the “critical response” team. More passengers than drivers, Critical Intervention testers let the software handle every aspect of driving. They only get involved or “intervene” to avoid a collision.

Part of the reason test pilots wait until the 11th hour to manually take control is that it gives the software time to react to make the right or wrong decision. The more data is collected, especially in real-world scenarios, the easier it is for engineers to adjust and update the software.

“We want the data to know what led the car to make that decision,” said a former Autopilot engineer. “If you keep intervening too early, we don’t really get to the exact moment where we’re like, OK, we understand what happened.”

However, this leads to vehicles being allowed to run red lights, cross the double yolksignore stop signs and exceed speed limits, all on public roads. Even if the situation becomes uncomfortable for the driver, supervisors will say they took over too soon. Thus, Project Rodeo drivers, even those in noncritical response roles, felt obligated to maintain risky driving situations or sometimes create them altogether to test the software in order to keep their jobs.

John Bernal, a former test pilot and data analyst, said he was never asked to deliberately break the law when it came to data collection, but it was strongly implied. “My training was to wait for the wheels to touch the white line before I could brake,” he said.

Additionally, some readers were used solely to train the software to recognize and adapt to “vulnerable road users”, such as pedestrians, cyclists and wheelchair users. While riding with his coach, a former tester said their vehicle came within a meter of a cyclist before the cyclist braked.

“I vividly remember this guy jumping off his bike,” he said. “He was terrified. The car lunged at him and all I could do was slam on the brakes. Apparently his trainer was actually happy about it, telling him that his delayed reaction was “perfect” and exactly what they wanted him to do. “It felt like the goal was almost to fake an accidental crash and then prevent it at the last second.”

Cruise And Waymo are also developing self-driving cars, but say they conduct rigorous software testing in controlled environments or believe their autonomous system is “fundamentally different” from Tesla’s. Hmm, so why are these companies having the same problems with vehicles don’t read the roomin itself? In the case of Uber’s self-driving division now closedSometimes the results are deadly.

“If you have a parent who holds the bike all the time, they never learn,” said a former Tesla engineer. Ultimately, data is king. For these standalone technology companies now at the mercy of shareholders, this is a high-risk, high-reward environment that the public has not embraced.