Cruise CEO says SF ‘should be rolling out the red carpet’ for robotaxis, threatens to maybe leave town::In his first major public interview since the DMV cut their San Francisco fleet in half, Cruise CEO Kyle Vogt said “we cannot expect perfection” from the self-driving cars, and vaguely threatened to leave town if regulators curtail them any further.

  • chakan2@lemmy.world
    link
    fedilink
    English
    arrow-up
    106
    arrow-down
    10
    ·
    1 year ago

    You MUST DEMAND perfection from self driving cars. Mistakes cost lives.

    Fuck this guy.

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      3
      ·
      1 year ago

      I don’t know about “perfection”, but we should at least aim to be better than most human drivers.

      I’d be comfortable holding robot drivers to the same standard as human drivers if there were similar levels of accountability. That said, I think the current standards for licensing human drivers are far too low. Tons of people on the road are simply not capable of driving safely, consistently, and legally. I would support measures to raise the bar for human drivers as well, but since that is extremely unlikely, we can at least establish better standards for the future.

      • Rolder@reddthat.com
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        6
        ·
        1 year ago

        Just hold the CEO directly liable for any deaths or injuries. Like someone gets hit? That’s a reckless driving charge for the CEO. They would get perfect real quick.

      • Psythik@lemm.ee
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        1 year ago

        As a pedestrian, I’d sooner trust a self-driving car to ID and stop for me than I’d trust a human to do the same. Humans make way more mistakes than these cars do. It just doesn’t make the news when humans fuck up cause we do it all the damn time. But accidents are so rare for self-driving cars that every time one happens, it makes headlines, and then a bunch of idiots show up in the comments to throw shade at them when they’re much worse drivers themselves.

        And then more idiots show up and upvote them.

      • DeadlineX@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        1 year ago

        Yeah a lot of people drive selfishly and dangerously. Until we get alternative transportation, however, more stringent licensing will just condemn poorer folks to worse poverty and possibly being cast to the streets.

        We need better public transportation before we can cripple people’s ability to get where they need to be. Including work.

      • guacupado@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        I’ve always thought that self-driving cars won’t be mainstream until local departments of transportation are actively aiding in surrounding recognition for these vehicles. Cities will need to make sure their paint is maintained much more often so that yellow and white lines are much more easily recognized by AI. Also need more of those LED street lights with the hoods so that the colors of the light better stand out. I’m sure there are also better way to make signs more readable to AI as well, but all of these needs to be done with the help from local governments. Autonomous vehicles get better the more other autonomous vehicles are on the road.

        • GenderNeutralBro@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          I’d want it to be regulated like other safety features. If they shipped a car with faulty brakes or any other safety defects, it would be a legal issue. Fines, recalls, etc. Ideally it should be enough that half-assing it would put them out of business.

    • Son_of_dad@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      Perfection right out of the gate is impossible, but I think SF is too big for these kinds of tests. Use smaller towns if anything

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      3
      ·
      1 year ago

      As the other guy said. Demanding perfection is insane - we don’t demand that from human drivers either. As long as it’s better than humans (preferably by a long shot), I’m all in favour.

        • just another dev@lemmy.my-box.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Yeah, legislation and requirements for a self driving car to be allowed on the road will have to be updated. But an automated car can’t drink and drive, or make the intentional decision to run someone over because they hate them. I don’t see how vehicular homicide would apply.

          If somebody reprograms a car to murder someone, they are at fault. In all other cases - accidents - the insurance would have to shift from the driver to the car creator.

      • supercriticalcheese@feddit.it
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        We don’t even know if they are better than humans in an actual driving environment that is more challenging higher speed roads etc…

        It is insane to think the slow speed tests are representative of the entire possible scenarios. Or they might fail in driving in things like roundabouts or merging into motorways much more often than humans or who knows what edge cases.

        • just another dev@lemmy.my-box.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I agree. That will need to be proven. But when they are better than, say 90% of all drivers, it would make sense to switch. Waiting until they’re “perfect” (which is the requirement I object to), is just wasting needless lives.

          • supercriticalcheese@feddit.it
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Depends on what happens when they make errors. Is it comparable to human errors or are they prone to making worse mistakes than humans on average in terms of the conseguences.

            They might be 99.99% perfect but in 0.01% of cases cause massive car pileups in motorways (for example) due to reasons.

            A proper risk analysis based on a controlled transition would be better to be done first.

            • just another dev@lemmy.my-box.dev
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Yups, fully agreed.

              When it all comes down to I’d much rather have the mass pileup you describe once every few years (which can then be analysed and remedied due to the telemetry involved), than the over 3000 traffic deaths a day we have now.

      • chakan2@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        edit-2
        1 year ago

        We demand perfection in a lot of fields, and we are a hell of a lot closer to it than the wild west of AI alphas we have driving around.

        Aviation, Medical, Space Travel…etc…

        We can get to extreme levels of quality when lives are at risk. Driverless cars put lives at risk.

        Humans are a terribly low bar to use for a quality measure. Also, a human will (usually) do it’s best to mitigate damage in am accident

        In the case of Tesla…fuck it…I’m going through that parked semi at 80mph.

        • just another dev@lemmy.my-box.dev
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          1 year ago

          None of those fields have achieved perfection. Airplanes crash, people die in hospitals and space shuttles. If anything, computer assistance has managed to make those safer than before.

          If (when) robotcars are safer than human drivers, less people will die in traffic accidents. It’s not a perfect bar to settle on, but it’s better then the current standard.

          Again, denying improvements, because it’s less than perfect is just insane.

          • chakan2@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Denying “improvements” that cost innocent bystanders their life is the only responsible choice.

            I was game for the great experiment 10 years ago. But the tech just hasn’t gotten better, and arguably is worse today.

            It’s time to say enough is enough and restrict driverless tech to controlled areas.

            Being simply better than the average human isn’t enough here.

            • just another dev@lemmy.my-box.dev
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              I never said better than the average driver, I said better than human drivers (preferably by a long shot).

              So let’s say that means… Better than 90% of all drivers. That isn’t going to cost lives, it’s going to save them. Not to mention improve traffic flow.

              • chakan2@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                1 year ago

                Unlikely…to make an AI car safer than 90% than human drivers means it will respect the speed limit.

                That alone causes traffic jams and unsafe conditions around the car as people try to get around it.

                A human driver will somewhat go with the flow of traffic.

                An AI vehicle just won’t work until it’s a nearly perfect driver that can make human decisions.

                That’s not going to happen for a long time. Musk, with his revolving door of low cost engineers is actually making it all worse.

                Pull the plug on this experiment and put it back on the test track.