• @Showroom7561@lemmy.ca
    link
    fedilink
    English
    894 days ago

    I’m reading the comments about this video, and I think people are missing the point.

    It’s not about the Telsa running into the kid. It’s about the Tesla completely ignoring the FLASHING FUCKING STOP SIGN at the side of the bus, which resulted in it hitting the kid dummy.

    This could have been a pedestrian crossing, railroad stop, intersection, etc.

    These vehicles aren’t “smart” and should not be allowed on the road. Any idiot can have greater awareness than a Tesla.

    • @outhouseperilous@lemmy.dbzer0.com
      link
      fedilink
      English
      18
      edit-2
      4 days ago

      Yeah, it might kill the kid, it might not.

      Im still gonna stick to my ford F50000 Fleshreaper (BLOOD FOR THE CAR GOD!™) driven by a good old fashioned human to get the job done.

      Besides, it avoids the whole mess of theological issues about who gets moloch’s love.

    • @doomi@discuss.tchncs.de
      link
      fedilink
      English
      44 days ago

      Oh, now I get it. Didn‘t know it’s not allowed to pass the bus even when it’s on the other side of the street. In our country we teach the kids to not run across the street when they get out of the bus.

      • @hikaru755@lemmy.world
        link
        fedilink
        English
        93 days ago

        Kids will do stupid things sometimes, no avoiding that. In Germany you can pass a stopped bus on the other side of the road, but if it has its hazards on, you can’t go faster than walking speed.

    • @varyingExpertise@feddit.org
      link
      fedilink
      English
      12 days ago

      Also fairly impressive interception of a moving small target, especially considering they’re hidden behind the bus for most of their initial trajectory.

  • @hark@lemmy.world
    link
    fedilink
    English
    193 days ago

    It’s fine, they’ll fix these issues in time for the robotaxi rollout ten years from now.

    What’s that? They’re planning on launching the robotaxis at the end of this month? Well then.

      • Caveman
        link
        fedilink
        English
        13 days ago

        The guy who wrote this is an idiot. Vehicles with self driving should always follow the driving law to the letter regardless of what normal human behavior is. Human drivers do an immense amount of stupid shit in cars so imitating them with a computer is extremely dangerous.

        If anything people should have extra safety requirements for self driving cars such as “always 5km/h below speed limit”, “2s stop at stop sign” and “slow down to -10km/h below speed limit before an intersection”.

    • @ChickenLadyLovesLife@lemmy.world
      link
      fedilink
      English
      12
      edit-2
      3 days ago

      I’m a school bus driver and this year we finally got the automatic cameras that catch people going past our red flashers and stop signs. My camera has captured about two to three drivers per day doing this. I would have rather had the automatic machine guns but the camera is a fine second choice.

      Edit: the funniest thing I’ve had happen with the camera so far is one person that came flying past my reds, noticed the lights and stop sign as they were passing me, slammed on their brakes and then backed up past me again while mouthing “I’m so sorry” to me. Yes, they received two tickets for this - and I had nothing to do with it as the cameras are completely automated.

          • @squaresinger@lemmy.world
            link
            fedilink
            English
            43 days ago

            Do people in the US just get away with not paying tickets?

            Over here, if you don’t pay fines, it will get escalated until the point of seizure, and if you have nothing else to seize, they will take your car.

            Not paying isn’t really an option.

            • @ChickenLadyLovesLife@lemmy.world
              link
              fedilink
              English
              13 days ago

              Apparently, the issue with mail-in tickets specifically is that while the camera can catch the license plate number, it can’t really prove who was driving the car. So whereas an in-person ticket from a cop for passing a school bus will result in points on your license, a mail-in ticket from a camera like this won’t. The same problem applies to people that just don’t pay the mail-in tickets - the state doesn’t really know who to go after specifically.

              • @utopiah@lemmy.world
                link
                fedilink
                English
                12 days ago

                mail-in ticket from a camera like this won’t

                Damn, imagine now if one could put such a camera anywhere and it would

                • either be moved away if below a threshold after a period, e.g. 1 week
                • or stay as long as it pays for itself then when it pays for a 2nd one, that new one would be randomly nearby
              • @squaresinger@lemmy.world
                link
                fedilink
                English
                13 days ago

                That’s some weird legal loopholes.

                Over here, if you own a car you are responsible for who drives it. So if you get a mail-in ticket, you can either name the driver (and in case of a dispute prove that they were the driver) or you yourself are counted as the driver.

            • @FireRetardant@lemmy.world
              link
              fedilink
              English
              12 days ago

              Most automated camera tickets for buses, red lights, or speeding only result in municipal fines and no insurance complications. The camera can reliably get the lisence plate but the tech and bureaucracy isn’t ready to prove who was driving each car.

    • @Showroom7561@lemmy.ca
      link
      fedilink
      English
      184 days ago

      The only difference is that a driver would get out of their car, check for damage to their vehicle, and then get mad at the kid! /s

  • @CrowAirbrush@lemmy.world
    link
    fedilink
    English
    63 days ago

    I’ll give you my uneducated findings: self driving cars are not ready.

    I doubt they will ever be really ready, they’ll eventually be considered “ready enough” no software will always work without flaws. When that software controls a car a minor flaw might mean 20 deaths.

      • @FireRetardant@lemmy.world
        link
        fedilink
        English
        22 days ago

        Comparing self driving cars to American driving standards is kinda a moot point because the american safety standards are so low that death and injury is considered the cost of doing business.

        I’d be curious to see how well waymo performs compared to a country with far safer road designs and drivers that are better trained and respect rules of the road more frequently.

      • @nandeEbisu@lemmy.world
        link
        fedilink
        English
        63 days ago

        Way is also operating in a fairly small fixed area that is highly mapped.

        Not saying that’s a bad thing, they are doing things the right way, slowly and cautiously.

      • @Furbag@lemmy.world
        link
        fedilink
        English
        53 days ago

        Waymo cars use much better technology than Tesla.

        Nobody is disputing that a machine that is never distracted and has reaction times down to fractions of a second would make a better driver than even the most skilled human, but Tesla’s FSD hardware and software aren’t there yet and probably never will be.

    • @AlteredEgo@lemmy.ml
      link
      fedilink
      English
      2
      edit-2
      3 days ago

      40.000 deaths by traffic accident by year (in the US). Only 20 deaths would be a major improvement. Obviously “cars” is a highly irrational discussion though.

      And it’s not just the victims who could be spared their lives, it’s also the mental toll on those who kill people on accident. Blaming it on a flaw in the software that can be improved and flaws permanently fixed is great.

      I say let the mechanized reduced slaughter begin!

    • Pyr
      link
      fedilink
      English
      13 days ago

      Not unless government adjust infrastructure to accomodate. Like, QR codes on every sign giving each car passing specific instructions or whatever.

  • @jaybone@lemmy.zip
    link
    fedilink
    English
    134 days ago

    Didn’t I just read this like a few weeks ago? But there’s a Jun 15 date in the article. So did this happen again?

    • @NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      9
      edit-2
      3 days ago

      It’s the same story making the rounds.

      Edit: They also did it in Austin and somewhere else, so same situation in 2 different spots, generating like 4-5x the stories as each one gets repeated in the news cycle

  • @PlagueShip@lemmy.world
    link
    fedilink
    English
    154 days ago

    If I worked at Tesla, I would very much be doing a crappy job and slipping bad ideas into what looks like good code. The Lord’s work.

    • Krudler
      link
      fedilink
      English
      84 days ago

      How would you know where to put it among all the other shitty code?

      • @squaresinger@lemmy.world
        link
        fedilink
        English
        3
        edit-2
        3 days ago

        Because, as we know, US developers are so much better than anyone else around the world. Which is why they they gave us excellent software like Windows 11, MS Teams, Google Hangouts, and so on.

        Take your xenophobia and stick it where the sun doesn’t shine.

  • WolfmanEightySix
    link
    fedilink
    English
    3
    edit-2
    4 days ago

    I’ve seen some comments elsewhere about how it can be trusted.

    Edit; like really straight up trusted. The delusion is unreal.

    • KT-TOT
      link
      fedilink
      English
      94 days ago

      That’s shrimply not true. The numbers Tesla releases are heavily cooked.

      Had a quick look around but I didn’t manage to find any numbers that weren’t either using Tesla’d numbers, or guessing.

      But it’s pretty well known that FSD sucks (have been in a car using it … terrifying af) and that it’ll turn itself off before an accident to pass accountability to the driver.

      • @NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        2
        edit-2
        3 days ago

        I love how this keeps getting repeated by everyone everywhere

        it’ll turn itself off before an accident to pass accountability to the driver.

        But both Tesla (5 seconds) and the NHSTA (30 seconds) count any incident where a L2 system was on before the accident as having happened with the system active. So no, they do not use it for that purpose.

        You know that video going around a few weeks ago where some dude with FSD on darted across the rode into a tree? Well, he got the cars data, and it turns out it was disabled due to enough torque on the wheel which is one of the ways you disable it. He probably nudged the wheel too hard by mistake and disabled it, or there was a mechanical failure which disabled it, but the accident counted as FSD in the report he got from Tesla as ON even though it was OFF at the time of the accident when he started going out of his lane.

        So please just stop it with that nonsense.

        • KT-TOT
          link
          fedilink
          English
          33 days ago

          I may be buying the foolishness of the masses, but your anecdotes are only as good as mine.

          • @NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            1
            edit-2
            2 days ago

            Which part don’t you like?

            I could source the crash report and a video explaining what likely happened, but if you simply don’t believe that Tesla truly stands by the 5s rule in their self reported data even with the crash report, then that’s another matter entirely.

        • @TriflingToad@sh.itjust.works
          link
          fedilink
          English
          23 days ago

          what about this one? https://youtu.be/V2u3dcH2VGM

          I don’t know anything about self driving, but I can’t imagine why it would turn off right before a crash instead of keeping the breaks held

          (also I know the drivers is a total idiot and it’s 100% their fault, I just want to know why it turned off)

          • @NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            2
            edit-2
            3 days ago

            My guess is that Automatic Emergency Breaking (AEB) is a separate feature from autopilot.

            AEB is something that is always on and watching even if AP/FSD is off, but it’s only intended (originally anyway) to turn on when a crash is imminent to slow forces, not prevent a crash. So my guess is AP fails to detect the cop car as radar couldn’t see it, vision finally sees it tries to brake but it’s too late, AEB detects a crash is inevitable and kicks in which turns AP off.

            It’d actually be nice if we got a straight answer from Tesla as to why it happens this way. This would have been reported as ON as it was within 5 seconds, and obviously NHSTA was investigating it as ON.

    • @ano_ba_to@sopuli.xyz
      link
      fedilink
      English
      23 days ago

      Despite doors blowing off, Boeing planes are safer than human drivers tbh. You’d think tech fans would understand the importance of logic in computers. Red means stop.