cross-posted from: https://fedia.io/m/fuckcars@lemmy.world/t/2201156

In case you were worried about the roads being too safe, you can rest easily knowing that Teslas will be rolling out with unsupervised “Full Self Driving” in a couple days.

It doesn’t seem to be going great, even in supervised mode. This one couldn’t safely drive down a simple, perfectly straight road in broad daylight :( Veered off the road for no good reason. Glad nobody got badly hurt.

We analyze the onboard camera footage, and try to figure out what went wrong. Turns out, a lot. We also talk through how camera-only autonomous cars work, Tesla’s upcoming autonomous taxi rollout, and how AI hallucinations figure into everything.

  • @boydster@sh.itjust.works
    link
    fedilink
    English
    2611 days ago

    There’s a lot wrong with Tesla’s implementation here, so I’m going to zoom in on one in particular. It is outright negligent to decide against using LIDAR on something like a car that you want to be autonomous. Maybe if this car had sensors to map out 3D space, that would help it move more successfully through 3D space?

    • Ulrich
      link
      fedilink
      English
      411 days ago

      You and I are (mostly) able to safely navigate a vehicle with 3D stereoscopic vision. It’s not a sensor issue, it’s a computation issue.

      • @TwanHE@lemmy.world
        link
        fedilink
        710 days ago

        In theory maybe, but our brains are basically a supercomputer on steroids when it comes to interpreting and improving the “video feed” our eyes give us.

        Could it be done with just cameras, probably some time in the future, but why the fuck wouldn’t you use a depth sensor now, and even in the future as a redundancy.

        • Ulrich
          link
          fedilink
          English
          110 days ago

          our brains are basically a supercomputer on steroids

          Yeah I mean that’s what I said.

      • @this_1_is_mine@lemmy.world
        link
        fedilink
        510 days ago

        I can also identify a mirror. Tesla smashed that head on. If you can’t effectively understand the image then it’s not enough information.

        • Ulrich
          link
          fedilink
          English
          110 days ago

          I can also identify a mirror.

          My point exactly.

          If you can’t effectively understand the image then it’s not enough information.

          No, it’s just not able to process the information it has.

          • @this_1_is_mine@lemmy.world
            link
            fedilink
            2
            edit-2
            8 days ago

            My point it costs literally nothing vs the added trove of additional data collection for them to be better interpreted now. And as you said

            No, it’s just not able to process the information it has.

            If it doesn’t understand the data now then it wasn’t ready yesterday and it definitely wasn’t ready before that when he dropped the additional sensors which was… In 2020… Fuck me it’s almost Been the bigger half of a decade already. There is no other argument. Fix your shit Elon.

      • Computation NOW cannot replicate what humans do with our rather limited senses.

        “Self-driving” cars are being made NOW.

        That means it’s the NOW computation we worry about, not some hypothetical future computation capabilities. And the NOW computation cannot do the job safely with just vision.

        • Ulrich
          link
          fedilink
          English
          2
          edit-2
          10 days ago

          Did I give you the impression that I was talking about some other time than NOW?

            • Ulrich
              link
              fedilink
              English
              310 days ago

              You were babbling about non-existent computing horsepower, yes.

              Yes that was the point.

              I don’t know why you’re upset when we seem to be in agreement.

      • @arrakark@lemmy.ca
        link
        fedilink
        English
        310 days ago

        I get what you are saying. But adding a new form of data input into the system would probably function to improve performance, not decrease it. I don’t think it makes sense to not add LIDAR into Teslas.

        All of this feels like Elon was asked to justify not putting a rather expensive (at the time) set of sensors into the Teslas, and he just doubles down and says that they will compensate with software.

    • @6nk06@sh.itjust.works
      link
      fedilink
      28 days ago

      I’ll go further and say that LIDARs should be mandatory by law when vision is used. A lot of industries have regulations, and this guy can slap shitty webcams in his cars. That’s a big problem.