• @APassenger@lemmy.world
        link
        fedilink
        English
        5010 months ago

        It’s this. When boards and non-tech savvy managers start making decisions based on a slick slide deck and a few visuals, enough will bite that people will be laid off. It’s already happening.

        There may be a reckoning after, but wall street likes it when you cut too deep and then bounce back to the “right” (lower) headcount. Even if you’ve broken the company and they just don’t see the glide path.

        It’s gonna happen. I hope it’s rare. I’d argue it’s already happening, but I doubt enough people see it underpinning recent lay offs (yet).

    • @tias@discuss.tchncs.de
      link
      fedilink
      English
      3910 months ago

      AI as a general concept probably will at some point. But LLMs have all but reached the end of the line and they’re not nearly smart enough.

      • @li10@feddit.uk
        link
        fedilink
        English
        1510 months ago

        LLMs have already reached the end of the line 🤔

        I don’t believe that. At least from an implementation perspective we’re extremely early on, and I don’t see why the tech itself can’t be improved either.

        Maybe it’s current iteration has hit a wall, but I don’t think anyone can really say what the future holds for it.

        • @jacksilver@lemmy.world
          link
          fedilink
          English
          25
          edit-2
          10 months ago

          LLMs have been around since roughly 2016 2017 (comment below corrected me that Attention paper was 2017). While scaling the up has improved their performance/capabilities, there are fundamental limitations on the actual approach. Behind the scenes, LLMs (even multimodal ones like gpt4) are trying to predict what is most expected, while that can be powerful it means they can never innovate or be truth systems.

          For years we used things like tf-idf to vectorize words, then embeddings, now transformers (supped up embeddings). Each approach has it limits, LLMs are no different. The results we see now are surprisingly good, but don’t overcome the baseline limitations in the underlying model.

        • @mashbooq@infosec.pub
          link
          fedilink
          English
          510 months ago

          I’m not trained in formal computer science, so I’m unable to evaluate the quality of this paper’s argument, but there’s a preprint out that claims to prove that current computing architectures will never be able to advance to AGI, and that rather than accelerating, improvements are only going to slow down due to the exponential increase in resources necessary for any incremental advancements (because it’s an NP-hard problem). That doesn’t prove LLMs are end of the line, but it does suggest that additional improvements are likely to be marginal.

          Reclaiming AI as a theoretical tool for cognitive science

        • @Wooki@lemmy.world
          link
          fedilink
          English
          1
          edit-2
          10 months ago

          we’re extremely early on

          Oh really! The analysis has been established since the 80’s. Its so far from early on that statement is comical

      • Optional
        link
        fedilink
        English
        610 months ago

        “at some point” being like 400 years in the future? Sure.

        Ok that’s probably a little bit of an exaggeration. 250 years.

    • @assembly@lemmy.world
      link
      fedilink
      English
      410 months ago

      The one thing that LLMs have done for me is to make summarizing and correlating data in documents really easy. Take 20 docs of notes about a project and have it summarize where they are at so I can get up to speed quickly. Works surprisingly well. I haven’t had luck with code requests.

    • @ikidd@lemmy.world
      link
      fedilink
      English
      39
      edit-2
      10 months ago

      I’m pretty sure I could write a bot right now that just regurgitates pop science bullshit and how it relates to Line Go Up business philosophy.

      Edit: did it, thanks ChatJippity

      def main():
          # Check if the correct number of arguments are provided
          if len(sys.argv) != 2:
              print("Usage: python script.py <PopScienceBS>")
              sys.exit(1)
          # Get the input from the command line
          PopScienceBS = sys.argv[1]
          # Assign the input variable to the output variable
          LineGoUp = PopScienceBS
          # Print the output
          print(f"Line Go Up if we do: {LineGoUp}")
      if __name__ == "__main__":
          main()
      
  • @casmael@lemm.ee
    link
    fedilink
    English
    10110 months ago

    I know just enough about this to confirm that this statement is absolute horseshit

  • dinckel
    link
    fedilink
    English
    8510 months ago

    I’ll take “things business people dont understand” for 100$.

    No one hires software engineers to code. You’re hired to solve problems. All of this AI bullshit has 0 capability to solve your problems, because it can only spit out what it’s already stolen from seen somewhere else

    • @HakFoo@lemmy.sdf.org
      link
      fedilink
      English
      1610 months ago

      It can also throw things against the wall with no concern for fitness-to=purpose. See “None pizza, left beef”.

    • @breckenedge@lemmy.world
      link
      fedilink
      English
      810 months ago

      I’ve worked with a few PMs over my 12 year career that think devs are really only there to code like trained monkeys.

      • @wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        310 months ago

        I’m at the point where what I work on requires such a depth of knowledge that I just manage my own projects. Doesn’t help that my work’s PM team consistently brings in new hires only to toss them on the difficult projects no one else is willing to take. They see a project is doomed to fail so they put their least skilled and newest person on it so the seniors don’t suffer any failures.

        Simplifying things to a level that is understandable for the PMs just leads to overlooked footguns. Trying to explain a small subset of the footguns just leads to them wildly misinterpreting what is going on, causing more work for me to sort out what terrible misconceptions they’ve blasted out to everyone else.

        If you can’t actually be a reliable force multiplier, or even someone I can rely on to get accurate information from other teams, just get out of my way please.

  • @linearchaos@lemmy.world
    link
    fedilink
    English
    7510 months ago

    Guys that are putting billions of dollars into their AI companies making grand claims about AI replacing everyone in two years. Whoda thunk it

  • @SomeGuy69@lemmy.world
    link
    fedilink
    English
    6610 months ago

    But coding never was the difficult part. It’s understanding a concept, identify a problem and solve it with the possible methods. An AI just makes the coding part faster and gives me options to quicker identify a possible solution. Thankfully there’s a never ending pile of projects, issues, todos and stackholder wants, that I don’t see how we need less programmers. Maybe we need more to deal with AI, as now people can do a lot more in house instead of outsourcing, but as soon as that threshold is reached, companies will again contact large software companies. If people want to put AI into everything, you need people feeding the AI with company specific data and instruct people to use this AI.

    All I see is middle management getting replaced, because instead of a boring meeting, I could just ask an AI.

    • @curry@programming.dev
      link
      fedilink
      English
      1410 months ago

      I dread meetings and I can’t wait for AIs to replace those managers. Or perhaps we’ll have even more meetings because the management wants to know why we’re so late despite the AI happily churning out meaningless codes that look so awesome like all that CSI VB GUI crap.

  • @Kekzkrieger@feddit.org
    link
    fedilink
    English
    6210 months ago

    CEOs without a clue how things work think they know how things work.

    I swear if we had no CEOs from today on the only impact would be that we wouldve less gibberish being spoken

    • @CaptPretentious@lemmy.world
      link
      fedilink
      English
      3010 months ago

      If AI could replace anyone… it’s those dingbats. I mean, what would you say, in this given example, the CEO does… exactly? Make up random bullshit? AI does that. Write a speech? AI does that. I love how these overpaid people think they can replace the talent but they… they are absolutely required and couldn’t possibly be replaced! Talent and AI can’t buy and enjoy the extra big yacht, or private jets, or over priced cars, or a giant over sized mansion… no you need people for that.

  • @qarbone@lemmy.world
    link
    fedilink
    English
    6110 months ago

    If, 24 months from now, most people aren’t coding, it’ll be because people like him cut jobs to make a quicker buck. Or nickel.

    • @Cringe2793@lemmy.world
      link
      fedilink
      English
      1
      edit-2
      10 months ago

      Well if it works, means that job wasn’t that important, and the people doing that job should improve themselves to stay relevant.

      Edit: wow what a bunch of hypersensitive babies. I swear, y’all just allergic to learning or something. I just said people need to improve themselves to stay relevant, and people freak out and send me death threats. What a joke.

      • @UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        1510 months ago

        job wasn’t that important

        I keep telling you that changing out the battery in the smoke alarm isn’t worth the effort and you keep telling me that the house is currently on fire, we need to get out of here immediately, and I just roll my eyes because you’re only proving my point.

        • @Cringe2793@lemmy.world
          link
          fedilink
          English
          110 months ago

          Sure, believe what you want to believe. You can either adapt to what’s happening, or just get phased out. AI is happening whether you like it or not. You may as well learn to use it.

          • @TheGalacticVoid@lemm.ee
            link
            fedilink
            English
            510 months ago

            You can adapt, but how you adapt matters.

            AI in tech companies is like a hammer or drill. You can either get rid of your entire construction staff and replace them with a few hammers, or you can keep your staff and give each worker a hammer. In the first scenario, nothing gets done, yet jobs are replaced. In the second scenario, people keep their jobs, their jobs are easier, and the house gets built.

            • @Cringe2793@lemmy.world
              link
              fedilink
              English
              110 months ago

              Yup. Most of us aren’t CEOs, so we don’t have a lot of say about how most companies are run. All we can do is improve ourselves.

              For some reason, a lot of people seem to be against that. They prefer to whine.

      • @qarbone@lemmy.world
        link
        fedilink
        English
        810 months ago

        Define “works”?

        If you’re a CEO, cutting all your talent, enshittifying your product, and pocketing the difference in new, lower costs vs standard profits might be considered as “working”.

        • @Cringe2793@lemmy.world
          link
          fedilink
          English
          110 months ago

          Hmmm maybe you’re misunderstanding me.

          What I mean is “coding” is basically the grunt work of development. The real skill is understanding the requirements and building something efficiently. Tbh, I hate coding.

          What tools like Gemini or ChatGPT brings to the table is the ability to create small, efficient snippets of code that works. We can then just modify it to meet our more specific requirements.

          This makes things much faster, for me at least. If the time comes when the AI can generate more efficient code, making my job easier, I’d count that as “works” for me.

      • @prole@lemmy.blahaj.zone
        link
        fedilink
        English
        2
        edit-2
        10 months ago

        Define “works.”

        Because the goals of a money-hungry CEO don’t always align with those of the workers in the company itself (or often, even the consumer). I imagine this guy will think it worked just fine as he’s enjoying his golden parachute.

  • @hark@lemmy.world
    link
    fedilink
    English
    6110 months ago

    This will be used as an excuse to try to drive down wages while demanding more responsibilities from developers, even though this is absolute bullshit. However, if they actually follow through with their delusions and push to build platforms on AI-generated trash code, then soon after they’ll have to hire people to fix such messes.

  • @yokonzo@lemmy.world
    link
    fedilink
    English
    5610 months ago

    How many times does the public have to learn if the CEO says it, he probably doesn’t know what he’s talking about. If the devs say it, listen

    • @woodgen@lemm.ee
      link
      fedilink
      English
      910 months ago

      Lets wait for any LLM do a single sucessful MR on Github first before starting a project on its own. Not aware of any.

    • @utopiah@lemmy.world
      link
      fedilink
      English
      410 months ago

      there isn’t a single serious project written exclusively or mostly by an LLM? There isn’t a single library or remotely original application

      IMHO “original” here is the key. Finding yet another clone of a Web framework ported from one language to another in order to push online a basic CMS slightly faster, I can imagine this. In fact I even bet that LLM, because they manipulate words in languages and that code can be safely (even thought not cheaply) tested within containers, could be an interesting solution for that.

      … but that is NOT really creating value for anyone, unless that person is technically very savvy and thus able to leverage why a framework in a language over another creates new opportunities (say safety, performances, etc). So… for somebody who is not that savvy, “just” relying on the numerous existing already existing open-source providing exactly the value they expect, there is no incentive to re-invent.

      For anything that is genuinely original, i.e something that is not a port to another architecture, a translation to another language, a slight optimization, but rather something that need just a bit of reasoning and evaluating against the value created, I’m very skeptical, even less so while pouring less resources EVEN with a radical drop in costs.

    • @Hawk@lemmynsfw.com
      link
      fedilink
      English
      110 months ago

      My last employer had many internal tools that were fine.

      They had only a moderate amount of oversight.

      I had to find a new job, I’m actually thinking of walking away from software development now that there are so few jobs :(

      It sucks but there’s no sense pretending this won’t have a large impact on the job landscape.

        • @Hawk@lemmynsfw.com
          link
          fedilink
          English
          110 months ago

          Mostly internal data cleaning stuff, close etc, which I accept is less in scope than you’re original comment.

          • jeeva
            link
            fedilink
            English
            110 months ago

            The things you are describing sound like if-statement levels of automation, GitHub Actions with preprogrammed responses rather than LLM whatever.

            If you’re worrying about being replaced by that… Go find the code, read it, and feel better.

            • @Hawk@lemmynsfw.com
              link
              fedilink
              English
              110 months ago

              The code was non trivial and relatively sophisticated. It performed statistical analysis on ingested data and the approach taken was statistically sound.

              I was replaced by that. So was my colleague.

              The job market is exceptionally tough right now and a large part of that is certainly llms.

              I think taking people with statistical training out of the equation is quite dangerous, but it’s happening. In my area, everybody doing applied mathematics, statistics or analysis has been laid off.

              In saying that, the produced program was quite good.

              • jeeva
                link
                fedilink
                English
                110 months ago

                Certainly sounds more interesting than my original read of it! Sorry about that, I was grumpy.

                • @Hawk@lemmynsfw.com
                  link
                  fedilink
                  English
                  110 months ago

                  All good man.

                  I think the point is that LLMs can replace people and they are quite good.

                  But they absolutely shouldn’t replace people, yet, or possibly ever.

                  But that’s what’s happening and it’s a massive problem because it’s leading to mediocre code in important spaces.

    • @RagingRobot@lemmy.world
      link
      fedilink
      English
      4210 months ago

      Yeah writing the code isn’t really the hard part. It’s knowing what code to write and how to structure it to work with your existing code or potential future code. Knowing where things might break so you can add the correct tests or alerts. Giving time estimates on how long it will take to build the parts of the system and building in phases to meet your teams needs.

      • @floofloof@lemmy.caOP
        link
        fedilink
        English
        1010 months ago

        I’ve always thought that design and maintenance are the difficult and gruelling parts, and writing code is when you get to relax for a bit. Most of the time you’re in maintenance mode, and it’s harder than writing new code.

      • beefbot
        link
        fedilink
        English
        610 months ago

        This. I’m learning a new skill right now & hardly any of it is actual writing— it’s how to arrange the pieces someone else wrote (& which sometimes AI can decently reproduce.)

        When you use a computer you don’t start by mining iron, because the thing is already built

  • @riodoro1@lemmy.world
    link
    fedilink
    English
    5210 months ago

    Todays news: Rich assholes in suits are idiots and don’t know how their own companies are working. Make sure to share what they’re saying.