There’s a very long history of extremely effective labor saving tools in software.

Writing in C rather than Assembly, especially for more than 1 platform.

Standard libraries. Unix itself. More recently, developing games in Unity or Unreal instead of rolling your own engine.

And what happened when any of these tools come on the scene is that there is a mad gold rush to develop products that weren’t feasible before. Not layoffs, not “we don’t need to hire junior developers any more”.

Rank and file vibe coders seem to perceive Claude Code (for some reason, mostly just Claude Code) as something akin to the advantage of using C rather than Assembly. They are legit excited to code new things they couldn’t code before.

Boiling the rivers to give them an occasional morale boost with “You are absolutely right!” is completely fucked up and I dread the day I’ll have to deal with AI-contaminated codebases, but apart from that, they have something positive going for them, at least in this brief moment. They seem to be sincerely enthusiastic. I almost don’t want to shit on their parade.

The AI enthusiast bigwigs on the other hand, are firing people, closing projects, talking about not hiring juniors any more, and got the media to report on it as AI layoffs. They just gleefully go on about how being 30% more productive means they can fire a bunch of people.

The standard answer is that they hate having employees. But they always hated having employees. And there were always labor saving technologies.

So I have a thesis here, or a synthesis perhaps.

The bigwigs who tout AI (while acknowledging that it needs humans for now) don’t see AI as ultimately useful, in the way in which C compiler was useful. Even if its useful in some context, they still don’t. They don’t believe it can be useful. They see it as more powerfully useless. Each new version is meant to be a bit more like AM or (clearly AM-inspired, but more familiar) GLaDOS, that will get rid of all the employees once and for all.

  • bitofhope
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Fair enough, I concede. It’s true that C really came into its own on the PDP-11, whose design was a big influence on the language itself and the PDP-7 NB can’t really be considered a true C compiler.

    In light of the points you brought up I’d like to amend my earlier disparagement of C by admitting that actually it wasn’t even all that light. Sure, it wasn’t as much of a behemoth as PL/I or ALGOL 68, but indeed a Fortan or a Pascal could probably have fit the same systems where C was born.

    • The Penguin of Evil@mastodon.social
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      @bitofhope and to be clear I just care about the history. I’ve no problem with the idea the software world would have been a better place if Unix had been built on and spread a language that took less of a relaxed attitude to types and pointers.