So at one point Macs were the developer laptop. They gave a nice desktop experience but with UNIX underneath that was very close to the Linux servers you’d deploy on to.
The direction of travel has been to bring the UI closer and closer to touch devices, often at the detriment to the developer experience (IMO). Snow Leopard / Lion was Mac OS at it’s best. Once we left the cats behind it started going wrong.
Intel has become Arm - moving things further from those deployment servers. I’ve come to the realisation that I actually need x86 Linux adjacency more than anything else and nothing does that better than x86 Linux.
So at one point Macs were the developer laptop. They gave a nice desktop experience but with UNIX underneath that was very close to the Linux servers you’d deploy on to.
The direction of travel has been to bring the UI closer and closer to touch devices, often at the detriment to the developer experience (IMO). Snow Leopard / Lion was Mac OS at it’s best. Once we left the cats behind it started going wrong.
Intel has become Arm - moving things further from those deployment servers. I’ve come to the realisation that I actually need x86 Linux adjacency more than anything else and nothing does that better than x86 Linux.
Can you give some examples on how they’re making it worse for developers? I’ve never used Mac OS before, so I got no clue what’s different about it.