Little bit of everything!

Avid Swiftie (come join us at !taylorswift@poptalk.scrubbles.tech )

Gaming (Mass Effect, Witcher, and too much Satisfactory)

Sci-fi

I live for 90s TV sitcoms

  • 98 Posts
  • 3.51K Comments
Joined 2 years ago
cake
Cake day: June 2nd, 2023

help-circle
rss




  • Okay I’m going to go against the grain, and will probably get downvoted to hell, but this is not new. This is PC gaming. This has always been PC gaming. Hot take - you don’t need 4k@60fps to be able to have fun playing games.

    New games require top of the line hardware (or hardware that doesn’t even exist yet) for high-ultra settings. Always have, always will. (Hell, we had an entire meme about ‘can it run crysis’, a game that literally could only play on low-medium on even the highest level machines for a few years) Game makers want to make their games not just work now, but want them to look great in 5 years too. Unless you have shelled out over a grand this year for the absolute latest GPU, you should not expect any new game to run on great settings.

    In fact, I do keep my PC fairly bleeding edge and I can’t drive more than High settings on most games - and that’s okay. Eventually I’ll play them on Ultra, when hardware catches up. It’s fine.

    And as for low to mid level hardware I was there too - and that’s just PC gaming friend. I played Borderlands and Left4Dead the year they came out on a very old Radeon card at 640x480 in windowed mode, medium settings, at about 40fps.

    Again, this is just what PC gaming is. If you want crisp ultra graphics, you’re gonna have to shell out the ultra payments. Otherwise, fine tuning low to medium payments, becoming okay with sub 60fps, this is all fairly normal.

    Personally, when I upgrade I find great joy in going back and “rediscovering” some of the older games and playing them on ultra for the first time.