I’ve been wondering this recently. I grew up on atari/nes/snes and so of course almost all of those games (pretty sure all) are written in assembly and are rock solid smooth and responsive for the most part. I wonder if this has affected how I cannot stand to play badly optimized games eith even a hint of a laggy feel to it. I’ve always been drawn to quake and cs for that reason: damn smooth. And no, it doesn’t just need to be FPS games either. I cant play beat saber with a modicum of lag or i suck massively, but others can play just fine and not even notice the lag.

Its odd. I feel like a complainer but maybe I just notice it more easily than others?

  • ThunderComplex@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 day ago

    No. The thing is AAA games are now being released in an unoptimized state way too often. Even if you still get good FPS microstuttering and short lag spikes still occur frequently.

    Of course this can make you wonder if this is a you problem and you just got too sensitive.

    Nope, this is an industry problem. Why would you optimize a game? No, legitimately asking. It doesn’t affect sales numbers, it often doesn’t significantly tank your steam review score (that most publishers don’t care about), there are practically no downsides to not optimize your game.
    But if you do value optimization, it lowers dev velocity, requires more training/awareness for devs and artists, and you won’t be able to ship as fast anymore. And on top of that you get… nothing. A few more sales maybe?

    • NuXCOM_90Percent@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      I’m going to push back on that a fair bit.

      I used to agree it was “optimization” problems. And there are definitely some games/engines with those (I love Team Ninja but… god damn).

      But it is also that mindsets have changed. Most people know of the “can it run Crysis?” meme… if only from Jensen. But it was a question for a reason because Crysis (and other games) genuinely pushed the envelope of what desktop computers could handle. It was an era where you really would put a LOT of effort into figuring out what settings would get you what framerate and “ultra” was something that only the super rich or the people who JUST built a new computer could expect to run.

      But around the launch of the PS4/XBONE, that all changed. Consoles were just PCs for all intents and purposes and basically all games “worth playing” were cross platform. So rather than taking advantage of the latest nVidia card or going sicko mode for the people who got the crazy powerful single thread performance i7, they just targeted what the consoles could run. So when people did their mid-gen upgrades of PCs… suddenly “ultra” and “epic” were what we began defaulting to. Just crank that shit up, turn off whatever you don’t like, and see your framerate and go from there.

      The refresh SKU consoles bumped up the baseline but… not all that much since those games still had to run on a base XBONE. And then we got the PS5/XSEX which… you know how it is never a good time to build a new PC? It was REALLY not a good time to build a new console as ray tracing and upscaling/framegen rapidly became the path forward in the hardware/graphics space. But also? Those launched during COVID so the market share of the previous gen remained very large and all those third parties continued to target the previous gen anyway.

      Which gets back to PC gaming. Could more effort be put in to improve performance? Yeah, definitely. But we are also getting reminded of what things were actually like until the mid 10s where you might only play a game on Medium or High and wanting that new game to be gorgeous is what motivates you to drive down to Best Buy and get a new GPU.

      But instead it is the devs fault that we can’t play every game on maxed out Epic settings at 4k/240Hz… because this generation never knew any different.

      • ThunderComplex@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        I get what you’re trying to say but I’ve definitely experienced performance problems even on lowest settings.
        The issue isn’t that everyone tries to run the game maxed out. The issue is that fundamental problems are often left in the games that you can’t just fix by lowering quality settings.

        • NuXCOM_90Percent@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 day ago

          And there is a reason the +/- (?) buttons literally changed the render window for DOOM and the like. Like… those iconic HUDs were specifically so that those playing on a 640480 monitor might actually only have to worry about a 640360 game and so forth.

          Same with those of us who played games like Unreal Tournament at 18-24 FPS on an 800*600.

          Like I said, there are definitely some problem children (again: Team Ninja). But it is also worth remembering that most games are still targeting a previous gen console SKU at 1080p. And, ironically, the optimizations are going to be geared more towards that.

          Which… is why upscaling is such a big deal. Yeah “AI Upscaling” is a great buzzword. But it really is no different than when we used to run OFP at a lower resolution on the helicopter missions. It is just that now we can get “shockingly good” visuals while doing that rather than thinking Viktor Troska looks extra blocky.

          Like, I’ll always crap on Team Ninja’s PC ports because they are REALLY bad… even if that is my preferred platform. But it took maybe 2 minutes of futzing about (once I got to Yokohama proper and had my game slow to sub 20 FPS…) to get the game to look good and play at a steady 60 FPS. No, it wasn’t at Epic (or whatever they use) but most of the stuff was actually on High. Is it the same as just hitting auto-detect and defaulting to everything maxed out? Of course not. But that gets back to “Can it run Crysis?”