I’ve been wondering this recently. I grew up on atari/nes/snes and so of course almost all of those games (pretty sure all) are written in assembly and are rock solid smooth and responsive for the most part. I wonder if this has affected how I cannot stand to play badly optimized games eith even a hint of a laggy feel to it. I’ve always been drawn to quake and cs for that reason: damn smooth. And no, it doesn’t just need to be FPS games either. I cant play beat saber with a modicum of lag or i suck massively, but others can play just fine and not even notice the lag.
Its odd. I feel like a complainer but maybe I just notice it more easily than others?
No. The thing is AAA games are now being released in an unoptimized state way too often. Even if you still get good FPS microstuttering and short lag spikes still occur frequently.
Of course this can make you wonder if this is a you problem and you just got too sensitive.
Nope, this is an industry problem. Why would you optimize a game? No, legitimately asking. It doesn’t affect sales numbers, it often doesn’t significantly tank your steam review score (that most publishers don’t care about), there are practically no downsides to not optimize your game.
But if you do value optimization, it lowers dev velocity, requires more training/awareness for devs and artists, and you won’t be able to ship as fast anymore. And on top of that you get… nothing. A few more sales maybe?I’m going to push back on that a fair bit.
I used to agree it was “optimization” problems. And there are definitely some games/engines with those (I love Team Ninja but… god damn).
But it is also that mindsets have changed. Most people know of the “can it run Crysis?” meme… if only from Jensen. But it was a question for a reason because Crysis (and other games) genuinely pushed the envelope of what desktop computers could handle. It was an era where you really would put a LOT of effort into figuring out what settings would get you what framerate and “ultra” was something that only the super rich or the people who JUST built a new computer could expect to run.
But around the launch of the PS4/XBONE, that all changed. Consoles were just PCs for all intents and purposes and basically all games “worth playing” were cross platform. So rather than taking advantage of the latest nVidia card or going sicko mode for the people who got the crazy powerful single thread performance i7, they just targeted what the consoles could run. So when people did their mid-gen upgrades of PCs… suddenly “ultra” and “epic” were what we began defaulting to. Just crank that shit up, turn off whatever you don’t like, and see your framerate and go from there.
The refresh SKU consoles bumped up the baseline but… not all that much since those games still had to run on a base XBONE. And then we got the PS5/XSEX which… you know how it is never a good time to build a new PC? It was REALLY not a good time to build a new console as ray tracing and upscaling/framegen rapidly became the path forward in the hardware/graphics space. But also? Those launched during COVID so the market share of the previous gen remained very large and all those third parties continued to target the previous gen anyway.
Which gets back to PC gaming. Could more effort be put in to improve performance? Yeah, definitely. But we are also getting reminded of what things were actually like until the mid 10s where you might only play a game on Medium or High and wanting that new game to be gorgeous is what motivates you to drive down to Best Buy and get a new GPU.
But instead it is the devs fault that we can’t play every game on maxed out Epic settings at 4k/240Hz… because this generation never knew any different.
I get what you’re trying to say but I’ve definitely experienced performance problems even on lowest settings.
The issue isn’t that everyone tries to run the game maxed out. The issue is that fundamental problems are often left in the games that you can’t just fix by lowering quality settings.And there is a reason the +/- (?) buttons literally changed the render window for DOOM and the like. Like… those iconic HUDs were specifically so that those playing on a 640480 monitor might actually only have to worry about a 640360 game and so forth.
Same with those of us who played games like Unreal Tournament at 18-24 FPS on an 800*600.
Like I said, there are definitely some problem children (again: Team Ninja). But it is also worth remembering that most games are still targeting a previous gen console SKU at 1080p. And, ironically, the optimizations are going to be geared more towards that.
Which… is why upscaling is such a big deal. Yeah “AI Upscaling” is a great buzzword. But it really is no different than when we used to run OFP at a lower resolution on the helicopter missions. It is just that now we can get “shockingly good” visuals while doing that rather than thinking Viktor Troska looks extra blocky.
Like, I’ll always crap on Team Ninja’s PC ports because they are REALLY bad… even if that is my preferred platform. But it took maybe 2 minutes of futzing about (once I got to Yokohama proper and had my game slow to sub 20 FPS…) to get the game to look good and play at a steady 60 FPS. No, it wasn’t at Epic (or whatever they use) but most of the stuff was actually on High. Is it the same as just hitting auto-detect and defaulting to everything maxed out? Of course not. But that gets back to “Can it run Crysis?”
If you folks want to have a really hard time find a way to play the NES version of Mike Tyson’s Punch Out on original hardware with a CRT monitor and then play it on any emulator on a modern monitor. You will feel like you’ve aged 80 years.
Been wondering this, or something like this.
I used to be good at Mario 1, but I cannot play it on emulators. It feels like there’s a delay. It feels a little like Mario is on ice, much like the ice levels of Mario 2. Mario is running, and I want to jump or stop, but there’s a noticeable delay and it makes me feel like my old ass has lost my touch. But playing any modern game, my reflexes are good enough. In a Nintendo to Nintendo comparison, I play Animal Crossing on the Switch, and sure enough, if I’m running and pull back on the stick, my villager skids at exactly the time I want them to. But on that same Switch with the same controller, I can’t control Mario in Mario 1 worth a damn. I do just fine in Super Mario Wonder, though.
(Side note, more to do with Animal Crossing than older games, but I’ve noticed a wired controller, plugged into the Switch dock via USB, with the Switch on the dock, gets more latency than the Switch in handheld mode, which I’m pretty sure uses Bluetooth to connect to its controllers, even if they’re physically connected — not 100% sure on that. But for one example, fishing — even the five-star rarity fish — is quite easy in handheld. But, with the wired connection, I mash A as soon as the fish bites, and it still slips my hook. Maybe the latency isn’t from the controller to the dock to the Switch, maybe it’s from the Switch to the dock to the TV (and speakers since I close my eyes and listen for the sound, which most animal crossers agree is the best way to fish).)
It’s mostly the TV. The input difference between wired and BT should be very small, though the switch is not optimized for wired controllers. The variability of TV response times on the other hand it massive in comparison. Specially modern TVs with heavy post processing who think they are clever trying to interpolate frames or other shit like bad HDR implementations, etc. HDMI DRM also adds latency.
All that causes most TVs to be subpar for gaming. I still game on TV, mostly cozy games. But I accept that nothing competitive will come out of gaming on a TV.
On the one hand, we’re more accustomed to better hardware latency. On the other hand… we played first-person shooters on 56K modems. The lag was legendary
Wasn’t prediction baked into the netcode very early in the FPS genre? I wasn’t playing multiplayer in the Doom days, but by the late 90s, you wouldn’t have latency so much as you’d have rubberbanding. Games also use very little bandwidth, so 56K was no different than broadband, from my recollection.
Yes and no.
Different games (really engines) had different models for it. Some games you would feel things grind to a halt while you waited for a packet. Others you would have rubber banding where the prediction of what your opponent would do was wrong and they teleport 2 meters to the right. And a select few would result in endless double kills as you both killed the predictions.
The big difference was that arena shooters (which DOOM effectively was) tended to have encounters where you might have 3 or 4 players all shooting each other at once with a high enough TTK that it was very easy to lose track of one enemy because you saw a more immediate threat. So it was a lot easier to just assume the rubber banding was a you problem or not notice it at all.
Then we had CoD and it all became about super short TTK and 1on1 fights. And now? Now it was incredibly obvious when someone warped because they were your only concern.
Back in the day, my games were UT (mostly the good one, sometimes 2k4), Jedi Knight 2, Tribes 2, and Operation Flashpoint. I was a cool kid… But even then, it was almost never perceptible in UT even though the Unreal Engine had “the worst netcode”. Also not OFP since your encounter ranges were so long and you were squinting through iron sights so you had no idea if you missed because of lag or what. But JK2 and Tribes 2 were VERY obvious when the network was acting up because you were generally dueling someone or taking out a lone flag carrier while skiing across a field.
First multiplayer FPS I played was Jedi Knight: Dark Forces II (released in '97). In that game, you had to lead your shots to a silly degree to actually hit anyone. But I think you’re right; by then most games weren’t suffering from that problem as much.
I played using a cell phone connected by USB with a 14k data connection. It was slow af but I got unlimited data for $5 a month and it didn’t tie up the land line.
There are so many things that go into whether a game feels responsive or not. Your experience could be explained by anything from access to stable Internet, to trends in game design philosophy, and vary from game to game based on implementation.
Here’s one of my favorite GDC talks that looks at just one small part of what goes into making a game feel responsive: https://youtu.be/h47zZrqjgLc
I was actually not thinking about online games when posting this. Too many variables there.
Sure, this is just an example of how complex “feel” can get in game development. The video includes several examples where player perception changes drastically from very minor gameplay design changes
The other way around. I grew up playing games on PCs that were quite underpowered for a long time. I played Doom like this. Hell, I had to reduce screen size even in Wolfenstein 3D. I loved fog in GTA San Andreas because it reduced draw distance and when it was raining in Las Venturas, I had to look at my feet like I was speedrunning Goldeneye. I played through Oblivion in a 640 x 480 window and thought it looked amazing. I still have to fight not to turn off AA completely first time running a game on my RTX 3080 because it was the first thing to go for so long.
All of this trained my brain so now I have bulit-in antialiasing and frame generation. I don’t give a shit. Give me a good art direction and gameplay loop and I can just generate smooth graphics in my head.
I had a super underpowered PC I grew up with and it influenced my imagination. For a long time stuff I’d imagine also ran at like 15-20FPS. Really weird effect.
You obviously did not play on pc when if you didn’t have the newest graphics card everything was laggy but still playable.
Agreed, also CRTs ruined the future for me as well.
I feel the opposite when I hear people complain about load times… “We want you to buy our SSD so your game will boot in 11 seconds instead of 19 seconds!”
Son, let me tell you about loading games from casette tape.
You’d start it loading, get up and go have dinner with the family. After 30 minutes, maybe it would be done. Maybe.
Maybe it hit an error 5 minutes after you walked away and now you need to re-wind and try again.
When did they have games on tape?
C64, for one!
Oh, you sweet summer child…
Up to the 90s my friend. Then 3.5 floppy"s took over (1.44 MEGAbyte!) then came zip (100MB) but only for rich people, then it became the era of CD and later dvd burning. Internet was not measured in mbits back then and most of the time not even in kbits. The internet was not a valid delivery system. It was slow and very expensive. Also the first memory cards (CF) around the millennium and from there it went on to the 10s and around there you got the pivot to what we have now.
Tape is still around in computing; its cheap, it’s cheerful, dependable and has quite a throughput. Seeking on it is still horrible though. But anyway, watching a real mechanised tapelibrary do it’s thing backing up computer systems is still mesmerizing.
You left out 5 1/4 floppy disks that were actually floppy. Yes, I know there are 8" floppies but those were mostly business use and specialized drives that you didn’t really get in the home computer market. Atari, Commodore, Radio Shack, etc all had 5 1/4" floppy drives, and when I got my first box of floppies, it was $50 of early 1980’s money for 10 disks. And on my Atari they held about 90K worth of space.
Definitely also a thing in Germany. Alongside magazines printing source codes of games for you to type off.
Nobody had this, it was way too expensive for what it was. Everybody just kept saving for a msx or Commodore and skipped this.
The generation of Amstrad, Spectrum etc had the games on tape. I would say they were the closest thing to a console pre-NES, so 1980s. I had an amstrad that was handed down to me by a friend of an older sister and it had tapes like this.
Late 1970s / early 1980s.
Beginning of file not found… Shit, didn’t rewind it far enough
An effect you may be noticing is motion smoothing, or the lack of it.
If you play Pong on an old console, it likely moves the paddle at full speed the moment it gets input to move. Acceleration is instant. This is very precise, but it also feels unnatural.
Modern versions will usually have some acceleration time that smooths out movement. It can be a very small effect, but it feels more natural and most people prefer it. It’s also less precise. People generally learn to compensate for it over time.
It’s so weird to me that no one uses the term “slowdown” any more. Lag and latency meant networking delays back in the days you’re talking about. Not a complaint, just an observation that I’ve been wondering about the last few years.
But yeah, as others said, slowdown/lag was pretty common. I immediately think of the ninjas jumping out of the water in TMNT3, the beginning of Top Man’s stage in Mega Man 3, and the last boss of The Guardian Legend, but there were many more. Early 3d is shocking too, with more sub-30-fps games than you remember. Some called themselves at 20, even. [Edit: Now that I think about it, even some NES games capped at 20. Strange times.]
“Lag” does indeed come from network/signal theory and does indeed refer to networking. Been a minute, but I want to say lag is the round trip delay and latency is A to B but don’t quote me on that.
That said? Nobody cared. “Lag” was always the time between action and response. Some of that might be input delay. Some of that might be display delay (which has always been over-exaggerated but…). And a lot of that really was network delay. These days it tends to be more rendering/logic delay because people who are playing on shitty internet connections know it.
I believe OP is referring to input latency, which isn’t so much a result of the system slowing down due to increased load, as much as running in a consistently slowed-down state causing a delay on your inputs being reflected on-screen. There’s several reasons for why this is happening more often lately.
Part of it has to do with the displays we use nowadays. In the past, most players used a CRT TV/monitor to play games, which have famously fast response times (the time between receiving the video signal and rendering that signal on the screen is nearly zero). But modern displays, while having a much crisper picture, often tend to be slower at the act of actually firing pixels on the screen, causing that delay between pressing Jump and seeing your character begin jumping.
Some games also strain their systems so hard that, after various layers of post-processing effects get applied to every rendered frame, the displayed frames are already “old” before they’re even sent down the HDMI cable, resulting in a laggier feel for the player. You’ll see this difference in action with games that have a toggle for a “performance/quality” mode in the graphics settings. Usually this setting will enable/disable certain visual effects, reducing the load on the system and allowing your inputs to be registered faster.
You’re right. Yes, there’s slowdowns in a lot of older games but not necessarily input lag. The slowdowns dont bother me hardly at all. I think you hit right on it!
There are only a few reasons I can surmise that this would be the case:
CRTs don’t add any input lag
There’s no extra latency from being connected to the internet
There’s no latency from bluetooth/wireless on the controller
Because most older games are extremely badly optimised by today’s standards. The original Metroid slows to an absolute crawl when there’s more than about 4 sprites on the screen; the dragon boss in Mega Man (2, I think) was such a laggy, slippery mess that I gave up trying to beat the game; Ocarina of Time runs at 20FPS (worse if you’re in a PAL territory like I am), and that’s one of the better playing N64 games.
I think you’re either noticing one of these extra sources of delay, or you’re blinded by nostalgia.
If you’re measuring display lag the same way we measure it with modern LCDs, then yes, CRTs do have lag.
Unless it’s an HD one, there’s no input buffer so it’s impossible for a CRT to have more than a frame of input lag. And the console needs a frame to notice your input anyway.
You measure lag by taking the capture of a frame an input happens when it is halfway down the screen. Therefore, CRTs have input lag of half their refresh rate. For NTSC, that’s about 8ms. For PAL, 10ms.
Incidentally, a modern gaming LCD has a 2ms average pixel response time. Which is about the same as the difference between NTSC and PAL.
Classics still had lag. DK Country 3’s final boss was so laggy it’d affect the boss music.
Not quite super classic you mentioned but a chunk of the speed run tech around Super Mario 64 is how to optimize the camera to avoid lagging on certain effects (the sunshine to the wing cap, the top tower in whomps fortress, the sub in dire dire docks).
Also OOT only ran at 20 fps
Ocarina of Time ran at 20 fps as a compromise for it having the largest draw distance of any game on the Nintendo64.
Oh absolutely
I say that less as a knock on the game and more that there were technical compromises made back in the day as well. Nostalgia sometime last hits and people assume everything ran blazing fast.
I grew up on atari/nes/snes and so of course almost all of those games (pretty sure all) are written in assembly and are rock solid smooth and responsive for the most part.
HA!
Older games were laggy as all fuck and had very significant input delay.
But ignoring the rose tinted glasses: I DO think there is some element of truth to this: My formative years of online gaming were 56k and an ATI Rage. I probably logged at least a thousand hours of UT at 20-ish FPS and my ping was regularly in the hundreds. I can definitely appreciate lower latency games, but I mostly just need VRR (for screen tearing and the like) and I am set. Whereas one of the younglings from work pretty much can’t play anything below 60 FPS… and we have tested this.
Im not certain what input delay youre referring to. It is likely very dependent on the games I play as well. Of course some of the older games pushing the hardware to the max were laggy when a lot of sprtes etc were loading.
I think so – gamers these days complain about having 50 ping or less than 120fps. There’s certainly a point at which it seriously impacts your gameplay, but I find it laughable when they can’t even deal with better performance than even existed 15 years ago.