Way back in the day, every game had its logic tied to its framerate – As anyone who’s ever tried to run an eighties PC game on a nineties PC only to see it run at 20x speed and completely unplayable can tell you.
But in the modern day this is less common. Generally the game keeps chugging along at the same pace, no matter how fast or slow the frames are being presented (unless, of course, everything is bogged down so hard that even the game logic is struggling)
And yet, you’ll still find a few. Any fan of Dark Souls who played on PC back when Prepare to Die edition first came to PC will remember how unlocking the framerate could cause collision bugs and send you into the void. And I recently watched a video of a gent who massively overclocked a Nintendo Switch OLED and got Tears of the Kingdom to run at 60FPS… Except everything was, indeed, running in fast-forward, rather than just being smoother.
This makes me wonder – Is there some unseen advantage to keeping game logic and framerate tied together? Perhaps something that only really shows on weaker hardware? Or is it just devs going “well the hardware we’re targeting won’t really go over this speed, and we don’t really give a fuck about anything else” and not bothering to implement it?
Nearly every game I know of that ties things to the frame rate are games that require frame perfect inputs for many mechanics so I assume it’s just a lot easier to ensure the timings are predictable and achieveable by a wide range of hardware than it would be to base everything on the internal clock or some other time measurement that can vary depending on the machine.
My only experience with programming time based things is in doing mission creation and scripting for ARMA, but it has very much the same kind of issue where if you want everyone playing on the same server to have the same timings for events and such, you can’t just use the game timer. It would be several milliseconds off every client to the server making it unpredictable and desynchronized from all other players.