Got myself a few months ago into the optimization rabbit hole as I had a slow quant finance library to take care of, and for now my most successful optimizations are using local memory allocators (see my C++ post, I also played with mimalloc which helped but custom local memory allocators are even better) and rethinking class layouts in a more “data-oriented” way (mostly going from array-of-structs to struct-of-arrays layouts whenever it’s more advantageous to do so, see for example this talk).

What are some of your preferred optimizations that yielded sizeable gains in speed and/or memory usage? I realize that many optimizations aren’t necessarily specific to any given language so I’m asking in [email protected].

  • Alex@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    11 months ago

    More than ten years ago, I was a team-lead in a game development company and during another crunch I was very overloaded with tasks, including a code review. And of course, I missed some things. We optimized the game, which in general did not do 30 fps on the target machine with minimum requirements, and the target was at least 40 fps. During the month, we jointly optimized everything so that we were able to achieve almost stable 40 frames, but this was not enough and periodically there were drawdowns up to 15-20 fps. Everyone is in panic, there are no ideas, we’re optimized everything.

    No, I understand that there is no optimization limit apriori, but I mean an adequate opts without rewriting everything to assembler specifically for all supported architectures.

    So, one night I looked into the event loop initialization and found that the initialization happens twice. Twice. Two parallel event loops. That is, two parallel cycles of logic and state updates. That night, by deleting one line, I optimized the performance of the game by more than 100%. 🤦🏻‍♂️

    I investigated and found out that this line of secondary initialization was left by junior “for debugging” and forgotten in the crunch. And I missed it on the review.

    I’ve keep this story a secret all these years. And now I’m not revealing names. Otherwise, it can have a dramatic impact on the careers of many.

  • lysdexic@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    The optimization I’m the most proud about was when I worked on a legacy project whose end-to-end builds took around 1 hour. After spending some time working on its architecture, project layout and build system, I managed to get the full end-to-end builds to take 10 minutes, and incremental builds to be almost instant.

    What makes me the most proud about this project is that the technical debt plaguing the legacy project was directly and indirectly the reason why half a dozen of my team members burned out and quit the company. After that point my remaining team members started to be far less stressed and team velocity skyrocketed, just for the fact that the thought of iterating over a bugfix and posting a pull request didn’t cost at least one hour, and sometimes two or three.

      • lysdexic@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Surely you got a bonus and a raise out of it right? Right??

        The only reward I got from it was recognition from my team members, which was already more than what I was expecting to get.

        My manager was praised for the higher team velocity and improvements in the team’s burndown chart. The hallmark of having done good work is seeing others trying to take credit for it.

  • Mikina@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I was working on a pretty well known game, porting it to consoles.

    On PS4 we started getting OOM crashes after you’ve played a few levels, because PS4 doesn’t have that much memory. I was mostly new on the project and didn’t know it very well, so I started profiling.

    It turned out that all the levels are saved in a pretty descriptive JSON files. And all of them are in Unity’s Scriptable Objects, so even if you are not playing that level, they all get loaded into memory, since once something references a SO, it gets loaded immediately. It was 1.7Gb of JSON strings loaded into memory once the game started, that stays there for the whole gameplay.

    I wrote a build script that compresses the JSON strings using gzip, and then uncompresses it when loading the actual level.

    It reduced the memory of all the levels to 46Mb down from 1.7Gb, while also reduced the game load by around 5 seconds.