• 3 Posts
  • 116 Comments
Joined 1 year ago
cake
Cake day: June 23rd, 2023

help-circle





  • Yea, I wasn’t saying it’s always bad in every scenario - but we used to have this kinda deployment in a professional company. It’s pretty bad if this is still how you’re doing it like this in an enterprise scenarios.

    But for a personal project, it’s alrightish. But yea, there are easier setups. For example configuring an automated deployed from Github/Gitlab. You can check out other peoples’ deployment config, since all that stuff is part of the repos, in the .github folder. So probably all you have to do is find a project that’s similar to yours, like “static file upload for an sftp” - and copypaste the script to your own repo.

    (for example: a script that publishes a website to github pages)


  • I suppose in the days of ‘Cloud Hosting’ a lot of people (hopefully) don’t just randomly upload new files (manually) on a server anymore.

    Even if you still just use normal servers that behave like this, a better practice would be to have a build server that creates builds, like whenever you check code into the Main branch, it’ll create a deploy for the server, and you deploy it from there - instead of compiling locally, opening filezilla and doing an upload.

    If you’re using ‘Cloud Hosting’ - for example AWS - If you use VMs or bare metal - you’d maybe create Elastic Beanstalk images and upload a new Application or Machine Image as a new version, and deploy that in a more managed way. Or if you’re using Docker, you just upload a new Docker image into a Docker registry and deploy those.



  • Hmm, well the first round(s) are doable for beginners. If you want to get into programming, these kinda games are a good way to start, since you’re getting visual feedback of what your bot is actually doing.

    And you can participate in loads of languages, so you can pick anything that you’re somewhat familiar with.

    However, once you’re getting into higher rounds, ranks, and leagues, you’ll be playing against other peoples’ bots. So obviously if you have 0 experience it’ll be way harder to beat people with loads of experience, that understand which algorithms are suitable etc.

    But I’d say go ahead and try it out. Its free. Maybe it turns out to be too difficult, maybe you’ll manage.





  • Defragging an SSD on a modern OS just runs a TRIM command. So probably when you wanted to shrink the windows partition, there was still a bunch of garbage data on the SSD that was “marked for deletion” but didn’t fully go through the entire delete cycle of the SSD.

    So “windows being funky” was just it making you do a “defragmentation” for the purpose of trimming to prepare to partition it. But I don’t really see why they don’t just do a TRIM inside the partition process, instead of making you do it manually through defrag




  • Well to be clear, this was not supposed to be a jab at gitflow, or me complaining specifically about gitflow. I merely used “gitflow” as an example of a set of conventions and standardizations that comes nicely packaged as one big set of conventions.

    But there’s nothing wrong with gitflow. I was just saying - it are not set in stone rules you must follow religiously. If you’re using it and it seems more practical to adapt the flow for your own use-case, don’t worry it’d be considered wrong to not stick strictly to it


  • I think a common misconception is that there’s a “right way to do git” - for example: “we must use Gitflow, that’s the way to do it”.

    There are no strict rules for how you should use git, it’s just a tool, with some guidelines what would probably work best in certain scenarios. And it’s fine diverge from those guidelines, add or remove some extra steps depending on what kinda project or team-structure you’re working in.

    If you’re new to Git, you probably shouldn’t just lookup Gitflow, structure your branches like that, and stick strictly to it. It’s gonna be a bit of trial-and-error and altering the flow to create a setup that works best