• 0 Posts
  • 1.82K Comments
Joined 1 year ago
cake
Cake day: June 30th, 2023

help-circle
  • Here’s one I just realized: we’re closer to 2070 than we are to 1970.

    In general, if someone is over 50, we’re closer to their 100th birthday than their birth year. 1974 is currently equidistant to 2074 and it goes up by one every year.

    In 1990, we were equidistant from 1940 and 2040.

    And the bit you mentioned might make you feel old but consider that anyone born before 1976 still spent more time in the 20th century than the 21st. Or more time in the 2nd millennium then the 3rd. The ones who can’t say what you said are the really old ones.

    And both people in the running for president haven’t even spent a third of their lives in the current millennium.



  • Because our brains interpret colours and shading relative to their surroundings. That specific blue is on the opposite side of the colour wheel from red, so that relative lack of blue can be interpreted by our brains as red.

    Remember that white is all colours present, so white next to white will have more red than white next to blue.

    You’d get a similar effect if you stare at a bright blue version of the can for a while and then look at a blank white page or close your eyes. The after image isn’t the same colour as the thing you were staring at, it’s the inverse of that colour.









  • That does depend on who is making those memes. The bad faith propaganda meant to further divide the population targets all sides of all issues and it’s another very divisive issue. Any time I see someone making what should be a good point but in a way that will instead increase resistance, I suspect that’s what’s going on. It’s not a certain way to determine if that’s happening; anyone who has read How to Win Friends and Influence People knows that our instincts about persuasion are bad even before any bad faith is involved.


  • It’s the same mindset that lead to using dispersants on the oil spilled by deep horizon. It’s not about science, it’s about dealing with a problem that has no easy good solution, so instead of a good solution, just something is done.

    Oil companies probably thought that people would be more resistant to buying oil if it needed special effort to dispose of properly. Maybe they didn’t even have a good way of dealing with it at that time and just hadn’t dumped enough of it yet to realize that it would eventually run down into the water table. Though going by how they handled realizing that burning oil at all was going to have a huge effect on climate, they likely wouldn’t have cared even if they did know.

    Just like deep horizon wasn’t an environmental problem for BP but a PR one, thus they selected solutions that looked like they were trying, that they shouldn’t be liquidated to fund a real cleanup effort, and that new deep water oil wells were still worth the risk. Think of all the retirees that they are holding hostage because they put money towards funds that bought BP stock and derivatives!






  • That’s a part of it. Another part is that it looks for patterns that it can apply in other places, which is how it ends up hallucinating functions that don’t exist and things like that.

    Like it can see that English has the verbs add, sort, and climb. And it will see a bunch of code that has functions like add(x, y) and sort( list ) and might conclude that there must also be a climb( thing ) function because that follows the pattern of functions being verb( objects ). It didn’t know what code is or even verbs for that matter. It could generate text explaining them because such explanations are definitely part of its training, but it understands it in the same way a dictionary understands words or an encyclopedia understands the concepts contained within.



  • Then each QA human will be paired with a second AI that will catch those mistakes the human ignores. And another human will be hired to watch that AI and that human will get an AI assistant to catch their mistakes.

    Eventually they’ll need a rule that you can only communicate with the human/AI directly above you or below you in the chain to avoid meetings with entire countries of people.


  • I think the same about anyone who fears LGBT+ trying to convert their kids like they believe someone can be convinced to be gay rather than just convinced to accept their sexuality.

    Like I don’t see any problem with being gay but it’s not for me. I sometimes think dating would be easier if I was bi, but it’s about as appealing as knowing it would be easier to fill my stomach if I ate sawdust.

    So it’s very telling when someone talks about gays tempting them or that they worry about a gay agenda of turning everyone gay like it’s a realistic possibility.