• 12 Posts
  • 539 Comments
Joined 4 years ago
cake
Cake day: May 31st, 2020

help-circle











  • I guess, it becomes more unusual when you’re old enough to buy your own candy. At that point, if you don’t put effort in, it might come off to some neighbors like you’re freeloading.

    But as others said, if you put on a costume and you’re clearly enjoying the process, maybe you even make it a friend group activity, then it’s easy to believe that you’re doing it for fun. It’s not like you’d get rich off of freeloading candy in any scenario anyways.



  • I’d say, I’m primarily a very low volume gamer, so I don’t play a lot of games, and if I do, I don’t play them for long. And that certainly makes it easy to look at the news of a game releasing and to think, yeah, that’s probably neat, but if I’m buying another game then it’d be Undertale or Baba Is You or such, and it definitely doesn’t look as neat as those…



  • I mean, it is good to be empirical about things, but it would fit well into the other evidence we have.

    The warmer air means there’s more energy kicking about in the atmosphere and, to my knowledge, we have pretty clear evidence that this causes more extrem weather events to occur. For example, hurricanes are more likely.
    We’ll probably see those on the weather radar to avoid them, but at that point it would be weird to me, if the occurrence of lighter winds wasn’t also more likely in places we don’t avoid.

    I guess, a reduction of turbulence injuries might’ve taken place independently, because our instruments for predicting them are getting better, but then their frequency would’ve still increased.




  • I mean, yeah, but that difference is quite crucial.

    People have always wanted to be the top search result without putting effort in, because that brings in ad money.
    But without putting effort in, their articles were generally short, had typoes, and there were relatively few such articles.

    Now, LLMs allow these same people to pump out hundredfold as much gargage, consisting of lengthy articles in many languages. And because LLMs are specifically trained to produce texts that are human-like, it’s difficult for search engines to filter out these bad quality results.