• 0 Posts
  • 43 Comments
Joined 6 months ago
cake
Cake day: October 22nd, 2025

help-circle







  • The result of all this may be catastrophic. Should a worst-case scenario ever occur — a cyberattack, a natural disaster, an internet outage — there may be no human workers left with the skills that once kept food on the shelves.

    Very nerdy of me, but this reminds me of a Stargate SG-1 episode “the Sentinel.” The team travels to a planet whose civilization relies on fully automated technology. The people don’t have to operate or maintain it (normally), so their society has completely forgotten how. In the episode, one set of antagonists comes in and sabotages their defense system, and another set sees the opportunity and invades. The protagonists have to then figure out the defense system and fix it.

    We don’t live in a TV series. There aren’t benevolent outsiders who will swoop down and save our systems in the nick of time when they break down. We’re headed in a bad direction.





  • Determinism means performing the same way every time it is run with the same inputs. It doesn’t mean it follows your mental model of how it should run. The article you cite talks about aggressive compiler optimizing causing unexpected crashes. Unexpected, not unpredictable. The author found the root cause and addressed it. Nothing there was nondeterministic. It was just not what the developer expected, or personally thought was an appropriate implementation, but it performed the same way every time. I think you keyed on the word “randomly” and missed “seemed to,” which completely changes the meaning of the sentence.

    LLMs often act truly nondeterministically. You can create a fresh session and feed it exactly the same prompt and it will produce a different output. This unpredictability is poison for producing a quality product that is maintainable with dynamic LLM code generation in the pipeline.


  • It’s a lot harder to perpetuate historical knowledge when you don’t get support from the educational system. The government sets educational standards and subject matter, so it’s not surprising they de-emphasize the record of their own actions against the public they are teaching.

    Universities are more independent (but definitely not completely, and they come with their own set of problems), so students there tend to be more exposed to topics like this. But then you get political movements villianizing universities.


  • Bluesky is one, single platform. It stores the complete data for any given user post in its databases and provides that through its data stream and APIs. This means every different client someone writes has access to all the same data as every other client, because they’re all going through Bluesky. This also means if Bluesky doesn’t support some feature, no clients can either.

    The architecture of the Fediverse is different. Forgetting ActivityPub for a moment, Mastodon is one platform and Pixelfed is another. This means each one has its own data model, internal storage architecture, and streams/APIs. Because they were built for different purposes, they support different features. I don’t use either, but I expect there are image-related features in Pixelfed that are just not possible in a Mastodon client, not because someone hasn’t written a client capable of it, but because Mastodon doesn’t have the internal data storage nor API to support it in any client.

    Where ActivityPub comes in is a unified stream language. When a post pops up on a platform, that platform has the complete data and translates as much as it can into an ActivityPub message to send to other platforms. Some platforms haven’t figured out yet how to pack all of their relevant data into an ActivityPub message, so some data may be lost in the sending. And different platforms may not support storing all the data in a given ActivityPub message they receive, especially if it’s from a feature they don’t provide, so some data may be lost in the receiving.

    Ultimately this means even with ActivityPub linking things together, the data flow isn’t perfect/complete. So different data is available to any even theoretical Mastodon client compared to a Pixelfed client because the backend platforms are different. Their APIs expose different data in different, often incompatible ways, so even if someone wrote an image-focused client for Mastodon, it wouldn’t be possible to do everything an image-focused client for Pixelfed could do, because the backend platforms focus on different things.





  • I want to start by saying I generally agree with the theme of the article that the average American already gets enough protein without needing to specifically target it in fast food. However, I think this is not entirely accurate:

    Overall recommendations have consistently hovered between 50-70 grams [of protein] per day, depending on weight.

    That sounds low to me. I’ve seen nutritionist recommended minimums in the 50-70 range depending on weight, height, gender, and age, but recommended targets are higher. Especially for older men who are at higher risk of muscle loss with age, these recommended targets can be above 90 grams.

    Edit: Getting several down votes, so let’s add some sources.

    0.8g per kg of weight, which comes out to about 55g per day for a 150 lb person, is a minimum, not an average: https://doi.org/10.3945/an.116.013821

    Aging men may need to consume as much as 2g per kg of weight, which comes out to about 135g for a 150 lb man: https://doi.org/10.3390/nu10030360