• 39 Posts
  • 541 Comments
Joined 1 year ago
cake
Cake day: September 21st, 2023

help-circle




  • I hear you that it’s tiring and intimidating dealing with fascists. That said I don’t think it’s factual to say they only need to win once, and believing so creates a strategic disadvantage.

    Factually, world war 2 is the classic example of fascists needing to win continually and being unable to do it. The Nazis had a good showing in an election, Hitler was made chancellor and then they used that foot in the door to take over the government and seize many countries. But they lost in the end, and that was a result of resistance, not just militarily but the sum of every individual act of opposition.

    There’s a concept of anticipatory obedience. Corporations and local governments sometimes fell over themselves to do what they thought the fascist government would ask before the actual ask. Even if Trump seized power, that wouldn’t be the end. They need us to cooperate. And by resisting in a concrete way (not just #resist posting of course) we will stop fascism.

    It’s never over. Fascism is destined to lose. It’s a question of how much suffering and injustice can we avoid by defeating it sooner.

    And believing like they want us to believe, that it’s all over, is a strategic disadvantage. If we believe we’re beaten or that victory is impossible we’ll act that way. Believe that we can win, and spread that belief, and we’ll act that way.







  • Is there some Linux equivalent to “ctrl + alt + del?” I get that killing a process from the terminal is preferred, but one of the few things I like about windows is if the GUI freezes up, I can pretty much always kill the process by pressing ctrl+alt+del and finding it in task manager. Using Linux if I don’t already have the terminal open there are plenty of times I’m just force restarting the computer because I don’t know what else to do.


  • You’re right, cameras can be tricked. As Descartes pointed out there’s very little we can truly be sure of, besides that we ourselves exist. And I think deepfakes are going to be a pretty challenging development in being confident about lots of things.

    I could imagine something like photographers with a news agency using cameras that generate cryptographically signed photos, to ward off claims that newsworthy events are fake. It would place a higher burden on naysayers, and it would also become a story in itself if it could be shown that a signed photo had been faked. It would become a cause for further investigation, it would threaten a news agency’s reputation.

    Going further I think one way we might trust people we aren’t personally standing in front of would be a cryptographic circle of trust. I “sign” that I know and trust my close circle of friends and they all do the same. When someone posts something online, I could see “oh, this person is a second degree connection, that seems fairly likely to be true” vs “this is a really crazy story if true, but I have no second or third or fourth degree connections with them, needs further investigation.”

    I’m not saying any of this will happen, just it’s potentially a way to deal with uncertainty from AI content.


  • Well as I said, I think there’s a collection of things we already use for judging what’s true, this would just be one more tool.

    A cryptographic signature (in the original sense, not just the Bitcoin sense) means that only someone who possesses a certain digital key is able to sign something. In the case of a digitally signed photo, it verifies “hey I, key holder, am signing this file”. And if the file is edited, the signed document won’t match the tampered version.

    Is it possible someone could hack and steal such a key? Yes. We see this with certificates for websites, where some bad actor is able to impersonate a trusted website. (And of course when NFT holders get their apes stolen)

    But if something like that happened it’s a cause for investigation, and it leaves a trail which authorities could look into. Not perfect, but right now there’s not even a starting point for “did this image come from somewhere real?”


  • In this case, digitally signing an image verifies that the image was generated by a specific camera (not just any camera of that brand) and that the image generated by that camera looks such and such a way. If anyone further edits the image the hash won’t match the one from the signature, so it will be apparent it was tampered with.

    What it can’t do is tell you if someone pasted a printout of some false image over the lens, or in some other sophisticated way presented a doctored scene to the camera. But there’s nothing preventing us from doing that today.

    The question was about deepfakes right? So this is one tool to address that, but certainly not the only one the legal system would want to use.