• 0 Posts
  • 123 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle
  • Ik dacht dat SGP over het algemeen een respectabele partij waren, maar ze zeggen hier letterlijk gewoon “Yep, accepteer maar wat Isreal hier op dit blaadje zet.” in hun motie. Dat is toch te gek voor woorden als het gaat om mensen en organizaties als terroristen markeren? Daar hebben we toch onze eigen veiligheidsdiensten voor om dat soort beslissingen te onderbouwen? Niet klakkeloos over nemen van een land dat een gigantisch conflict van interesse heeft om kritiek de mond te snoeren en mensenrechten veel minder serieus neemt.

    Ja wat er in Amsterdam gebeurt is, is ver weg van acceptabel. Maar laten we niet doen alsof er niet gezonde mensen zijn die niet heel vrolijk worden van Isreal’s manier van dingen doen.

    In mijn ogen niet anders dan Erdogan die hier of in Duitsland vraagt om hier iemand te veroordelen of uit te leveren omdat ze online wat gemene woorden over hem zeggen. Kom maar met bewijs dat het hier illegaal is, en dan kunnen we verder praten, tot dan, rot op met je ongefundeerde inmenging.



  • People differentiate AI (the technology) from AI (the product being peddled by big corporations) without making clear that nuance (Or they mean just LLMs, or they aren’t even aware the technology has a grassroots adoption outside of those big corporations). It will take time, and the bubble bursting might very well be a good thing for the technology into the future. If something is only know for it’s capitalistic exploits it’ll continue to be seen unfavorably even when it’s proven it’s value to those who care to look at it with an open mind. I read it mostly as those people rejoicing over those big corporations getting shafted for their greedy practices.



  • Stop trying to force your interpretation on my words, it’s not what I said, period. I’m not limiting my scope to two choices. The US constitution does that for the matter of what party is in office. There are very obvious other choices, and most of them call for massive human suffering like civil war or political violence, which I’m not going to iterate on for obvious reasons. Nowhere do I deny the existence of those choices, I’m just presenting the obvious conclusion of trying to change the system in a peaceful manner.




  • You can blame both, honestly. The US has always had the same political game as ever, people should be wise enough to understand how to play it. If you ever want to get to a more stable democracy that no longer has the stupid two party system that prevents any form of real representative democracy where you can actually have a selection of parties that represent you perfectly, the choice should be obvious.

    At least with Harris they could try to work with her and convince them to change their views for the future as they ruled. Trump will call you a left wing lunatic and slam the door in your face. Zero influence and no chance for progress (and even regression) vs some influence and some chance to progress.



  • First of all, I understand your point of view. And I’ve been looking at artists being undervalued like your potential client for decades, before AI was even a thing. So I definitely feel you on that point, and I wish it would be different. That said, here’s my response. (It’s a bit long, so I put it in spoiler tags)

    I told him he wasn’t looking for a composer, but rather a programmer or something

    spoiler

    Yes, but maybe also no. Do you use computer software to compose or assist you in composing? Like FL Studio, Audacity? Or maybe you use a microphone to record the played version of your composition?

    I know maybe one or two composers, and they wouldn’t go without that while I worked with them. But I’m sure you can agree using those things does not make you a programmer. It just takes a composer with a more technical mindset and experience with those tools. I don’t deny there are composers that do without it, and maybe you are one of them. If so, rock on, but I’m sure you can see using computer tools does not stop you from being a composer, it just enhances it. Now if you were to never learn anything about composing and just use AI blindly, then I would agree with you.

    But AI in that manner is no different, and like those other pieces of software it still requires expertise to make something actually good. However, judging from the manner your client spoke to you, I think the issue wasn’t that you weren’t making good music, it’s that you were making too expensive music for the value he wanted to derive from it. That’s sadly how the free market goes, and I agree that it has disproportionately screwed over artists because their work gets systematically undervalued. However, AI is not the cause of that, it merely made it more apparent, and it will not stop with the next thing after AI, unless we tackle it at the root cause by giving artists better protections that don’t end up empowering the same people that undervalue them, which is really quite nuanced to get right and the current system we have already makes it worse than it is. This is what I fight for instead.

    _

    I could also tell you about the written assignments that students hand in, and for which I can identify in less than 30 seconds which ones have been produced by AI (students overreact to their writing skills, it’s often laughable).

    spoiler

    Students are probably the worst example of this though. Because that’s basically what students are known for before AI was even a thing. The average student has no conception or feeling yet of what has artistic value or not, and most will not go into creative fields. Students used to hand in fully plagiarized works they just downloaded or took from other students, and it is indeed laughable for anyone that actually wants to make it somewhere in their field. So yes, if that’s the majority of AI produced works you’ve encountered I can totally understand your point of view, but I implore you to broaden your horizon to people that actually work in the field. Those that already have built up the artistic mindset.

    _

    As I tell them, those who have used chatgpt have “learned” to use AI, those who have done the work have learned to carry out research, to synthesize their ideas and to structure, articulate and present them.

    spoiler

    But these people have not learned how to proficiently use AI, just very shallowly. They have learned how to be lazy. Which mind you, is the same laziness that you learn from plagiarizing directly. This has literally been the reality of people growing up for the entirety of human existence. You’re right that the ones that did go through the effort learned more, but that does not mean they could not also value from enhancing that process with other tools. And you wouldn’t even know the ones that did. Because they will not hand in something that looks like it came directly out of ChatGPT. They might have only used it for brainstorming, or proof reading, or to make a boring passage more entertaining. Someone who understands why their own effort and sense of ownership matters would never just hand in something they had zero say in, that’s what lazy people do. And we have no shortage of those.

    A small subset of your students will go the extra mile, and realize that they need to get better themselves to produce things with more artistic value. They too will see what AI can help them with, and what it can’t. Some students that are lazy now will eventually see the light too, and realize that they’re lacking behind. That’s life - maturity takes time to develop.

    But just because lazy people can play the guitar by randomly stroking the strings, doesn’t mean a competent guitar player can’t create an incredibly intricate banger with the same guitar. AI is no different.

    _

    One last thing. As far as innovation is concerned, AI can endlessly produce pieces that sound like Bach, but it took Bach to exist in the first place, and Glenn Gould to revolutionize the interpretation of his scores for this to be possible.

    spoiler

    You’re right that AI requires existing material. But you said it yourself. Glenn Gould would not be able to make his work without Bach. And just like that Bach has inspirations that would mean Bach as we know him would not exist without those. And if paper did not exist, Bach could not write down his pieces for us to remember now and learn from. In the same way, an artists of any kind in the future will not exist without their influences and tools, of which AI could be one.

    AI can indeed produce endless pieces that sound like Bach, but only a human could use AI to produce a piece that has evokes feelings, passion, thoughts - anything to be considered to be real art. A machine cannot produce the true definition of art on it’s own, but it can be invoked by an artists to do work in furtherance of their art. Because it takes a creative mind to be able to spot, transform, extend, and also know when to discard, what an AI has produced. Just like we discard sources we perceive as low in value, and sources that are high in value we take as inspiration.

    _

    EDIT: Just want to add to this:

    I have no interest in replacing this practice by entering prompts into an algorithm, even if I could make easy money from it.

    That’s not something anyone should do. Because that’s not using it as a tool. That’s making it the entire process. That’s not the kind of AI usage I’m advocating for either. And you’re free to forego AI completely. Just like there are probably some instruments you never use, or some genre you never visit. I don’t like taking the easy way either, that’s why I make creative stuff as a living too. If I just wanted money I would go elsewhere too.



  • Most things produced by AI and assisted by AI are still human creation, as it requires a human to guide it to what it’s making. Human innovation is also very much based on mixing materials it’s seen before in new creative manners. Almost no material is truly innovative. Ask any honest artists about their inspirations and they can tell you what parts of their creations were inspired by what. Our world has explored the depths of most art forms so there is more than a lifetime’s worth of art to mix and match. Often the real reason things feel fresh and new is because they are fresh and new to us, but already existed in some form out there before it came to our attention.

    That AI can match this is easily proven by fact AI can create material that no human would realistically make (like AI generated QR codes, or ‘cursed’ AI), very proficient style mixing that would take a human extensive study of both styles to pull off (eg. Pokemon and real life), or real looking images that could not realistically, financially, conscionably, be made using normal methods (eg. A bus full of greek marble statues).

    Nobody is saying you have to like AI art, and depending on your perspective, some or most of it will still be really low effort and not worth paying attention to, but that was already the state of art before AI. Lifetimes of art are being uploaded every day, but nobody has the time to view it all. So I would really keep an open mind that good AI art and AI assisted art exists out there, and you might one day come to like it but not realize you’re seeing it, because good AI usage is indistinguishable from normal art.


  • This kind of AI approaches art in a way that finally kinda makes sense for my brain, so it’s frustrating seeing it shot down by people who don’t actually understand it. Stop using this stuff for tasks it wasn’t meant for (unless it’s a novelty “because we could” kind of way) and it becomes a lot more palatable.

    Preach! I’m surprised to hear it works for people with aphantasia too, and that’s awesome. I personally have a very vivid mind’s eye and I can often already imagine what I want something to look like, but could never put it to paper in a satisfying way that didn’t cost excruciating amount of time. GenAI allows me to do that with still a decent amount of touch up work, but in a much more reasonable timeframe. I’m making more creative work than I’ve ever been because of it.

    It’s crazy to me that some people at times completely refuse to even acknowledge such positives about the technology, refuse to interact with it in a way that would reveal those positives, refuse to look at more nuanced opinions of people that did interact with it, refuse even simple facts about how we learn and interact with other art and material, refusing legal realities like the freedom to analyze that allow this technology to exist (sometimes even actively fighting to restrict those legal freedoms, which would hurt more artists and creatives than it would help, and give even more more power to corporations and those with enough capital to self sustain AI model creation).

    It’s tiring, but luckily it seems to be mostly an issue on the internet. Talking to people (including artists) in real life about it shows that it’s a very tiny fraction that holds that opinion. Keep creating 👍


  • Totally second the latter part - it’s the self destructive nature of being blindly anti-AI. Pretty much everyone would support giving more rights and benefits to people displaced by AI, but only a fraction of that group would support an anti-AI mentality. If you want to work against the negative effects of AI in a way that can actually change things, the solution is not to push against the wall closing in on you, but to find the escape.


  • Yeah and honestly, this is largely a reasonable standard for anyone running an email server. If you don’t have SPF, DKIM and DMARC, basically anyone can spoof your emails and you’d be none the wiser. It also makes spam much harder to send without well, sacrificing IP addresses to the many spam lists. I wouldn’t be surprised if some people setting up their own mail server were made aware of these things because of being blocked.


  • There is so much wrong with this…

    AI is a range of technologies. So yes, you can make surveillance with it, just like you can with a computer program like a virus. But obviously not all computer programs are viruses nor exist for surveillance. What a weird generalization. AI is used extensively in medical research, so your life might literally be saved by it one day.

    You’re most likely talking about “Chat Control”, which is a controversial EU proposal to scan either on people’s devices or from provider’s ends for dangerous and illegal content like CSAM. This is obviously a dystopian way to achieve that as it sacrifices literally everyone’s privacy to do it, and there is plenty to be said about that without randomly dragging AI into that. You can do this scanning without AI as well, and it doesn’t change anything about how dystopian it would be.

    You should be using end to end regardless, and a VPN is a good investment for making your traffic harder to discern, but if Chat Control is passed to operate on the device level you are kind of boned without circumventing this software, which would potentially be outlawed or made very difficult. It’s clear on it’s own that Chat Control is a bad thing, you don’t need some kind of conspiracy theory about ‘the true purpose of AI’ to see that.


  • You don’t solve a dystopia by adding more dystopian elements. Yes, some companies are scum and they should be rightfully targeted and taken down. But the way you do that is by targeting those scummy companies specifically, and creatives aren’t the only industry suffering from them. There are broad spectrum legislatures to do so, such as income based equality (proportional taxing and fining), or further regulations. But you don’t do that by changing fundamental rights every artists so far has enjoyed to learn their craft, but also made society what it is today. Your idea would KILL any scientific progress because all of it depends on either for profit businesses (Not per se the scummy ones) and the freedom to analyze works without a license (Something you seem to want to get rid of), in which the vast majority is computer driven. You are arguing in favor of taking a shot to the foot if it means “owning the libs big companies” when there are clearly better solutions, and guess what, we already have pretty bad luck getting those things passed as is.

    And you think most artists and creatives don’t see this? Most of us are honest about the fact of how we got to where we are, because we’ve learned how to create and grow our skill set this same way. By consuming (and so, analyzing) a lot of media, and looking a whole lot at other people making things. There’s a reason “good artists copy, great artists steal” is such a known line, and I’d argue against it because I feel it frames even something like taking inspiration as theft, but it’s the same argument people are making in reverse for AI.

    But this whole conversation shouldn’t be about the big companies, but about the small ones. If you’re not in the industry you might just not know that AI is everywhere in small companies too. And they’re not using the big companies if they can help it. There’s open source AI that’s free to download and use, that holds true to open information that everyone can benefit from. By pretending they don’t exist and proposing an unreasonable ban on the means, denies those without the capital and ability to build their own (licensed) datasets in the future, while those with the means have no problem and can even leverage their own licenses far more efficiently than any small company or individuals could. And if AI does get too good to ignore, there will be the artists that learned how to use AI, forced to work for corporations, and the ones that don’t and can’t compete. So far it’s only been optional since using AI well is actually quite hard, and only dumb CEOs would put any trust in it replacing a human. But it will speed up your workflow, and make certain tasks faster, but it doesn’t replace it in large pieces unless you’re really just making the most generic stuff ever for a living, like marketing material.

    Never heard of Cara. I don’t doubt it exists somewhere, but I’m wholly uninterested in it or putting any work I make there. I will fight tooth and nail for what I made to be mine and allowing me to profit off it, but I’m not going to argue and promote for taking away the freedom that allowed me to become who I am from others, and the freedom of people to make art in any way they like. The freedom of expression is sacred to me. I will support other more broad appealing and far more likely to succeed alternatives that will put these companies in their place, and anything sensible that doesn’t also cause casualties elsewhere. But I’m not going to be in favor of being the “freedom of expression police” against my colleagues, and friends, or anyone for that matter, on what tools they can or cannot not use to funnel their creativity into. This is a downright insidious mentality in my eyes, and so far most people I’ve had a good talk about AI with have shared that distaste, while agreeing to it being abused by big companies.

    Again, they can use whatever they want, but Nightshade (And Glaze) are not proven to be effective, in case you didn’t know. They rely on misunderstandings, and hypothetically only work under extremely favorable situations, and assume the people collecting the dataset are really, really dumb. That’s why I call it snake oil. It’s not just me saying exactly this.


  • If you think I’m being optimistic about UBI, I can only question how optimistic you are about your own position receiving wide spread support. So far not even most artists stand behind anti AI standpoints, just a very vocal minority and their supporters who even threaten and bully other artists that don’t support their views.

    It’s not about “analysis” but about for-profit use. Public domain still falls under Fair Use.

    I really don’t know what you’re trying to say here. Public domain is free of any copyright, so you don’t need a fair use exemption to use it at all. And for-profit use is not a factor for whether analysis is allowed or not. And if it was, again, it would stagnate the ability for society to invent and advance, since most frequent use is for profit. But even if it wasn’t, one company can produce the dataset or the model as a non-profit, and the other company could use that for profit. It doesn’t hold up.

    As it stands, artists are already forming their own walled off communities to isolate their work from being publicly available

    If you want to avoid being trained on by AI, that’s a pretty good way to do it yes. It can also be combined with payment. So if that helps artists, I’m all for it. But I have yet to hear any of that from the artists I know, nor seen a single practical example of it that wasn’t already explicitly private (eg. commissions or a patreon). Most artists make their work to be seen, and that has always meant accepting that someone might take your work and be inspired by it. My ideas have been stolen blatantly, and I cannot do a thing about it. That is the compromise we make between creative freedom and ownership, since the alternative would be disastrous. Even if people pay for access, once they’ve done so they can still analyze and learn from it. But yes, if you don’t want your ideas to be copied, never sharing it is a sure way to do that, but that is antithetical to why most people make art to begin with.

    creating software to poison LLMs.

    These tools are horribly ineffective though. They waste artists time and/or degrade the artwork to the point humans don’t enjoy it either. It’s an artists right to use it though, but it’s essentially snake oil that plays on these artists fears of AI. But that’s a whole other discussion.

    So either art becomes largely inaccessible to the public, or some form of horrible copyright action is taken because those are the only options available to artists.

    I really think you are being unrealistic and hyperbolic here. Neither of these have happened nor have much of chance of happening. There are billions of people producing works that could be considered art and with making art comes the desire to share it. Sure there might only be millions that make great art, but if they would mobilize together that would be world news, if a workers strike in Hollywood can do that for a significantly smaller amount of artists.

    Ultimately, I’d like a licensing system put in place Academics have to cite their sources for research That way, if they’ve used stuff that they legally shouldn’t, it can be proven.

    The reason we have sources in research is not for licensing purposes. It is to support legitimacy, to build upon the work of the other. I wouldn’t be against sourcing, but it is a moot point because companies that make AI models don’t typically throw their dataset out there. So these datasets might very well be sourced. One well known public dataset LAION 5b, does source URLs. But again, because analysis can be performed freely, this is not a requirement.

    Creating a requirement to license data for analysis is what you are arguing here for. I can already hear every large corporation salivating in the back at the idea of that. Every creator in existence would have to pay license to some big company because they interacted with their works at some point in their life and something they made looked somewhat similar. And copyright is already far more of a tool for big corporations, not small creators. This is a dystopian future to desire.


  • I think you are making the mistake of assuming disagreement with your stance means someone would say no to these questions. Simply put - it’s a strawman.

    Most (yes, even corporations, albeit much less so for the larger ones), would say “Yes” to this question on it’s face value, because they would want the same for their own “sweat of the brow”. But certain uses after the work is created no longer have a definitive “Yes” to their answer, which is why your ‘simple question’ is not an accurate representation, as it forms no distinctions between that. You cannot stop your publicly posted work from being analyzed, by human or computer. This is firmly established. As others have put in this thread, reducing protections over analysis will be detrimental to both artists as well as everyone else. It would quite literally cause society’s ability to advance to slow down if not halt completely as most research requires analysis of existing data, and most of that is computer assisted.

    Artists have always been undervalued, I will give you that. But to mitigate that, we should provide artists better protections that don’t rely on breaking down other freedoms. For example, UBI. And I wish people that were against AI would focus on that, since that is actually something you could get agreement on with most of society and actually help artists with. Fighting against technology that besides it negatives also provides great positives is a losing battle.


  • You’re confusing LLMs with other AI models, as LLMs are magnitudes more energy demanding than other AI. It’s easy to see why if you’ve ever looked at self hosting AI, you need a cluster of top line business GPUs to run modern LLMs while an image generator can be run on most consumer 3000, 4000 series Nvidia GPUs at home. Generating images is about as costly as playing a modern video game, and only when it’s generating.