• 0 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle
  • First of all, I understand your point of view. And I’ve been looking at artists being undervalued like your potential client for decades, before AI was even a thing. So I definitely feel you on that point, and I wish it would be different. That said, here’s my response. (It’s a bit long, so I put it in spoiler tags)

    I told him he wasn’t looking for a composer, but rather a programmer or something

    spoiler

    Yes, but maybe also no. Do you use computer software to compose or assist you in composing? Like FL Studio, Audacity? Or maybe you use a microphone to record the played version of your composition?

    I know maybe one or two composers, and they wouldn’t go without that while I worked with them. But I’m sure you can agree using those things does not make you a programmer. It just takes a composer with a more technical mindset and experience with those tools. I don’t deny there are composers that do without it, and maybe you are one of them. If so, rock on, but I’m sure you can see using computer tools does not stop you from being a composer, it just enhances it. Now if you were to never learn anything about composing and just use AI blindly, then I would agree with you.

    But AI in that manner is no different, and like those other pieces of software it still requires expertise to make something actually good. However, judging from the manner your client spoke to you, I think the issue wasn’t that you weren’t making good music, it’s that you were making too expensive music for the value he wanted to derive from it. That’s sadly how the free market goes, and I agree that it has disproportionately screwed over artists because their work gets systematically undervalued. However, AI is not the cause of that, it merely made it more apparent, and it will not stop with the next thing after AI, unless we tackle it at the root cause by giving artists better protections that don’t end up empowering the same people that undervalue them, which is really quite nuanced to get right and the current system we have already makes it worse than it is. This is what I fight for instead.

    _

    I could also tell you about the written assignments that students hand in, and for which I can identify in less than 30 seconds which ones have been produced by AI (students overreact to their writing skills, it’s often laughable).

    spoiler

    Students are probably the worst example of this though. Because that’s basically what students are known for before AI was even a thing. The average student has no conception or feeling yet of what has artistic value or not, and most will not go into creative fields. Students used to hand in fully plagiarized works they just downloaded or took from other students, and it is indeed laughable for anyone that actually wants to make it somewhere in their field. So yes, if that’s the majority of AI produced works you’ve encountered I can totally understand your point of view, but I implore you to broaden your horizon to people that actually work in the field. Those that already have built up the artistic mindset.

    _

    As I tell them, those who have used chatgpt have “learned” to use AI, those who have done the work have learned to carry out research, to synthesize their ideas and to structure, articulate and present them.

    spoiler

    But these people have not learned how to proficiently use AI, just very shallowly. They have learned how to be lazy. Which mind you, is the same laziness that you learn from plagiarizing directly. This has literally been the reality of people growing up for the entirety of human existence. You’re right that the ones that did go through the effort learned more, but that does not mean they could not also value from enhancing that process with other tools. And you wouldn’t even know the ones that did. Because they will not hand in something that looks like it came directly out of ChatGPT. They might have only used it for brainstorming, or proof reading, or to make a boring passage more entertaining. Someone who understands why their own effort and sense of ownership matters would never just hand in something they had zero say in, that’s what lazy people do. And we have no shortage of those.

    A small subset of your students will go the extra mile, and realize that they need to get better themselves to produce things with more artistic value. They too will see what AI can help them with, and what it can’t. Some students that are lazy now will eventually see the light too, and realize that they’re lacking behind. That’s life - maturity takes time to develop.

    But just because lazy people can play the guitar by randomly stroking the strings, doesn’t mean a competent guitar player can’t create an incredibly intricate banger with the same guitar. AI is no different.

    _

    One last thing. As far as innovation is concerned, AI can endlessly produce pieces that sound like Bach, but it took Bach to exist in the first place, and Glenn Gould to revolutionize the interpretation of his scores for this to be possible.

    spoiler

    You’re right that AI requires existing material. But you said it yourself. Glenn Gould would not be able to make his work without Bach. And just like that Bach has inspirations that would mean Bach as we know him would not exist without those. And if paper did not exist, Bach could not write down his pieces for us to remember now and learn from. In the same way, an artists of any kind in the future will not exist without their influences and tools, of which AI could be one.

    AI can indeed produce endless pieces that sound like Bach, but only a human could use AI to produce a piece that has evokes feelings, passion, thoughts - anything to be considered to be real art. A machine cannot produce the true definition of art on it’s own, but it can be invoked by an artists to do work in furtherance of their art. Because it takes a creative mind to be able to spot, transform, extend, and also know when to discard, what an AI has produced. Just like we discard sources we perceive as low in value, and sources that are high in value we take as inspiration.

    _

    EDIT: Just want to add to this:

    I have no interest in replacing this practice by entering prompts into an algorithm, even if I could make easy money from it.

    That’s not something anyone should do. Because that’s not using it as a tool. That’s making it the entire process. That’s not the kind of AI usage I’m advocating for either. And you’re free to forego AI completely. Just like there are probably some instruments you never use, or some genre you never visit. I don’t like taking the easy way either, that’s why I make creative stuff as a living too. If I just wanted money I would go elsewhere too.


  • Again - if this is your argument - then the vast majority of things humans do would be “pushing down harder on the gas pedal”. Excluding AI, more people get born every year, water usage also increases every year, electricity usage too. Even if you got rid of all AI right now you would have to overcome those much more significant increases to make a difference. It just doesn’t even make a dent. And it has to, if you want it to actually reduce the impact of climate change and resource depletion.

    The world does not stand still, even if we did everything we should to stop climate change. Technology that can change the world and facilitates happier, healthier humans is not a bad thing for a reasonable price. And as I just explained in detail, that price is not that significant in the grand scheme of things. Hence why there is no significant public outrage from this.

    If you’re going to hold this position, you should really stick to the biggest polluters, which as you agree, are not getting enough pushback. I agree with that as well, and I would happily stand by your side here. But if someone is handing out pie, and you think everyone should be angry at someone taking 0.00532% of the pie, that is horribly ineffective at actually getting the change we need. Since basically nobody reasonable is going to agree with you. While for the larger polluters it is easily self evident they need it, and we still have a lot of trouble with that.

    And Sam Altman went to the emirates asking for Trillions to scale LLMs. All of this for a little more convenience when tackling mostly mundane tasks.

    I don’t know how many times I have to say this, but I don’t like the big tech companies use of AI. That does not say anything about the technology at large though. Screw OpenAI and Sam Altman. If your criticism is purely aimed at wasteful conduct by big companies, I’m all there with you. But there are so many smaller companies that also use AI and LLMs.



  • I already know of the stats you speak of, and it’s not nothing. But as I explain below, on the scale of other pollutants and for how widespread it’s usage is, it is a footnote. There are far bigger fish to fry first when it comes to reducing water usage and energy usage.

    Lets look up some stats together. Most sources agree that we use about 4 trillion cubic meters of water every year worldwide (Although, this stat is from 2015 most likely, and so it will be bigger now). In 2022, using the stats here Microsoft used 1.7 billion gallons per year, and Google 5.56 billion gallons per year. In cubic meters that’s only 23.69 million cubic meters. That’s only 0.00059% of the worldwide water usage. Meanwhile agriculture uses 70%. Granted not every country uses 70%, but a 1% gain there overshadows any current and even future usage.

    And even if we just look at the US, since that’s where Google and Microsoft are based, which uses 322 billion gallons of water every day, resulting in about 445 billion cubic meters per year, that’s still 0.00532%. So it’s not hugely significant inside the US either, we can have 187 more Googles and Microsofts before we even top a single percentage.

    There are plenty of other hobbies that are also terrible for the environment but we don’t tackle them because like AI it would be a drop in the bucket. Honestly ask yourself, if AI was such a big issue and would threaten lives, why is nobody actually taking serious measures for it? Why are there no public campaigns to get people to stop using AI to save the water? It’s because the stats are concerning, but not significantly so, and if the stats don’t actually back up the position that AI usage is slowly sucking us dry, people don’t stand behind it. If the advancement of AI allows us to save more lives through medical research, or allows us to farm more efficiently, it can easily pay back it’s investment. And those things outweigh the negatives.

    However, as I’ve stated multiple times below, I despise the wasteful usage of AI, and therefore you won’t see me praising Bing or Google for their usage of AI in a way that makes no sense and just wastes resources. But the technology exists for everyone, not just those companies. There are more sustainable LLM models than GPT-4, and OpenAI can rightfully be criticized for not prioritizing efficiency rather that bigger and more power hungry models.

    EDIT: Added US comparison EDIT2: Double checked some of the math



  • Most things produced by AI and assisted by AI are still human creation, as it requires a human to guide it to what it’s making. Human innovation is also very much based on mixing materials it’s seen before in new creative manners. Almost no material is truly innovative. Ask any honest artists about their inspirations and they can tell you what parts of their creations were inspired by what. Our world has explored the depths of most art forms so there is more than a lifetime’s worth of art to mix and match. Often the real reason things feel fresh and new is because they are fresh and new to us, but already existed in some form out there before it came to our attention.

    That AI can match this is easily proven by fact AI can create material that no human would realistically make (like AI generated QR codes, or ‘cursed’ AI), very proficient style mixing that would take a human extensive study of both styles to pull off (eg. Pokemon and real life), or real looking images that could not realistically, financially, conscionably, be made using normal methods (eg. A bus full of greek marble statues).

    Nobody is saying you have to like AI art, and depending on your perspective, some or most of it will still be really low effort and not worth paying attention to, but that was already the state of art before AI. Lifetimes of art are being uploaded every day, but nobody has the time to view it all. So I would really keep an open mind that good AI art and AI assisted art exists out there, and you might one day come to like it but not realize you’re seeing it, because good AI usage is indistinguishable from normal art.


  • And Google as one of the largest companies in the world should be held fully responsible for that. But Google isn’t everyone. Google also has a huge amount of computers under their name, but does that mean everyone with a computer should be held to the same standards because it’s the same technology?

    If sweeping conclusions are going to be made about the technology, it has to be looked at outside of the context of a specific company and how they implement and use AI. Otherwise, it should be specified that this is a criticism of specific companies and how they use AI, and I’d be totally there agreeing with you in the case of Google.


  • The earth has been warming for decades at this point, even before AI. We know what causes climate change and AI has so far been a footnote on a large list of unsolved problems. The moment that changes I’ll be right there with you, but I’m far more interested in taking down the companies that are largely responsible for it. This logic follows that just living life as a human is unethical because you can’t be environmentally neutral in today’s society, and I reject that notion.


  • Yes, exactly. And I don’t even disagree with making things better on the environment. It’s why I dislike LLMs being pushed into random things that don’t really need it. And if more efficient models exist, they should preferably be used.

    But using AI to make your life a little better also brings positives that outweigh that cost. And it seems like as a society we have much bigger polluters to take care of that use orders more electricity and water, and we already struggle massively doing just that. So it really feels like misdirected anger towards the unfairness in society (which the societal cost they mentioned also seems to point to) than a real criticism of AI. And I can understand that anger, but not the mindset behind the conclusions derived from it.


  • This kind of AI approaches art in a way that finally kinda makes sense for my brain, so it’s frustrating seeing it shot down by people who don’t actually understand it. Stop using this stuff for tasks it wasn’t meant for (unless it’s a novelty “because we could” kind of way) and it becomes a lot more palatable.

    Preach! I’m surprised to hear it works for people with aphantasia too, and that’s awesome. I personally have a very vivid mind’s eye and I can often already imagine what I want something to look like, but could never put it to paper in a satisfying way that didn’t cost excruciating amount of time. GenAI allows me to do that with still a decent amount of touch up work, but in a much more reasonable timeframe. I’m making more creative work than I’ve ever been because of it.

    It’s crazy to me that some people at times completely refuse to even acknowledge such positives about the technology, refuse to interact with it in a way that would reveal those positives, refuse to look at more nuanced opinions of people that did interact with it, refuse even simple facts about how we learn and interact with other art and material, refusing legal realities like the freedom to analyze that allow this technology to exist (sometimes even actively fighting to restrict those legal freedoms, which would hurt more artists and creatives than it would help, and give even more more power to corporations and those with enough capital to self sustain AI model creation).

    It’s tiring, but luckily it seems to be mostly an issue on the internet. Talking to people (including artists) in real life about it shows that it’s a very tiny fraction that holds that opinion. Keep creating 👍


  • Totally second the latter part - it’s the self destructive nature of being blindly anti-AI. Pretty much everyone would support giving more rights and benefits to people displaced by AI, but only a fraction of that group would support an anti-AI mentality. If you want to work against the negative effects of AI in a way that can actually change things, the solution is not to push against the wall closing in on you, but to find the escape.


  • This is like saying you can’t play video games because it costs electricity and you can go without. You can say it about literally everything that isn’t strictly necessary to live. AI isn’t just LLMs and only LLMs have a high environmental cost, and unless you are literally wasting the output like the big tech companies are, even that can be justified for the right reasons.