…LLM users displayed the weakest connectivity. Cognitive activity scaled down in relation to external tool use… LLM users also struggled to accurately quote their own work. While LLMs offer immediate convenience, our findings highlight potential cognitive costs. Over four months, LLM users consistently underperformed at neural, linguistic, and behavioral levels.
Outsourcing thinking from your brain to an AI literally makes you dumber, less confident in the output, and teaches you nothing.
Call me a Luddite or a hater, but if you’re one of the people who uses AI as a shortcut to actual thought or learning, I will judge you and disregard your output and opinions. Form your own basis of understanding and knowledge instead of a teaspoon deep summary that is frequently incorrect.
They say that, when making an Anki deck, using it is only half the battle because a lot of the learning comes from the act of making it yourself. That advice is older than these LLMs and it really showcases a big reason why they suck. Personally, I haven’t even used autocorrect since 2009.
Being a luddite I feel requires having a highly abstinence-only approach. Knowing what is worth off-loading and what is worth doing yourself is just being smart. I’m really glad that I don’t need to know every detail of modern life but I still take a lot of pride in knowing how quite a lot of it works.
Oh for sure, I won’t argue that, but it does explain my point. Even when I use a program with the squiggly red line I correct it myself so that I can reinforce the correct spelling.
Personally dont think that’s a good comparison. I would say it’s more like taking a photo and claiming you know how to paint. You’re still actually cre a ting something, but using a digital tool that does it for you. You chose the subject and fiddle with setting to get a better image closer to what you want and then can take it into software to edit it further.
Its art in its own right, but you shouldn’t directly compare it to painting.
Even that is a bad analogy, it’s like commissioning a painter to paint something for you, and then claiming you know how to paint. You told an entity that knows how to do stuff what you wanted, and it gave it to you. Sure, you can ask for tweaks here and there, but in terms of artistic knowledge, you didn’t need any and didn’t provide any, and you didn’t really directly create anything. Taking a decent photo requires more knowledge than generating something on ChatGPT. Not to mention actually being in front of the thing you want a photo of.
I mean i think i explained myself quite well already, and not to be insulting to you, but i dont think you’re willing to accept any argument i would make that goes against what you already beleive, since your argument against it simply you asserting your own beliefs (that AI art isnt art) as an immutable fact
Oh, I’m not saying AI art isn’t art. It is. I’m just saying that the person writing the prompt didn’t create it, or do anything remotely skilled or artistic to get the result.
Outsourcing thinking from your brain to an AI literally makes you dumber, less confident in the output, and teaches you nothing.
Call me a Luddite or a hater, but if you’re one of the people who uses AI as a shortcut to actual thought or learning, I will judge you and disregard your output and opinions. Form your own basis of understanding and knowledge instead of a teaspoon deep summary that is frequently incorrect.
They say that, when making an Anki deck, using it is only half the battle because a lot of the learning comes from the act of making it yourself. That advice is older than these LLMs and it really showcases a big reason why they suck. Personally, I haven’t even used autocorrect since 2009.
Being a luddite I feel requires having a highly abstinence-only approach. Knowing what is worth off-loading and what is worth doing yourself is just being smart. I’m really glad that I don’t need to know every detail of modern life but I still take a lot of pride in knowing how quite a lot of it works.
Genuinely the weirdest flex I have ever seen.
Oh for sure, I won’t argue that, but it does explain my point. Even when I use a program with the squiggly red line I correct it myself so that I can reinforce the correct spelling.
Same with using a calculator, no? Or not memorising log tables.
Better comparison would be opening a song on radio and saying “see I can produce music.” You still don’t know about music production in the end.
Personally dont think that’s a good comparison. I would say it’s more like taking a photo and claiming you know how to paint. You’re still actually cre a ting something, but using a digital tool that does it for you. You chose the subject and fiddle with setting to get a better image closer to what you want and then can take it into software to edit it further.
Its art in its own right, but you shouldn’t directly compare it to painting.
Even that is a bad analogy, it’s like commissioning a painter to paint something for you, and then claiming you know how to paint. You told an entity that knows how to do stuff what you wanted, and it gave it to you. Sure, you can ask for tweaks here and there, but in terms of artistic knowledge, you didn’t need any and didn’t provide any, and you didn’t really directly create anything. Taking a decent photo requires more knowledge than generating something on ChatGPT. Not to mention actually being in front of the thing you want a photo of.
I think my analogy is more accurate
Care to explain? I think your analogy gives the credit of art creation to someone who didn’t create art, and thus is flawed.
I mean i think i explained myself quite well already, and not to be insulting to you, but i dont think you’re willing to accept any argument i would make that goes against what you already beleive, since your argument against it simply you asserting your own beliefs (that AI art isnt art) as an immutable fact
Oh, I’m not saying AI art isn’t art. It is. I’m just saying that the person writing the prompt didn’t create it, or do anything remotely skilled or artistic to get the result.