Honestly I don’t understand how it can be unethical use if it’s referencing millions of different data points, anymore than an artist is unethical for studying famous works of a style they want to use or an author is unethical for looking at how other authors write certain scenes. The vast majority of art is built on what came before, this is just the next technological extension of that. Obviously if you exclusively train it on a single artist it’ll just imitate them, but once you’ve got several dozen in the mix can it really be said to be taking enough from any single one to be stealing?
If materials are copyrighted or legally protected, and the Ai isn’t developed to protect such materials, then yes it is effectively stealing.
And that is the crux of the issue, where is the line drawn for Ai to pull publicly available data that is not legally protected?
Sure, humans have been reinventing the wheel for an extremely long time, and have also taken actions to protect their creations for just as long, Ai ( and those that created such an Ai system ) should be held just as accountable for the consequences of use of protected data just as any human being would be held accountable for such use.
Copyright isn’t nearly as clear-cut as all that; or do you think all those fanart Patreons and similar accounts are operating strictly within the bounds of IP laws and fair use? And to a certain degree the argument that the AI isn’t legitimate because it’s removing hands on steps could be leveled at every art program that incorporates time-saving features. Obviously there’s differences, but at the same time there’s always an element of gatekeeping to the “if a program is doing some of the work, it’s not real art” position, at least for a few years until it becomes mostly normalized. I’m not saying the question of copyrights and originality should simply be ignored for AI works, but these absolute declarations that they cannot be anything but plagiarism seem rather like knee-jerk reactionary responses rather than considered positions.
Honestly I don’t understand how it can be unethical use if it’s referencing millions of different data points, anymore than an artist is unethical for studying famous works of a style they want to use or an author is unethical for looking at how other authors write certain scenes. The vast majority of art is built on what came before, this is just the next technological extension of that. Obviously if you exclusively train it on a single artist it’ll just imitate them, but once you’ve got several dozen in the mix can it really be said to be taking enough from any single one to be stealing?
Its not the fact of the AI sampling enough to qualify as a copy write infringement. Its from the other end. Did the AI creator have the copywrite to sample that content?
Just because something in in the library for you to read for free the library was given permission to loan out that material. You were not given permission to use it an AI program.
So asking the Ai to write you a short story in the style of lets say H.P.Lovecraft, would not be possible because the AI does not have permission to sample or read his stories.
Actually Hasbro would be the only legal entity that could use the official D&D material to create an AI using it.
I do not think AI is included in any open source agreements.
And if they do come to an agreement about some third party Ai program I bet they demand a payment every single time the Ai is used. Just think of all the times someone uses the free content of D&DB. Imagine if D&DB had to pay someone for every single use. Ever single page reference.
I will say I do not feel there is an ethical use for "ai", it's very nature is to be unethical. JMHO.
What's so fundamental and intrinsic to AI use that's unethical, that you can't change to make it ethical? Every criticism I've heard has been how it's used (basically copyright infringement) rather than AI per se. The only thing I can see is that artists etc might lose their jobs, but while that's a shame and I have sympathy for them, given the amount of jobs lost to automation in our lives, it'd be oddly selective to criticise AI for that.
I don't have much to add to what has been posted between our posts.
Rollback Post to RevisionRollBack
CENSORSHIP IS THE TOOL OF COWARDS and WANNA BE TYRANTS.
Honestly I don’t understand how it can be unethical use if it’s referencing millions of different data points, anymore than an artist is unethical for studying famous works of a style they want to use or an author is unethical for looking at how other authors write certain scenes. The vast majority of art is built on what came before, this is just the next technological extension of that. Obviously if you exclusively train it on a single artist it’ll just imitate them, but once you’ve got several dozen in the mix can it really be said to be taking enough from any single one to be stealing?
Its not the fact of the AI sampling enough to qualify as a copy write infringement. Its from the other end. Did the AI creator have the copywrite to sample that content?
Just because something in in the library for you to read for free the library was given permission to loan out that material. You were not given permission to use it an AI program.
So asking the Ai to write you a short story in the style of lets say H.P.Lovecraft, would not be possible because the AI does not have permission to sample or read his stories.
Well, in point of fact Lovecraft’s works are in the public domain. And I do take your underlying point, but as I previously pointed out the actual practical application of copyrights and fair use, even for transactional circumstances, has a distinct grey area if only by precedent. Again, it’s not that there’s no cause for regulation, but the reality is that sampling from someone else for your own work that you then make money off of is something that does happen on a relatively wide scale. Obviously there’s significant differences if this starts coming into play at the corporate level, but in important ways the difference between what happens with AI and what’s been happening on various art sites for years and years is just a compression of time.
Honestly I don’t understand how it can be unethical use if it’s referencing millions of different data points, anymore than an artist is unethical for studying famous works of a style they want to use or an author is unethical for looking at how other authors write certain scenes. The vast majority of art is built on what came before, this is just the next technological extension of that. Obviously if you exclusively train it on a single artist it’ll just imitate them, but once you’ve got several dozen in the mix can it really be said to be taking enough from any single one to be stealing?
Plenty of examples IRL like coin clipping, deposit smurfing and movies like Superman III, Office Space et al.
Where is the threshold for theft to become legal? That is the premise isn't it, as long as you don't steal "too" much at a time it is ok?
My ethics say theft is theft no matter if the theft is detectable or not.
That is the trouble with "ethics" mine are mine and yours are yours and if ours aren't similar enough problems arise.
How much sampling does it take to trigger a court settlement?
There is no defined threshold, it's just an element in the four factor test for fair use. Hashing out what all of that means for generative AI is pretty much guaranteed to produce some interesting lawsuits.
AI means different things to different people. That's a big problem when trying to discuss things that use that phrase.
Models. Algorithms. Deep learning. Neural network. Training data. Generative. Predictive. Etc.: There are specific technical definitions, and there are the definitions assumed by people who want to discuss a technology either they don't understand, they fear, or they hope to exploit without any care how accurate their assumptions are.
As for ethics, I have a rule-of-thumb:
If you think you can get the permission from artists, writers, or performers to have algorithms use their works to generate something they didn't create even for distributing freely, then get permission.
If you don't think you can get permission for it, then be respectful of the artists', writers', and performers' wishes don't use their works to generate something they didn't create even for distributing freely.
Just because someone thinks someone else should be okay with it doesn't immediately make it okay with that other person.
Rollback Post to RevisionRollBack
Human. Male. Possibly. Don't be a divider. My characters' backgrounds are written like instruction manuals rather than stories. My opinion and preferences don't mean you're wrong. I am 99.7603% convinced that the digital dice are messing with me. I roll high when nobody's looking and low when anyone else can see.🎲 “It's a bit early to be thinking about an epitaph. No?” will be my epitaph.
How much sampling does it take to trigger a court settlement? Ai has to be held below that threshold other wise the programmers need permission.
Just like when a paper is written and published, all references must be listed and included, otherwise its called plagiarism.
So as long as a court won't hear the case it is ethical to steal?
No
A case is filled under specific clauses and reason. If the judge refuses to try the case it could and normally means that it was filed under the wrong reasons. Or just cited the wrong cases and legal rulings. It does not in any way mean something is legal to do.
How much sampling does it take to trigger a court settlement?
There is no defined threshold, it's just an element in the four factor test for fair use. Hashing out what all of that means for generative AI is pretty much guaranteed to produce some interesting lawsuits.
The first case will set the tone for the rest of the industry. And Hasbro has millions to defend its IP.
What if Ai was being used to create music? These cases would already be in court.
What if Ai was being used to create music? These cases would already be in court.
TRIVIA: It already has been done a few years ago, but it was done with Over the Bridge getting permission from estates and publisher rights of deceased performers. Google's Magenta was a major factor in generating the content... again, with permission.
Checking up on the info, I just discovered that, almost a decade ago, Benoit Carré and François Pachet used something called Flow Machines as part of a research project in generating music in the style of the Beatles. The result is something called Daddy's Car (can be found on SONY CSL Paris' YT channel). ...again, done with Sony's permission (as they own much of the Beatles' works).
Human. Male. Possibly. Don't be a divider. My characters' backgrounds are written like instruction manuals rather than stories. My opinion and preferences don't mean you're wrong. I am 99.7603% convinced that the digital dice are messing with me. I roll high when nobody's looking and low when anyone else can see.🎲 “It's a bit early to be thinking about an epitaph. No?” will be my epitaph.
A fair bit on AI but not really that much. If Cox really wants to make money he should look at ways where a DM can use AI to run a campaign. Imagine how much time you could save for the DM to handle the rolls, narration and adventure set up. I mean the DM could sit back and let the game go and not even have to be there the whole time. Even players could save time and have AI make their choices and not even be present. I mean we could unleash so much creativity and save so much time with AI. Dead Internet theory has nothing on this.
We’ve had well over 35+ years for someone somewhere to have developed an AI to run D&D as a GM/DM but I don’t see one anywhere near capable as a human.
And IMO it will be a long time till anything made can come close.
We’ve had well over 35+ years for someone somewhere to have developed an AI to run D&D as a GM/DM but I don’t see one anywhere near capable as a human.
And IMO it will be a long time till anything made can come close.
An AI running a simple dungeon crawl is probably possible in the near future, if not already. That is just a matter of laying out rooms, populating it with traps, monsters, and puzzles, then being able to react to changing targeting parameters and such. We already have video games that do that - AI would just be able to expand on that with better reactions to player decisions.
I also expect we are pretty close to an AI which run fairly basic campaigns. There certainly is a market for that, both at the one-shot length (Mansions of Madness) and even the longer-form campaign length (Gloomhaven). While we are nowhere close to an AI which can replace a human DM entirely, an AI probably could create a Gloomhaven-like experience - something which is a pale facsimile of D&D, but still finds success.
So, Cocks claimed that they'd be able to do it with their own IP since they have so much of it and to be honest, they'd have to anyway - the point of the DM is to play D&D, they don't want you to start roaming around Mos Eisley.
Why is that unethical? That's without the debate of why me looking at a picture and copying the style is fine, but an AI doing similar (remember, WotC caught flak because an artist used AI to do touch ups, not to create the artwork in the first place), but why is it inherently unethical?
As for environmental concerns, it would be interesting to see actual numbers involved, specifically how much it would increase things like water consumption compared to what we're using now by using DDB.
Rollback Post to RevisionRollBack
If you're not willing or able to to discuss in good faith, then don't be surprised if I don't respond, there are better things in life for me to do than humour you. This signature is that response.
To post a comment, please login or register a new account.
If materials are copyrighted or legally protected, and the Ai isn’t developed to protect such materials, then yes it is effectively stealing.
And that is the crux of the issue, where is the line drawn for Ai to pull publicly available data that is not legally protected?
Sure, humans have been reinventing the wheel for an extremely long time, and have also taken actions to protect their creations for just as long, Ai ( and those that created such an Ai system ) should be held just as accountable for the consequences of use of protected data just as any human being would be held accountable for such use.
Copyright isn’t nearly as clear-cut as all that; or do you think all those fanart Patreons and similar accounts are operating strictly within the bounds of IP laws and fair use? And to a certain degree the argument that the AI isn’t legitimate because it’s removing hands on steps could be leveled at every art program that incorporates time-saving features. Obviously there’s differences, but at the same time there’s always an element of gatekeeping to the “if a program is doing some of the work, it’s not real art” position, at least for a few years until it becomes mostly normalized. I’m not saying the question of copyrights and originality should simply be ignored for AI works, but these absolute declarations that they cannot be anything but plagiarism seem rather like knee-jerk reactionary responses rather than considered positions.
Its not the fact of the AI sampling enough to qualify as a copy write infringement. Its from the other end. Did the AI creator have the copywrite to sample that content?
Just because something in in the library for you to read for free the library was given permission to loan out that material. You were not given permission to use it an AI program.
So asking the Ai to write you a short story in the style of lets say H.P.Lovecraft, would not be possible because the AI does not have permission to sample or read his stories.
Actually Hasbro would be the only legal entity that could use the official D&D material to create an AI using it.
I do not think AI is included in any open source agreements.
And if they do come to an agreement about some third party Ai program I bet they demand a payment every single time the Ai is used. Just think of all the times someone uses the free content of D&DB. Imagine if D&DB had to pay someone for every single use. Ever single page reference.
I don't have much to add to what has been posted between our posts.
CENSORSHIP IS THE TOOL OF COWARDS and WANNA BE TYRANTS.
Well, in point of fact Lovecraft’s works are in the public domain. And I do take your underlying point, but as I previously pointed out the actual practical application of copyrights and fair use, even for transactional circumstances, has a distinct grey area if only by precedent. Again, it’s not that there’s no cause for regulation, but the reality is that sampling from someone else for your own work that you then make money off of is something that does happen on a relatively wide scale. Obviously there’s significant differences if this starts coming into play at the corporate level, but in important ways the difference between what happens with AI and what’s been happening on various art sites for years and years is just a compression of time.
Plenty of examples IRL like coin clipping, deposit smurfing and movies like Superman III, Office Space et al.
Where is the threshold for theft to become legal? That is the premise isn't it, as long as you don't steal "too" much at a time it is ok?
My ethics say theft is theft no matter if the theft is detectable or not.
That is the trouble with "ethics" mine are mine and yours are yours and if ours aren't similar enough problems arise.
CENSORSHIP IS THE TOOL OF COWARDS and WANNA BE TYRANTS.
Think of it like music copy writing.
How much sampling does it take to trigger a court settlement?
Ai has to be held below that threshold other wise the programmers need permission.
Just like when a paper is written and published, all references must be listed and included, otherwise its called plagiarism.
So as long as a court won't hear the case it is ethical to steal?
CENSORSHIP IS THE TOOL OF COWARDS and WANNA BE TYRANTS.
There is no defined threshold, it's just an element in the four factor test for fair use. Hashing out what all of that means for generative AI is pretty much guaranteed to produce some interesting lawsuits.
AI means different things to different people. That's a big problem when trying to discuss things that use that phrase.
Models. Algorithms. Deep learning. Neural network. Training data. Generative. Predictive. Etc.: There are specific technical definitions, and there are the definitions assumed by people who want to discuss a technology either they don't understand, they fear, or they hope to exploit without any care how accurate their assumptions are.
As for ethics, I have a rule-of-thumb:
Just because someone thinks someone else should be okay with it doesn't immediately make it okay with that other person.
Human. Male. Possibly. Don't be a divider.
My characters' backgrounds are written like instruction manuals rather than stories. My opinion and preferences don't mean you're wrong.
I am 99.7603% convinced that the digital dice are messing with me. I roll high when nobody's looking and low when anyone else can see.🎲
“It's a bit early to be thinking about an epitaph. No?” will be my epitaph.
No
A case is filled under specific clauses and reason. If the judge refuses to try the case it could and normally means that it was filed under the wrong reasons. Or just cited the wrong cases and legal rulings.
It does not in any way mean something is legal to do.
The first case will set the tone for the rest of the industry. And Hasbro has millions to defend its IP.
What if Ai was being used to create music? These cases would already be in court.
Hasbro (and RPGs in general) are a bit player in this whole dispute; it's going to be artists and publishers against big tech.
Universal Music Group sued Anthropic last October. Case is ongoing.
TRIVIA: It already has been done a few years ago, but it was done with Over the Bridge getting permission from estates and publisher rights of deceased performers. Google's Magenta was a major factor in generating the content... again, with permission.
Checking up on the info, I just discovered that, almost a decade ago, Benoit Carré and François Pachet used something called Flow Machines as part of a research project in generating music in the style of the Beatles. The result is something called Daddy's Car (can be found on SONY CSL Paris' YT channel). ...again, done with Sony's permission (as they own much of the Beatles' works).
Human. Male. Possibly. Don't be a divider.
My characters' backgrounds are written like instruction manuals rather than stories. My opinion and preferences don't mean you're wrong.
I am 99.7603% convinced that the digital dice are messing with me. I roll high when nobody's looking and low when anyone else can see.🎲
“It's a bit early to be thinking about an epitaph. No?” will be my epitaph.
https://venturebeat.com/games/how-hasbro-is-jumping-on-the-game-opportunity-chris-cocks-interview/
A fair bit on AI but not really that much. If Cox really wants to make money he should look at ways where a DM can use AI to run a campaign. Imagine how much time you could save for the DM to handle the rolls, narration and adventure set up. I mean the DM could sit back and let the game go and not even have to be there the whole time. Even players could save time and have AI make their choices and not even be present. I mean we could unleash so much creativity and save so much time with AI. Dead Internet theory has nothing on this.
We’ve had well over 35+ years for someone somewhere to have developed an AI to run D&D as a GM/DM but I don’t see one anywhere near capable as a human.
And IMO it will be a long time till anything made can come close.
An AI running a simple dungeon crawl is probably possible in the near future, if not already. That is just a matter of laying out rooms, populating it with traps, monsters, and puzzles, then being able to react to changing targeting parameters and such. We already have video games that do that - AI would just be able to expand on that with better reactions to player decisions.
I also expect we are pretty close to an AI which run fairly basic campaigns. There certainly is a market for that, both at the one-shot length (Mansions of Madness) and even the longer-form campaign length (Gloomhaven). While we are nowhere close to an AI which can replace a human DM entirely, an AI probably could create a Gloomhaven-like experience - something which is a pale facsimile of D&D, but still finds success.
Rogue first came out in 1980.
So, Cocks claimed that they'd be able to do it with their own IP since they have so much of it and to be honest, they'd have to anyway - the point of the DM is to play D&D, they don't want you to start roaming around Mos Eisley.
Why is that unethical? That's without the debate of why me looking at a picture and copying the style is fine, but an AI doing similar (remember, WotC caught flak because an artist used AI to do touch ups, not to create the artwork in the first place), but why is it inherently unethical?
As for environmental concerns, it would be interesting to see actual numbers involved, specifically how much it would increase things like water consumption compared to what we're using now by using DDB.
If you're not willing or able to to discuss in good faith, then don't be surprised if I don't respond, there are better things in life for me to do than humour you. This signature is that response.