Getting a bit frustrated over the builder and 5.5E, I tried using AI. It turned out to be frustratingly wrong in interpreting the (new) rules. So I'm now back to using pen, paper sheets and the books. Just for my curiosity: do you use AI?
Absolutely not! 1) Hate the copyright implications of training LLMs and AI. 2) Hate the energy consumption for even menial tasks, let alone image generation.
Absolutely not! 1) Hate the copyright implications of training LLMs and AI. 2) Hate the energy consumption for even menial tasks, let alone image generation.
This ^^^
But also you’re using a tool that simply doesn’t understand the rules, they’re programmed to please you and will tell you out right lies if they think that’s what you want to hear. Repeated tests have shown it’s not simply hallucinating answers like earlier LLM were but instead many modern ones are literally choosing to ignore evidence in order to give an answer it thinks you want. That’s absolutely useless when asking it to tell you rules
A character? Absolutely not. I have more than enough character ideas and the scope of a character itself is so small scale that I can't imagine ever needing AI to help. It would be just as much effort to train the AI to be of use as it would just to make the character off my own back.
A campaign? I haven't yet. I've always found enough prompts and so forth from players on top of written adventures that there's just not been any need. I wouldn't say I'd never use it...but at the moment I don't envisage the need.
Rollback Post to RevisionRollBack
If you're not willing or able to to discuss in good faith, then don't be surprised if I don't respond, there are better things in life for me to do than humour you. This signature is that response.
LLM's (mistakenly referred to as AI by tech marketing campaigns) cannot actually know a ruleset (and this even applies to basic math), all it can do is predict what it calculates the most likely word to appear next in a sequence might be, which it does by plundering data samples of content they don't have permission to use.
Not only that, but each queary you submit to an LLM consumes about a gallon of fresh drinking water, as well as LLM data centers consuming inordinate amounts of electricity that's projected to result in rolling blackouts across the US as early as next year if use continues as is.
Absolutely not! 1) Hate the copyright implications of training LLMs and AI. 2) Hate the energy consumption for even menial tasks, let alone image generation.
This ^^^
But also you’re using a tool that simply doesn’t understand the rules, they’re programmed to please you and will tell you out right lies if they think that’s what you want to hear. Repeated tests have shown it’s not simply hallucinating answers like earlier LLM were but instead many modern ones are literally choosing to ignore evidence in order to give an answer it thinks you want. That’s absolutely useless when asking it to tell you rules
You're giving them way too much credit.
They're not "telling lies" or "ignoring evidence" -- they have no conception of truth, nor are they capable of such a thing.
As for the original question:
I'd rather give up the game.
Beyond even the environmental concerns, the ethical concerns, and the fact that it can't actually do the thing. (Which are all extremely real problems with the tech, don't get me wrong.)
This is a game about being creative. If you offload that to a computer, what even is the point?
Another thought: I do not even believe that the LLMs out there have full training access on the D&D rules, exactly because the full rules are not available for free in full. The thing is, nobody knows, how all these greedy AI corpos actually get data into their LLMs, and what data they feed on.
It would be possible to build an AI that was actually a useful rules assistant -- but it would need to be specially trained for the task, an AI that is trained on gobbling as much data as it can find on the internet might answer rules queries with the actual rules, but it also might produce rules from a different edition or game system, someone's house rules, random internet discussion, or just invent an answer.
Another thought: I do not even believe that the LLMs out there have full training access on the D&D rules, exactly because the full rules are not available for free in full. The thing is, nobody knows, how all these greedy AI corpos actually get data into their LLMs, and what data they feed on.
Oh, it's pretty well known. The exact contents may not be, but many of the big datasets used are well-known, and at least one of them (libgen), contains (IIRC) hundreds of thousands of pirated books. A quick check for "Jeremy Crawford" indicates that at least some of the 5e rulebooks are in it. (And authorship data may not be complete, and also libgen is not the only data set used.)
It would be possible to build an AI that was actually a useful rules assistant -- but it would need to be specially trained for the task, an AI that is trained on gobbling as much data as it can find on the internet might answer rules queries with the actual rules, but it also might produce rules from a different edition or game system, someone's house rules, random internet discussion, or just invent an answer.
I don't think one could do it with an LLM, even with dedicated training. They aren't designed that way. (Also, the amount of text needed to make them work is way too large to just train on D&D materials, so you'd end up with a lot of non-D&D text involved.)
It might well be possible with a dedicated machine learning system, but that's a lot of specialized work, possibly more than it'd take to make a BG3-style game engine.
I looked at the poll options and wanted to comment that the poll needs a "no" option for a host of ethical and creative reasons, but y'all are already hammering that point home. Points to Plaguescarred for focusing on fun; I know for some of us Forever DMs D&D can be a lot of work, but it's still fun work. I wouldn't want to use a bot to skip all the fun of character and campaign building even if AI weren't destructive to individuals, communities, professions, and the environment.
AI should not be used for anything technical. The character builder already does pretty much all of the work and AI just gets it wrong.
AI can work well as a prompt for story ideas. It's not that great on making them good for a campaign, and seems to want to write stories not situations, but it can be useful to ask it to throw out 50 random taverns with some flavour, or a bunch of simple NPCs that you can flesh out when they're used.
LLM's are fuzzy. I don't mean to say that they're incapable, but the current Large Language Models, are not taught, they're trained. They don't learn by being corrected every time, they learn by ingesting every piece of information possible, so even if you trained one on the D&D Books, it would include 10,000 bad takes from Reddit, ENWorld, the forums here, etc etc. And even then, it would only get it 'mostly' right. Even the weakest rules lawyer could beat it.
TL;DR: It can be useful for flavour and flowery text, but it's woeful at accurate technical tasks.
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
Getting a bit frustrated over the builder and 5.5E, I tried using AI.
It turned out to be frustratingly wrong in interpreting the (new) rules.
So I'm now back to using pen, paper sheets and the books.
Just for my curiosity: do you use AI?
Absolutely not! 1) Hate the copyright implications of training LLMs and AI. 2) Hate the energy consumption for even menial tasks, let alone image generation.
This ^^^
But also you’re using a tool that simply doesn’t understand the rules, they’re programmed to please you and will tell you out right lies if they think that’s what you want to hear. Repeated tests have shown it’s not simply hallucinating answers like earlier LLM were but instead many modern ones are literally choosing to ignore evidence in order to give an answer it thinks you want. That’s absolutely useless when asking it to tell you rules
A character? Absolutely not. I have more than enough character ideas and the scope of a character itself is so small scale that I can't imagine ever needing AI to help. It would be just as much effort to train the AI to be of use as it would just to make the character off my own back.
A campaign? I haven't yet. I've always found enough prompts and so forth from players on top of written adventures that there's just not been any need. I wouldn't say I'd never use it...but at the moment I don't envisage the need.
If you're not willing or able to to discuss in good faith, then don't be surprised if I don't respond, there are better things in life for me to do than humour you. This signature is that response.
No.
Thou shalt not make a machine in the likeness of a human mind.
LLM's (mistakenly referred to as AI by tech marketing campaigns) cannot actually know a ruleset (and this even applies to basic math), all it can do is predict what it calculates the most likely word to appear next in a sequence might be, which it does by plundering data samples of content they don't have permission to use.
Not only that, but each queary you submit to an LLM consumes about a gallon of fresh drinking water, as well as LLM data centers consuming inordinate amounts of electricity that's projected to result in rolling blackouts across the US as early as next year if use continues as is.
Don't use LLM'S.
You're giving them way too much credit.
They're not "telling lies" or "ignoring evidence" -- they have no conception of truth, nor are they capable of such a thing.
As for the original question:
I'd rather give up the game.
Beyond even the environmental concerns, the ethical concerns, and the fact that it can't actually do the thing. (Which are all extremely real problems with the tech, don't get me wrong.)
This is a game about being creative. If you offload that to a computer, what even is the point?
It's unethical tech.
It's a scam sold by money piles.
& it doesn't work.
DM, player & homebrewer(Current homebrew project is an unofficial conversion of SBURB/SGRUB from Homestuck into DND 5e)
Once made Maxwell's Silver Hammer come down upon Strahd's head to make sure he was dead.
Always study & sharpen philosophical razors. They save a lot of trouble.
I also voted no. Not because it doesn’t understand the rules, but for all the other reasons given above.
Another thought: I do not even believe that the LLMs out there have full training access on the D&D rules, exactly because the full rules are not available for free in full. The thing is, nobody knows, how all these greedy AI corpos actually get data into their LLMs, and what data they feed on.
In addition to all the other issues already raised, studies have shown evidence that reliance on LLMs lowers a person's creative ability.
Find your own truth, choose your enemies carefully, and never deal with a dragon.
"Canon" is what's factual to D&D lore. "Cannon" is what you're going to be shot with if you keep getting the word wrong.
Voted no, but not because it doesn't just not understand the rules
It guzzles drinkable water
It steals the work of others
It enriches some of them most diabolical people
It runs in data centers that destroy communities and the environment
It has been used to sue the artists it copies
It causes false arrests and people to be falsely denied services
It has direct negative impacts on the ability to learn and store information and critically think in those that use it
It causes negative mental health outcomes ("AI psychosis) which has directly resulted in murders and suicides
It has been used to mass generate misinformation, disinformation, and abusive material including that of minors
It is actively killing the internet (look up Dead Internet Theory)
It causing a massive financial bubble which when it collapses, won't harm the people responsible
There is no ethical means or reason to use consumer facing generative AI, least of all as a replacement for reading the rules in a TTRPG
Find my D&D Beyond articles here
It would be possible to build an AI that was actually a useful rules assistant -- but it would need to be specially trained for the task, an AI that is trained on gobbling as much data as it can find on the internet might answer rules queries with the actual rules, but it also might produce rules from a different edition or game system, someone's house rules, random internet discussion, or just invent an answer.
No, using AI would remove all the fun of character development i have!
No.
Oh, it's pretty well known. The exact contents may not be, but many of the big datasets used are well-known, and at least one of them (libgen), contains (IIRC) hundreds of thousands of pirated books. A quick check for "Jeremy Crawford" indicates that at least some of the 5e rulebooks are in it. (And authorship data may not be complete, and also libgen is not the only data set used.)
I don't think one could do it with an LLM, even with dedicated training. They aren't designed that way. (Also, the amount of text needed to make them work is way too large to just train on D&D materials, so you'd end up with a lot of non-D&D text involved.)
It might well be possible with a dedicated machine learning system, but that's a lot of specialized work, possibly more than it'd take to make a BG3-style game engine.
I looked at the poll options and wanted to comment that the poll needs a "no" option for a host of ethical and creative reasons, but y'all are already hammering that point home. Points to Plaguescarred for focusing on fun; I know for some of us Forever DMs D&D can be a lot of work, but it's still fun work. I wouldn't want to use a bot to skip all the fun of character and campaign building even if AI weren't destructive to individuals, communities, professions, and the environment.
Absolutely not.
This is a game of imagination. So use your imagination! Simple as that.
Anzio Faro. Protector Aasimar light cleric. Lvl 18.
Viktor Gavriil. White dragonborn grave cleric. Lvl 20.
Ikram Sahir ibn-Malik al-Sayyid Ra'ad. Brass dragonborn draconic sorcerer Lvl 9. Fire elemental devil.
Wrangler of cats.
No and you should not use it either. If you don't want to tangle with the character builder on DDB, use the pregens.
DM mostly, Player occasionally | Session 0 form | He/Him/They/Them
EXTENDED SIGNATURE!
Doctor/Published Scholar/Science and Healthcare Advocate/Critter/Trekkie/Gandalf with a Glock
Try DDB free: Free Rules (2024), premade PCs, adventures, one shots, encounters, SC, homebrew, more
Answers: physical books, purchases, and subbing.
Check out my life-changing
AI should not be used for anything technical. The character builder already does pretty much all of the work and AI just gets it wrong.
AI can work well as a prompt for story ideas. It's not that great on making them good for a campaign, and seems to want to write stories not situations, but it can be useful to ask it to throw out 50 random taverns with some flavour, or a bunch of simple NPCs that you can flesh out when they're used.
LLM's are fuzzy. I don't mean to say that they're incapable, but the current Large Language Models, are not taught, they're trained. They don't learn by being corrected every time, they learn by ingesting every piece of information possible, so even if you trained one on the D&D Books, it would include 10,000 bad takes from Reddit, ENWorld, the forums here, etc etc. And even then, it would only get it 'mostly' right. Even the weakest rules lawyer could beat it.
TL;DR: It can be useful for flavour and flowery text, but it's woeful at accurate technical tasks.