Getting a bit frustrated over the builder and 5.5E, I tried using AI. It turned out to be frustratingly wrong in interpreting the (new) rules. So I'm now back to using pen, paper sheets and the books. Just for my curiosity: do you use AI?
Absolutely not! 1) Hate the copyright implications of training LLMs and AI. 2) Hate the energy consumption for even menial tasks, let alone image generation.
Absolutely not! 1) Hate the copyright implications of training LLMs and AI. 2) Hate the energy consumption for even menial tasks, let alone image generation.
This ^^^
But also you’re using a tool that simply doesn’t understand the rules, they’re programmed to please you and will tell you out right lies if they think that’s what you want to hear. Repeated tests have shown it’s not simply hallucinating answers like earlier LLM were but instead many modern ones are literally choosing to ignore evidence in order to give an answer it thinks you want. That’s absolutely useless when asking it to tell you rules
A character? Absolutely not. I have more than enough character ideas and the scope of a character itself is so small scale that I can't imagine ever needing AI to help. It would be just as much effort to train the AI to be of use as it would just to make the character off my own back.
A campaign? I haven't yet. I've always found enough prompts and so forth from players on top of written adventures that there's just not been any need. I wouldn't say I'd never use it...but at the moment I don't envisage the need.
Rollback Post to RevisionRollBack
If you're not willing or able to to discuss in good faith, then don't be surprised if I don't respond, there are better things in life for me to do than humour you. This signature is that response.
LLM's (mistakenly referred to as AI by tech marketing campaigns) cannot actually know a ruleset (and this even applies to basic math), all it can do is predict what it calculates the most likely word to appear next in a sequence might be, which it does by plundering data samples of content they don't have permission to use.
Not only that, but each queary you submit to an LLM consumes about a gallon of fresh drinking water, as well as LLM data centers consuming inordinate amounts of electricity that's projected to result in rolling blackouts across the US as early as next year if use continues as is.
Absolutely not! 1) Hate the copyright implications of training LLMs and AI. 2) Hate the energy consumption for even menial tasks, let alone image generation.
This ^^^
But also you’re using a tool that simply doesn’t understand the rules, they’re programmed to please you and will tell you out right lies if they think that’s what you want to hear. Repeated tests have shown it’s not simply hallucinating answers like earlier LLM were but instead many modern ones are literally choosing to ignore evidence in order to give an answer it thinks you want. That’s absolutely useless when asking it to tell you rules
You're giving them way too much credit.
They're not "telling lies" or "ignoring evidence" -- they have no conception of truth, nor are they capable of such a thing.
As for the original question:
I'd rather give up the game.
Beyond even the environmental concerns, the ethical concerns, and the fact that it can't actually do the thing. (Which are all extremely real problems with the tech, don't get me wrong.)
This is a game about being creative. If you offload that to a computer, what even is the point?
Another thought: I do not even believe that the LLMs out there have full training access on the D&D rules, exactly because the full rules are not available for free in full. The thing is, nobody knows, how all these greedy AI corpos actually get data into their LLMs, and what data they feed on.
It would be possible to build an AI that was actually a useful rules assistant -- but it would need to be specially trained for the task, an AI that is trained on gobbling as much data as it can find on the internet might answer rules queries with the actual rules, but it also might produce rules from a different edition or game system, someone's house rules, random internet discussion, or just invent an answer.
Getting a bit frustrated over the builder and 5.5E, I tried using AI.
It turned out to be frustratingly wrong in interpreting the (new) rules.
So I'm now back to using pen, paper sheets and the books.
Just for my curiosity: do you use AI?
Absolutely not! 1) Hate the copyright implications of training LLMs and AI. 2) Hate the energy consumption for even menial tasks, let alone image generation.
This ^^^
But also you’re using a tool that simply doesn’t understand the rules, they’re programmed to please you and will tell you out right lies if they think that’s what you want to hear. Repeated tests have shown it’s not simply hallucinating answers like earlier LLM were but instead many modern ones are literally choosing to ignore evidence in order to give an answer it thinks you want. That’s absolutely useless when asking it to tell you rules
A character? Absolutely not. I have more than enough character ideas and the scope of a character itself is so small scale that I can't imagine ever needing AI to help. It would be just as much effort to train the AI to be of use as it would just to make the character off my own back.
A campaign? I haven't yet. I've always found enough prompts and so forth from players on top of written adventures that there's just not been any need. I wouldn't say I'd never use it...but at the moment I don't envisage the need.
If you're not willing or able to to discuss in good faith, then don't be surprised if I don't respond, there are better things in life for me to do than humour you. This signature is that response.
No.
Thou shalt not make a machine in the likeness of a human mind.
LLM's (mistakenly referred to as AI by tech marketing campaigns) cannot actually know a ruleset (and this even applies to basic math), all it can do is predict what it calculates the most likely word to appear next in a sequence might be, which it does by plundering data samples of content they don't have permission to use.
Not only that, but each queary you submit to an LLM consumes about a gallon of fresh drinking water, as well as LLM data centers consuming inordinate amounts of electricity that's projected to result in rolling blackouts across the US as early as next year if use continues as is.
Don't use LLM'S.
You're giving them way too much credit.
They're not "telling lies" or "ignoring evidence" -- they have no conception of truth, nor are they capable of such a thing.
As for the original question:
I'd rather give up the game.
Beyond even the environmental concerns, the ethical concerns, and the fact that it can't actually do the thing. (Which are all extremely real problems with the tech, don't get me wrong.)
This is a game about being creative. If you offload that to a computer, what even is the point?
It's unethical tech.
It's a scam sold by money piles.
& it doesn't work.
DM, player & homebrewer(Current homebrew project is an unofficial conversion of SBURB/SGRUB from Homestuck into DND 5e)
Once made Maxwell's Silver Hammer come down upon Strahd's head to make sure he was dead.
Always study & sharpen philosophical razors. They save a lot of trouble.
I also voted no. Not because it doesn’t understand the rules, but for all the other reasons given above.
Another thought: I do not even believe that the LLMs out there have full training access on the D&D rules, exactly because the full rules are not available for free in full. The thing is, nobody knows, how all these greedy AI corpos actually get data into their LLMs, and what data they feed on.
In addition to all the other issues already raised, studies have shown evidence that reliance on LLMs lowers a person's creative ability.
Find your own truth, choose your enemies carefully, and never deal with a dragon.
"Canon" is what's factual to D&D lore. "Cannon" is what you're going to be shot with if you keep getting the word wrong.
Voted no, but not because it doesn't just not understand the rules
It guzzles drinkable water
It steals the work of others
It enriches some of them most diabolical people
It runs in data centers that destroy communities and the environment
It has been used to sue the artists it copies
It causes false arrests and people to be falsely denied services
It has direct negative impacts on the ability to learn and store information and critically think in those that use it
It causes negative mental health outcomes ("AI psychosis) which has directly resulted in murders and suicides
It has been used to mass generate misinformation, disinformation, and abusive material including that of minors
It is actively killing the internet (look up Dead Internet Theory)
It causing a massive financial bubble which when it collapses, won't harm the people responsible
There is no ethical means or reason to use consumer facing generative AI, least of all as a replacement for reading the rules in a TTRPG
Find my D&D Beyond articles here
It would be possible to build an AI that was actually a useful rules assistant -- but it would need to be specially trained for the task, an AI that is trained on gobbling as much data as it can find on the internet might answer rules queries with the actual rules, but it also might produce rules from a different edition or game system, someone's house rules, random internet discussion, or just invent an answer.
No, using AI would remove all the fun of character development i have!
No.