I am a bit more mathematically inclined than the average RP gamer and one thing I have noted is the term "Bounded Accuracy" is thrown around a lot in these fora. I think I know what that means, but I want to see if I really know by asking the folks that haunt this discussion board.
So who can give me a hand with it? Thanks.
Rollback Post to RevisionRollBack
Cum catapultae proscriptae erunt tum soli proscript catapultas habebunt
I'm pretty sure bounded accuracy is the term that's used to describe the d20 combat/roll system that's used in 5e; such that you can only ever get a certain bonus to your rolls, and this the accuracy has a boundary that can't be crossed when playing the game. I never really bothered to look up the actual definition though, so this might be wrong
Rollback Post to RevisionRollBack
I know what you're thinking: "In that flurry of blows, did he use all his ki points, or save one?" Well, are ya feeling lucky, punk?
Previous editions of D&D allowed the bonuses one applied to a d20 roll to scale well into the hundreds - you gained a point of Proficiency BOnus every level, and other methods could drive the bonus vastly higher. The d20 was almost insignificant next to your skill bonus for a given action.
What this meant is that content had to be tightly calibrated to player level. Only content within an exceptionally narrow band of Numbermancy would provide an appropriate challenge to a party of PCs. Too low and the PCs were gods among ants - too high and the PCs are ants among gods. A third-level goblin, for example, was mathematically incapable of harming a seventh-level PC, while a twelfth-level goblin (if such a thing was made to exist) would one-shot most seventh-level PCs with overwhelming brutality.
In Fifth Edition, a core change to the game was the elimination of this arms race of ever-increasing bonuses. Instead, the system was capped with 30 as the highest possible number (under normal circumstances) that any creature could have for any relevant d20-modifying stat, and the idea was that accuracy was bounded - that is, the d20 was always the biggest possible contributor a given roll and stat bonuses could not overshadow the weight of the d20. A first-level goblin is capable of scoring a hit on a twentieth-level PC and dealing damage, a fact which would've been absolutely unthinkable in previous editions of D&D. But under Bounded Accuracy, no enemy is too weak or too strong to affect the PCs, or the inverse.
The idea is a strong net positive and it brought a lot of good to the game, but there are unfortunate holes in it. A lot of folks miss being able to no-sell weak enemies that gave them trouble in the early game, and it does mean the game math is far more sensitive to DMs who accidentally **** up and overtune numeric bonuses on their items or boons. It's not as sensitive as some folks think, but there can be issues for those who don't understand bounded accuracy.
Rodney Thompson wrote this on the Wizards website back in 2012:
"The basic premise behind the bounded accuracy system is simple: we make no assumptions on the DM’s side of the game that the player’s attack and spell accuracy, or their defenses, increase as a result of gaining levels. Instead, we represent the difference in characters of various levels primarily through their hit points, the amount of damage they deal, and the various new abilities they have gained. Characters can fight tougher monsters not because they can finally hit them, but because their damage is sufficient to take a significant chunk out of the monster’s hit points; likewise, the character can now stand up to a few hits from that monster without being killed easily, thanks to the character’s increased hit points. Furthermore, gaining levels grants the characters new capabilities, which go much farther toward making your character feel different than simple numerical increases.
Now, note that I said that we make no assumptions on the DM’s side of the game about increased accuracy and defenses. This does not mean that the players do not gain bonuses to accuracy and defenses. It does mean, however, that we do not need to make sure that characters advance on a set schedule, and we can let each class advance at its own appropriate pace. Thus, wizards don’t have to gain a +10 bonus to weapon attack rolls just for reaching a higher level in order to keep participating; if wizards never gain an accuracy bonus, they can still contribute just fine to the ongoing play experience.
This extends beyond simple attacks and damage. We also make the same assumptions about character ability modifiers and skill bonuses. Thus, our expected DCs do not scale automatically with level, and instead a DC is left to represent the fixed value of the difficulty of some task, not the difficulty of the task relative to level."
Rollback Post to RevisionRollBack
Want to start playing but don't have anyone to play with? You can try these options: [link].
Well then, it turns out I didn't understand bounded accuracy as the term is used in D&D 5th ed. I thought it was a little more "scientific". We would call this empirical in my world.
Rollback Post to RevisionRollBack
Cum catapultae proscriptae erunt tum soli proscript catapultas habebunt
My experience in D&D is much more binary than the more active participants here. I played only AD&D and 5e. And at this point, I am about equal time in the seat for each, although I spent some time as a DM for one campaign in AD&D and I wouldn't even consider it for 5e unless all the players were brand new. There are so many more class combinations in the current system that I wouldn't feel comfortable running a game when essentially every player knew their character much much better than I did.
And so when you allude to a view that "... in terms of To Hit, D&D has, throughout most editions, kept a progression table of bonuses due to class level which was based on +1 per level for fighters, +0.75 for "Intermediate classes" and +0.5 per level for magic-users." it just doesn't square with my memory of AD&D. As a matter of fact, what I glean from this is the AD&D was a bounded accuracy model too. I'm not saying it is a better model than 5e, just that it appears to me the same things folks state is a positive in 5e, bounded accuracy as it is described here, existed back in AD&D and it appears the game found that that was preferable to the model that was tested between these two editions. And in AD&D we never got to 20th level; that was just crazy high.
I actually don't think it is a bad thing for many creatures to be untouchable for low level characters. And likewise, I don't think it is undesirable for high level characters to be able to swat away some low level threats. But I appreciate the sentiment that the "curve" on that relationship might have been too steep. As four or five second level players, I think it is appropriate that an Old Dragon could one-shot the party. I also think it is appropriate that a twelfth level Paladin can swat skeletons dead with the flat of his sword. But I don't think that captures the extremes you wish to discuss.
Anyway, based on the numeric deterministic world I live in, bounded accuracy is an empirical concept and not an analytical approach. I thought there was more substance to it based on the way the term is used in these conversations.
Rollback Post to RevisionRollBack
Cum catapultae proscriptae erunt tum soli proscript catapultas habebunt
AD&D’s THAC0 progressions absolutely fell into the approximate paradigm of “+1 per level for fighters, +0.75 for ‘Intermediate classes’ and +0.5 per level for magic-users."
I'm not sure it's clear that it's "empirical" vs "analytical." I'm actually not sure what you mean by those terms, in this context. It's part of design. Design is all about trade-offs, and is usually a mix of both form and function.
If you google "bounded accuracy" you can find a few useful explanations or discussions:
Those contain a fair amount of analysis, at least, based on the experiences of the article writers (and, presumably, the experiences of the designers).
AD&D’s THAC0 progressions absolutely fell into the approximate paradigm of “+1 per level for fighters, +0.75 for ‘Intermediate classes’ and +0.5 per level for magic-users."
Saga, are you sure you want to have this conversation? I have my AD&D DMG right here at my fingertips. And remember the admonition from George Bernard Shaw ...
I learned long ago, never to wrestle with a pig. You get dirty, and besides, the pig likes it.
I make my living using math. Could we just agree that AD&D's system was pretty close to the current concept of Bounded Accuracy and leave it there?
Rollback Post to RevisionRollBack
Cum catapultae proscriptae erunt tum soli proscript catapultas habebunt
Anyway, based on the numeric deterministic world I live in, bounded accuracy is an empirical concept and not an analytical approach. I thought there was more substance to it based on the way the term is used in these conversations.
There are ways to study the fundamental math of DnD 5E empirically. For example, the DMG makes it very clear that at all PC levels except for 9, PCs fighting their expected CR should, on average, hit 65% of the time, assuming they're 16 in their hit stat at L1, 18 at L4, and 20 at L8 (at 9 they briefly spike to 70%). It's definitely not all empirical - the DMG has, by contrast, saving throw guidelines so sparse as to be nearly useless. Someone reading the DMG guidelines for making an NPC would have no clue that PC classes are uniformly organized to grant one "good" save proficiency (dex, con, or wis) and one "bad" one (str, int, or cha), and laughably, the DMG also claims a monster's athletics and acrobatics bonuses have no bearing whatsoever on its CR.
One of the big departures in 5E from previous editions, at least for me, is how few rules there actually are - I routinely can't get through a single 5E session without my DM needing to houserule something because the 5E rulebook contradicts itself or makes no sense or something of that nature. For example, my DM is still deciding how he wants it to work when I throw a fork at someone in a bar. The PHB offers two mutually contradictory rules for this. It makes reasoning about the game in abstract challenging, to say the least, since the answer for almost anything you want to do is "your DM will need to house rule that for you".
AD&D’s THAC0 progressions absolutely fell into the approximate paradigm of “+1 per level for fighters, +0.75 for ‘Intermediate classes’ and +0.5 per level for magic-users."
Saga, are you sure you want to have this conversation? I have my AD&D DMG right here at my fingertips. And remember the admonition from George Bernard Shaw ...
I learned long ago, never to wrestle with a pig. You get dirty, and besides, the pig likes it.
I make my living using math. Could we just agree that AD&D's system was pretty close to the current concept of Bounded Accuracy and leave it there?
Math threats are a thing now? On a forum for, let's be honest, still a pretty geeky hobby? Ok then... :p
AD&D was very different than current editions in hitting, and 1e was even before THAC0, but basically, the hit tables went like this. To hit AC 19 (ACs went down instead of up but basically AC 10 is the same thing, unarmored, no defense, no dex):
Fighters, Paladins, Rangers, Bards (they had almost nothing in common with post 2e bards)
This is not bounded accuracy. Bounded accuracy separates how well you can hit things from your level. THAC0 is entirely level-based. 5E has the proficiency bonus, but that doesn't scale with level to the same extent and it's not an absolute quality since it only applies to whatever you're proficient with. THAC0 is a lot closer to 3rd edition's BAB (base attack bonus) than anything in the current edition.
I am a bit more mathematically inclined than the average RP gamer and one thing I have noted is the term "Bounded Accuracy" is thrown around a lot in these fora. I think I know what that means, but I want to see if I really know by asking the folks that haunt this discussion board.
So who can give me a hand with it? Thanks.
Cum catapultae proscriptae erunt tum soli proscript catapultas habebunt
I'm pretty sure bounded accuracy is the term that's used to describe the d20 combat/roll system that's used in 5e; such that you can only ever get a certain bonus to your rolls, and this the accuracy has a boundary that can't be crossed when playing the game. I never really bothered to look up the actual definition though, so this might be wrong
I know what you're thinking: "In that flurry of blows, did he use all his ki points, or save one?" Well, are ya feeling lucky, punk?
Previous editions of D&D allowed the bonuses one applied to a d20 roll to scale well into the hundreds - you gained a point of Proficiency BOnus every level, and other methods could drive the bonus vastly higher. The d20 was almost insignificant next to your skill bonus for a given action.
What this meant is that content had to be tightly calibrated to player level. Only content within an exceptionally narrow band of Numbermancy would provide an appropriate challenge to a party of PCs. Too low and the PCs were gods among ants - too high and the PCs are ants among gods. A third-level goblin, for example, was mathematically incapable of harming a seventh-level PC, while a twelfth-level goblin (if such a thing was made to exist) would one-shot most seventh-level PCs with overwhelming brutality.
In Fifth Edition, a core change to the game was the elimination of this arms race of ever-increasing bonuses. Instead, the system was capped with 30 as the highest possible number (under normal circumstances) that any creature could have for any relevant d20-modifying stat, and the idea was that accuracy was bounded - that is, the d20 was always the biggest possible contributor a given roll and stat bonuses could not overshadow the weight of the d20. A first-level goblin is capable of scoring a hit on a twentieth-level PC and dealing damage, a fact which would've been absolutely unthinkable in previous editions of D&D. But under Bounded Accuracy, no enemy is too weak or too strong to affect the PCs, or the inverse.
The idea is a strong net positive and it brought a lot of good to the game, but there are unfortunate holes in it. A lot of folks miss being able to no-sell weak enemies that gave them trouble in the early game, and it does mean the game math is far more sensitive to DMs who accidentally **** up and overtune numeric bonuses on their items or boons. It's not as sensitive as some folks think, but there can be issues for those who don't understand bounded accuracy.
Please do not contact or message me.
Rodney Thompson wrote this on the Wizards website back in 2012:
"The basic premise behind the bounded accuracy system is simple: we make no assumptions on the DM’s side of the game that the player’s attack and spell accuracy, or their defenses, increase as a result of gaining levels. Instead, we represent the difference in characters of various levels primarily through their hit points, the amount of damage they deal, and the various new abilities they have gained. Characters can fight tougher monsters not because they can finally hit them, but because their damage is sufficient to take a significant chunk out of the monster’s hit points; likewise, the character can now stand up to a few hits from that monster without being killed easily, thanks to the character’s increased hit points. Furthermore, gaining levels grants the characters new capabilities, which go much farther toward making your character feel different than simple numerical increases.
Now, note that I said that we make no assumptions on the DM’s side of the game about increased accuracy and defenses. This does not mean that the players do not gain bonuses to accuracy and defenses. It does mean, however, that we do not need to make sure that characters advance on a set schedule, and we can let each class advance at its own appropriate pace. Thus, wizards don’t have to gain a +10 bonus to weapon attack rolls just for reaching a higher level in order to keep participating; if wizards never gain an accuracy bonus, they can still contribute just fine to the ongoing play experience.
This extends beyond simple attacks and damage. We also make the same assumptions about character ability modifiers and skill bonuses. Thus, our expected DCs do not scale automatically with level, and instead a DC is left to represent the fixed value of the difficulty of some task, not the difficulty of the task relative to level."
Want to start playing but don't have anyone to play with? You can try these options: [link].
Well then, it turns out I didn't understand bounded accuracy as the term is used in D&D 5th ed. I thought it was a little more "scientific". We would call this empirical in my world.
Cum catapultae proscriptae erunt tum soli proscript catapultas habebunt
My experience in D&D is much more binary than the more active participants here. I played only AD&D and 5e. And at this point, I am about equal time in the seat for each, although I spent some time as a DM for one campaign in AD&D and I wouldn't even consider it for 5e unless all the players were brand new. There are so many more class combinations in the current system that I wouldn't feel comfortable running a game when essentially every player knew their character much much better than I did.
And so when you allude to a view that "... in terms of To Hit, D&D has, throughout most editions, kept a progression table of bonuses due to class level which was based on +1 per level for fighters, +0.75 for "Intermediate classes" and +0.5 per level for magic-users." it just doesn't square with my memory of AD&D. As a matter of fact, what I glean from this is the AD&D was a bounded accuracy model too. I'm not saying it is a better model than 5e, just that it appears to me the same things folks state is a positive in 5e, bounded accuracy as it is described here, existed back in AD&D and it appears the game found that that was preferable to the model that was tested between these two editions. And in AD&D we never got to 20th level; that was just crazy high.
I actually don't think it is a bad thing for many creatures to be untouchable for low level characters. And likewise, I don't think it is undesirable for high level characters to be able to swat away some low level threats. But I appreciate the sentiment that the "curve" on that relationship might have been too steep. As four or five second level players, I think it is appropriate that an Old Dragon could one-shot the party. I also think it is appropriate that a twelfth level Paladin can swat skeletons dead with the flat of his sword. But I don't think that captures the extremes you wish to discuss.
Anyway, based on the numeric deterministic world I live in, bounded accuracy is an empirical concept and not an analytical approach. I thought there was more substance to it based on the way the term is used in these conversations.
Cum catapultae proscriptae erunt tum soli proscript catapultas habebunt
AD&D’s THAC0 progressions absolutely fell into the approximate paradigm of “+1 per level for fighters, +0.75 for ‘Intermediate classes’ and +0.5 per level for magic-users."
I'm not sure it's clear that it's "empirical" vs "analytical." I'm actually not sure what you mean by those terms, in this context. It's part of design. Design is all about trade-offs, and is usually a mix of both form and function.
If you google "bounded accuracy" you can find a few useful explanations or discussions:
Those contain a fair amount of analysis, at least, based on the experiences of the article writers (and, presumably, the experiences of the designers).
Saga, are you sure you want to have this conversation? I have my AD&D DMG right here at my fingertips. And remember the admonition from George Bernard Shaw ...
I learned long ago, never to wrestle with a pig. You get dirty, and besides, the pig likes it.
I make my living using math. Could we just agree that AD&D's system was pretty close to the current concept of Bounded Accuracy and leave it there?
Cum catapultae proscriptae erunt tum soli proscript catapultas habebunt
There are ways to study the fundamental math of DnD 5E empirically. For example, the DMG makes it very clear that at all PC levels except for 9, PCs fighting their expected CR should, on average, hit 65% of the time, assuming they're 16 in their hit stat at L1, 18 at L4, and 20 at L8 (at 9 they briefly spike to 70%). It's definitely not all empirical - the DMG has, by contrast, saving throw guidelines so sparse as to be nearly useless. Someone reading the DMG guidelines for making an NPC would have no clue that PC classes are uniformly organized to grant one "good" save proficiency (dex, con, or wis) and one "bad" one (str, int, or cha), and laughably, the DMG also claims a monster's athletics and acrobatics bonuses have no bearing whatsoever on its CR.
One of the big departures in 5E from previous editions, at least for me, is how few rules there actually are - I routinely can't get through a single 5E session without my DM needing to houserule something because the 5E rulebook contradicts itself or makes no sense or something of that nature. For example, my DM is still deciding how he wants it to work when I throw a fork at someone in a bar. The PHB offers two mutually contradictory rules for this. It makes reasoning about the game in abstract challenging, to say the least, since the answer for almost anything you want to do is "your DM will need to house rule that for you".
Math threats are a thing now? On a forum for, let's be honest, still a pretty geeky hobby? Ok then... :p
Just look at Lyxen's post above:
This is not bounded accuracy. Bounded accuracy separates how well you can hit things from your level. THAC0 is entirely level-based. 5E has the proficiency bonus, but that doesn't scale with level to the same extent and it's not an absolute quality since it only applies to whatever you're proficient with. THAC0 is a lot closer to 3rd edition's BAB (base attack bonus) than anything in the current edition.
Want to start playing but don't have anyone to play with? You can try these options: [link].