This has implications for the statistics (probability distribution) we might use for our underlying task-resolution mechanic (built into any tables, for instance). Many probability distributions can be described as "location-scale families", in that they have exactly 2 parameters -- one indicating the position (mean), and one for the likely spread around that center (variance). Perhaps it's important to think really carefully about which of these 2 parameters should be affected by increasing levels/skill/abilities/etc, and what that implies for our probability distributions.
Intensity Versus Accuracy
Again with the initial question: Are most mechanical resolutions in D&D (attacks, spells, skills, saves, etc.) simulating an act of intensity or one of accuracy? Let's consider some examples.
Melee attacks: More important to hit "hard" (in any random place) or to hit "right" (on the enemy, in a vulnerable spot)? I would argue the latter; slipping past the enemy's shield/armor/defenses/dodging is the key. This is reflected in increasing the attack bonus with level; the character is getting more skilled and accurate, not swinging harder and harder. (Obviously using the Strength bonus to-hit indicates there is some intensity that is important, perhaps smashing through a shield, etc., but I think it's a minority part of the task).
What about missile attacks? Obviously a case of accuracy -- the missile has to be shot in the right location, and no amount of special "ferocity" on the part of the shooter will help. What about casting spells? In classic D&D, again a case of accuracy -- "The energy flow is not from the caster per se, it is from the utterance of the sounds, each of which is charged with energy which is loosed when the proper formula and/or ritual is completed with their utterance" [Gygax in 1E AD&D DMG, p. 40]. It is the "proper formula" which is important, and a particularly bombastic rendition of it by the caster will not help things.
What about traditional thief skills? Open locks, remove traps, move silently, etc. -- Obviously these are all things that require carefulness and dexterity. Being too far left or too far right would equally result in failure; smashing tools into the lock harder than anyone else does not help.
What about expanded skills (non-weapon proficiencies)? Looking at the 3E skill list as an example, practically all of them seem to be more accuracy based than intensity based -- the ability to appraise, craft, disable devices, forgeries, intuit direction, knowledge, perform, search, speak languages, tumble, etc. -- there's a "right" and balanced way (to name another such skill) to do these things that would be the goal of any practitioner.
Perhaps the only exceptions I can see are Strength-based items: Maybe climb, jump, and swim would benefit from an exceptional "burst" of effort. Perhaps, more generally, any raw application of an ability score would qualify as seeking special intensity -- like Olympic-style events of running, lifting, throwing for distance. But even with these some would argue that there is a "correct form" that is more important than anyone's raw Strength.
Thus, speaking generally, I think that for the great majority of tasks simulated in a D&D-like RPG succeed based on accuracy (landing in the "right place"), and not on intensity (sheer power/ distance). For most stuff, doing it doubly-hard would be a disaster, not a benefit.
Mean Versus Variance
So, granted that for most tasks it is accuracy that is key (having a "correct" place to be and landing there), does that mean that the important consideration is location or scale? Or in other words, is it the mean or the variance of results that is most affected by increasing skill level?
Here's an example that I use in my statistics class: Consider two basketball players, each taking three shots at the hoop. The positions of the three shots are shown for each below.
Notice: They're both aiming at the same spot -- If you average the positions of the three shots, the result is "5" for each player (that being indeed within the rim); which is to say that the mean (central location) is the same. But which player's shots are bunched up closer together; that is, have less variance (spread)? It's Player B. And which player has more shots going in the hoop? Again, Player B. (2 shots in to Player A's 1 shot.)
So we can see that it's really variance which dictates accuracy. Assuming that you're aiming anywhere near the target in the first place, then increasing your accuracy is really a matter of reducing variance. (Or technically: accuracy and variance are inversely related.) Which is to say, your training and skill acquisition are making the result more predictable, and closer to the "right" result, more of the time.
Again, throwing the ball extra super-hard and getting, say, a +15 bonus on your shot position (mean location) would be ruinous; every shot would miss wildly, for every player.
As an aside -- You see the same observation in modern portfolio theory -- granted that you've picked a particular target return rate, the real work then is to "reduce the total variance of the portfolio return" [Wikipedia], i.e., make the return as predictable as possible. (And that's done through diversification, i.e., increasing sample size, which reduces variance.) And if you watch some cable TV high-stakes poker shows, for really enormous pots you'll usually see the pros "run it twice" (or more) for the exact same reason.
Modeling with a Normal Curve
For a variety of good reasons, mechanical and muscular variance (i.e., "error") is most frequently modeled with a normal distribution (i.e., Gaussian; the z-curve; bell-shaped). As one example, see this abstract on "Analysis of Small-bore Shooting Scores":
For a competitor with a given average score, a calculation model based on the central circular bivariate normal distribution has been used to calculate the expected distribution of the displacements of shots from the point of aim, and hence the expected variation in the competitor's scores... [Journal of the Royal Statistical Society. Series C (Applied Statistics), Vol. 21, No. 3, p. 292]
For our purpose here, results off on the tail of the normal curve are not good (whether too far left or too far right). Our "target" is that central position (mean), like the center of a basketball hoop, or an archery target. So, we can draw a zone where our attempts count as being "on target"; the area of that zone reflects our probability of success. Increasing skill will reduce the spread/variance, narrowing the curve of possible results, and thereby getting more shots/attempts in the "target zone". Specifically, the probability of a hit/success is given by the common technique of a standardized table of areas for the normal curve (or alternatively: software like Excel or any spreadsheet program; or maybe you can do complex numerical integration in your head).
Here's an example of what we might try: Say that any result within z-score +/-2 on the normal curve counts as being "on target". At the top, for what we'll call "success level" 20, assume the standard deviation (square root of variance) is simply 1. Say we increase that error/variation by +12% (multiplying 1.12) for each step to a lower level. Then we can use a spreadsheet to easily calculate the probability at each level of landing "on target", and translate that to success target on a d20. (The previous graph shows the curve/success shape at the 10th level, in fact.)
Observations: This probability distribution has many very nice qualities. First of all, the numbers throughout levels 1-20 are broadly similar to the OD&D numbers for success at hitting AC0, or making a save vs. spells, or succeeding at a thief skill. Secondly, the numbers are "smooth", in that they don't jump at coarse increments. Thirdly, throughout the "meat" of our level progression (7th-18th), the numbers conveniently increase by 1 per level (see "The 5% Principle"). Fourth, at the bottom, we don't have the problem of very abruptly switching from possible to impossible: an advancement beyond AD&D's 6 repeated 20's, here we see an even "softer landing": 2 copies each of 15-17; 3 18's; 4 19's; and a long string of 10 20's, before success is effectively impossible. All of these desirable features automatically pop out for us by using the right model to reflect known physical systems of success and failure.
To use this, now we really would need to commit to using the resolution table all the time. (The results are not linear as with classic D&D/ d20 System/ Target 20, so there's no simple arithmetic shortcut to the procedure.) For attacks, take the character's "fighting level", add in the opponent's AC (classic descending), plus any other bonuses or modifiers; then find that "level" in the table, and look across to see what roll of d20 counts as success. Saving throws, thief abilities, special skills, etc., can all work in a similar fashion. In other words, all modifiers must be made to the "level" value (equivalent to moving up or down rows in the table); no modifiers are ever (EVER!) applied to the resulting "to hit" score.
Can I Accomplish The Same Thing By Rolling Many Dice?
No! Although it's a common mechanic to roll several dice and add them (generating a bell-shaped-like probability distribution), if you do this and compare to a minimum target number, then you're actually doing the exact opposite of our procedure.
Compare the graph above to the one on the right; in this proposed process, it is the dice results (not success results) which are bell-shaped. The target is not the in the "center", instead it is all the extremely high results out in the tail. Bonuses and modifiers (regardless of whether we apply them to the dice roll or the target number) will now shift the mean/center location, when our argument all along has been that we need to keep that fixed and alter the variance, or the spread of the curve.
Quick example: Say you roll 3d6 (range 3-18) for your resolution mechanic, and see the table to the right. Note that for our "normal resolution" process, the flattest spot was in the middle (levels 7-18), where every step was consistently a 1-in-20 difference in success; but here the opposite is true -- the very center is actually where the wildest fluctuations occur (a single step around target 8-14 changes success by the equivalent 2 or 3-in-20). The more dice you add, the spikier the distribution gets, so the more this disturbing effect will be exacerbated. And you certainly don't get the long "soft landing" effect of many 20's as shown in the AD&D DMG.
Conclusions
In summary, for almost any kind of skill you can think of (hitting/ shooting/ dodging/ picking locks, etc.) character level shouldn't change the mean result (i.e., the target); it should change the variance of the result (i.e., get closer to the desired target, reflecting increased accuracy). And, the standard normal (bell-shaped) curve would be an excellent choice for use as a model of "error level".
A summary table (without the normal-curve statistics on display) is shown to the right. An Excel spreadsheet of the original calculations is here if you want to confirm or play with the numbers involved.
Would I use this myself? Actually, I don't expect to. I like the Target 20 freedom from tablature (saving table space) and honestly, the results are "close enough" in the key central part of the chart (level 1-20) that it's approximately correct most of the time anyway. I might, however, use this in the future as a basis to analyze (for example) archery ranges; but if you use it in a game yourself, be sure to tell me how it went!
It seams you could do the same thing by rolling lots of dice; if you counted by high and low rolls as less good than center-of-curve rolls. For many things that wouldn't be very interesting (who really cares if your shot misses to the left or right), but it might be fun for spells, where under-rolling would result in a weakened or fizzled spell but over-rolling would create disasterous unintended effects (perhaps with a player option to add or subtract a fixed value). I might play around with this a bit; it would give me an excuse to dig out some of my old-WoD Mage materials.
ReplyDeleteJoshua, my guess is that there wouldn't be any very good way (interfaces with D&D level/bonuses; steps at desired sizes) of setting the success rates while being locked into existing dice sizes (i.e., without going through a tabled math function as shown here). At any rate, we agree that a standard D&D dice sum of ndN fails to do the job.
ReplyDeleteYeah, I think you're right. It might be workable at a Chainmail level of granularity (a "man" MU rolls 3d6, a "hero" rolls 2d6+3, a "super-hero" rolls 1d6+7, or there abouts, with an optimal range of 9-12 -- and you'd still want a table for the bands).
ReplyDeleteRight, think I agree with that.
ReplyDeleteSo you feel that the table is the only good way to get these results, and that something like 3d6 is actually inferior, for D&D, to D20. OK, but I really wish there were a way to effect this kind of arrangement without moving to tables.
ReplyDeleteWhat about a more thorough change, where you roll, say, 3d6 or 2d10, but you are trying to roll in-between numbers: success is 7-14. Bonuses widen the target range (6-14 or 7-15) and minuses shrink the target range (8-14 or 7-13).
Methinks I'm making this more complicated than a table...
Hey Brandon -- I actually did have the following in my handwritten notes for this article (but snipped them to stay on-focus)...
ReplyDeleteOption: Take a multi-dice roll, subtract the mean, divide by some function of level, and call anything within +/-N success.
Example: Roll 3d6, subtract 10 (10.5), divide by square root of level (?), call anything within +/-2 a "hit".
The main thing I would emphasize -- variance (standard deviation) is fundamentally a multiplicative entity, not an additive one. (Swap my divide roll by level for multiply target range, if you like.)
If you just add bonuses to the target range, then within 3 steps you abruptly switch to certain or impossible results. Variance inherently needs to be in the context of a multiply/divide process to smoothly scale and avoid that.
Delta, have you compared your resolution chart to the "to hit" progression for fighters in B/X D&D?
ReplyDeleteCompare (v AC2)
Level 1: BX 17, Delta's System 17
Level 4: BX 15, Delta's System 16
Level 7: BX 12, Delta's System 14
Level 10: BX 10, Delta's System 11
Level 13: BX 8, Delta's System 8
In so far as I can tell, then, the B/X to hit progression for fighters is an admirable model of your system. This strikes me as not coincidental.
Hey Alexander -- What I usually do is look at the OD&D charts, and fortunately those were copied without change into the B/X rules. :) Most of the rulesets differ by only a point or two at most in places.
ReplyDeleteSome more examples (in my "normalized" system, remember to add level to AC Fighter Level 1 + AC 2 = success level 3 = to-hit 16):
D&D Fighter To-Hit vs. AC2
(Level -- OD&D, AD&D, 2E, Rules Cyclopedia, Delta's Normalized)
1 -- 17,18,18,17,16
4 -- 15,16,15,15,15
7 -- 12,12,12,13,12
10 -- 10,10,9,11,9
13 -- 8,6,6,9,6
16 -- 5,4,3,7,3
XLS Spreadsheet here.
Deleted a comment as off-topic.
ReplyDeleteA very interesting article which chimes with my actual experience of historical combat styles.
ReplyDeleteThe simple mechanism I use in my home-brewed system is simply multiple dice, take the highest.
E.g. 3,4,2 results in a 4.
So, the more dice you roll, the less the variance.
(subscribing to comments)
ReplyDeleteDo you have an equation for this which I can plug into excel?
ReplyDeleteThis comment has been removed by the author.
ReplyDeletestdev = POWER(1.12, 20 - level)
ReplyDeletesuccess = NORMDIST(2, 0, stdev, TRUE) - NORMDIST(-2, 0, stdev, TRUE)
d20 = 21 - ROUND(success * 20, 0)
With the bit relevant to your point being the "subtrahend" in the success formula.
There you go. Always the best when the questioner answers themselves! :-)
ReplyDeleteI'm not sure if changing the level is the same as adjusting the difficulty. (It could be, but I'm not familiar enough with the math to say for sure.) Wouldn't the target area (+/- 2 above) be the thing to adjust for difficulty?
ReplyDeleteIn fiddling with this stuff, TSR's love for certain colorful tables is starting to make more sense. Check out the "aiming size" column in the Combat Results table.
Hey, Alex -- I think I see what you're saying. You'd be right that, say, narrowing the target area by 5% doesn't produce exactly the same results as widening the skill curve by 5% (since it's nonlinear). But for game-ability I'm assuming that for whatever difficulty is desired, there's some "level" modifier that is a closest match to it. It's not trying to strictly simulate any particular mechanical skill -- the biggest lesson is perhaps to what degree the success chances trail off at the ends.
ReplyDeleteI had been thinking similarly that as one increases his/her expertise in addition to becoming better they would also become more consistent. In other words as you state the variance would decrease.
ReplyDeleteThis is what I had developed: To succeed on a check you had to roll lower than some difficulty value. As your "ability level" increases the dice you roll decreases in size. And like you described, modifiers would increase/decrease the “ability level”.
However your system has the advantage of only needing one die of one size.
Another interesting idea. I think this might one case where consulting a table (for a distinctly non-linear probability function) might be warranted. Although as I say for game-pacing purposes I prefer to be divested of tables.
Delete"no modifiers are ever (EVER!) applied to the resulting "to hit" score."
ReplyDeleteThis is just me thinking out loud. So take the case of non-combat skill say climbing a cliff. As the climber's skill improves, the variance in his attempts decreases, agreed. You are also saying if the cliff is more difficult it will increase the variance of climbing attempt, rather than requiring a greater impact. I could see it argued either way.
True, you might have a point with that one; for simplicity (in theory) I'd go with the system being consistent. The primary effect would be that e.g. at level 19 a +1 bonus wouldn't make success automatic; there would still be some variation in the skill possibly causing failure (success on roll of 2+).
DeleteWell here's what I'm thinking. If all modifiers that are intrinsic to the character was applied to the level on the variance table then the target number from the variance table could be written on the character sheet. All external circumstances would be modifiers to the target. Then table look up would occur less often.
DeleteOne could say that if the character was suffering from a negative condition such as fatigued or blinded then it would be an intrinsic condition and affect variance level. But even so table look up would be less often.
Anyway sorry to keep re-resurrecting your post.
Thanks.
I don't mind the resurrection, it's good for me to see if my defense is coherent. :-) I do think that even extrinsic factors are more frequently a matter of precision or how narrow the range is for success, e.g., further away shooting target, more tricky lock to pick, more convoluted spell formula to decode, etc. And symmetrically, the game-effect here is to give a "long near-20" series before the task becomes impossible, which seems desirable.
Delete