2016-01-29
OD&D is Back at D&D Classics
Great news! The Original D&D little brown books (LBBs) are once again officially available in PDF form on RPGNow's D&D Classics site. Nicely timed to celebrate the 42nd birthday of their release (link).
It's been a frustrating number of years since OD&D was available for legal purchase. I got my digital copy on RPGNow back in 2007 -- shortly before I fell in love with it as my favorite, purest and most playable edition, and starting this blog dedicated to it. (To my amazement the RPGNow system still remembers that purchase and provides me with re-download access for the current work, bravo.) Hopefully it will remain available for the foreseeable future, so that we can refer fellow players back to the classic if that's the ruleset we're using for our games -- and I recommend that we do.
Of course, a number of OSR publications in the intervening years had as their raison d'ĂȘtre the need to fill in for the original game being out-of-print in any form. Notably, my own Book of Spells was partly motivated by this fact -- I could run everything in the game as DM, with players basically responding with real-world intuition, except for players of wizard characters who at a minimum did need rules text for the magic spells. Of course, that's fine, and I think the "missing rules" generated a lot of creativity -- in particular, I really do think that having spells removed to their own book is a better organizing principle, and the Original D&D spells were very sketchy indeed, so it pays to have the necessary interpretations written down in an official form. (As D&D Classics Product Historian Shannon Appelcline writes: "Clerics and magic-users cast spells, but the rules for doing so are quite terse, which caused some confusion about how the rules worked in the early days...".)
Bonus content: Check out Zenopus Archives' blog for a handout re-presenting the top-level Balrog monster, which was in the 1st printing of OD&D, but struck out from later editions (such as the version available at D&D Classics). Fight on!
2016-01-25
Silver Weapons
Silver weapons are a pretty good staple of myth and fantasy. Silver bullets disposing of witches and werewolves date to at least the 1700's. A silver dagger narrowly saves the day against a necromancer when all other weapons have failed for Fafhrd & the Gray Mouser in their origin story by Fritz Leiber, "Ill Met In Lankhmar" (here illustrated by artist Mike Mignola):
This may hurt a bit. |
- OD&D (1973): The basic equipment table in Vol-1 includes a "silver tipped arrow" for 5 gp, and nothing else; obviously this looks like Gygax's medieval analog to the silver bullet. A few of the monsters in Vol-2 require at least silver weapons to hit; for wights and wraiths this is phrased as "silver-tipped arrows will score normal damage", which may look like a weirdly specific restriction, unless you realize that's the only such weapon in the preceding equipment list. But lycanthropes refer to "silver weapons" in the more general phrasing.
- Holmes Basic D&D (1977): The equipment list is effectively identical to OD&D; the "silver tipped arrow" is included and no other type. The restrictions for wights, wraiths, and lycanthropes use identical language as in OD&D.
- AD&D 1E (1978): No change here: the silver arrow is included in the expanded equipment list, and it's still the only such weapon. Both wights and wraiths have had their vulnerability text changed to the more general term "silver weapons", even though only the one type exists in the equipment list. Some other lower-planes types have also been added that require at least silver to hit (ghosts, imps, night hags, and greater devils).
- Moldvay Basic D&D (1981): This is the first ruleset that has more than the single silver arrow; Moldvay adds the "silver dagger" for 30 gp (a la Leiber above). This finally gives an option to characters in melee combat; note what a bad scene it would be in prior rules for a silver-required creature to close to hand-to-hand range when the only silver weapon available is the arrow.
- AD&D 2E (1989): Not only is the list of silver weapons not further expanded at this time, but they're actually entirely removed from the equipment list as far as I can tell. Absolutely no appearances of the phrase "silver weapon" (or "arrow", or "dagger") appear in my copy of the 2E PHB. Yet those wights, wraiths, lycanthropes, etc., in the Monstrous Manual still have the same silver-minimum requirement to hit them. The DMG has a section on "Silver Weapons", but in standard 2E style, this is entirely a rumination of optional limitations the DM could put on any such weapons (need entire blade of pure silver, broken with regular use); no prices or accessibility rules are given. Good luck, you princes of role-play, you characters of 2E!
- D&D 3E (2000): Expanding the list for the first time in two decades, the PHB has a section on "Special and Superior Items" which includes the "arrow, bolts, or bullet, silvered", as well as the "dagger, silvered". The monster defenses have been jostled around a bit; lycanthropes have "damage resistance 15", bypassed by silver or magic weapons (i.e., normal weapons are not totally impotent, but have their damage reduced by 15 points), whereas wights and wraiths have had the defense removed entirely (although wraiths now require at least magic to hit, as they have become fully "incorporealized" at this time).
- D&D 3.5 (2003): The status of silver weapons gets even further shuffled around in this revision to 3rd Edition. Specifically, other weapon materials come into play for some monsters (like adamantine and cold iron), and now any weapon may be constructed out of silver or any of the other special materials. In addition, magic weapons no longer serve to trump those resistances.
In my Book of War game, the silver-weapon restriction is a big reason why I've avoided adding monsters such as wights, wraiths, and lycanthropes (even if they are a staple in mass battles seen in Tolkien, say). Obviously, it requires the addition of silver weapons or else these unit types would automatically massacre normal men without recourse. And we basically need to price the lycanthropes (or whatever) such that they are sure to lose against men pre-armed with silver weapons (so as to balance against the fact that they automatically win if they face off against men without such weapons). But even if we price silver weapons at a minimum 1 of gold per figure, it then turns out that those men will be so under-powered as to generally lose against any other normal type (the Book of War prices are that sensitively balanced). So in playtests, if we include these options, the entire Book of War system immediately collapses into a game of rock-paper-scissors: the only relevant choice is whether one selects (1) lycanthropes, or (2) men with silver weapons, or (3) men without silver weapons -- and the game is effectively determined as soon as the players reveal what units they have brought to the table.
Personally, I do like the appearance of silver arrows as a witty stand-in for silver bullets; and the silver dagger gives an option to those in melee combat against the forces of darkness (e.g., otherwise our friends Fafhrd & the Gray Mouser would have had a very short career indeed!). But allowing larger types like swords and battle axes in 3rd Edition seems like lazy, mindlessly-abstracted, "we no longer care about the concrete details of our world" game design. That is: in reality a large weapon made of soft silver would instantly bend and break. And I kind of think that real-world detail of being reduced to a less powerful weapon adds commendably to the sense of desperation in fights against these kinds of supernatural monsters.
What do you allow for silver weapons in your games? Is it arrows-only as in OD&D and AD&D? Do you permit daggers like Moldvay (and Leiber) did? Or is it anything-goes like in 3rd Edition and later?
Edit: Several have commentators have pointed out that while early editions have no silver daggers in the starting equipment lists, there are numerous instances of them given as possible treasures. Zenopus Archives notes there is a silver-dagger in the Holmes Basic Sample Dungeon, and also Gygax's Caves of Chaos (B2). Marathon Recaps & Professor Oats note that a silver-dagger is included in the original robe of useful items from Dragon #26, and so reprinted with that inclusion in 1E AD&D DMG Appendix P. Excellent finds!
2016-01-18
XP: The Big Switch, Pt. 3
One final observation on the difference between the monster XP award system between Original Vol-1 (linear in HD) and Revised Sup-I (parabolic in HD up to name level). Let's run our updated Arena simulator program under both these XP award systems. Parameters in use here are: 10,000 fighters, 200 cycles of one-on-one man-vs-monster combat, expected treasure awards being added, no special abilities simulated (so: somewhat safer for the men), -2 modifier given to the random monster level die.
Here's the resulting fighter population with the Revised (Sup-I) XP system:
And here's the population generated under the Original (Vol-1) system:
So this reiterates the expected finding that the Original (linear) XP system is much more generous to low-level characters (by a factor of fully ×10 XP value), and so we get somewhat more lower-level characters successfully leveling up, or in other words, the curve for advancing levels is noticeably more gentle.
Interpolating the curve at lower levels for the Revised system gives an advancement function of about y = k e^(-1.4x) (where x is the level, k the 0-level men-at-arms population, and y the population at any other level), or approximately dividing by 4 at each sequential level. In contrast, the Original system generates a simpler advancement function of close to y = k e^(-x), or approximately dividing by 3 at each sequential level.
Moreover, looking at the Original XP system, compare several possibilities for a modifier to the "Monster Determination and Level of Monster Matrix" (Vol-3, p. 10), versus the prescribed higher-level leader proportions from encounters with Men (Vol-2, p. 5):
We see that a modifier of -2 to the random-monster die roll in the Arena gives the best match (least sum-squared difference) to the leadership proportions given in Vol-2 (whereas previously under the Revised XP system we saw that a modifiers of -3 or even -4 from book was necessary). So in other words: Taken as a whole, the Original XP awards are closer to presenting an overall rational system of demographics in D&D.
I suppose the most important consideration is: How fast do you want your PCs leveling up, in terms of number of encounters faced? (And this was in fact used as the starting axiom for the 3E XP system, for example.) If you use the Original linear system, you will have PCs advancing through the lower levels somewhat more rapidly than many of us are accustomed to in D&D. In the period when Gygax was running pickup games in his basement every night of the week, perhaps he legitimately felt the need to slow down advancement, so that people weren't gaining multiple levels within a week of play. But for those of us who now play somewhat less frequently, perhaps the original system from Vol-1 -- which is actually a better representation of the real "value" of a monster, and also a more generally coherent demographic system, and also a lot simpler -- is still the more encouraging one.
Get the updated Java source code to the Arena program (v. 106) here.
Here's the resulting fighter population with the Revised (Sup-I) XP system:
And here's the population generated under the Original (Vol-1) system:
So this reiterates the expected finding that the Original (linear) XP system is much more generous to low-level characters (by a factor of fully ×10 XP value), and so we get somewhat more lower-level characters successfully leveling up, or in other words, the curve for advancing levels is noticeably more gentle.
Interpolating the curve at lower levels for the Revised system gives an advancement function of about y = k e^(-1.4x) (where x is the level, k the 0-level men-at-arms population, and y the population at any other level), or approximately dividing by 4 at each sequential level. In contrast, the Original system generates a simpler advancement function of close to y = k e^(-x), or approximately dividing by 3 at each sequential level.
Moreover, looking at the Original XP system, compare several possibilities for a modifier to the "Monster Determination and Level of Monster Matrix" (Vol-3, p. 10), versus the prescribed higher-level leader proportions from encounters with Men (Vol-2, p. 5):
We see that a modifier of -2 to the random-monster die roll in the Arena gives the best match (least sum-squared difference) to the leadership proportions given in Vol-2 (whereas previously under the Revised XP system we saw that a modifiers of -3 or even -4 from book was necessary). So in other words: Taken as a whole, the Original XP awards are closer to presenting an overall rational system of demographics in D&D.
I suppose the most important consideration is: How fast do you want your PCs leveling up, in terms of number of encounters faced? (And this was in fact used as the starting axiom for the 3E XP system, for example.) If you use the Original linear system, you will have PCs advancing through the lower levels somewhat more rapidly than many of us are accustomed to in D&D. In the period when Gygax was running pickup games in his basement every night of the week, perhaps he legitimately felt the need to slow down advancement, so that people weren't gaining multiple levels within a week of play. But for those of us who now play somewhat less frequently, perhaps the original system from Vol-1 -- which is actually a better representation of the real "value" of a monster, and also a more generally coherent demographic system, and also a lot simpler -- is still the more encouraging one.
Get the updated Java source code to the Arena program (v. 106) here.
2016-01-11
XP: The Big Switch, Pt. 2
In the last blog we looked at the two different systems for awarding XP for defeated monsters in D&D: (1) in the the original white box set, Vol-1, a simple linear progression of 100 XP for HD, and (2) in the revised Supplement-I (and later editions), a much-depressed parabolic progression up to about 9 HD, and then effectively the same linear 100 XP per HD for levels after that. So which is really better?
Now, one of my major game-design principles is that finding the right in-game value or "price"for different unit types is very hard. And by very hard I mean: (1) A table or formula of added components for this purpose is certain to be insufficient for this job (c.f. Traveller spaceship construction, War Machine unit values, D&D 3E magic items, etc.); the only way to properly gauge the interaction of all the different moving parts is through actual playtesting, each unit as a coherent individual piece. And (2) This correct playtest-balancing is by far the hardest part of game design, like, a whole order of magnitude more work than all the rest of the design put together.
So given that, I'd like to use my extensive playtests for Book of War as a resource, since over the years I've lost track of how many hundreds of millions, or billions, of computer-simulated combats I've run for each unit type; that is, I'm pretty confident about the relative "cost" values for each unit there. And even if you don't trust that (although you could verify with the simulator program here), note that the cost values generated eerily match those found in the Chainmail Fantasy Supplement, or original D&D Vol-3 Men-At-Arms costs, so you could just reference Gygax there if you prefer.
The original XP table for D&D only considers Hit Dice, but of course other factors can make a monster much weaker or more powerful. Greater movement, armor class, attacks, and damage can enormously change the value of a type. That's even before we consider "special abilities" like flying, poison, paralysis, turn-to-stone, fire breath, regeneration, etc. Even just having a ranged attack will at least double the value of a piece (e.g., see Vol-3 Men-At-Arms), because that effectively gives many more attacks on the table. Let's reduce our sample space by only looking at units from BOW that are foot, melee-only units with the identical armor value of 5 (same as AC 5, chain mail). Here's what that part of the BOW database looks like:
And let's chart the Hit Dice versus the Cost values:
That's an interesting chart, because there's a very clear outlier: Trolls, the very example given in Vol-1 which allows us to extrapolate any kind of XP awards in the first place. (Again I'll point out that both my Book of War game and Gygax's Chainmail Fantasy agree that Trolls should have a cost of about 70 [75 in Chainmail], with Hill Giants at the lower cost of 50, etc.) If we remove the Troll outliers, then the rest of the chart is basically a linear progression, as shown below:
So this says that for those simple medium-foot units, you could approximate the proper cost in BOW by just taking the hit dice and multiplying by 6 or 7 in each case, and this would account for over 98% of the variation from the mean for those types.
In other words: The value of raw Hit Dice is really linear -- it's not geometric or parabolic or higher-powered in any way. I find this unsurprising for a few reasons: (1) Taking higher-HD creatures generally reduces the number of attacks on a per-HD basis (10 orcs get 10 attacks, but a 10-HD giant only gets one attack; in most cases I'd prefer the former when attacking). (2) We also know that higher individual hit dice are effectively devalued in hits taken, because it presents a simpler "packing problem" to the attacker in applying damage (see here). Now, that two-fold devaluation for higher hit dice is somewhat offset by higher to-hit chances; but you also need some other assumed improvements in movement, damage, and minor special abilities (e.g., throwing rocks) just to maintain parity on a per-HD basis (i.e., to maintain even a linear price increase).
So that argues for the original Vol-1 XP system (linear) over the Sup-I alternative (parabolic). However: What the Vol-1 system was certainly blind to was the need to account for special abilities in some way -- with a canonical case in the very Troll that Vol-1 used for its example; it certainly needs to be worth more than 700 XP no matter how you slice or re-slice it (they are very dangerous in practice). Of course, as we saw in the last post, the Sup-I "special ability" awards were pretty close to the base awards themselves; you don't really need a new table for that, as even in Vol-1 you could broadly just "double" (or: add a like amount to the base XP) the award for a powerful special ability like regeneration, and, for example, give out 1,400 XP for defeating a Troll. (That's just about the proportional difference we see between Troll and Giant costs in the table above.) And it would be justified to likewise give this same doubling adjustment for other statistical improvements like high armor class, ranged attacks (at least for numerous low-HD creatures), etc., etc.
Thus, it seems that the Vol-1 linear system really presents the best true "value" per Hit Die (with appropriate special ability adjustments, as for Trolls). But here are a few possible counterarguments to this theory: One is that in Original D&D's fairly small monster list, high-HD creatures almost uniformly had more deadly special abilities (spells, petrification, breath weapon, swallowing whole, etc.); so perhaps that was accounted for in the system in the first place, and only needed to be disentangled when a greater variety of monsters appeared later on (although I think that's disproven by the relative value of Giants and Trolls in Chainmail, which gets flip-flopped in Vol-1 XP in the absence of any such modifier).
A second counterargument is that the Sup-I adjustment (parabolic through level 9, then linear) may be intended to reflect the progression in the class XP tables: geometric through about Name Level, and then a constant addition for higher levels. This is a somewhat stronger counterargument. Let's look at the relative "number of monsters defeated to gain a level":
The Vol-1 system presents a sliding scale: A single 1st-level fighter in this system must defeat about 20 1st-level monsters to level up (ignoring treasure considerations); and this number slowly increases up to around 100 after name level. But the Sup-I system is more consistent in this sense: At almost every level the fighter needs to kill about 100 monsters to level up (ignoring the anomalous 200 monster requirement at 1st level).
But on the other hand: The depressed Sup-I monster XP implies that about 95% of earned XP will be from treasure at 1st level, sliding to some 70% at level 21+ (link). By my calculations, the original Vol-1 XP system has a more constant proportion in that regard, with around 65-80% XP coming from treasure across any level; and in that way, monster XP is still the smaller part, but not totally negligible, as in the later system.
So the relative merits seem to boil down to this:
At the moment, to my eye, it appears that the original, simple Vol-1 system has more qualities in its favor. What do you think?
Now, one of my major game-design principles is that finding the right in-game value or "price"for different unit types is very hard. And by very hard I mean: (1) A table or formula of added components for this purpose is certain to be insufficient for this job (c.f. Traveller spaceship construction, War Machine unit values, D&D 3E magic items, etc.); the only way to properly gauge the interaction of all the different moving parts is through actual playtesting, each unit as a coherent individual piece. And (2) This correct playtest-balancing is by far the hardest part of game design, like, a whole order of magnitude more work than all the rest of the design put together.
So given that, I'd like to use my extensive playtests for Book of War as a resource, since over the years I've lost track of how many hundreds of millions, or billions, of computer-simulated combats I've run for each unit type; that is, I'm pretty confident about the relative "cost" values for each unit there. And even if you don't trust that (although you could verify with the simulator program here), note that the cost values generated eerily match those found in the Chainmail Fantasy Supplement, or original D&D Vol-3 Men-At-Arms costs, so you could just reference Gygax there if you prefer.
The original XP table for D&D only considers Hit Dice, but of course other factors can make a monster much weaker or more powerful. Greater movement, armor class, attacks, and damage can enormously change the value of a type. That's even before we consider "special abilities" like flying, poison, paralysis, turn-to-stone, fire breath, regeneration, etc. Even just having a ranged attack will at least double the value of a piece (e.g., see Vol-3 Men-At-Arms), because that effectively gives many more attacks on the table. Let's reduce our sample space by only looking at units from BOW that are foot, melee-only units with the identical armor value of 5 (same as AC 5, chain mail). Here's what that part of the BOW database looks like:
And let's chart the Hit Dice versus the Cost values:
That's an interesting chart, because there's a very clear outlier: Trolls, the very example given in Vol-1 which allows us to extrapolate any kind of XP awards in the first place. (Again I'll point out that both my Book of War game and Gygax's Chainmail Fantasy agree that Trolls should have a cost of about 70 [75 in Chainmail], with Hill Giants at the lower cost of 50, etc.) If we remove the Troll outliers, then the rest of the chart is basically a linear progression, as shown below:
So this says that for those simple medium-foot units, you could approximate the proper cost in BOW by just taking the hit dice and multiplying by 6 or 7 in each case, and this would account for over 98% of the variation from the mean for those types.
In other words: The value of raw Hit Dice is really linear -- it's not geometric or parabolic or higher-powered in any way. I find this unsurprising for a few reasons: (1) Taking higher-HD creatures generally reduces the number of attacks on a per-HD basis (10 orcs get 10 attacks, but a 10-HD giant only gets one attack; in most cases I'd prefer the former when attacking). (2) We also know that higher individual hit dice are effectively devalued in hits taken, because it presents a simpler "packing problem" to the attacker in applying damage (see here). Now, that two-fold devaluation for higher hit dice is somewhat offset by higher to-hit chances; but you also need some other assumed improvements in movement, damage, and minor special abilities (e.g., throwing rocks) just to maintain parity on a per-HD basis (i.e., to maintain even a linear price increase).
So that argues for the original Vol-1 XP system (linear) over the Sup-I alternative (parabolic). However: What the Vol-1 system was certainly blind to was the need to account for special abilities in some way -- with a canonical case in the very Troll that Vol-1 used for its example; it certainly needs to be worth more than 700 XP no matter how you slice or re-slice it (they are very dangerous in practice). Of course, as we saw in the last post, the Sup-I "special ability" awards were pretty close to the base awards themselves; you don't really need a new table for that, as even in Vol-1 you could broadly just "double" (or: add a like amount to the base XP) the award for a powerful special ability like regeneration, and, for example, give out 1,400 XP for defeating a Troll. (That's just about the proportional difference we see between Troll and Giant costs in the table above.) And it would be justified to likewise give this same doubling adjustment for other statistical improvements like high armor class, ranged attacks (at least for numerous low-HD creatures), etc., etc.
Thus, it seems that the Vol-1 linear system really presents the best true "value" per Hit Die (with appropriate special ability adjustments, as for Trolls). But here are a few possible counterarguments to this theory: One is that in Original D&D's fairly small monster list, high-HD creatures almost uniformly had more deadly special abilities (spells, petrification, breath weapon, swallowing whole, etc.); so perhaps that was accounted for in the system in the first place, and only needed to be disentangled when a greater variety of monsters appeared later on (although I think that's disproven by the relative value of Giants and Trolls in Chainmail, which gets flip-flopped in Vol-1 XP in the absence of any such modifier).
A second counterargument is that the Sup-I adjustment (parabolic through level 9, then linear) may be intended to reflect the progression in the class XP tables: geometric through about Name Level, and then a constant addition for higher levels. This is a somewhat stronger counterargument. Let's look at the relative "number of monsters defeated to gain a level":
The Vol-1 system presents a sliding scale: A single 1st-level fighter in this system must defeat about 20 1st-level monsters to level up (ignoring treasure considerations); and this number slowly increases up to around 100 after name level. But the Sup-I system is more consistent in this sense: At almost every level the fighter needs to kill about 100 monsters to level up (ignoring the anomalous 200 monster requirement at 1st level).
But on the other hand: The depressed Sup-I monster XP implies that about 95% of earned XP will be from treasure at 1st level, sliding to some 70% at level 21+ (link). By my calculations, the original Vol-1 XP system has a more constant proportion in that regard, with around 65-80% XP coming from treasure across any level; and in that way, monster XP is still the smaller part, but not totally negligible, as in the later system.
So the relative merits seem to boil down to this:
- Original Vol-1 XP System: This linear system is easier to use, requiring no table. It better reflects the actual added "value" per raw monster hit die (although manual additions are needed for "special abilities"). It supports a convenient linear conversion to unit values shown in Chainmail Fantasy, Men-At-Arms costs, Book of War costs, etc. It allows lower-level characters to level up after defeating a fairly reasonable 10 or 20 monsters (instead of hundreds in the Sup-I system, excluding treasure). And it has a more even and constant monster/treasure XP split, in a ratio of about 1:3 across all levels.
- Revised Sup-I XP System: This piecewise-parabolic system better reflects the class XP tables, and so generates a nearly constant number of monsters needed to gain the next level in each case (although this number is very high, approximately 100 monsters per level; or in other words, monster XP is nearly negligible compared to treasure XP). And it's compatible and familiar to players of any later edition of D&D.
At the moment, to my eye, it appears that the original, simple Vol-1 system has more qualities in its favor. What do you think?
2016-01-04
XP: The Big Switch, Pt. 1
Experience awards from monsters saw a big switch between Original D&D (Vol-1) and the first Supplement (Sup-I, Greyhawk). In short, original Vol-1 implied a simple linear award of 100 xp per monster Hit Die, while Sup-I changed this to a geometric-like award, increasing cumulatively per Hit Die -- and this was the system maintained in later editions like AD&D, Holmes and Moldvay Basic D&D, etc.
Let's consider the details. The very first volume of Original D&D is famously cryptic, to say the least, about XP awards. This is what it says:
So the truth is, OD&D fails to give an explicit rule for XP, and the best we could do is make an interpretation of this one example of defeating a troll and taking its treasure. It appears that 1 GP of treasure is worth 1 XP (and of course this is familiar from all later editions). And it also appears that 1 HD of defeated monster is worth 100 XP (as per this troll whose 6+3 Hit Dice are assessed as being "7th level", i.e., 700 XP). Of course, even this brief passage manages to be self-contradictory: initially it says that the warlock on the 5th dungeon level should have XP scaled by 5/8; but then this is overruled a few sentences later, where defeating a creature of the 7th monster level scales XP by 7/8.
But then in Supplement-I, Gygax makes the great XP overhaul. He archly writes:
Below I've recreated that table and a chart to visualize the new awards. Here I've converted Hit Dice such as "2+1" to a decimal fraction of 2.3 (granted original d6-based hit dice, a 1-pip adjustment is worth 1/3.5 ≈ 0.286 or about 0.3 of a full hit die). In the chart, base value XP are in blue, and added points for special abilities in orange:
This visualization clarifies something that's otherwise easy to miss. There's two separate sections to the Sup-I table that should be treated distinctly (what we'd call a "piecewise function" mathematically): the low-HD section up to about 9 HD, where there is a notable curve to the chart; and the high-HD section after that, where it's pretty obvious that the progression becomes more like a straight line. Part of the reason that's easy to misinterpret is that the XP table apparently keeps adding larger numbers in the later rows, but since these rows span multiple Hit Dice, it actually works out being basically a constant addition per Hit Die (i.e., linear).
A few side notes: In D&D, this is the same arithmetic illusion as in the Saving Throw tables that make people think that Magic-Users are better at saves vs. spells than Fighters, for example, when on a level-for-level basis they're actually not. And mathematically, this serves as a good case study that one should visualize one's data first before blindly running statistical procedures on it (like regressions or correlations) -- something which I must admit due to time constraints I don't completely enforce in the statistics classes that I myself teach.
Having noted the switch between the two parts of the XP table, let's make a best-fit model for each section:
We see from the "Low HD" chart that the increasing numbers that Gygax selected there are essentially parabolic; that is, a formula that looks like y = kx² accounts for better than 99% of the variation from the mean. The other thing we see is that the "Additional Points for Special Abilities" is, relatively speaking, not very far off from the "Base Value" of XP. Massaging the numbers above a bit, as we are fond of doing here, you could roughly compute the base value of XP as y = 10x² (where x is the number of hit dice), and the special ability award as about y = 9x² (or, to simplify matters completely, just add a like value as per the base XP award).
On the other hand, we note that in the "High HD" chart, a straight linear regression does indeed account for more than 99% of the variation from the mean in the base award, and over 96% of the variation from the mean for the special ability award. Again simplifying the numbers from the true regression, we could approximate the base XP award as about y = 120x − 400, and the special ability award as about y = 100x − 400.
Or, to make a significant observation that would provide an even greater simplification -- The numbers in the "High HD" part of the table are very nearly a simple 100 XP per Hit Die (e.g., 9 HD gives 900 XP, 11 HD gives 1100 XP, 20 HD gives 2000 XP, etc.). Exactly the same as the original Vol-1 rule!
So we might interpret this observation as saying: The altered XP rules in OD&D Sup-I (and hence throughout later rulesets, like the entire B/X and AD&D lines) really only change things for lower-level monsters and PCs, i.e., those up to about 9 HD, greatly depressing the monster XP awards there -- while leaving awards at higher levels very nearly the same. Then the question we might ask is: Which approach is really better? More next time.
Let's consider the details. The very first volume of Original D&D is famously cryptic, to say the least, about XP awards. This is what it says:
As characters meet monsters in mortal combat and defeat them, and when they obtain various forms of treasure (money, gems, jewelry, magical items, etc.), they gain "experience". This adds to their experience point total, gradually moving them upwards through the levels. Gains in experience points will be relative; thus an 8th level Magic-User operating on the 5th dungeon level would be awarded 5/8 experience. Let us assume he gains 7,000 Gold Pieces by defeating a troll (which is a 7th level monster, as it has over 6 hit dice). Had the monster been only a 5th level one experience would be awarded on a 5/8 basis as already stated, but as the monster guarding the treasure was a 7th level one experience would be awarded on a 7/8 basis thus; 7,000 G.P. + 700 for killing the troll = 7,700 divided by 8 = 962.5 x 7 = 6,037.5. Experience points are never awarded above a 1 for 1 basis, so even if a character defeats a higher level monster he will not receive experience points above the total of treasure combined with the monster's kill value. It is also recommended that no more experience points be awarded for any single adventure than will suffice to move the character upwards one level.
So the truth is, OD&D fails to give an explicit rule for XP, and the best we could do is make an interpretation of this one example of defeating a troll and taking its treasure. It appears that 1 GP of treasure is worth 1 XP (and of course this is familiar from all later editions). And it also appears that 1 HD of defeated monster is worth 100 XP (as per this troll whose 6+3 Hit Dice are assessed as being "7th level", i.e., 700 XP). Of course, even this brief passage manages to be self-contradictory: initially it says that the warlock on the 5th dungeon level should have XP scaled by 5/8; but then this is overruled a few sentences later, where defeating a creature of the 7th monster level scales XP by 7/8.
But then in Supplement-I, Gygax makes the great XP overhaul. He archly writes:
Guidelines for Awarding Experience Points for Monster Slaying: (Addition)
The awarding of experience points is often a matter of discussion, for the referee must make subjective judgments. Rather than the (ridiculous) 100 points per level for slain monsters, use the table below, dividing experience equally among all characters in the party involved. (Sup-I, p. 12)
Below I've recreated that table and a chart to visualize the new awards. Here I've converted Hit Dice such as "2+1" to a decimal fraction of 2.3 (granted original d6-based hit dice, a 1-pip adjustment is worth 1/3.5 ≈ 0.286 or about 0.3 of a full hit die). In the chart, base value XP are in blue, and added points for special abilities in orange:
This visualization clarifies something that's otherwise easy to miss. There's two separate sections to the Sup-I table that should be treated distinctly (what we'd call a "piecewise function" mathematically): the low-HD section up to about 9 HD, where there is a notable curve to the chart; and the high-HD section after that, where it's pretty obvious that the progression becomes more like a straight line. Part of the reason that's easy to misinterpret is that the XP table apparently keeps adding larger numbers in the later rows, but since these rows span multiple Hit Dice, it actually works out being basically a constant addition per Hit Die (i.e., linear).
A few side notes: In D&D, this is the same arithmetic illusion as in the Saving Throw tables that make people think that Magic-Users are better at saves vs. spells than Fighters, for example, when on a level-for-level basis they're actually not. And mathematically, this serves as a good case study that one should visualize one's data first before blindly running statistical procedures on it (like regressions or correlations) -- something which I must admit due to time constraints I don't completely enforce in the statistics classes that I myself teach.
Having noted the switch between the two parts of the XP table, let's make a best-fit model for each section:
We see from the "Low HD" chart that the increasing numbers that Gygax selected there are essentially parabolic; that is, a formula that looks like y = kx² accounts for better than 99% of the variation from the mean. The other thing we see is that the "Additional Points for Special Abilities" is, relatively speaking, not very far off from the "Base Value" of XP. Massaging the numbers above a bit, as we are fond of doing here, you could roughly compute the base value of XP as y = 10x² (where x is the number of hit dice), and the special ability award as about y = 9x² (or, to simplify matters completely, just add a like value as per the base XP award).
On the other hand, we note that in the "High HD" chart, a straight linear regression does indeed account for more than 99% of the variation from the mean in the base award, and over 96% of the variation from the mean for the special ability award. Again simplifying the numbers from the true regression, we could approximate the base XP award as about y = 120x − 400, and the special ability award as about y = 100x − 400.
Or, to make a significant observation that would provide an even greater simplification -- The numbers in the "High HD" part of the table are very nearly a simple 100 XP per Hit Die (e.g., 9 HD gives 900 XP, 11 HD gives 1100 XP, 20 HD gives 2000 XP, etc.). Exactly the same as the original Vol-1 rule!
So we might interpret this observation as saying: The altered XP rules in OD&D Sup-I (and hence throughout later rulesets, like the entire B/X and AD&D lines) really only change things for lower-level monsters and PCs, i.e., those up to about 9 HD, greatly depressing the monster XP awards there -- while leaving awards at higher levels very nearly the same. Then the question we might ask is: Which approach is really better? More next time.
Subscribe to:
Posts (Atom)