Let's consider the details. The very first volume of Original D&D is famously cryptic, to say the least, about XP awards. This is what it says:
As characters meet monsters in mortal combat and defeat them, and when they obtain various forms of treasure (money, gems, jewelry, magical items, etc.), they gain "experience". This adds to their experience point total, gradually moving them upwards through the levels. Gains in experience points will be relative; thus an 8th level Magic-User operating on the 5th dungeon level would be awarded 5/8 experience. Let us assume he gains 7,000 Gold Pieces by defeating a troll (which is a 7th level monster, as it has over 6 hit dice). Had the monster been only a 5th level one experience would be awarded on a 5/8 basis as already stated, but as the monster guarding the treasure was a 7th level one experience would be awarded on a 7/8 basis thus; 7,000 G.P. + 700 for killing the troll = 7,700 divided by 8 = 962.5 x 7 = 6,037.5. Experience points are never awarded above a 1 for 1 basis, so even if a character defeats a higher level monster he will not receive experience points above the total of treasure combined with the monster's kill value. It is also recommended that no more experience points be awarded for any single adventure than will suffice to move the character upwards one level.
So the truth is, OD&D fails to give an explicit rule for XP, and the best we could do is make an interpretation of this one example of defeating a troll and taking its treasure. It appears that 1 GP of treasure is worth 1 XP (and of course this is familiar from all later editions). And it also appears that 1 HD of defeated monster is worth 100 XP (as per this troll whose 6+3 Hit Dice are assessed as being "7th level", i.e., 700 XP). Of course, even this brief passage manages to be self-contradictory: initially it says that the warlock on the 5th dungeon level should have XP scaled by 5/8; but then this is overruled a few sentences later, where defeating a creature of the 7th monster level scales XP by 7/8.
But then in Supplement-I, Gygax makes the great XP overhaul. He archly writes:
Guidelines for Awarding Experience Points for Monster Slaying: (Addition)
The awarding of experience points is often a matter of discussion, for the referee must make subjective judgments. Rather than the (ridiculous) 100 points per level for slain monsters, use the table below, dividing experience equally among all characters in the party involved. (Sup-I, p. 12)
Below I've recreated that table and a chart to visualize the new awards. Here I've converted Hit Dice such as "2+1" to a decimal fraction of 2.3 (granted original d6-based hit dice, a 1-pip adjustment is worth 1/3.5 ≈ 0.286 or about 0.3 of a full hit die). In the chart, base value XP are in blue, and added points for special abilities in orange:
This visualization clarifies something that's otherwise easy to miss. There's two separate sections to the Sup-I table that should be treated distinctly (what we'd call a "piecewise function" mathematically): the low-HD section up to about 9 HD, where there is a notable curve to the chart; and the high-HD section after that, where it's pretty obvious that the progression becomes more like a straight line. Part of the reason that's easy to misinterpret is that the XP table apparently keeps adding larger numbers in the later rows, but since these rows span multiple Hit Dice, it actually works out being basically a constant addition per Hit Die (i.e., linear).
A few side notes: In D&D, this is the same arithmetic illusion as in the Saving Throw tables that make people think that Magic-Users are better at saves vs. spells than Fighters, for example, when on a level-for-level basis they're actually not. And mathematically, this serves as a good case study that one should visualize one's data first before blindly running statistical procedures on it (like regressions or correlations) -- something which I must admit due to time constraints I don't completely enforce in the statistics classes that I myself teach.
Having noted the switch between the two parts of the XP table, let's make a best-fit model for each section:
We see from the "Low HD" chart that the increasing numbers that Gygax selected there are essentially parabolic; that is, a formula that looks like y = kx² accounts for better than 99% of the variation from the mean. The other thing we see is that the "Additional Points for Special Abilities" is, relatively speaking, not very far off from the "Base Value" of XP. Massaging the numbers above a bit, as we are fond of doing here, you could roughly compute the base value of XP as y = 10x² (where x is the number of hit dice), and the special ability award as about y = 9x² (or, to simplify matters completely, just add a like value as per the base XP award).
On the other hand, we note that in the "High HD" chart, a straight linear regression does indeed account for more than 99% of the variation from the mean in the base award, and over 96% of the variation from the mean for the special ability award. Again simplifying the numbers from the true regression, we could approximate the base XP award as about y = 120x − 400, and the special ability award as about y = 100x − 400.
Or, to make a significant observation that would provide an even greater simplification -- The numbers in the "High HD" part of the table are very nearly a simple 100 XP per Hit Die (e.g., 9 HD gives 900 XP, 11 HD gives 1100 XP, 20 HD gives 2000 XP, etc.). Exactly the same as the original Vol-1 rule!
So we might interpret this observation as saying: The altered XP rules in OD&D Sup-I (and hence throughout later rulesets, like the entire B/X and AD&D lines) really only change things for lower-level monsters and PCs, i.e., those up to about 9 HD, greatly depressing the monster XP awards there -- while leaving awards at higher levels very nearly the same. Then the question we might ask is: Which approach is really better? More next time.