The math is straightforward
With an advantage you are looking for best of two results. To figure out your odds you need to multiply the chance of FAILURE together to find out the new chance of failure. For example if you need 11+ to hit rolling two dice and taking the best means instead of a 50% of failing you have only a 25% chance of failing (.5 times .5).
For a disadvantage where you take the worst of two dice roll you need to multiply the chances of SUCCESS to find out the new odds. For example if you need a 11+ to hit your chance success drops from 50% to 25% (.5 time .5).
Advantage 15+ to hit, goes from 25% chance of success to roughly 43% chance of success. (.75 time .75)
Disadvantage 15+ to hit, goes from 25% chance of success to roughly a 6% chance of success (.25 times .25)
The general rule of thumb that in the mid range of the d20 (from success on a 9+ to 12+) advantage grant roughly a equivalent to a +5 bonus and disadvantage a -5 bonus. The increase and decrease in odds tappers off when your odds of success approach 1 or 20. For example a advantage on a 19+ your chance of success goes from 90% to 81% not quite a +2 bonus on a d20.
An interesting property of the system is that there always a chance of success and always a chance of failure. Unlike a modifier systems where enough modifiers can mean auto success or auto failure. (Unless you have a 20 is an automatic success and 1 a automatic failure)
A useful application of knowing the odds of rolling two dice is that you can just covert it to a straight bonus when rolling for a large number of NPCs. A bunch of goblins with an advantage from surprise that need 13+ to hit the players you can just apply a +4 (or +5 if you round up) bonus instead of rolling the second dice. This is because they have a 60% chance of failure on 13+. Taking .6 times .6 yields .36 a drop of 24%. Not quite a +5 bonus on a d20 dice.
With an advantage you are looking for best of two results. To figure out your odds you need to multiply the chance of FAILURE together to find out the new chance of failure. For example if you need 11+ to hit rolling two dice and taking the best means instead of a 50% of failing you have only a 25% chance of failing (.5 times .5).
For a disadvantage where you take the worst of two dice roll you need to multiply the chances of SUCCESS to find out the new odds. For example if you need a 11+ to hit your chance success drops from 50% to 25% (.5 time .5).
Advantage 15+ to hit, goes from 25% chance of success to roughly 43% chance of success. (.75 time .75)
Disadvantage 15+ to hit, goes from 25% chance of success to roughly a 6% chance of success (.25 times .25)
The general rule of thumb that in the mid range of the d20 (from success on a 9+ to 12+) advantage grant roughly a equivalent to a +5 bonus and disadvantage a -5 bonus. The increase and decrease in odds tappers off when your odds of success approach 1 or 20. For example a advantage on a 19+ your chance of success goes from 90% to 81% not quite a +2 bonus on a d20.
An interesting property of the system is that there always a chance of success and always a chance of failure. Unlike a modifier systems where enough modifiers can mean auto success or auto failure. (Unless you have a 20 is an automatic success and 1 a automatic failure)
A useful application of knowing the odds of rolling two dice is that you can just covert it to a straight bonus when rolling for a large number of NPCs. A bunch of goblins with an advantage from surprise that need 13+ to hit the players you can just apply a +4 (or +5 if you round up) bonus instead of rolling the second dice. This is because they have a 60% chance of failure on 13+. Taking .6 times .6 yields .36 a drop of 24%. Not quite a +5 bonus on a d20 dice.
6 comments:
A lot of people are forgetting, including Mike Mearls, that not rolling both dice as you suggest potentially cheats the roller out of a critical hit :)
It is a judgement when dealing with large numbers of NPCs.
If you really want to use criticals then compute the odds and factor that in. For example a critical of a natural 20 with advantage is 19+ on a single die. (.95 x .95 = .902 a +1 improvement on a d20).
Excellent. I'd seen the people saying that it broke down to a +5 in many situations, but never the math behind it (or the bonus conversion for large groups of NPCs)
I think I'll submit that with my feedback to WotC, a request for a "math of the mechanics" section in the rules.
I like this sort of mechanic. The first time I remember seeing it was in Barbarians of Lemuria, but they used it with a 2d6 base, rolling 3d6 keeping best/worst for ad/disad. Working that kind of probability math on a bell curve makes me itch, but I do like the feel of it enough to use the basic idea for my house ruled task resolution system in OD&D.
I think I prefer the +5/+2/-2/-5 modifiers of 4e. It allows you to differentiate between a big modifier and a small modifier. With advantage you always give a big modifier if they need an average roll and you always give a small modifier if they need a high or low roll. If somebody needs a 19 to hit and his opponent is blind, I want him to be able to get a +5 modifier rather than a +0.8 modifier. The flat modifiers may create autosuccess or autofailure. But it can only do that when the odds were going that way anyway. (and assuming you don't rule a 1 an autofailure and 20 an autosuccess.
@Philo, in my heart, I 100% agree with you. But in play, I was surprised at how well it worked out.
Post a Comment