I'm having trouble understanding why my method is wrong here.
The question:
Throw: ----- Outcome:
-- 1 ---------- Lose $3.00
-- 2 ---------- Lose $2.00
-- 3 ---------- Lose $1.00
-- 4 ---------- No Effect
-- 5 ---------- Win $1.00
-- 6 ---------- Win $5.00
The results of throwing a single die in a certain gambling game are shown in the table above. What is the probability that a player will have won at least $5.00 after two throws?
A) 1/36, B) 1/12, C) 1/9, D) 5/36, E) 1/6
My method was:
You must roll a 6 (1 in 6 chance) in at least one of the rolls, and you must roll either a 4, 5 or 6 in the other roll (3 in 6 chance), so the probability would be (1/6)*(3/6), which is equal to 1/12.
Would someone mind to point out how my logic is flawed, please?
Thanks!