Conditional Probability in quant interviews
"Given X happened, what's the probability of Y?" — the second-most-common framing in quant interviews.
TRY A PROBLEM NOW · NO SIGNUP
In the classical secretary problem, as n → ∞, what fraction should you reject outright before accepting the next best-so-far candidate?
WHAT IT IS
Conditional probability, P(A | B) = P(A ∩ B) / P(B), is the machinery for updating beliefs when new information arrives. In quant interviews it appears in two modes: as the explicit problem ("given that at least one coin is heads, what's the probability both are?") and as the implicit solving tool ("condition on the first roll, then the rest simplifies"). Candidates who handle conditional probability well do three things reflexively: they identify what's given vs. what's asked, they write out the conditioning carefully before computing, and they recognise when the intuitive answer is subtly wrong. Problems are deliberately designed to catch careless conditioning: the Monty Hall problem, the two-children problem, Bertrand's box paradox — all rely on interviewer candidates conflating P(A | B) with P(A ∩ B) or with P(B | A).
WHEN IT APPEARS IN INTERVIEWS
Everywhere. Conditional probability questions appear in roughly every quant interview, from the first screen to the final round. Jane Street in particular uses conditional-probability open-ended discussions as their signature style; SIG and Optiver tend toward shorter, higher-volume conditional-probability brainteasers.
FIRMS THAT TEST THIS
SAMPLE PROBLEMS
I have two children. At least one is a boy. What's the probability both are boys?
Sample space conditional on 'at least one boy' is {BB, BG, GB} — three equally likely outcomes. Only BB has two boys, so P = 1/3, not 1/2. The subtle point: being told 'at least one boy' excludes the GG case but keeps three equally likely remaining cases, not two.
You pick door 1 of 3. Monty (who knows where the car is) opens door 3 to reveal a goat. Should you switch to door 2?
Switching wins with probability 2/3, not 1/2. The conditioning is subtle: Monty's action is not random — he always reveals a goat — which breaks the symmetry between the two remaining doors.
A disease affects 1 in 1000 people. A test has 99% sensitivity (true positive rate) and 99% specificity (true negative rate). You test positive. What's the probability you have the disease?
Bayes' theorem: P(disease | positive) = P(pos | disease) P(disease) / P(pos). Compute: 0.99 × 0.001 / (0.99 × 0.001 + 0.01 × 0.999) ≈ 9.0%. The counter-intuitive result — most candidates guess >50% — is why this problem comes up so often.
I have two children. At least one is a boy born on a Tuesday. What's the probability both are boys?
The extra conditioning on day-of-week changes the answer. Out of 196 equally likely (gender, day) × (gender, day) combinations, 27 have at least one boy-on-Tuesday, of which 13 have two boys. P = 13/27 ≈ 48%, which surprises most candidates expecting either 1/3 or 1/2.
SOLVING STRATEGIES
- ·Write P(A | B) = P(A ∩ B) / P(B) before doing anything else. The formula keeps you honest.
- ·Build the sample space restricted to the conditioning event. Count favourable outcomes within that restricted space.
- ·Use a tree diagram when the problem has multiple stages. Label each branch with its conditional probability.
- ·Swap the conditioning direction via Bayes when the problem gives you P(B | A) but asks for P(A | B).
- ·Sanity-check by checking limiting cases: what if the conditioning event is certain? What if it has probability zero?
COMMON VARIATIONS
- ·Bayes updates: computing P(hypothesis | evidence).
- ·Multi-step conditioning: P(A | B, C, D) — often simplifies via conditional independence.
- ·Conditional expectation: E[X | Y], which leads to the tower property E[E[X | Y]] = E[X].
- ·Continuous conditional distributions: conditioning on a continuous random variable introduces density-based analogues.
FAQ
Bayes' theorem is a specific application of conditional probability: it tells you how to flip the conditioning direction (from P(A | B) to P(B | A)). All Bayes problems are conditional probability problems, but not vice versa.
Write out the restricted sample space explicitly. Don't rely on intuition for problems involving 'at least' or 'given that I told you X'. The wording of the conditioning event matters far more than candidates expect.
Not as the literal problem — too well-known — but its structural cousins (deliberate-information vs. random-information conditioning) appear regularly. Recognising that pattern is the useful skill.
Jane Street pushes conditional probability problems open-ended — they want you to justify every step, propose generalisations, and discuss edge cases. Short correct answers are fine but don't distinguish you.
RELATED TECHNIQUES
The machinery for updating beliefs. A staple of quant interviews, often in disguised form.
The single most common technique in quant interviews. Every problem reduces to it eventually.
If the problem has states and transitions, it's a Markov chain — and quant interviewers love them.
CLASSIC CONDITIONAL PROBABILITY PROBLEMS
Deep walkthroughs of named problems that test conditional probability.
Not ready to sign up? Get 5 hand-picked quant interview problems every week. No spam, unsubscribe any time.
Drill conditional probability problems now
400+ problems, adaptive selection, AI-generated alternative explanations. Free tier covers 20 attempts.
Start practicing free