Quantum computers need error correction to work reliably. Physical qubits are noisy, so we encode logical qubits using surface codes — 2D lattices of physical qubits with redundancy that lets us detect and correct errors.
A decoder is the algorithm that reads error syndrome measurements and decides which corrections to apply. The industry standard is MWPM (Minimum Weight Perfect Matching), which assumes errors are independent. But real quantum hardware has correlated noise — errors that cluster together — and MWPM doesn't exploit this structure.
Your goal: build a decoder that beats MWPM by exploiting correlated noise patterns on surface codes.
A single Python file (.py) that defines one function:
def build_decoder(point: ParameterPoint) -> Decoder:
"""Return a decoder for the given parameter point.
point.L -- code distance (3, 5, or 7)
point.p -- physical error rate (0.005 or 0.01)
point.xi -- correlation length (0, 2, 5, or 10)
"""
return MyDecoder(point)Your Decoder must implement decode(syndrome_array) where the input is a (shots, num_detectors) uint8 array and the output is a (shots,) uint8 array of predicted logical corrections.
Allowed imports: numpy, scipy, pymatching, stim. No file I/O, no network access, no subprocess calls.
Your decoder is tested on 24 parameter points — every combination of:
At each point, we generate syndrome data using Stim surface code circuits with correlated Bernoulli noise, run your decoder, and count logical errors. Each submission is evaluated with a unique random seed.
Your score is errors per million simulations (lower is better):
score = total_errors * 1,000,000 / total_shots
All 24 parameter points are aggregated — total errors across all points divided by total shots across all points. This means harder parameter points (higher p, higher xi, larger L) contribute more to the score.
The baseline is Minimum Weight Perfect Matching, the most widely used QEC decoder. MWPM finds minimum-weight corrections by modeling errors as a graph matching problem. It works well for independent noise (xi=0) but ignores error correlations.
As xi increases (more correlated noise), MWPM's performance degrades because it treats each error independently. Your decoder can beat it by learning or exploiting the spatial correlation structure of errors.
Submissions are validated with a 7-layer pipeline:
build_decoder(point)decode() method.py file, max 200KBbuild_decoder(point)