Dilated RNNs Induce Long-Range Correlations in Quantum States

Researchers introduce a simple geometric modification to autoregressive recurrent neural network wave functions that injects explicit long-range inductive bias while keeping efficient sampling. The paper shows analytically that dilation alters correlation geometry and can generate power-law scaling in a linearized perturbative regime. Numerically, dilated RNN ansatze recover the expected power-law connected two-point correlations for the critical 1D transverse-field Ising model, where standard RNN architectures show exponential decay. The method also accurately represents the one-dimensional Cluster state, a case with long-range conditional correlations that previously challenged RNN-based neural quantum states. The proposal offers a low-overhead alternative to transformer-style self-attention for long-range dependencies in Neural Quantum States.
What happened
The paper "Geometry-Induced Long-Range Correlations in Recurrent Neural Network Quantum States" proposes a simple architectural fix, dilated RNN, that gives autoregressive recurrent wave functions an explicit long-range inductive bias while preserving favorable forward-pass scaling. The authors prove analytically that dilation modifies the geometry of correlations and can induce power-law correlation decay in a linearized, perturbative limit, and they validate the claim numerically on two benchmark quantum states.
Technical details
The work replaces dense short-range recurrence with dilated connections so recurrent units access distant lattice sites through structured skips (dilated connections). This keeps the model autoregressive and sample-efficient, unlike global self-attention. Key technical claims and results include:
- •Analytic derivation showing dilation changes the underlying correlation geometry and supports power-law scaling in a simplified linear regime.
- •Numerical experiments on the critical 1D transverse-field Ising model where dilated RNN reproduces the expected power-law connected two-point correlations, in contrast to exponential decay from conventional RNN ansatze.
- •Successful approximation of the one-dimensional Cluster state, demonstrating ability to capture long-range conditional correlations known to defeat standard RNN wave functions.
Context and significance
Neural Quantum States (NQS) built from RNN wave functions are attractive because they allow exact sampling without Markov-chain autocorrelation. However, standard RNNs are biased toward finite-length correlations, motivating expensive fixes such as transformer-style self-attention with heavy computational and memory cost. The paper positions dilation as a low-cost geometric inductive bias that addresses long-range dependency failure modes in NQS, trading minimal architectural complexity for substantial representational gain. This bridges ideas from dilated convolutional networks and sequence modeling into quantum many-body state representation.
What to watch
Validate these results on larger system sizes, higher-dimensional lattices, and within variational optimization loops. Also compare wall-clock training and sampling cost to transformer-based NQS and gauge robustness to noise during optimization.
Scoring Rationale
This is a solid methodological contribution within Neural Quantum States and sequence-modeling architectures, offering a low-cost fix for long-range correlations. It is niche to quantum many-body simulation rather than a general ML paradigm shift. Recent publication timing slightly reduces immediacy.
Practice interview problems based on real data
1,500+ SQL & Python problems across 15 industry datasets — the exact type of data you work with.
Try 250 free problemsStep-by-step roadmaps from zero to job-ready — curated courses, salary data, and the exact learning order that gets you hired.
