TimesFM Releases 2.5 Time-Series Model Update
Google Research today released TimesFM 2.5, an updated pretrained time-series foundation model that reduces parameter count to 200 million and expands context length to 16,000. The March 31, 2026 update also adds an optional 30M continuous quantile head for up to 1,000-step forecasting, updated inference APIs, and restored covariate (XReg) support. Checkpoints are available on Hugging Face and BigQuery integration is provided.
Scoring Rationale
Significant product update from Google Research with high novelty (16k context, quantile head) and practical usability; credibility is high due to official checkpoints and ICML paper. Score slightly boosted for source authority and freshness, and modestly reduced for ongoing docs/infrastructure updates.
Practice interview problems based on real data
1,500+ SQL & Python problems across 15 industry datasets — the exact type of data you work with.
Try 250 free problemsStep-by-step roadmaps from zero to job-ready — curated courses, salary data, and the exact learning order that gets you hired.
Sources
- Read OriginalGitHub - google-research/timesfm: TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.github.com


