ChatTS: Understanding & Reasoning about Time Series
ChatTS is a native Time Series Multimodal LLM (TS-MLLM) for conversational analysis and reasoning on multivariate time series. It preserves numerical values, supports flexible lengths and dimensions, and integrates smoothly into existing LLM pipelines.
Key Features
Native TS Modality
Built for multivariate time series as a first‑class input.
Flexible Inputs
Handles variable sequence lengths and dimensions in a single prompt.
Conversational Reasoning
Interactive dialogue to explore trends, anomalies, and statistics.
Value-Preserving Encoding
Answers numeric questions precisely (e.g., spike magnitude at t).
Easy Integration
Works with Transformers and vLLM; supports an OpenAI‑compatible server.
Demo & Tools
Public web demo, training data/code, and evaluation suites.
Quick Start
Interactive Web Demo
Resources
Paper (VLDB ’25)
ChatTS: Aligning Time Series with LLMs via Synthetic Data.
Model Weights
ChatTS‑14B checkpoints on Hugging Face.
Quantized Model
GPTQ Int4 variant for efficient inference.
Training Dataset
Synthetic alignment / SFT / IFT data.
Evaluation Datasets
Zenodo bundle of real & synthetic eval data.
Training Scripts
End‑to‑end scripts (based on LLaMA‑Factory).
Showcase
Use Cases from GitHub figures folder.