Long-term Forecasting using Tensor-Train RNNs
Abstract
We present Tensor-Train RNN (TT-RNN), a novel family of neural sequence architectures for multivariate forecasting in environments with nonlinear dynamics. Long-term forecasting in such systems is highly challenging, since there exist long-term temporal dependencies, higher-order correlations and sensitivity to error propagation. Our proposed tensor recurrent architecture addresses these issues by learning the nonlinear dynamics directly using higher order moments and high-order state transition functions. Furthermore, we decompose the higher-order structure using the tensor-train (TT) decomposition to reduce the number of parameters while preserving the model performance. We theoretically establish the approximation properties of Tensor-Train RNNs for general sequence inputs, and such guarantees are not available for usual RNNs. We also demonstrate significant long-term prediction improvements over general RNN and LSTM architectures on a range of simulated environments with nonlinear dynamics, as well on real-world climate and traffic data.
Attached Files
Submitted - 1711.00073.pdf
Files
Name | Size | Download all |
---|---|---|
md5:94c6fafeebe8f7f17a86bc6896c69ba7
|
2.2 MB | Preview Download |
Additional details
- Eprint ID
- 92672
- Resolver ID
- CaltechAUTHORS:20190205-113450468
- Created
-
2019-02-05Created from EPrint's datestamp field
- Updated
-
2023-06-02Created from EPrint's last_modified field