Abstract

Multivariate time series forecasting is an important machine learning problem across many domains, including predictions of solar plant energy output, electricity consumption, and traffic jam situation. Temporal data arise in these real-world applications often involves a mixture of long-term and short-term patterns, for which traditional approaches such as Autoregressive models and Gaussian Process may fail. In this paper, we proposed a novel deep learning framework, namely Long- and Short-term Time-series network (LSTNet), to address this open challenge. LSTNet uses the Convolution Neural Network (CNN) and the Recurrent Neural Network (RNN) to extract short-term local dependency patterns among variables and to discover long-term patterns for time series trends. Furthermore, we leverage traditional autoregressive model to tackle the scale insensitive problem of the neural network model. In our evaluation on real-world data with complex mixtures of repetitive patterns, LSTNet achieved significant performance improvements over that of several state-of-the-art baseline methods. All the data and experiment codes are available online.

Keywords

Leverage (statistics)Computer scienceAutoregressive modelRecurrent neural networkTime seriesArtificial intelligenceDeep learningArtificial neural networkTerm (time)Machine learningGaussian processDependency (UML)Convolutional neural networkData modelingData miningGaussianEconometrics

Affiliated Institutions

Related Publications

Publication Info

Year
2018
Type
preprint
Pages
95-104
Citations
1848
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1848
OpenAlex
206
Influential

Cite This

Guokun Lai, Wei-Cheng Chang, Yiming Yang et al. (2018). Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks. The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval , 95-104. https://doi.org/10.1145/3209978.3210006

Identifiers

DOI
10.1145/3209978.3210006
arXiv
1703.07015

Data Quality

Data completeness: 84%