论文标题

用时间序列注意变压器表示多元时间序列作为图表

Expressing Multivariate Time Series as Graphs with Time Series Attention Transformer

论文作者

Ng, William T., Siu, K., Cheung, Albert C., Ng, Michael K.

论文摘要

在各种下游机器学习任务中,多元时间序列的可靠和有效表示至关重要。在多元时间序列预测中,每个变量都取决于其历史值,并且变量之间也存在相互依存关系。必须设计模型以捕获时间序列之间的内部和相互关系。为了朝着这一目标迈进,我们提出了时间序列注意变压器(TSAT),以进行多元时间序列表示学习。使用TSAT,我们以边缘增强动态图表示多元时间序列的时间信息和相互依赖性。在动态图中以节点表示串行内的相关性。修改了一种自我注意力的机制,以使用超经验模式分解(SMD)模块捕获串行间的相关性。我们将嵌入式动态图应用于时代序列预测问题,包括两个现实世界数据集和两个基准数据集。广泛的实验表明,TSAT显然在各种预测范围内使用了六种最先进的基线方法。我们进一步可视化嵌入式动态图,以说明TSAT的图形表示功能。我们在https://github.com/radiantresearch/tsat上共享代码。

A reliable and efficient representation of multivariate time series is crucial in various downstream machine learning tasks. In multivariate time series forecasting, each variable depends on its historical values and there are inter-dependencies among variables as well. Models have to be designed to capture both intra- and inter-relationships among the time series. To move towards this goal, we propose the Time Series Attention Transformer (TSAT) for multivariate time series representation learning. Using TSAT, we represent both temporal information and inter-dependencies of multivariate time series in terms of edge-enhanced dynamic graphs. The intra-series correlations are represented by nodes in a dynamic graph; a self-attention mechanism is modified to capture the inter-series correlations by using the super-empirical mode decomposition (SMD) module. We applied the embedded dynamic graphs to times series forecasting problems, including two real-world datasets and two benchmark datasets. Extensive experiments show that TSAT clearly outerperforms six state-of-the-art baseline methods in various forecasting horizons. We further visualize the embedded dynamic graphs to illustrate the graph representation power of TSAT. We share our code at https://github.com/RadiantResearch/TSAT.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源