1. 首页
  2. 人工智能
  3. 论文/代码
  4. Multi-Faceted Representation Learning with Hybrid Architecture for Time Series C

Multi-Faceted Representation Learning with Hybrid Architecture for Time Series C

上传者: 2021-01-24 06:00:06上传 .PDF文件 1.14 MB 热度 10次

Multi-Faceted Representation Learning with Hybrid Architecture for Time Series Classification

Time series classification problems exist in many fields and have been explored for a couple of decades. However, they still remain challenging, and their solutions need to be further improved for real-world applications in terms of both accuracy and efficiency.In this paper, we propose a hybrid neural architecture, called Self-Attentive Recurrent Convolutional Networks (SARCoN), to learn multi-faceted representations for univariate time series. SARCoN is the synthesis of long short-term memory networks with self-attentive mechanisms and Fully Convolutional Networks, which work in parallel to learn the representations of univariate time series from different perspectives. The component modules of the proposed architecture are trained jointly in an end-to-end manner and they classify the input time series in a cooperative way. Due to its domain-agnostic nature, SARCoN is able to generalize a diversity of domain tasks. Our experimental results show that, compared to the state-of-the-art approaches for time series classification, the proposed architecture can achieve remarkable improvements for a set of univariate time series benchmarks from the UCR repository. Moreover, the self-attention and the global average pooling in the proposed architecture enable visible interpretability by facilitating the identification of the contribution regions of the original time series. An overall analysis confirms that multi-faceted representations of time series aid in capturing deep temporal corrections within complex time series, which is essential for the improvement of time series classification performance. Our work provides a novel angle that deepens the understanding of time series classification, qualifying our proposed model as an ideal choice for real-world applications.

时间序列分类的混合架构多面表示学习

时间序列分类问题存在于许多领域,并且已经探索了几十年。但是,它们仍然具有挑战性,并且需要在准确性和效率方面针对实际应用进一步改进其解决方案。.. 在本文中,我们提出了一种混合神经体系结构,称为自注意递归卷积网络(SARCoN),以学习单变量时间序列的多面表示。SARCoN是具有自注意机制的长短期记忆网络和完全卷积网络的综合,两者并行工作以从不同角度学习单变量时间序列的表示形式。所提出的体系结构的组件模块以端到端的方式共同训练,并且它们以协作的方式对输入时间序列进行分类。由于其领域不可知的性质,SARCoN能够概括各种领域任务。我们的实验结果表明,与时间序列分类的最新方法相比,对于UCR储存库中的一组单变量时间序列基准,建议的体系结构可以实现显着的改进。而且,通过促进对原始时间序列的贡献区域的识别,所提出的体系结构中的自注意力和全局平均池使得可见的可解释性成为可能。整体分析证实,时间序列的多方面表示有助于捕获复杂时间序列内的深度时间校正,这对于改善时间序列分类性能至关重要。我们的工作提供了一个新颖的角度,加深了对时间序列分类的理解,使我们提出的模型成为实际应用的理想选择。所提出的体系结构中的自注意力和全局平均池通过促进对原始时间序列的贡献区域的识别而实现了可见的可解释性。整体分析证实,时间序列的多方面表示有助于捕获复杂时间序列内的深度时间校正,这对于改善时间序列分类性能至关重要。我们的工作提供了一个新颖的角度,加深了对时间序列分类的理解,使我们提出的模型成为实际应用的理想选择。所提出的体系结构中的自注意力和全局平均池通过促进对原始时间序列的贡献区域的识别而实现了可见的可解释性。整体分析证实,时间序列的多方面表示有助于捕获复杂时间序列内的深度时间校正,这对于改善时间序列分类性能至关重要。我们的工作提供了一个新颖的角度,加深了对时间序列分类的理解,使我们提出的模型成为实际应用的理想选择。整体分析证实,时间序列的多方面表示有助于捕获复杂时间序列内的深度时间校正,这对于改善时间序列分类性能至关重要。我们的工作提供了一个新颖的角度,加深了对时间序列分类的理解,使我们提出的模型成为实际应用的理想选择。整体分析证实,时间序列的多方面表示有助于捕获复杂时间序列内的深度时间校正,这对于改善时间序列分类性能至关重要。我们的工作提供了一个新颖的角度,加深了对时间序列分类的理解,使我们提出的模型成为实际应用的理想选择。 (阅读更多)

下载地址
用户评论