大工至善|大学至真分享 http://blog.sciencenet.cn/u/lcj2212916

博文

[转载]【计算机科学】【2019】基于一维卷积神经网络的时间序列分类

已有 1932 次阅读 2021-3-25 16:34 |系统分类:科研笔记|文章来源:转载

图片

本文为挪威奥斯陆大学(作者:Sharanan Kulam)的硕士论文,共176页。

 

近年来,机器智能的研究取得了很大的进展,神经网络模型在图像分类、语言理解等领域都做出了重要的贡献。递归神经网络(RNN)通常是语言理解和时间序列分析等任务的首选方法。然而,一个已知的问题是它们捕捉长期依赖性的效率低下,从而产生了可替代的RNN长短时记忆(LSTM)和选通递归单元(GRU)解决了这一问题,但其代价是计算量增大。因此,近年来卷积神经网络(CNN)被广泛应用于序列建模,其性能优于RNN。然而,这种效率仅通过少数比较研究来检验,其中大多数主要集中在语言任务上。类似的研究在时间序列分类领域更为缺乏,而传统的分类方法常常被使用。为了克服这一缺点,进一步了解CNNRNN在时间序列分类领域的作用,本文对两种浅层网络CNNLSTM进行了评估。我们通过实验方法扩展了现有的一些比较,并为时间序列分类领域提供了两者的基线比较,而这方面的研究几乎没有。

 

为此,我们创建了一个易于扩展的系统来进行实验,并使用交叉验证在三个不同的数据集上评估了我们的模型。我们利用运动能力对抑郁症患者进行分类、预测电动汽车的能量需求,并对足球运动员的准备状态进行分类。CNNLSTM分别用于多个神经网络模型的比较研究。我们证明了简单的CNN可以达到与LSTM相同的性能,并且训练速度更快。对于我们的两个用例,CNN的速度快了30多倍,但是我们看到了训练时间和迭代之间的权衡,因为CNN使用了更多的训练迭代。我们的结论是,对于时间序列分类,CNN应该是LSTM的首选,因为它们在性能和更快的训练方面都是有效的

 

In recent years, research in machine intelligence has gained increased momentum, where neural network models have made significant contributions in various fields, like image classification and language understanding. Recurrent neural networks (RNNs) are often the preferred approach for tasks like language understanding and time-series analysis. However, a known problem is their inefficiency to capture long-term dependencies, giving rise to alternative RNNs. Long short-term memory (LSTM) and gated recurrent units (GRU) solve this problem, but on the expense of computational effort. As a result, convolutional neural networks (CNNs) have been explored for sequence modelling in recent years and shown to outperform RNNs in general. This efficiency, however, is examined only by a few comparative studies, where most primarily focus on language tasks. Similar studies are far more absent in the time-series classification domain, where traditional methods are often used. To address this shortcoming and further understand the effects of CNNs and RNNs in the time-series classification domain, we evaluate two shallow networks in this thesis, a CNN and an LSTM. We extend the few existing comparisons through an experimental approach and provide a baseline comparison of both for the time-series classification domain, where such studies are almost absent. To do so, we created an easily extensible system for running experiments and evaluated our models on three different datasets using cross-validation. We classify depressed patients using motor activity, predict the energy demand of Electric Vehicles (EVs) and classify readiness of football players. The system was used to evaluate CNN and LSTM separately for each dataset and is generalisable for multiple neural network models that can be used for similar comparative studies. We show that simple CNN achieves the same performance as LSTM and is faster to train. For two of our use cases, CNN is more than 30 times faster in terms of seconds used, but we see a trade-off between training time used in seconds and iterations, as CNN uses more training iterations. We conclude that for time-series classification, CNNs should be the preferred choice over LSTM, because of their effectiveness in performance and faster training.

 

1.       引言

2. 项目背景

3. 研究方法

4. 实验

5. 结论


更多精彩文章请关注公众号:205328s611i1aqxbbgxv19.jpg




https://wap.sciencenet.cn/blog-69686-1278564.html

上一篇:[转载]【计算机科学】【2019.06】卷积神经网络的贪婪分层训练
下一篇:[转载]【遥感遥测】【2008】高光谱遥感在草地生态系统分析中的应用
收藏 IP: 220.178.172.*| 热度|

0

该博文允许注册用户评论 请点击登录 评论 (0 个评论)

数据加载中...

Archiver|手机版|科学网 ( 京ICP备07017567号-12 )

GMT+8, 2024-3-29 21:19

Powered by ScienceNet.cn

Copyright © 2007- 中国科学报社

返回顶部