【2020-2024持续更新】Echo State Network和储层计算论文汇总!包括经典ESN、DeepESN、组合ESN和综述!

本文主要是介绍【2020-2024持续更新】Echo State Network和储层计算论文汇总!包括经典ESN、DeepESN、组合ESN和综述!,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

关键词:ESN、Echo state network、Reservoir Computing
更新时间:2024

目录

  • 1 综述
  • 2 ESN模型分类
    • 2.1 ESN
    • 2.2 DeepESN
    • 2.3 组合ESN
  • 3 开源论文
  • 4 储层计算相关研究
  • 5 应用

1 综述

  1. Gallicchio, Claudio and Alessio Micheli. “Deep Echo State Network (DeepESN): A Brief Survey.” ArXiv abs/1712.04323 (2017): n. pag.
  2. Sun, Chenxi et al. “A Systematic Review of Echo State Networks From Design to Application.” IEEE Transactions on Artificial Intelligence 5 (2024): 23-37.
  3. Soltani, Rebh et al. “Echo State Network Optimization: A Systematic Literature Review.” Neural Processing Letters 55 (2023): 10251-10285.
  4. Xu Y. A review of machine learning with echo state networks[J]. Proj. Rep, 2020.
  5. Margin D A, Dobrota V. Overview of Echo State Networks using Different Reservoirs and Activation Functions[C]//2021 20th RoEduNet Conference: Networking in Education and Research (RoEduNet). IEEE, 2021: 1-6.
  6. Sun, Chenxi et al. “A Review of Designs and Applications of Echo State Networks.” ArXiv abs/2012.02974 (2020): n. pag.
  7. Sun, Chenxi et al. “A Systematic Review of Echo State Networks From Design to Application.” IEEE Transactions on Artificial Intelligence 5 (2024): 23-37.

2 ESN模型分类

2.1 ESN

典型的ESN由一个输入层、一个循环层(储层,由大量的稀疏连接的神经元组成)和一个输出层组成。包含对经典ESN、并对ESN的结构改进的研究的论文。

  1. Manneschi, Luca et al. “Exploiting Multiple Timescales in Hierarchical Echo State Networks.” Frontiers in Applied Mathematics and Statistics (2021).
  2. Fourati R, Ammar B, Jin Y, et al. EEG feature learning with intrinsic plasticity based deep echo state network[C]//2020 international joint conference on neural networks (IJCNN). IEEE, 2020: 1-8.
  3. Liu, Qianwen et al. “Memory augmented echo state network for time series prediction.” Neural Computing and Applications (2023): 1-16.
  4. Akrami, Abbas et al. “Design of a reservoir for cloud-enabled echo state network with high clustering coefficient.” EURASIP Journal on Wireless Communications and Networking 2020 (2020): 1-14.
  5. Arroyo, Diana Carolina Roca. “A Modified Echo State Network Model Using Non-Random Topology.” (2023).
  6. Fu, Jun et al. “A double-cycle echo state network topology for time series prediction.” Chaos 33 9 (2023): n. pag.
  7. Akrami, Abbas et al. “Design of a reservoir for cloud-enabled echo state network with high clustering coefficient.” EURASIP Journal on Wireless Communications and Networking 2020 (2020): n. pag.
  8. Yang, Cuili and Zhanhong Wu. “Multi-objective sparse echo state network.” Neural Computing and Applications 35 (2022): 2867-2882.
  9. Tortorella, Domenico et al. “Spectral Bounds for Graph Echo State Network Stability.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 1-8.
  10. Zheng, Shoujing et al. “Improved Echo State Network With Multiple Activation Functions.” 2022 China Automation Congress (CAC) (2022): 346-350.
  11. Morra, Jacob and Mark Daley. “Imposing Connectome-Derived Topology on an Echo State Network.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 1-6.
  12. McDaniel, Shane et al. “Investigating Echo State Network Performance with Biologically-Inspired Hierarchical Network Structure.” 2022 International Joint Conference on Neural Networks (IJCNN) (2022): 01-08.
  13. Yao, Xianshuang et al. “A stability criterion for discrete-time fractional-order echo state network and its application.” Soft Computing 25 (2021): 4823 - 4831.
  14. Mu, Xiaohui and Lixiang Li. “Memristor-based Echo State Network and Prediction for Time Series.” 2021 International Conference on Neuromorphic Computing (ICNC) (2021): 153-158.
  15. Mahmoud, Tarek A. and Lamiaa M. Elshenawy. “TSK fuzzy echo state neural network: a hybrid structure for black-box nonlinear systems identification.” Neural Computing and Applications 34 (2022): 7033 - 7051.
  16. Maksymov, Ivan S. et al. “Neural Echo State Network using oscillations of gas bubbles in water: Computational validation by Mackey-Glass time series forecasting.” Physical review. E 105 4-1 (2021): 044206 .
  17. Wang, Lei et al. “Design of sparse Bayesian echo state network for time series prediction.” Neural Computing and Applications 33 (2020): 7089 - 7102.
  18. Gong, Shangfu et al. “An Improved Small-World Topology for Optimizing the Performance of Echo State Network.” 2020 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom) (2020): 1413-1419.
  19. Iacob, Stefan et al. “Delay-Sensitive Local Plasticity in Echo State Networks.” 2023 International Joint Conference on Neural Networks (IJCNN) (2023): 1-8.
  20. Jordanou, Jean P. et al. “Investigation of Proper Orthogonal Decomposition for Echo State Networks.” Neurocomputing 548 (2022): 126395.
  21. Paassen, Benjamin et al. “Tree Echo State Autoencoders with Grammars.” 2020 International Joint Conference on Neural Networks (IJCNN) (2020): 1-8.(有源码)
  22. Liu, Junxiu, et al. “Echo state network optimization using binary grey wolf algorithm.” Neurocomputing 385 (2020): 310-318.
  23. Trouvain, Nathan, et al. “Reservoirpy: an efficient and user-friendly library to design echo state networks.” International Conference on Artificial Neural Networks. Cham: Springer International Publishing, 2020.(源码)
  24. Hart, Allen, James Hook, and Jonathan Dawes. “Embedding and approximation theorems for echo state networks.” Neural Networks 128 (2020): 234-247.
  25. Morra, Jacob, and Mark Daley. “Imposing Connectome-Derived topology on an echo state network.” 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 2022.
  26. Na, Xiaodong, Weijie Ren, and Xinghan Xu. “Hierarchical delay-memory echo state network: A model designed for multi-step chaotic time series prediction.” Engineering Applications of Artificial Intelligence 102 (2021): 104229.

2.2 DeepESN

Deep Echo State Network
DeepESN是利用深度学习DL框架堆叠多个ESN而成的网络。它由输入层、动力学堆叠的储层组件和输出层组成。

  1. Bouazizi, Samar et al. “Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network.” JUCS - Journal of Universal Computer Science (2023): n. pag.
  2. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  3. Wang, Yuanhui et al. “A Weight Optimization Method of Deep Echo State Network Based on Improved Knowledge Evolution.” 2022 China Automation Congress (CAC) (2022): 395-400.
  4. Yang, Xiaojian et al. “An improved deep echo state network inspired by tissue-like P system forecasting for non-stationary time series.” Journal of Membrane Computing 4 (2022): 222 - 231.
  5. Kanda, Keiko and Sou Nobukawa. “Feature Extraction Mechanism for Each Layer of Deep Echo State Network.” 2022 International Conference on Emerging Techniques in Computational Intelligence (ICETCI) (2022): 65-70.
  6. Kim, Taehwan and Brian R. King. “Time series prediction using deep echo state networks.” Neural Computing and Applications (2020): 1-19.
  7. Hu, Ruihan et al. “Ensemble echo network with deep architecture for time-series modeling.” Neural Computing and Applications 33 (2020): 4997 - 5010.
  8. Ma, Qianli, Lifeng Shen, and Garrison W. Cottrell. “DeePr-ESN: A deep projection-encoding echo-state network.” Information Sciences 511 (2020): 152-171.
  9. Song, Zuohua, Keyu Wu, and Jie Shao. “Destination prediction using deep echo state network.” Neurocomputing 406 (2020): 343-353.
  10. Barredo Arrieta, Alejandro, et al. “On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification.” Neural Computing and Applications (2022): 1-21.(有源码)

2.3 组合ESN

ESN与深度学习、机器学习网络、特殊数据结构结合

  1. Lien, Justin. “Hypergraph Echo State Network.” ArXiv abs/2310.10177 (2023): n. pag.
  2. Deng, Lichi and Yuewei Pan. “Machine Learning Assisted Closed-Loop Reservoir Management using Echo State Network.” (2020).
  3. Trierweiler Ribeiro, Gabriel, et al. “Bayesian optimized echo state network applied to short-term load forecasting.” Energies 13.9 (2020): 2390.

3 开源论文

包含ESN和储层计算的研究,不限时间

  1. Cerina L, Santambrogio M D, Franco G, et al. EchoBay: design and optimization of echo state networks under memory and time constraints[J]. ACM Transactions on Architecture and Code Optimization (TACO), 2020, 17(3): 1-24.
  2. Lukoševičius M, Uselis A. Efficient implementations of echo state network cross-validation[J]. Cognitive computation, 2021: 1-15.
  3. Sun C, Hong S, Song M, et al. Te-esn: Time encoding echo state network for prediction based on irregularly sampled time series data[J]. arXiv preprint arXiv:2105.00412, 2021.
  4. Özdemir A, Scerri M, Barron A B, et al. EchoVPR: Echo state networks for visual place recognition[J]. IEEE Robotics and Automation Letters, 2022, 7(2): 4520-4527.
  5. Li Z, Liu Y, Tanaka G. Multi-Reservoir Echo State Networks with Hodrick–Prescott Filter for nonlinear time-series prediction[J]. Applied Soft Computing, 2023, 135: 110021.
  6. Barredo Arrieta A, Gil-Lopez S, Laña I, et al. On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification[J]. Neural Computing and Applications, 2022: 1-21.
  7. Robust optimization and validation ofecho state networksfor learning chaotic dynamics
  8. Gallicchio, Claudio and Alessio Micheli. “Deep Echo State Network (DeepESN): A Brief Survey.” ArXiv abs/1712.04323 (2017): n. pag.
  9. Steiner, Peter, Azarakhsh Jalalvand, and Peter Birkholz. “Cluster-based input weight initialization for echo state networks.” IEEE Transactions on Neural Networks and Learning Systems (2022).
  10. Bianchi, Filippo Maria et al. “Bidirectional deep-readout echo state networks.” The European Symposium on Artificial Neural Networks (2017).
  11. Maat, Jacob Reinier et al. “Efficient Optimization of Echo State Networks for Time Series Datasets.” 2018 International Joint Conference on Neural Networks (IJCNN) (2018): 1-7.
  12. Heim, Niklas and James E. Avery. “Adaptive Anomaly Detection in Chaotic Time Series with a Spatially Aware Echo State Network.” ArXiv abs/1909.01709 (2019): n. pag.
  13. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  14. Lukoševičius, Mantas, and Arnas Uselis. “Efficient implementations of echo state network cross-validation.” Cognitive computation (2021): 1-15.
  15. Lukoševičius, Mantas and Arnas Uselis. “Efficient Cross-Validation of Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  16. Özdemir, Anil et al. “EchoVPR: Echo State Networks for Visual Place Recognition.” IEEE Robotics and Automation Letters PP (2021): 1-1.
  17. Verzelli, Pietro et al. “Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere.” Scientific Reports 9 (2019): n. pag.
  18. Rodriguez, Nathaniel et al. “Optimal modularity and memory capacity of neural reservoirs.” Network Neuroscience 3 (2017): 551 - 566.
  19. Chattopadhyay, Ashesh et al. “Data-driven prediction of a multi-scale Lorenz 96 chaotic system using deep learning methods: Reservoir computing, ANN, and RNN-LSTM.” (2019).
  20. Steiner, Peter, et al. “PyRCN: A toolbox for exploration and application of Reservoir Computing Networks.” Engineering Applications of Artificial Intelligence 113 (2022): 104964.
  21. Strock, Anthony et al. “A Simple Reservoir Model of Working Memory with Real Values.” 2018 International Joint Conference on Neural Networks (IJCNN) (2018): 1-8.
  22. Zhang, Yuanzhao and Sean P. Cornelius. “Catch-22s of reservoir computing.” Physical Review Research (2022): n. pag.
  23. Gao, Ruobin et al. “Dynamic ensemble deep echo state network for significant wave height forecasting.” Applied Energy (2023): n. pag.
  24. Gallicchio, Claudio and Alessio Micheli. “Reservoir Topology in Deep Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  25. Lukoševičius, Mantas and Arnas Uselis. “Efficient Implementations of Echo State Network Cross-Validation.” Cognitive Computation 15 (2020): 1470 - 1484.
  26. Mattheakis, Marios et al. “Unsupervised Reservoir Computing for Solving Ordinary Differential Equations.” ArXiv abs/2108.11417 (2021): n. pag.
  27. Paassen, Benjamin et al. “Tree Echo State Autoencoders with Grammars.” 2020 International Joint Conference on Neural Networks (IJCNN) (2020): 1-8.
  28. Evanusa, Matthew et al. “Hybrid Backpropagation Parallel Reservoir Networks.” ArXiv abs/2010.14611 (2020): n. pag.
  29. Trouvain, Nathan, et al. “Reservoirpy: an efficient and user-friendly library to design echo state networks.” International Conference on Artificial Neural Networks. Cham: Springer International Publishing, 2020.
  30. Cossu, Andrea, et al. “Continual learning with echo state networks.” arXiv preprint arXiv:2105.07674 (2021).
  31. Gauthier, Daniel J., et al. “Next generation reservoir computing.” Nature communications 12.1 (2021): 5564.
  32. Vlachas, Pantelis R., et al. “Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics.” Neural Networks 126 (2020): 191-217.
  33. Cucchi, Matteo, et al. “Hands-on reservoir computing: a tutorial for practical implementation.” Neuromorphic Computing and Engineering 2.3 (2022): 032002.(储层计算实践)
  34. Mattheakis, Marios, Hayden Joy, and Pavlos Protopapas. “Unsupervised reservoir computing for solving ordinary differential equations.” arXiv preprint arXiv:2108.11417 (2021).
  35. Barredo Arrieta, Alejandro, et al. “On the post-hoc explainability of deep echo state networks for time series forecasting, image and video classification.” Neural Computing and Applications (2022): 1-21.

4 储层计算相关研究

  1. Margin D A, Ivanciu I A, Dobrota V. Deep Reservoir Computing using Echo State Networks and Liquid State Machine[C]//2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom). IEEE, 2022: 208-213.
  2. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  3. Chattopadhyay, Ashesh et al. “Data-driven prediction of a multi-scale Lorenz 96 chaotic system using deep learning methods: Reservoir computing, ANN, and RNN-LSTM.” (2019).
  4. Steiner, Peter, et al. “PyRCN: A toolbox for exploration and application of Reservoir Computing Networks.” Engineering Applications of Artificial Intelligence 113 (2022): 104964.
  5. Zhang, Yuanzhao and Sean P. Cornelius. “Catch-22s of reservoir computing.” Physical Review Research (2022): n. pag.
  6. Gallicchio, Claudio and Alessio Micheli. “Reservoir Topology in Deep Echo State Networks.” International Conference on Artificial Neural Networks (2019).
  7. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  8. Manjunath, G… “Memory-Loss is Fundamental for Stability and Distinguishes the Echo State Property Threshold in Reservoir Computing & Beyond.” ArXiv abs/2001.00766 (2020): n. pag.
  9. Margin, Dan-Andrei et al. “Deep Reservoir Computing using Echo State Networks and Liquid State Machine.” 2022 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom) (2022): 208-213.
  10. Gonon, Lukas et al. “Infinite-dimensional reservoir computing.” ArXiv abs/2304.00490 (2023): n. pag.
  11. Sun, Xiaochuan et al. “Towards Fault Tolerance of Reservoir Computing in Time Series Prediction.” Inf. 14 (2023): 266.
  12. Lee, Kundo and Tomoki Hamagami. “Reservoir Computing for Scalable Hardware with Block‐Based Neural Network.” IEEJ Transactions on Electrical and Electronic Engineering 16 (2021): n. pag.
  13. Ren, Bin and Huanfei Ma. “Global optimization of hyper-parameters in reservoir computing.” Electronic Research Archive (2022): n. pag.
  14. Storm, Lance et al. “Constraints on parameter choices for successful reservoir computing.” ArXiv abs/2206.02575 (2022): n. pag.
  15. Bendali, Wadie et al. “Optimization of Deep Reservoir Computing with Binary Genetic Algorithm for Multi-Time Horizon Forecasting of Power Consumption.” Journal Européen des Systèmes Automatisés (2022): n. pag.
  16. Bacciu, Davide et al. “Federated Reservoir Computing Neural Networks.” 2021 International Joint Conference on Neural Networks (IJCNN) (2021): 1-7.
  17. Mattheakis, Marios et al. “Unsupervised Reservoir Computing for Solving Ordinary Differential Equations.” ArXiv abs/2108.11417 (2021): n. pag.(有源码)
  18. Love, Jake et al. “Task Agnostic Metrics for Reservoir Computing.” ArXiv abs/2108.01512 (2021): n. pag.
  19. Heyder, Florian et al. “Generalizability of reservoir computing for flux-driven two-dimensional convection.” Physical review. E 106 5-2 (2021): 055303 .
  20. Honda, Hirotada. “A novel framework for reservoir computing with inertial manifolds.” 2021 International Conference on Artificial Intelligence in Information and Communication (ICAIIC) (2021): 347-352.
  21. Hart, Allen G… “(Thesis) Reservoir Computing With Dynamical Systems.” (2021).(可视化美观)
  22. Doan, Nguyen Anh Khoa et al. “Auto-Encoded Reservoir Computing for Turbulence Learning.” ArXiv abs/2012.10968 (2020): n. pag.
  23. Gallicchio, Claudio et al. “Frontiers in Reservoir Computing.” The European Symposium on Artificial Neural Networks (2020).
  24. Evanusa, Matthew et al. “Hybrid Backpropagation Parallel Reservoir Networks.” ArXiv abs/2010.14611 (2020): n. pag.(有源码)
  25. Kleyko, Denis, et al. “Integer echo state networks: Efficient reservoir computing for digital hardware.” IEEE Transactions on Neural Networks and Learning Systems 33.4 (2020): 1688-1701.
  26. Huhn, Francisco, and Luca Magri. “Gradient-free optimization of chaotic acoustics with reservoir computing.” Physical Review Fluids 7.1 (2022): 014402.
  27. Alomar, Miquel L., et al. “Efficient parallel implementation of reservoir computing systems.” Neural Computing and Applications 32 (2020): 2299-2313.
  28. Manneschi, Luca, Andrew C. Lin, and Eleni Vasilaki. “SpaRCe: Improved learning of reservoir computing systems through sparse representations.” IEEE Transactions on Neural Networks and Learning Systems (2021).
  29. Damicelli, Fabrizio, Claus C. Hilgetag, and Alexandros Goulas. “Brain connectivity meets reservoir computing.” PLoS Computational Biology 18.11 (2022): e1010639.
  30. Gauthier, Daniel J., et al. “Next generation reservoir computing.” Nature communications 12.1 (2021): 5564.(有源码)
  31. Gallicchio, Claudio. “Sparsity in reservoir computing neural networks.” 2020 International Conference on INnovations in Intelligent SysTems and Applications (INISTA). IEEE, 2020.
  32. Vlachas, Pantelis R., et al. “Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics.” Neural Networks 126 (2020): 191-217.
  33. Cucchi, Matteo, et al. “Hands-on reservoir computing: a tutorial for practical implementation.” Neuromorphic Computing and Engineering 2.3 (2022): 032002.(有源码)(储层计算实践)
  34. Lim, Soon Hoe, et al. “Predicting critical transitions in multiscale dynamical systems using reservoir computing.” Chaos: An Interdisciplinary Journal of Nonlinear Science 30.12 (2020).
  35. Mattheakis, Marios, Hayden Joy, and Pavlos Protopapas. “Unsupervised reservoir computing for solving ordinary differential equations.” arXiv preprint arXiv:2108.11417 (2021).(有源码)

5 应用

  1. Bouazizi S, Benmohamed E, Ltifi H. Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network[J]. JUCS: Journal of Universal Computer Science, 2023, 29(10).
  2. Valencia C H, Vellasco M M B R, Figueiredo K. Echo State Networks: Novel reservoir selection and hyperparameter optimization model for time series forecasting[J]. Neurocomputing, 2023, 545: 126317.
  3. Viehweg J, Worthmann K, Mäder P. Parameterizing echo state networks for multi-step time series prediction[J]. Neurocomputing, 2023, 522: 214-228.
  4. Bai, Yu-ting et al. “Nonstationary Time Series Prediction Based on Deep Echo State Network Tuned by Bayesian Optimization.” Mathematics (2023): n. pag.
  5. Bianchi, Filippo Maria et al. “Reservoir Computing Approaches for Representation and Classification of Multivariate Time Series.” IEEE Transactions on Neural Networks and Learning Systems 32 (2018): 2169-2179.
  6. Özdemir, Anil et al. “EchoVPR: Echo State Networks for Visual Place Recognition.” IEEE Robotics and Automation Letters PP (2021): 1-1.
  7. Gao, Ruobin et al. “Dynamic ensemble deep echo state network for significant wave height forecasting.” Applied Energy (2023): n. pag.
  8. Liu, Qianwen et al. “Memory augmented echo state network for time series prediction.” Neural Computing and Applications (2023): 1-16.
  9. Deng, Lichi and Yuewei Pan. “Machine-Learning-Assisted Closed-Loop Reservoir Management Using Echo State Network for Mature Fields under Waterflood.” Spe Reservoir Evaluation & Engineering 23 (2020): n. pag.
  10. Mandal, Swarnendu and Manish Dev Shrimali. “Learning unidirectional coupling using echo-state network.” Physical review. E 107 6-1 (2023): 064205 .
  11. Koprinkova-Hristova, Petia D. et al. “Echo state network for features extraction and segmentation of tomography images.” Computer Science and Information Systems (2023): n. pag.
  12. Bouazizi, Samar et al. “Enhancing EEG-based emotion recognition using PSD-Grouped Deep Echo State Network.” JUCS - Journal of Universal Computer Science (2023): n. pag.
  13. Soltani, Rebh et al. “Optimized Echo State Network based on PSO and Gradient Descent for Choatic Time Series Prediction.” 2022 IEEE 34th International Conference on Tools with Artificial Intelligence (ICTAI) (2022): 747-754.
  14. Caremel, Cedric et al. “Echo State Network for Soft Actuator Control.” J. Robotics Mechatronics 34 (2022): 413-421.
  15. Ren, Weijie et al. “Time series prediction based on echo state network tuned by divided adaptive multi-objective differential evolution algorithm.” Soft Computing 25 (2021): 4489 - 4502.
  16. Na, Yongsu et al. “Near real-time predictions of tropical cyclone trajectory and intensity in the northwestern Pacific Ocean using echo state network.” Climate Dynamics 58 (2021): 651 - 667.
  17. Gandhi, Manjunath. “An Echo State Network Imparts a Curve Fitting.” IEEE Transactions on Neural Networks and Learning Systems 33 (2021): 2596-2604.
  18. Jere, Shashank et al. “Channel Equalization Through Reservoir Computing: A Theoretical Perspective.” IEEE Wireless Communications Letters 12 (2023): 774-778.
  19. Jordanou, Jean P. et al. “Echo State Networks for Practical Nonlinear Model Predictive Control of Unknown Dynamic Systems.” IEEE Transactions on Neural Networks and Learning Systems 33 (2021): 2615-2629.
  20. Kim, Taehwan and Brian R. King. “Time series prediction using deep echo state networks.” Neural Computing and Applications (2020): 1-19.
  21. Simov, Kiril Ivanov et al. “A Reservoir Computing Approach to Word Sense Disambiguation.” Cognitive Computation 15 (2020): 1409 - 1418.
  22. Cossu, Andrea, et al. “Continual learning with echo state networks.” arXiv preprint arXiv:2105.07674 (2021).(有源码)
  23. Fourati, Rahma, et al. “EEG feature learning with intrinsic plasticity based deep echo state network.” 2020 international joint conference on neural networks (IJCNN). IEEE, 2020.
  24. Fourati, Rahma, et al. “Unsupervised learning in reservoir computing for eeg-based emotion recognition.” IEEE Transactions on Affective Computing 13.2 (2020): 972-984.

这篇关于【2020-2024持续更新】Echo State Network和储层计算论文汇总!包括经典ESN、DeepESN、组合ESN和综述!的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/610429

相关文章

2024年流动式起重机司机证模拟考试题库及流动式起重机司机理论考试试题

题库来源:安全生产模拟考试一点通公众号小程序 2024年流动式起重机司机证模拟考试题库及流动式起重机司机理论考试试题是由安全生产模拟考试一点通提供,流动式起重机司机证模拟考试题库是根据流动式起重机司机最新版教材,流动式起重机司机大纲整理而成(含2024年流动式起重机司机证模拟考试题库及流动式起重机司机理论考试试题参考答案和部分工种参考解析),掌握本资料和学校方法,考试容易。流动式起重机司机考试技

hdu4869(逆元+求组合数)

//输入n,m,n表示翻牌的次数,m表示牌的数目,求经过n次操作后共有几种状态#include<iostream>#include<algorithm>#include<cstring>#include<stack>#include<queue>#include<set>#include<map>#include<stdio.h>#include<stdlib.h>#includ

poj3468(线段树成段更新模板题)

题意:包括两个操作:1、将[a.b]上的数字加上v;2、查询区间[a,b]上的和 下面的介绍是下解题思路: 首先介绍  lazy-tag思想:用一个变量记录每一个线段树节点的变化值,当这部分线段的一致性被破坏我们就将这个变化值传递给子区间,大大增加了线段树的效率。 比如现在需要对[a,b]区间值进行加c操作,那么就从根节点[1,n]开始调用update函数进行操作,如果刚好执行到一个子节点,

hdu1394(线段树点更新的应用)

题意:求一个序列经过一定的操作得到的序列的最小逆序数 这题会用到逆序数的一个性质,在0到n-1这些数字组成的乱序排列,将第一个数字A移到最后一位,得到的逆序数为res-a+(n-a-1) 知道上面的知识点后,可以用暴力来解 代码如下: #include<iostream>#include<algorithm>#include<cstring>#include<stack>#in

hdu1689(线段树成段更新)

两种操作:1、set区间[a,b]上数字为v;2、查询[ 1 , n ]上的sum 代码如下: #include<iostream>#include<algorithm>#include<cstring>#include<stack>#include<queue>#include<set>#include<map>#include<stdio.h>#include<stdl

【专题】2024飞行汽车技术全景报告合集PDF分享(附原数据表)

原文链接: https://tecdat.cn/?p=37628 6月16日,小鹏汇天旅航者X2在北京大兴国际机场临空经济区完成首飞,这也是小鹏汇天的产品在京津冀地区进行的首次飞行。小鹏汇天方面还表示,公司准备量产,并计划今年四季度开启预售小鹏汇天分体式飞行汽车,探索分体式飞行汽车城际通勤。阅读原文,获取专题报告合集全文,解锁文末271份飞行汽车相关行业研究报告。 据悉,业内人士对飞行汽车行业

高效录音转文字:2024年四大工具精选!

在快节奏的工作生活中,能够快速将录音转换成文字是一项非常实用的能力。特别是在需要记录会议纪要、讲座内容或者是采访素材的时候,一款优秀的在线录音转文字工具能派上大用场。以下推荐几个好用的录音转文字工具! 365在线转文字 直达链接:https://www.pdf365.cn/ 365在线转文字是一款提供在线录音转文字服务的工具,它以其高效、便捷的特点受到用户的青睐。用户无需下载安装任何软件,只

usaco 1.3 Mixing Milk (结构体排序 qsort) and hdu 2020(sort)

到了这题学会了结构体排序 于是回去修改了 1.2 milking cows 的算法~ 结构体排序核心: 1.结构体定义 struct Milk{int price;int milks;}milk[5000]; 2.自定义的比较函数,若返回值为正,qsort 函数判定a>b ;为负,a<b;为0,a==b; int milkcmp(const void *va,c

poj 2349 Arctic Network uva 10369(prim or kruscal最小生成树)

题目很麻烦,因为不熟悉最小生成树的算法调试了好久。 感觉网上的题目解释都没说得很清楚,不适合新手。自己写一个。 题意:给你点的坐标,然后两点间可以有两种方式来通信:第一种是卫星通信,第二种是无线电通信。 卫星通信:任何两个有卫星频道的点间都可以直接建立连接,与点间的距离无关; 无线电通信:两个点之间的距离不能超过D,无线电收发器的功率越大,D越大,越昂贵。 计算无线电收发器D

hdu 1754 I Hate It(线段树,单点更新,区间最值)

题意是求一个线段中的最大数。 线段树的模板题,试用了一下交大的模板。效率有点略低。 代码: #include <stdio.h>#include <string.h>#define TREE_SIZE (1 << (20))//const int TREE_SIZE = 200000 + 10;int max(int a, int b){return a > b ? a :