English
 
User Manual Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Gradient based hyperparameter optimization in Echo State Networks

Thiede, L. A., & Parlitz, U. (2019). Gradient based hyperparameter optimization in Echo State Networks. Neural Networks, 115, 23-29. doi:10.1016/j.neunet.2019.02.001.

Item is

Basic

show hide
Item Permalink: http://hdl.handle.net/21.11116/0000-0003-99C3-4 Version Permalink: http://hdl.handle.net/21.11116/0000-0003-99C4-3
Genre: Journal Article

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Thiede, L. A., Author
Parlitz, Ulrich1, Author              
Affiliations:
1Research Group Biomedical Physics, Max Planck Institute for Dynamics and Self-Organization, Max Planck Society, ou_2063288              

Content

show
hide
Free keywords: Echo State Network; Hyperparameters; Reservoir computing
 Abstract: Like most machine learning algorithms, Echo State Networks possess several hyperparameters that have to be carefully tuned for achieving best performance. For minimizing the error on a specific task, we present a gradient based optimization algorithm, for the input scaling, the spectral radius, the leaking rate, and the regularization parameter.

Details

show
hide
Language(s): eng - English
 Dates: 2019-03-082019-07
 Publication Status: Published in print
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Method: Peer
 Identifiers: DOI: 10.1016/j.neunet.2019.02.001
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Neural Networks
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 115 Sequence Number: - Start / End Page: 23 - 29 Identifier: -