نتایج جستجو برای: recurrent neural net

تعداد نتایج: 508769  

1991
Jürgen Schmidhuber

Do you want your neural net algorithm to learn sequences? Do not limit yourself to conventional gradient descent (or approximations thereof). Instead, use your sequence learning algorithm (any will do) to implement the following method for history compression. No matter what your final goals are, train a network to predict its next input from the previous ones. Since only unpredictable inputs c...

2018
Xiaopeng Li Zhourong Chen Nevin L. Zhang

Sparse connectivity is an important factor behind the success of convolutional neural networks and recurrent neural networks. In this paper, we consider the problem of learning sparse connectivity for feedforward neural networks (FNNs). The key idea is that a unit should be connected to a small number of units at the next level below that are strongly correlated. We use Chow-Liu’s algorithm to ...

Journal: :journal of artificial intelligence in electrical engineering 0

the main objective of this paper is to introduce a new intelligent optimization technique that uses a predictioncorrectionstrategy supported by a recurrent neural network for finding a near optimal solution of a givenobjective function. recently there have been attempts for using artificial neural networks (anns) in optimizationproblems and some types of anns such as hopfield network and boltzm...

2005
Colin Molter Utku Salihoglu Hugues Bersini

While perceiving recurrent neural networks as brain-like information storing and retrieving machines, it is fundamental to explore at best these storing, indexing and retrieving capacities. This paper reviews an efficient Hebbian learning rule used to store both static and cyclic patterns in the dynamical attractors of an Hopfield neural network. A key improvement will be presented which consis...

1993
Pekka Orponen

We prove that polynomial size discrete synchronous Hop-eld networks with hidden units compute exactly the class of Boolean functions PSPACE/poly, i.e., the same functions as are computed by polynomial space-bounded nonuniform Turing machines. As a corollary to the construction, we observe also that networks with polynomially bounded interconnection weights compute exactly the class of functions...

Journal: :Neural computation 2006
Randall D. Beer

A fundamental challenge for any general theory of neural circuits is how to characterize the structure of the space of all possible circuits over a given model neuron. As a first step in this direction, this letter begins a systematic study of the global parameter space structure of continuous-time recurrent neural networks (CTRNNs), a class of neural models that is simple but dynamically unive...

2017
P.Corbett J.Boyle

Chemical named entity recognition has traditionally been dominated by CRF (Conditional Random Fields)-based approaches but given the success of WKH DUWLILFLDO QHXUDO QHWZRUN WHFKQLTXHV NQRZQ DV 3GHHS OHDUQLQJ ́ Ze decided to examine them as an alternative to CRFs. We present here three systems. The first system translates the traditional CRF-based idioms into a deep learning framework, using ric...

1994
Michael Gschwind Valentina Salapura Oliver Maischberger

We show how eld-programable gate arrays can be used to eeciently implement neural nets. By implementing the training phase in software and the actual application in hardware, connicting demands can be met: training beneets from a fast edit-debug cycle, and once the design has stabilized, a hardware implementation results in higher performance. While neural nets have been implemented in hardware...

Ahmad Jafarian Raheleh Jafari Safa Measoomy nia

Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید