نتایج جستجو برای: recurrent neural network
تعداد نتایج: 942527 فیلتر نتایج به سال:
This paper presents our Recurrent Control Neural Network (RCNN), which is a model-based approach for a data-efficient modelling and control of reinforcement learning problems in discrete time. Its architecture is based on a recurrent neural network (RNN), which is extended by an additional control network. The latter has the particular task to learn the optimal policy. This method has the advan...
A procedure that defines values of constraint weight parameters of single-layer relaxation-type recurrent neural networks for establishing stability of all solutions for an optimization problem is introduced. Application to the Traveling Salesman optimization problem, using the discrete dynamics Hopfield network as the recurrent neural network algorithm, is shown to illustrate the procedure. Si...
This paper explores the possibility of using multiplicative gate to build two recurrent neural network structures. These two structures are called Deep Simple Gated Unit (DSGU) and Simple Gated Unit (SGU), which are structures for learning long-term dependencies. Compared to traditional Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), both structures require fewer parameters and le...
An efficient algorithm for recurrent neural network training is presented. The approach increases the training speed for tasks where a length of the input sequence may vary significantly. The proposed approach is based on the optimal batch bucketing by input sequence length and data parallelization on multiple graphical processing units. The baseline training performance without sequence bucket...
This work presents two different translation models using recurrent neural networks. The first one is a word-based approach using word alignments. Second, we present phrase-based translation models that are more consistent with phrasebased decoding. Moreover, we introduce bidirectional recurrent neural models to the problem of machine translation, allowing us to use the full source sentence in ...
Recurrent Neural Networks are in the scope of the machine learning community for many years. In the current paper we discuss the Historical Consistent Recurrent Neural Network and its extension to the complex valued case. We give some insights into complex valued back propagation and its application to the complex valued recurrent neural network training. Finally we present the results for the ...
In this work, we investigate the memory capability of recurrent neural networks (RNNs), where this capability is defined as a function that maps an element in a sequence to the current output. We first analyze the system function of a recurrent neural network (RNN) cell, and provide analytical results for three RNNs. They are the simple recurrent neural network (SRN), the long short-term memory...
Convolutional and bidirectional recurrent neural networks have achieved considerable performance gains as acoustic models in automatic speech recognition in recent years. Latest architectures unify long short-term memory, gated recurrent unit and convolutional neural networks by stacking these different neural network types on each other, and providing short and long-term features to different ...
an adaptive input-output linearization method for general nonlinear systems is developed without using states of the system. another key feature of this structure is the fact that, it does not need model of the system. in this scheme, neurolinearizer has few weights, so it is practical in adaptive situations. online training of neurolinearizer is compared to model predictive recurrent training...
This paper proposes a discrete recurrent neural network model to implement winner-take-all function. This network model has simple organizations and clear dynamic behaviours. The dynamic properties of the proposed winner-take-all networks are studied in detail. Simulation results are given to show network performance. Since the network model is formulated as discrete time systems , it has advan...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید