نتایج جستجو برای: recurrent neural net

تعداد نتایج: 508769  

Journal: :international journal of energy and environmental engineering 2011
roozbeh zomorodian mohsen rezasoltani mohammad bagher ghofrani

in this paper, the application of neural networks for simulation and optimization of the cogeneration systems has been presented. cgam problem, a benchmark in cogeneration systems, is chosen as a casestudy. thermodynamic model includes precise modeling of the whole plant. for simulation of the steadysate behavior, the static neural network is applied. then using dynamic neural network, plant is...

2018
Valentin Khrulkov Alexander Novikov

Deep neural networks are surprisingly efficient at solving practical tasks, but the theory behind this phenomenon is only starting to catch up with the practice. Numerous works show that depth is the key to this efficiency. A certain class of deep convolutional networks – namely those that correspond to the Hierarchical Tucker (HT) tensor decomposition – has been proven to have exponentially hi...

1994
Sreerupa Das Michael C. Mozer

Although recurrent neural nets have been moderately successful in learning to emulate nite-state machines (FSMs), the continuous internal state dynamics of a neural net are not well matched to the discrete behavior of an FSM. We describe an architecture, called DOLCE, that allows discrete states to evolve in a net as learning progresses. dolce consists of a standard recurrent neural net trained...

1993
Sreerupa Das Michael C. Mozer

Although recurrent neural nets have been moderately successful in learning to emulate finite-state machines (FSMs), the continuous internal state dynamics of a neural net are not well matched to the discrete behavior of an FSM. We describe an architecture, called DOLCE, that allows discrete states to evolve in a net as learning progresses. DOLCE consists of a standard recurrent neural net train...

1998
Sepp Hochreiter

Recurrent nets are in principle capable to store past inputs to produce the currently desired output. This recurrent net property is used in time series prediction and process control. Practical applications involve temporal dependencies spanning many time steps between relevant inputs and desired outputs. In this case, however, gradient descent learning methods take to much time. The learning ...

Journal: :CoRR 2017
Heng Wang Zengchang Qin Tao Wan

In this paper, we propose a model using generative adversarial net (GAN) to generate realistic text. Instead of using standard GAN, we combine variational autoencoder (VAE) with generative adversarial net. The use of high-level latent random variables is helpful to learn the data distribution and solve the problem that generative adversarial net always emits the similar data. We propose the VGA...

2000
Stanislav Kurkovsky Rasiah Loganantharaj

In this paper we study how the framework of Petri nets can be extended and applied to study recurrent events. We use possibility theory to realistically model temporal properties of the recurrent processes being modeled by an extended Petri net. Such temporal properties include timestamps stored in tokens and durations of firing the transitions. We apply our method to model the recurrent behavi...

Journal: :CoRR 2018
Md. Zahangir Alom Mahmudul Hasan Chris Yakopcic Tarek M. Taha Vijayan K. Asari

Deep learning (DL) based semantic segmentation methods have been providing state-of-the-art performance in the last few years. More specifically, these techniques have been successfully applied to medical image classification, segmentation, and detection tasks. One deep learning technique, U-Net, has become one of the most popular for these applications. In this paper, we propose a Recurrent Co...

2017
Vincent Pollet Enrico Zovato Sufian Irhimeh Pier Domenico Batzu

Bidirectional recurrent neural nets have demonstrated state-ofthe-art performance for parametric speech synthesis. In this paper, we introduce a top-down application of recurrent neural net models to unit-selection synthesis. A hierarchical cascaded network graph predicts context phone duration, speech unit encoding and frame-level logF0 information that serves as targets for the search of unit...

1995
Jozef Sajda

A hybrid recurrent neural network is shown to learn small initial mealy machines (that can be thought of as translation machines translating input strings to corresponding output strings, as opposed to recognition automata that classify strings as either grammatical or nongrammatical) from positive training samples. A well-trained neural net 1 is then presented once again with the training set ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید