نتایج جستجو برای: encoder neural networks
تعداد نتایج: 643221 فیلتر نتایج به سال:
Although neural machine translation (NMT) with the encoder-decoder framework has achieved great success in recent times, it still suffers from some drawbacks: RNNs tend to forget old information which is often useful and the encoder only operates through words without considering word relationship. To solve these problems, we introduce a relation networks (RN) into NMT to refine the encoding re...
We propose Neural Responding Machine (NRM), a neural network-based response generator for Short-Text Conversation. NRM takes the general encoder-decoder framework: it formalizes the generation of response as a decoding process based on the latent representation of the input text, while both encoding and decoding are realized with recurrent neural networks (RNN). The NRM is trained with a large ...
Deep Neural Networks trained as image auto-encoders have recently emerged as a promising direction for advancing the state of the art in image compression. The key challenge in learning such networks is twofold: to deal with quantization, and to control the trade-off between reconstruction error (distortion) and entropy (rate) of the latent image representation. In this paper, we focus on the l...
In this paper we consider the problem of estimating a dense depth map from a set of sparse LiDAR points. We use techniques from compressed sensing and the recently developed Alternating Direction Neural Networks (ADNNs) to create a deep recurrent auto-encoder for this task. Our architecture internally performs an algorithm for extracting multi-level convolutional sparse codes from the input whi...
Neural machine translation is a recently proposed approach which has shown competitive results to traditional MT approaches. Standard neural MT is an end-to-end neural network where the source sentence is encoded by a recurrent neural network (RNN) called encoder and the target words are predicted using another RNN known as decoder. Recently, various models have been proposed which replace the ...
We describe the Alopex algorithm as a universal learning algorithm for neural networks. The algorithm is stochastic and it can be used for learning in networks of any topology, including those with feedback. The neurons could contain any transfer function and the learning could involve minimization of any error measure. The efficacy of the algorithm is investigated by applying it on multilayer ...
Attention-based Encoder-Decoder has the effective architecture for neural machine translation (NMT), which typically relies on recurrent neural networks (RNN) to build the blocks that will be lately called by attentive reader during the decoding process. This design of encoder yields relatively uniform composition on source sentence, despite the gating mechanism employed in encoding RNN. On the...
Machine Learning based anomaly detection ap- proaches have long training and validation cycles. With IoT devices rapidly proliferating, models on a per device basis is impractical. This work explores the “transfer- ability” of pre-trained autoencoder model across similar different nature. We hypothesized that nature would high level feature character- istics represented by initial layers autoen...
A time-domain analog spatial compressed sensing encoder for neural recording applications is proposed. Owing to the advantage of MEMS technologies, the number of channels on a silicon neural probe array has doubled in 7.4 years, and therefore, a greater number of recording channels and higher density of front-end circuitry is required. Since neural signals such as action potential (AP) have wid...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید