نتایج جستجو برای: encoder neural networks
تعداد نتایج: 643221 فیلتر نتایج به سال:
One of the challenges in modeling cognitive events from electroencephalogram (EEG) data is finding representations that are invariant to interand intra-subject differences, as well as the inherent noise associated with EEG data collection. Herein, we explore the capabilities of the recent deep neural architectures for modeling cognitive events from EEG data. In this paper, we present recent ach...
Current end-to-end machine reading and question answering (Q&A) models are primarily based on recurrent neural networks (RNNs) with attention. Despite their success, these models are often slow for both training and inference due to the sequential nature of RNNs. We propose a new Q&A architecture that does not require recurrent networks: Its encoder consists exclusively of convolution and self-...
Current end-to-end machine reading and question answering (Q&A) models are primarily based on recurrent neural networks (RNNs) with attention. Despite their success, these models are often slow for both training and inference due to the sequential nature of RNNs. We propose a new Q&A architecture that does not require recurrent networks: Its encoder consists exclusively of convolution and self-...
This paper presents a character-level encoder-decoder modeling method for question answering (QA) from large-scale knowledge bases (KB). This method improves the existing approach [9] from three aspects. First, long short-term memory (LSTM) structures are adopted to replace the convolutional neural networks (CNN) for encoding the candidate entities and predicates. Second, a new strategy of gene...
This paper extends fully-convolutional neural networks (FCN) for the clothing parsing problem. Clothing parsing requires higher-level knowledge on clothing semantics and contextual cues to disambiguate fine-grained categories. We extend FCN architecture with a side-branch network which we refer outfit encoder to predict a consistent set of clothing labels to encourage combinatorial preference, ...
Limits on synaptic efficiency are characteristic of biological neural networks. In this paper, weight limitation constraints are applied to the spike time error-backpropagation (SpikeProp) algorithm for temporally encoded networks of spiking neurons. A novel solution to the problem raised by non-firing neurons is presented which makes the learning algorithm converge reliably and efficiently. In...
Unsupervised image translation, which aims in translating two independent sets of images, is challenging in discovering the correct correspondences without paired data. Existing works build upon Generative Adversarial Network (GAN) such that the distribution of the translated images are indistinguishable from the distribution of the target set. However, such set-level constraints cannot learn t...
when a vehicle travels on a road, different parts of vehicle vibrate because of road roughness. this paper proposes a method to predict road roughness based on vertical acceleration using neural networks. to this end, first, the suspension system and road roughness are expressed mathematically. then, the suspension system model will identified using neural networks. the results of this step sho...
Time series account for a large proportion of the data stored in financial, medical and scientific databases. The efficient storage of time series is important in practical applications. In this paper, we propose a novel lossy compression scheme for time series. The encoder and decoder are both composed by recurrent neural networks (RNN) such as long short-term memory (LSTM). There is an autoen...
In this paper we present an extension of our previously described neural machine translation based system for punctuated transcription. This extension allows the system to map from per frame acoustic features to word level representations by replacing the traditional encoder in the encoder-decoder architecture with a hierarchical encoder. Furthermore, we show that a system combining lexical and...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید