نتایج جستجو برای: encoder neural networks
تعداد نتایج: 643221 فیلتر نتایج به سال:
Neural machine translation is a relatively new approach to statistical machine translation based purely on neural networks. The neural machine translation models often consist of an encoder and a decoder. The encoder extracts a fixed-length representation from a variable-length input sentence, and the decoder generates a correct translation from this representation. In this paper, we focus on a...
در این تحقیق شبکه عصبی مصنوعی برای پیش بینی مقادیر ضریب اطمینان و فاکتور ایمنی بحرانی سدهای خاکی ناهمگن ضمن در نظر گرفتن تاثیر نیروی اینرسی زلزله ارائه شده است. ورودی های مدل شامل ارتفاع سد و زاویه شیب بالا دست، ضریب زلزله، ارتفاع آب، پارامترهای مقاومتی هسته و پوسته و خروجی های آن شامل ضریب اطمینان می شود. مهمترین پارامتر مورد نظر در تحلیل پایداری شیب، بدست آوردن فاکتور ایمنی است. در این تحقیق ...
Generative adversarial networks (GANs) have demonstrated to be successful at generating realistic real-world images. In this paper we compare various GAN techniques, both supervised and unsupervised. The effects on training stability of different objective functions are compared. We add an encoder to the network, making it possible to encode images to the latent space of the GAN. The generator,...
Reducing the computational complexity is desired in speech coding algorithms. In this paper, three neural gain predictors are proposed which can function as backward gain adaptation module of low delay-code excited linear prediction (LD-CELP) G.728 encoder, recommended by International Telecommunication Union-Telecom sector (ITU-T, formerly CCITT). Elman, multilayer perceptron (MLP) and fuzzy A...
The digital surface model (DSM) is an important product in the field of photogrammetry and remote sensing and has variety of applications in this field. Existed techniques require more than one image for DSM extraction and in this paper it is tried to investigate and analyze the probability of DSM extraction from a single satellite image. In this regard, an algorithm based on deep convolutional...
Machine transliteration is the process of automatically transforming the script of a word from a source language to a target language, while preserving pronunciation. Sequence to sequence learning has recently emerged as a new paradigm in supervised learning. In this paper a character-based encoder-decoder model has been proposed that consists of two Recurrent Neural Networks. The encoder is a ...
The problem of extracting the building from mono optical aerial imagery with high spatial resolution is always considered as an important challenge to prepare the maps. The goal of the current research is to take advantage of the semantic segmentation of mono optical aerial imagery to extract the building which is realized based on the combination of deep convolutional neural networks (DCNN) an...
Encoder-decoder neural networks have been used for many NLP tasks, such as neural machine translation. They have also been applied to constituent parsing by using bracketed tree structures as a target language, translating input sentences into syntactic trees. A more commonly used method to linearize syntactic trees is the shift-reduce system, which uses a sequence of transition-actions to buil...
The variational encoder-decoder (VED) encodes source information as a set of random variables using a neural network, which in turn is decoded into target data using another neural network. In natural language processing, sequenceto-sequence (Seq2Seq) models typically serve as encoder-decoder networks. When combined with a traditional (deterministic) attention mechanism, the variational latent ...
We propose a selective encoding model to extend the sequence-to-sequence framework for abstractive sentence summarization. It consists of a sentence encoder, a selective gate network, and an attention equipped decoder. The sentence encoder and decoder are built with recurrent neural networks. The selective gate network constructs a second level sentence representation by controlling the informa...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید