نتایج جستجو برای: neural document embedding

تعداد نتایج: 520398  

Journal: :IET Software 2021

Classifying test executions automatically as pass or fail remains a key challenge in software testing and is referred to the oracle problem. It being attempted solve this problem with supervised learning over execution traces. A programme instrumented gather traces sequences of method invocations. small fraction programme's labelled verdicts. Execution are then embedded fixed length vectors neu...

2008
Igor MOKRIŠ Lenka SKOVAJSOVÁ

The aim of this paper is to survey the feed-forward and self-organizing neural networks for the text document retrieval models, which retrieve text documents in a natural language. These models come from linguistic and conceptual approach of the text document analysis, where problems of document representation and document database creation are being solved. The proposed structure of the feed-f...

2015
Kuan-Yu Chen Shih-Hung Liu Hsin-Min Wang Berlin Chen Hsin-Hsi Chen

Owing to the rapidly growing multimedia content available on the Internet, extractive spoken document summarization, with the purpose of automatically selecting a set of representative sentences from a spoken document to concisely express the most important theme of the document, has been an active area of research and experimentation. On the other hand, word embedding has emerged as a newly fa...

2006
Nitin N. Pise

The paper starts with the need for classification. Then the reasons why neural networks are suitable for document classification are explained. The paper continues with the details of the most commonly used topologically organized network model proposed by Kohonen (1982), referred to as the self-organizing map (SOM). The general idea proposed is to display the contents of a document library by ...

2010
Lenka Skovajsová Igor Mokriš

The paper deals with text document retrieval from the given document collection by using neural networks, namely cascade neural network model, linear and nonlinear Hebbian neural networks and linear autoassociative neural network. With using neural networks it is possible to reduce the dimension of the search space with preserving the highest retrieval accuracy.

2016
Ivan Vulic Anna Korhonen

A shared bilingual word embedding space (SBWES) is an indispensable resource in a variety of cross-language NLP and IR tasks. A common approach to the SBWES induction is to learn a mapping function between monolingual semantic spaces, where the mapping critically relies on a seed word lexicon used in the learning process. In this work, we analyze the importance and properties of seed lexicons f...

Journal: :Proceedings of the AAAI Conference on Artificial Intelligence 2019

Journal: :Data Science and Engineering 2023

Abstract Network embedding aims to map nodes in a network low-dimensional vector representations. Graph neural networks (GNNs) have received much attention and achieved state-of-the-art performance learning node representation. Using fundamental sociological theories (status theory balance theory) model signed networks, basing GNN on has become hot topic embedding. However, most GNNs fail use e...

Journal: :Journal of Machine Learning Research 2017
Stanislas Lauly Yin Zheng Alexandre Allauzen Hugo Larochelle

We present an approach based on feed-forward neural networks for learning the distribution of textual documents. This approach is inspired by the Neural Autoregressive Distribution Estimator (NADE) model, which has been shown to be a good estimator of the distribution of discrete-valued high-dimensional vectors. In this paper, we present how NADE can successfully be adapted to the case of textu...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید