نتایج جستجو برای: mlff n eural network

تعداد نتایج: 1611241  

2016
Asli Eyecioglu Bill Keller

A growing body of research has recently been conducted on semantic textual similarity using a variety of neural network models. While recent research focuses on word-based representation for phrases, sentences and even paragraphs, this study considers an alternative approach based on character n-grams. We generate embeddings for character n-grams using a continuous-bag-of-n-grams neural network...

2009
P. D. Sreekanth N. Geethanjali P. D. Sreedevi Shakeel Ahmed N. Ravi Kumar Kamala Jayanthi

P. D. Sreekanth*, N. Geethanjali, P. D. Sreedevi, Shakeel Ahmed, N. Ravi Kumar and P. D. Kamala Jayanthi National Research Centre for Cashew, Puttur 574 202, India Sri Krishnadevaraya University, Anantapur 515 003, India National Geophysical Research Institute, Hyderabad 500 007, India Central Plantation Crops Research Institute, Kasaragod 671 124, India Indian Institute of Horticulture Researc...

2007
Peng Dai Guangyou Xu

Computer understanding of human actions and interactions is the key research issue in human computing. Meanwhile context has been considered to play an essential role in understanding of human behavior during group interactions. This paper proposes a novel Event Driven Dynamic Context Model underlying the context sensing framework to support online analysis of group interactions in meeting scen...

2004
Jie Yin Xiaoyong Chai Qiang Yang

Plan recognition has traditionally been developed for logically encoded application domains with a focus on logical reasoning. In this paper, we present an integrated plan-recognition model that combines low-level sensory readings with high-level goal inference. A twolevel architecture is proposed to infer a user’s goals in a complex indoor environment using an RF-based wireless network. The no...

2015
Andreas Guta Tamer Alkhouli Jan-Thorsten Peter Joern Wuebker Hermann Ney

We propose a conversion of bilingual sentence pairs and the corresponding word alignments into novel linear sequences. These are joint translation and reordering (JTR) uniquely defined sequences, combining interdepending lexical and alignment dependencies on the word level into a single framework. They are constructed in a simple manner while capturing multiple alignments and empty words. JTR s...

2017
Zhipeng Xie

This paper proposes a neural model for closed-set Chinese word segmentation. The model follows the character-based approach which assigns a class label to each character, indicating its relative position within the word it belongs to. To do so, it first constructs shallow representations of characters by fusing unigram and bigram information in limited context window via an element-wise maximum...

2014
Xingxing Zhang Mirella Lapata

We propose a model for Chinese poem generation based on recurrent neural networks which we argue is ideally suited to capturing poetic content and form. Our generator jointly performs content selection (“what to say”) and surface realization (“how to say”) by learning representations of individual characters, and their combinations into one or more lines as well as how these mutually reinforce ...

2009
Haza Nuzly Abdul Hamed Nikola K. Kasabov Zbynek Michlovský Siti Mariyam Hj. Shamsuddin

This paper proposes a novel method for string pattern recognition using an Evolving Spiking Neural Network (ESNN) with Quantum-inspired Particle Swarm Optimization (QiPSO). This study reveals an interesting concept of QiPSO by representing information as binary structures. The mechanism optimizes the ESNN parameters and relevant features using the wrapper approach simultaneously. The N-gram ker...

2015
Yuhui Cao Zhao Chen Ruifeng Xu Tao Chen Lin Gui

Topic-based sentiment analysis for Chinese microblog aims to identify the user attitude on specified topics. In this paper, we propose a joint model by incorporating Support Vector Machines (SVM) and deep neural network to improve the performance of sentiment analysis. Firstly, a SVM Classifier is constructed using N-gram, NPOS and sentiment lexicons features. Meanwhile, a convolutional neural ...

2017
X. Chen Anton Ragni X. Liu Mark J. F. Gales

Recurrent neural network language models (RNNLMs) are powerful language modeling techniques. Significant performance improvements have been reported in a range of tasks including speech recognition compared to n-gram language models. Conventional n-gram and neural network language models are trained to predict the probability of the next word given its preceding context history. In contrast, bi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید