نتایج جستجو برای: lexical coverage
تعداد نتایج: 115790 فیلتر نتایج به سال:
It is very costly to build up lexical resources and domain ontologies. Especially when confronted with a new application domain lexical gaps and a poor coverage of domain concepts are a problem for the successful exploitation of natural language document analysis systems that need and exploit such knowledge sources. In this paper we report about ongoing experiments with ‘bootstrapping technique...
Due to the data sparseness problem, the lexical information from a treebank for a lexicalized parser could be insufficient. This paper proposes an approach to learn head-modifier pairs from a raw corpus, and to integrate them into a lexicalized dependency parser to parse a Chinese Treebank. Experimental results show that this approach not only enlarged the coverage of bi-lexical dependency, but...
What is the role of lexical information in robust parsing of unrestricted texts? In this paper we provide experimental evidence showing that, in order to strike the balance between robustness and coverage needed for practical NLP applications, judicious use of positive lexical evidence given a text should be complemented with a battery of dynamic parsing strategies aimed at solving local constr...
This paper reports on our Recognizing Textual Entailment (RTE) system developed for participation in the Text Analysis Conference RTE 2009 competition. The development of the system is based on the lexical entailment between two text excerpts, namely the hypothesis and the text. To extract atomic parts of hypotheses and texts, we carry out syntactic parsing on the sentences. We then utilize Wor...
In this talk I would like to address some issues of major importance in lexical semantics. In particular, I will discuss four topics relating to current research in the field: methodology, descriptive coverage, adequacy of the representation, and the computational usefulness of representations. In addressing these issues, I will discuss what I think are some of the central problems facing the l...
One method for evaluating a wide-coverage parser involves measuring how accurately it identifies dependency relations. The construction of a grammar which outputs dependency relations requires a lexicon with detailed information on subcategorization and dependency. We discuss how such a lexicon can be constructed automatically for a wide-coverage lexicalist grammar for Dutch by extracting the r...
This paper presents a study on the interpretation and bracketing of noun compounds (“NCs”), based on lexical semantics. Our primary goal is to develop a method to automatically interpret NCs through the use of semantic relations. Our NC interpretation method is based on lexical similarity with tagged NCs, based on lexical similarity measures derived fromWordNet. We apply the interpretation meth...
In this paper, we introduce our Recognizing Textual Entailment (RTE) system developed on the basis of Lexical Entailment between two text excerpts, namely the hypothesis and the text. To extract atomic parts of hypotheses and texts, we carry out syntactic parsing on the sentences. We then utilize WordNet and FrameNet lexical resources for estimating lexical coverage of the text on the hypothesi...
One project aim was to establish a theory-neutral lexical representation of mwes so that the resulting lexical database could be used in various nlp systems in an effective way. We set up a small scale task-based evaluation to find out to what extent the wide-coverage Alpino parser can benefit from using the ecm lexical database. In the remainder, we first explain some needed notions. Further, ...
Standardized lexical resources are an important prerequisite for the development of robust and wide coverage natural language processing application. Therefore, we applied the Lexical Markup Framework, a recent ISO initiative towards standards for designing, implementing and representing lexical resources, on a test bed of data for an Arabic full form lexicon. Besides minor structural accommoda...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید