Pertinent Background Knowledge for Learning Protein Grammars
نویسندگان
چکیده
We are interested in using Inductive Logic Programming (ILP) to infer grammars representing sets of protein sequences. ILP takes as input both examples and background knowledge predicates. This work is a rst step in optimising the choice of background knowledge predicates for predicting the function of proteins. We propose methods to obtain di erent sets of background knowledge. We then study the impact of these sets on inference results through a hard protein function inference task: the prediction of the coupling preference of GPCR proteins. All but one of the proposed sets of background knowledge are statistically shown to have positive impacts on the predictive power of inferred rules, either directly or through interactions with other sets. In addition, this work provides further con rmation, after the work of Muggleton et al., 2001 that ILP can help to predict protein functions.
منابع مشابه
Learning Semantic Functions of Attribute Grammars
Attribute grammars can be considered as an extension of context-free grammars, where the attributes are associated with grammar symbols, and the semantic rules deene the values of the attributes. This formalism is widely applied for the speciication and implementation of the compilation-oriented languages. The paper presents a method for learning semantic functions of attribute grammars which i...
متن کاملOn Temporal Evolution in Data Streams
The future of CiteSeer : CiteSeer[superscript x] p. 2 Learning to have fun p. 3 Winning the DARPA grand challenge p. 4 Challenges of urban sensing p. 5 Learning in one-shot strategic form games p. 6 A selective sampling strategy for label ranking p. 18 Combinatorial Markov random fields p. 30 Learning stochastic tree edit distance p. 42 Pertinent background knowledge for learning protein gramma...
متن کاملFCGlight: Making the bridge between Fluid Construction Grammars and main-stream unification grammars by using feature constraint logics
Fluid Construction Grammars (FCGs) are a flavour of Construction Grammars, which themselves are unification-based grammars. Its syntax is (only) up to certain extent similar to other unification-based grammars. However, it lacks a declarative semantics, while its procedural semantics is truly particular, compared to other unification-based grammar formalisms. Here we propose the re-definition o...
متن کاملLearning Computational Grammars
This paper reports on the LEARNING COMPUTATIONAL GRAMMARS (LCG) project, a postdoc network devoted to studying the application of machine learning techniques to grammars suitable for computational use. We were interested in a more systematic survey to understand the relevance of many factors to the success of learning, esp. the availability of annotated data, the kind of dependencies in the dat...
متن کاملImplicit Learning of Recursive Context-Free Grammars
Context-free grammars are fundamental for the description of linguistic syntax. However, most artificial grammar learning experiments have explored learning of simpler finite-state grammars, while studies exploring context-free grammars have not assessed awareness and implicitness. This paper explores the implicit learning of context-free grammars employing features of hierarchical organization...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006