نتایج جستجو برای: regression problems

تعداد نتایج: 883874  

2014
Selim G. Akl Robert Benkoczi Daya Ram Gaur Hossam S. Hassanein Shahadat Hossain Mark Thom

Article history: Received 15 April 2014 Received in revised form 29 September 2014 Accepted 28 October 2014 Available online 4 November 2014

Journal: :Applied Mathematics and Computation 2012
Leopold Koczan Pawel Zaprawa

Let F denote the class of all functions univalent in the unit disk and convex in the direction of the real axis. In the paper we discuss the functions of the class F which are n-fold symmetric, where n is positive even integer. For the class of such functions we find the Koebe set as well as the covering set, i.e. T f2F f ðDÞ and S f2F f ðDÞ. Moreover, the Koebe constant and the covering consta...

Journal: :J. Comb. Theory, Ser. A 1998
Yair Caro Raphael Yuster

For every fixed graph H, we determine the H-covering number of Kn, for all n > n0(H). We prove that if h is the number of edges of H, and gcd(H) = d is the greatest common divisor of the degrees of H, then there exists n0 = n0(H), such that for all n > n0, C(H,Kn) = d dn 2h dn− 1 d ee, unless d is even, n = 1 mod d and n(n− 1)/d+ 1 = 0 mod (2h/d), in which case C(H,Kn) = d ( n 2 ) h e+ 1. Our m...

Journal: :CoRR 2015
Vipul K. Dabhi Sanjay Chaudhary

This paper describes Postfix-GP system, postfix notation based Genetic Programming (GP), for solving symbolic regression problems. It presents an object-oriented architecture of Postfix-GP framework. It assists the user in understanding of the implementation details of various components of Postfix-GP. Postfix-GP provides graphical user interface which allows user to configure the experiment, t...

2015
Ju Wu

Purpose: preliminary discussion on model prediction precision in the partial least squares regression analysis method; Method: introduce current development conditions of partial least squares regression analysis, analyze problems of traditional regression analysis method such as multiple linear regression analysis, introduce the mathematic principle and modeling method of the partial least squ...

2005
Don L. Mcleish

Abstract: Suppose Y is a response variable, possibly multivariate, with a density function f(y|x, v;β) conditional on the covariates (x, v) where x and v are vectors and β is a vector of unknown parameters. The authors consider the problem of estimating β when data on the covariate vector v are available for all observations while data on the covariate x are missing at random. They compare seve...

1998
J. Tin-Yau Kwok

In this paper, we study the incorporation of the support vector machine (SVM) into the (hierarchical) mixture of experts model to form a support vector mixture. We show that, in both classification and regression problems, the use of a support vector mixture leads to quadratic programming (QP) problems that are very similar to those for a SVM, with no increase in the dimensionality of the QP pr...

2010
Jim. E. Griffin Philip. J. Brown

This paper considers the effects of placing an absolutely continuous prior distribution on the regression coefficients of a linear model. We show that the posterior expectation is a matrix-shrunken version of the least squares estimate where the shrinkage matrix depends on the derivatives of the prior predictive density of the least squares estimate. The special case of the normal-gamma prior, ...

Journal: :Expert Syst. Appl. 2012
Amir Ahmad Sami M. Halawani Ibrahim A. Albidewi

Problem statement: Regression via Classification (RvC) is a method in which a regression problem is converted into a classification problem. A discretization process is used to covert continuous target value to classes. The discretized data can be used with classifiers as a classification problem. Approach: In this study, we use a discretization method, Extreme Randomized Discretization (ERD), ...

Journal: :JCP 2006
Sotiris B. Kotsiantis Dimitris Kanellopoulos Panayiotis E. Pintelas

Numerous data mining problems involve an investigation of associations between features in heterogeneous datasets, where different prediction models can be more suitable for different regions. We propose a technique of boosting localized weak learners; rather than having constant weights attached to each learner (as in standard boosting approaches), we allow weights to be functions over the inp...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید