نتایج جستجو برای: principal component regression

تعداد نتایج: 998116  

Journal: :Biomass Conversion and Biorefinery 2022

Abstract Gasification represents a potential technology for the conversion of biomass into usable energy. The influence main gasification parameters, i.e. type used and its composition, as well composition outlet gas, were studied by multivariate statistical analysis based on principal component (PCA) partial least square (PLS) regression models in order to identify correlations between them co...

Journal: :Proceedings of the ISCIE International Symposium on Stochastic Systems Theory and its Applications 2013

2010
Han Lin Shang

This paper uses half-hourly electricity demand data in South Australia as an empirical study of nonparametric modeling and forecasting methods for prediction from half-hour ahead to one year ahead. A notable feature of the univariate time series of electricity demand is the presence of both intraweek and intraday seasonalities. An intraday seasonal cycle is apparent from the similarity of the d...

ژورنال: علوم آب و خاک 2005
امین شیروانی, , سید محمد جعفر ناظم‌السادات, ,

Since the fluctuations of the Persian Gulf Sea Surface Temperature (PGSST) have a significant effect on the winter precipitation and water resources and agricultural productions of the south western parts of Iran, the possibility of the Winter SST prediction was evaluated by multiple regression model. The time series of PGSSTs for all seasons, during 1947-1992, were considered as predictors, an...

2011
Antoni Wibowo Mohammad Ishak Desa

In recent years, many algorithms based on kernel principal component analysis (KPCA) have been proposed including kernel principal component regression (KPCR). KPCR can be viewed as a non-linearization of principal component regression (PCR) which uses the ordinary least squares (OLS) for estimating its regression coefficients. We use PCR to dispose the negative effects of multicollinearity in ...

2017
Manolis C. Tsakiris René Vidal

We consider the problem of outlier rejection in single subspace learning. Classical approaches work with a direct representation of the subspace, and are thus efficient when the subspace dimension is small. Our approach works with a dual representation of the subspace and hence aims to find its orthogonal complement; as such it is particularly suitable for high-dimensional subspaces. We pose th...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید