نتایج جستجو برای: batch and online learning
تعداد نتایج: 16981315 فیلتر نتایج به سال:
The learning method for hand gesture recognition that compute a space of eigenvectors by Principal Component Analysis(PCA) traditionally require a batch computation step, in which the only way to update the subspace is to rebuild the subspace by the scratch when it comes to new samples. In this paper, we introduce a new approach to gesture recognition based on online PCA algorithm with adaptive...
this study was conducted to investigate the impact of portfolio assessment as a process-oriented assessment mechanism on iranian efl students’ english writing and its subskills of focus, elaboration, organization, conventions, and vocabulary. out of ninety juniors majoring in english literature and translation at the university of isfahan, sixty one of them who were at the same level of writing...
Ensemble learning methods train combinations of base models, which may be decision trees, neural networks, or others traditionally used in supervised learning. Ensemble methods have gained popularity because many researchers have demonstrated their superior prediction performance relative to single models on a variety of problems especially when the correlations of the errors made by the base m...
In this paper, we study the problem of large-scale Kernel Logistic Regression (KLR). A straightforward approach is to apply stochastic approximation to KLR. We refer to this approach as non-conservative online learning algorithm because it updates the kernel classifier after every received training example, leading to a dense classifier. To improve the sparsity of the KLR classifier, we propose...
-Classification is one of the data mining techniques that analyses a given data set and induces a model for each class based on their features present in the data. Bagging and boosting are heuristic approaches to develop classification models. These techniques generate a diverse ensemble of classifiers by manipulating the training data given to a base learning algorithm. They are very successfu...
The promise of learning to learn for robotics rests on the hope that by extracting some information about the learning process itself we can speed up subsequent similar learning tasks. Here, we introduce a computationally efficient online meta-learning algorithm that builds and optimizes a memory model of the optimal learning rate landscape from previously observed gradient behaviors. While per...
Low-rank tensor learning has many applications in machine learning. A series of batch learning algorithms have achieved great successes. However, in many emerging applications, such as climate data analysis, we are confronted with largescale tensor streams, which pose significant challenges to existing solutions. In this paper, we propose an accelerated online low-rank tensor learning algorithm...
Convolutional sparse representations are a form of sparse representation with a structured, translation invariant dictionary. Most convolutional dictionary learning algorithms to date operate in batch mode, requiring simultaneous access to all training images during the learning process, which results in very high memory usage, and severely limits the training data that can be used. Very recent...
The overarching goal of this paper is to derive excess risk bounds for learning from expconcave loss functions in passive and sequential learning settings. Exp-concave loss functions encompass several fundamental problems in machine learning such as squared loss in linear regression, logistic loss in classification, and negative logarithm loss in portfolio management. In batch setting, we obtai...
Modern applications in sensitive domains such as biometrics and medicine frequently require the use of non-decomposable loss functions such as precision@k, F-measure etc. Compared to point loss functions such as hinge-loss, these offer much more fine grained control over prediction, but at the same time present novel challenges in terms of algorithm design and analysis. In this work we initiate...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید