نتایج جستجو برای: experts mixture
تعداد نتایج: 160772 فیلتر نتایج به سال:
A useful strategy to deal with complex classification scenarios is the “divide and conquer” approach. The mixture of experts (MOE) technique makes use of this strategy by joinly training a set of classifiers, or experts, that are specialized in different regions of the input space. A global model, or gate function, complements the experts by learning a function that weights their relevance in d...
The mixture-of-experts model is a static neural network architecture in that it learns input-output mappings where the output is directly influenced by the current input but not previous inputs. We explore a dynamic version of the mixture-of-experts model by introducing feedback into the architecture, enabling it to learn temporal behaviour. The model’s ability to decompose a task into static a...
In this paper, a deep mixture of diverse experts algorithm is developed for seamlessly combining a set of base deep CNNs (convolutional neural networks) with diverse outputs (task spaces), e.g., such base deep CNNs are trained to recognize different subsets of tens of thousands of atomic object classes. First, a two-layer (category layer and object class layer) ontology is constructed to achiev...
Estimating motion in scenes containing multiple motions remains a diicult problem for computer vision. Here we describe a novel recurrent network architecture which solves this problem by simultaneously estimating motion and segmenting the scene. The network is comprised of locally connected units which carry out simple calculations in parallel. We present simulation results illustrating the su...
The Hierarchical mixture of experts(HME) architecture is a powerful tree structured architecture for supervised learning. In this paper, an eecient one-pass algorithm to solve the M-step of the EM iterations while training the HME network to perform classiication tasks, is rst described. This substantially reduces the training time compared to using the IRLS method to solve the M-step. Further,...
We propose a new learning algorithm for regression modeling. The method is especially suitable for optimizing neural network structures that are amenable to a statistical description as mixture models. These include mixture of experts, hierarchical mixture of experts (HME), and normalized radial basis functions (NRBF). Unlike recent maximum likelihood (ML) approaches, we directly minimize the (...
A model for view-independent face recognition, based on Mixture of Experts, ME, is presented. Instead of allowing ME to partition the face space automatically, it is directed to adapt to a particular partitioning corresponding to predetermined views. Experimental results show that this model performs well in recognizing faces of intermediate unseen views. There are neurophysiological evidences ...
Although clustering data into mutually exclusive partitions has been an extremely successful approach to unsupervised learning, there are many situations in which a richer model is needed to fully represent the data. This is the case in problems where data points actually simultaneously belong to multiple, overlapping clusters. For example a particular gene may have several functions, therefore...
The hierarchical mixture of experts architecture provides a flexible procedure for implementing classification algorithms. The classification is obtained by a recursive soft partition of the feature space in a data-driven fashion. Such a procedure enables local classification where several experts are used, each of which is assigned with the task of classification over some subspace of the feat...
The connections of the alternative model for mixture of experts (ME) to the normalized radial basis function (NRBF) nets and extended normalized RBF (ENRBF) nets are established, and the well-known expectation-maximization (EM) algorithm for maximum likelihood learning is suggested to the two types of RBF nets. This new learning technique determines the parameters of the input layer (including ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید