نتایج جستجو برای: experts mixture
تعداد نتایج: 160772 فیلتر نتایج به سال:
During face-to-face conversation, people naturally integrate speech, gestures and higher level language interpretations to predict the right time to start talking or to give backchannel feedback. In this paper we introduce a new model called Latent Mixture of Discriminative Experts which addresses some of the key issues with multimodal language processing: (1) temporal synchrony/asynchrony betw...
Several experiments were conducted in order to investigate the usefulness of mixture of experts approach to an online internet system assisting in real estate appraisal. All experiments were performed using real-world datasets taken from a cadastral system. The analysis of the results was performed using statistical methodology including nonparametric tests followed by post-hoc procedures desig...
High-throughput methods can directly detect the set of interacting proteins in yeast but the results are often incomplete and exhibit high false positive and false negative rates. A number of researchers have recently presented methods for integrating direct and indirect data for predicting interactions. However, due to missing data and the high redundancy among the features used, different sam...
The human brain can be described as containing a number of functional regions. These regions, as well as the connections between them, play a key role in information processing in the brain. However, most existing multi-voxel pattern analysis approaches either treat multiple regions as one large uniform region or several independent regions, ignoring the connections between them. In this paper ...
ÐThis article introduces a new tool for exploratory data analysis and data mining called Scale-Sensitive Gated Experts (SSGE) which can partition a complex nonlinear regression surface into a set of simpler surfaces (which we call features). The set of simpler surfaces has the property that each element of the set can be efficiently modeled by a single feedforward neural network. The degree to ...
In this study we present a Deep Mixture of Experts (DMoE) neural-network architecture for single microphone speech enhancement. By contrast to most speech enhancement algorithms that overlook the speech variability mainly caused by phoneme structure, our framework comprises a set of deep neural networks (DNNs), each one of which is an ‘expert’ in enhancing a given speech type corresponding to a...
Face alignment, which is the task of finding the locations of a set of facial landmark points in an image of a face, is useful in widespread application areas. Face alignment is particularly challenging when there are large variations in pose (in-plane and out-of-plane rotations) and facial expression. To address this issue, we propose a cascade in which each stage consists of a mixture of regr...
We introduce an LSTM-based method for dynamically integrating several wordprediction experts to obtain a conditional language model which can be good simultaneously at several subtasks. We illustrate this general approach with an application to dialogue where we integrate a neural chat model, good at conversational aspects, with a neural question-answering model, good at retrieving precise info...
Viswanath Ramamurti and Joydeep Ghosh Department of Electrical and Computer Engineering The University of Texas at Austin, Austin, TX 78712-1084. E-mail: fviswa,[email protected] Abstract The mixture of experts architecture provides a modular approach to function approximation. Since di erent experts get attuned to di erent regions of the input space during the course of training, and ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید