نتایج جستجو برای: experts mixture

تعداد نتایج: 160772  

2007
Kukjin Kang Jong-Hoon Oh

We study the generalization capability of a modular neural network`the mixture of experts' that learn from examples generated by another network with the same architecture. When the number of examples is smaller than a critical value, the network shows a symmetric phase where the role of the experts is not specialized. Upon crossing the critical point, the system undergoes a continuous phase tr...

2010
Magdalena Graczyk Tadeusz Lasota Zbigniew Telec Bogdan Trawinski

Several experiments were conducted in order to investigate the usefulness of mixture of experts (ME) approach to an online internet system assisting in real estate appraisal. All experiments were performed using 28 realworld datasets composed of data taken from a cadastral system and GIS data derived from a cadastral map. The analysis of the results was performed using recently proposed statist...

2000
Ron Meir Ran El-Yaniv Shai Ben-David

We introduce and analyze LocBoost, a new boosting algorithm, which leads to the incremental construction of a mixture of experts type architecture. We provide upper bounds on the expected loss of such models in terms of the smoothness properties of the gating functions appearing in the mixture of experts model. Furthermore, an incremental algorithm is proposed for the construction of the classi...

Journal: :Proceedings of the ... AAAI Conference on Artificial Intelligence 2023

The long-tailed video recognition problem is especially challenging, as videos tend to be long and untrimmed, each may contain multiple classes, causing frame-level class imbalance. previous method tackles the only through sampling for re-balance without distinguishing feature representation between head tail classes. To improve of we modulate features with an auxiliary distillation loss reduce...

Journal: :IEEE Transactions on Industrial Informatics 2023

Accurate estimation of multiple quality variables is critical for building industrial soft sensor models, which have long been confronted with data efficiency and negative transfer issues. Methods sharing backbone parameters among tasks address the issue; however, they still fail to mitigate problem. To this issue, a balanced Mixture-of-Experts (BMoE) proposed in work, consists multi-gate mixtu...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید