Boosted Pre-loaded Mixture of Experts for low-resolution face recognition

نویسندگان

  • Reza Ebrahimpour
  • Naser Sadeghnejad
  • Saeed Masoudnia
  • Seyed Ali Asghar AbbasZadeh Arani
چکیده

A modified version of Boosted Mixture of Experts (BME) for low-resolution face recognition is presented in this paper. Most of the methods developed for low-resolution face recognition focused on improving the resolution of face images and/or special feature extraction methods that can deal effectively with low-resolution problem. However, we focused on the classification step of face recognition process in this paper. Using Neural Networks (NN) combinations is an efficient approach to deal with complex classification problems, such as the low-resolution face recognition which involves high-dimensional feature sets and highly overlapped classes. Mixture of Experts (ME) and boosting methods are two of the most popular and interesting NN combining methods, which have great potential for improving performance in classification. A modified combining approach based on both features of ME and boosting is presented in order to deal with this complex classification problem efficiently. Previous works [1,2] made attempts to incorporate the complementary features of boosting method in ME training algorithm to boost the performance. These approaches called Boosted Mixture of Experts (BME) have some drawbacks. Based on the analysis of the problems of previous approaches, some modifications are suggested in this paper. A modification in the preloading (initialization) procedure of ME is proposed to address the limitations of previous approaches and overcome them using a two stages pre-loading procedure. In our suggested approach, both the error and confidence measures are used as the difficulty criteria in boosting-based partitioning of the problem space. Regarding the nature of this approach, we call the proposed method Boosted Pre-loaded Mixture of Experts (BPME). The proposed method is tested in a low-resolution face recognition problem and compared to the other variations of ME and boosting method. The experiments are conducted using low-resolution variations of two common face databases including the ORL and Yale databases. The experimental results show that BPME method has significant better recognition rates against the other compared combining methods in various tested conditions including different quality grades of face images and different sizes of the training set.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mixture of Experts for Persian handwritten word recognition

This paper presents the results of Persian handwritten word recognition based on Mixture of Experts technique. In the basic form of ME the problem space is automatically divided into several subspaces for the experts, and the outputs of experts are combined by a gating network. In our proposed model, we used Mixture of Experts Multi Layered Perceptrons with Momentum term, in the classification ...

متن کامل

Facial expression recognition based on Local Binary Patterns: A comprehensive study

Automatic facial expression analysis is an interesting and challenging problem, and impacts important applications in many areas such as human–computer interaction and data-driven animation. Deriving an effective facial representation from original face images is a vital step for successful facial expression recognition. In this paper, we empirically evaluate facial representation based on stat...

متن کامل

Teacher-directed learning in view-independent face recognition with mixture of experts using overlapping eigenspaces

A model for view-independent face recognition, based on Mixture of Experts, ME, is presented. In the basic form of ME the problem space is automatically divided into several subspaces for the experts, and the outputs of experts are combined by a gating network. In our proposed model, the ME is directed to adapt to a particular partitioning corresponding to predetermined views. To force an exper...

متن کامل

Teacher-Directed Learning with Mixture of Experts for View-Independent Face Recognition

We propose two new models for view-independent face recognition, which lies under the category of multiview approaches. We use the so-called “mixture of experts” (MOE) in which, the problem space is divided into several subspaces for the experts, and then the outputs of experts are combined by a gating network to form the final output. Basically, our focus is on the way that the face space is p...

متن کامل

Teacher-directed learning in view-independent face recognition with mixture of experts using single-view eigenspaces

We propose a new model for view-independent face recognition by multiview approach. We use the so-called ‘‘mixture of experts’’, ME, in which, the problem space is divided into several subspaces for the experts, and the outputs of experts are combined by a gating network. In our model, instead of leaving the ME to partition the face space automatically, the ME is directed to adapt to a particul...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Int. J. Hybrid Intell. Syst.

دوره 9  شماره 

صفحات  -

تاریخ انتشار 2012