Extreme learning machine versus classical feedforward network

نویسندگان

چکیده

Abstract Our research is devoted to answering whether randomisation-based learning can be fully competitive with the classical feedforward neural networks trained using backpropagation algorithm for classification and regression tasks. We chose extreme as an example of networks. The models were evaluated in reference training time achieved efficiency. conducted extensive comparison these two methods various tasks scenarios: $$\bullet$$ ∙ comparable network capacity architectures tuned each model. was on multiple datasets from public repositories some artificial created this research. Overall, experiments covered more than 50 datasets. Suitable statistical tests supported results. They confirm that relatively small datasets, machines (ELM) are better by algorithm. But demanding image like ImageNet, ELM not modern backpropagation; therefore, order properly address current practical needs pattern recognition entirely, further development. Based our experience, we postulate develop smart algorithms inverse matrix calculation, so determining weights challenging becomes feasible memory efficient. There a need create specific mechanisms avoid keeping whole dataset compute weights. These most problematic elements processing, establishing main obstacle widespread application.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning of a single-hidden layer feedforward neural network using an optimized extreme learning machine

This paper proposes a learning framework for single-hidden layer feedforward neural networks (SLFN) called optimized extreme learning machine (O-ELM). In O-ELM, the structure and the parameters of the SLFN are determined using an optimization method. The output weights, like in the batch ELM, are obtained by a least squares algorithm, but using Tikhonov’s regularization in order to improve the ...

متن کامل

Distributed Extreme Learning Machine for Nonlinear Learning over a Network

Distributed data collection and analysis over a network are ubiquitous, especially over a wireless sensor network (WSN). To our knowledge, the data model used in most of the distributed algorithms is linear. However, in real applications, the linearity of systems is not always guaranteed. In nonlinear cases, the single hidden layer feedforward neural network (SLFN) with radial basis function (R...

متن کامل

Distributed Extreme Learning Machine for Nonlinear Learning over Network

Distributed data collection and analysis over a network are ubiquitous, especially over a wireless sensor network (WSN). To our knowledge, the data model used in most of the distributed algorithms is linear. However, in real applications, the linearity of systems is not always guaranteed. In nonlinear cases, the single hidden layer feedforward neural network (SLFN) with radial basis function (R...

متن کامل

Extreme Learning Machine

Slow speed of feedforward neural networks has been hampering their growth for past decades. Unlike traditional algorithms extreme learning machine (ELM) [5][6] for single hidden layer feedforward network (SLFN) chooses input weight and hidden biases randomly and determines the output weight through linear algebraic manipulations. We propose ELM as an auto associative neural network (AANN) and i...

متن کامل

Extreme Learning Machine: A Review

Feedforward neural networks (FFNN) have been utilised for various research in machine learning and they have gained a significantly wide acceptance. However, it was recently noted that the feedforward neural network has been functioning slower than needed. As a result, it has created critical bottlenecks among its applications. Extreme Learning Machines (ELM) were suggested as alternative learn...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Computing and Applications

سال: 2021

ISSN: ['0941-0643', '1433-3058']

DOI: https://doi.org/10.1007/s00521-021-06402-y