Hyperparameter optimization of tapping center machines model using robust whale optimization algorithm

نویسندگان

چکیده

To build a synchronization error prediction model for the machine tool efficiently, robust whale optimization algorithm (RWOA) method proposed in this study is applied to hyperparameter of its model. The RWOA integrated non-linear time-invariant inertia weighting (NTIW) and Taguchi-based adaptive parameter exploration (ATPE) improve performance WOA promote robustness. NTIW can other algorithms, so used WOA. In addition, Taguchi get an excellent combination variables with optimal values stable performance, making robust. First, verify validity method, 13 benchmark functions were study. results function tests include mean, standard deviation, p-value t-distribution test. show that 11 differ significantly. non-significant difference functions, means deviations obtained by are considerably better than those Since product cost tools higher, if be built effectively, it reduce cost. Therefore, study, was explore best From results, model’s average MAPE (mean absolute percentage error) 7.2604% training data 9.2603% testing under 30 modeling runs. For one models, 6.8384% 6.7372% data. This also introduced into actual experimental showed 6.3447%. effectively explores suitable machine.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonlinear regression model generation using hyperparameter optimization

An algorithm of the inductive model generation and model selection is proposed to solve the problem of automatic construction of regression models. A regression model is an admissible superposition of smooth functions given by experts. Coherent Bayesian inference is used to estimate model parameters. It introduces hyperparameters, which describe the distribution function of the model parameters...

متن کامل

Applying Model-Based Optimization to Hyperparameter Optimization in Machine Learning

This talk will cover the main components of sequential modelbased optimization algorithms. Algorithms of this kind represent the state-of-the-art for expensive black-box optimization problems and are getting increasingly popular for hyper-parameter optimization of machine learning algorithms, especially on larger data sets. The talk will cover the main components of sequential model-based optim...

متن کامل

Optimization of e-Learning Model Using Fuzzy Genetic Algorithm

E-learning model is examined of three major dimensions. And each dimension has a range of indicators that is effective in optimization and modeling, in many optimization problems in the modeling, target function or constraints may change over time that as a result optimization of these problems can also be changed. If any of these undetermined events be considered in the optimization process, t...

متن کامل

Optimization of e-Learning Model Using Fuzzy Genetic Algorithm

E-learning model is examined of three major dimensions. And each dimension has a range of indicators that is effective in optimization and modeling, in many optimization problems in the modeling, target function or constraints may change over time that as a result optimization of these problems can also be changed. If any of these undetermined events be considered in the optimization process, t...

متن کامل

Practical Hyperparameter Optimization

Recently, the bandit-based strategy Hyperband (HB) was shown to yield good hyperparameter settings of deep neural networks faster than vanilla Bayesian optimization (BO). However, for larger budgets, HB is limited by its random search component, and BO works better. We propose to combine the benefits of both approaches to obtain a new practical state-of-the-art hyperparameter optimization metho...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Advances in Mechanical Engineering

سال: 2023

ISSN: ['1687-8132', '1687-8140']

DOI: https://doi.org/10.1177/16878132231175000