A Hyperparameter Optimization for Galaxy Classification
نویسندگان
چکیده
In this study, the morphological galaxy classification process was carried out with a hybrid approach. Since Galaxy may contain detailed information about universe’s formation, it remains current research topic. Researchers divided more than 100 billion galaxies into ten different classes. It is not always possible to understand which class types belong. However, Artificial Intelligence (AI) can be used for successful classification. There are studies on automatic of small number As classes increases, success methods decreases. Based literature, using Convolutional Neural Network (CNN) better. Three meta-heuristic algorithms obtain optimum architecture CNN. These Grey Wolf Optimizer (GWO), Particle Swarm Optimization (PSO) and Bee Colony (ABC) algorithms. A CNN nine hidden layers two full connected used. The neurons in fully layers, learning coefficient batch size values were optimized. accuracy my model 85%. best results obtained GWO. Manual optimization difficult. help GWO algorithm.
منابع مشابه
Hyperparameter Optimization: A Spectral Approach
We give a simple, fast algorithm for hyperparameter optimization inspired by techniques from the analysis of Boolean functions. We focus on the high-dimensional regime where the canonical example is training a neural network with a large number of hyperparameters. The algorithm– an iterative application of compressed sensing techniques for orthogonal polynomials– requires only uniform sampling ...
متن کاملSurrogate Benchmarks for Hyperparameter Optimization
Since hyperparameter optimization is crucial for achieving peak performance with many machine learning algorithms, an active research community has formed around this problem in the last few years. The evaluation of new hyperparameter optimization techniques against the state of the art requires a set of benchmarks. Because such evaluations can be very expensive, early experiments are often per...
متن کاملPractical Hyperparameter Optimization
Recently, the bandit-based strategy Hyperband (HB) was shown to yield good hyperparameter settings of deep neural networks faster than vanilla Bayesian optimization (BO). However, for larger budgets, HB is limited by its random search component, and BO works better. We propose to combine the benefits of both approaches to obtain a new practical state-of-the-art hyperparameter optimization metho...
متن کاملHyperparameter optimization with approximate gradient
Most models in machine learning contain at least one hyperparameter to control for model complexity. Choosing an appropriate set of hyperparameters is both crucial in terms of model accuracy and computationally challenging. In this work we propose an algorithm for the optimization of continuous hyperparameters using inexact gradient information. An advantage of this method is that hyperparamete...
متن کاملHyperparameter Search Space Pruning - A New Component for Sequential Model-Based Hyperparameter Optimization
The optimization of hyperparameters is often done manually or exhaustively but recent work has shown that automatic methods can optimize hyperparameters faster and even achieve better nal performance. Sequential model-based optimization (SMBO) is the current state of the art framework for automatic hyperparameter optimization. Currently, it consists of three components: a surrogate model, an ac...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computers, materials & continua
سال: 2023
ISSN: ['1546-2218', '1546-2226']
DOI: https://doi.org/10.32604/cmc.2023.033155