Predicting vehicle prices via quantum-assisted feature selection
نویسندگان
چکیده
Abstract Feature selection is a technique used to reduce complexity and improve performance in regards generalization, fit, accuracy of prediction for machine learning models. A central challenge this process that the search over space features find subset k optimal known NP-Hard problem. In work, we study metrics encoding combinatorial as binary quadratic model, such Generalized Mean Information Coefficient Pearson Correlation application underlying regression problem price prediction. We compare predictive model leveraging quantum-assisted vs. classical subroutines search, using minimum redundancy maximal relevancy heuristic our approach. cross validate models on real world prediction, show improvement mean absolute error scores method $$(1471.02 \pm {135.6})$$ ( 1471.02 ± 135.6 ) similar methodologies greedy $$(1707.2 {168})$$ 1707.2 168 , recursive feature elimination $$(1678.3 {143.7})$$ 1678.3 143.7 all $$(1546.2 {154}$$ 1546.2 154 ). Our findings by quantum assisted routines solutions which increase quality output while reducing input dimensionality algorithm synthetic real-world data.
منابع مشابه
Feature Selection via Discretization
| Discretization can turn numeric attributes into discrete ones. Feature selection can eliminate some irrelevant and/or redundant attributes. Chi2 is a simple and general algorithm that uses the 2 statistic to discretize numeric attributes repeatedly until some inconsistencies are found in the data. It achieves feature selection via dis-cretization. It can handle mixed attributes, work with mul...
متن کاملFeature Selection and Predicting CardioVascular Risk
No gold standard exists for assessing the risk of individual patients in cardiovascular medicine. The medical data used for such purposes is, itself, inconsistent over a history of patients at any one clinical site, and not always immediately useable. In this paper the clustering of data using Self Organizing Maps (SOM) is described. This method is an unsupervised neural network developed by Te...
متن کاملFeature Selection via Dependence Maximization
We introduce a framework of feature selection based on dependence maximization between the selected features and the labels of an estimation problem, using the Hilbert-Schmidt Independence Criterion. The key idea is that good features should be highly dependent on the labels. Our approach leads to a greedy procedure for feature selection. We show that a number of existing feature selectors are ...
متن کاملFeature Selection via Probabilistic Outputs
This paper investigates two feature-scoring criteria that make use of estimated class probabilities: one method proposed by Shen et al. (2008) and a complementary approach proposed below. We develop a theoretical framework to analyze each criterion and show that both estimate the spread (across all values of a given feature) of the probability that an example belongs to the positive class. Base...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International journal of information technology
سال: 2023
ISSN: ['2511-2112', '2511-2104']
DOI: https://doi.org/10.1007/s41870-023-01370-z