Fast Rates for Contextual Linear Optimization

نویسندگان

چکیده

Incorporating side observations in decision making can reduce uncertainty and boost performance, but it also requires that we tackle a potentially complex predictive relationship. Although one may use off-the-shelf machine learning methods to separately learn model plug in, variety of recent instead integrate estimation optimization by fitting the directly optimize downstream performance. Surprisingly, case contextual linear optimization, show naïve plug-in approach actually achieves regret convergence rates are significantly faster than We this leveraging fact specific problem instances do not have arbitrarily bad near-dual-degeneracy. there other pros cons consider as discuss illustrate numerically, our results highlight nuanced landscape for enterprise optimization. Our overall positive practice: models easy fast train using existing tools; simple interpret; and, show, lead decisions perform very well. This paper was accepted Hamid Nazerzadeh, data science.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Rates for Bandit Optimization with Upper-Confidence Frank-Wolfe

We consider the problem of bandit optimization, inspired by stochastic optimization and online learning problems with bandit feedback. In this problem, the objective is to minimize a global loss function of all the actions, not necessarily a cumulative loss. This framework allows us to study a very general class of problems, with applications in statistics, machine learning, and other fields. T...

متن کامل

Gradient algorithms for quadratic optimization with fast convergence rates

We propose a family of gradient algorithms for minimizing a quadratic function f(x) = (Ax, x)/2− (x, y) in R or a Hilbert space, with simple rules for choosing the step-size at each iteration. We show that when the step-sizes are generated by a dynamical system with ergodic distribution having the arcsine density on a subinterval of the spectrum of A, the asymptotic rate of convergence of the a...

متن کامل

Fast optimization of Multithreshold Entropy Linear Classifier

Multithreshold Entropy Linear Classifier (MELC) is a density based model which searches for a linear projection maximizing the CauchySchwarz Divergence of dataset kernel density estimation. Despite its good empirical results, one of its drawbacks is the optimization speed. In this paper we analyze how one can speed it up through solving an approximate problem. We analyze two methods, both simil...

متن کامل

Fast rates for Noisy Clustering Fast rates for Noisy Clustering

The effect of errors in variables in empirical minimization is investigated. Given a loss l and a set of decision rules G, we prove a general upper bound for an empirical minimization based on a deconvolution kernel and a noisy sample Zi = Xi + ǫi, i = 1, . . . , n. We apply this general upper bound to give the rate of convergence for the expected excess risk in noisy clustering. A recent bound...

متن کامل

A FAST FUZZY-TUNED MULTI-OBJECTIVE OPTIMIZATION FOR SIZING PROBLEMS

The most recent approaches of multi-objective optimization constitute application of meta-heuristic algorithms for which, parameter tuning is still a challenge. The present work hybridizes swarm intelligence with fuzzy operators to extend crisp values of the main control parameters into especial fuzzy sets that are constructed based on a number of prescribed facts. Such parameter-less particle ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Management Science

سال: 2022

ISSN: ['0025-1909', '1526-5501']

DOI: https://doi.org/10.1287/mnsc.2022.4383