Paper

Sample Complexity of Sample Average Approximation for Conditional Stochastic Optimization

In this paper, we study a class of stochastic optimization problems, referred to as the \emph{Conditional Stochastic Optimization} (CSO), in the form of $\min_{x \in \mathcal{X}} \EE_{\xi}f_\xi\Big({\EE_{\eta|\xi}[g_\eta(x,\xi)]}\Big)$, which finds a wide spectrum of applications including portfolio selection, reinforcement learning, robust learning, causal inference and so on. Assuming availability of samples from the distribution $\PP(\xi)$ and samples from the conditional distribution $\PP(\eta|\xi)$, we establish the sample complexity of the sample average approximation (SAA) for CSO, under a variety of structural assumptions, such as Lipschitz continuity, smoothness, and error bound conditions. We show that the total sample complexity improves from $\cO(d/\eps^4)$ to $\cO(d/\eps^3)$ when assuming smoothness of the outer function, and further to $\cO(1/\eps^2)$ when the empirical function satisfies the quadratic growth condition. We also establish the sample complexity of a modified SAA, when $\xi$ and $\eta$ are independent. Several numerical experiments further support our theoretical findings. Keywords: stochastic optimization, sample average approximation, large deviations theory

Results in Papers With Code
(↓ scroll down to see all results)