Near-Optimal Design of Safe Output Feedback Controllers from Noisy Data

21 May 2021  ·  Luca Furieri, Baiwei Guo, Andrea Martin, Giancarlo Ferrari-Trecate ·

As we transition towards the deployment of data-driven controllers for black-box cyberphysical systems, complying with hard safety constraints becomes a primary concern. Two key aspects should be addressed when input-output data are corrupted by noise: how much uncertainty can one tolerate without compromising safety, and to what extent is the control performance affected? By focusing on finite-horizon constrained linear-quadratic problems, we provide an answer to these questions in terms of the model mismatch incurred during a preliminary identification phase. We propose a control design procedure based on a quasiconvex relaxation of the original robust problem and we prove that, if the uncertainty is sufficiently small, the synthesized controller is safe and near-optimal, in the sense that the suboptimality gap increases linearly with the model mismatch level. Since the proposed method is independent of the specific identification procedure, our analysis holds in combination with state-of-the-art behavioral estimators beyond standard least-squares. The main theoretical results are validated by numerical experiments.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here