Skip to main content
Fig. 10 | Advanced Modeling and Simulation in Engineering Sciences

Fig. 10

From: A comparison of mixed-variables Bayesian optimization approaches

Fig. 10

Sketch of Rockafellar’s augmented Lagrangian for \(\rho \approx 0\) in blue and \(\rho >0\) in red. \(x^1\) is infeasible, \(x^2\) feasible (and \(g(x^2) < -\lambda /\rho \)) and \(x^\star \) is an optimum with \(g(x^\star )=0\). The black highlighted curves are the approximation to the dual function, \({\widehat{D}}(\lambda )\) for \({{X}} = \{x^1,x^2,x^\star \}\), for \(\rho \approx 0\) and \(\rho >0\). There is no saddle point and a duality gap with the blue set of curves in that \(x^\star \notin \arg \min _x L_A(x;\lambda ^\star ,\rho \approx 0)\) and \({\widehat{D}}(\lambda ^\star ) = min_x L_A(x;\lambda ^\star ,\rho \approx 0) < L_A(x^\star ;\lambda ^\star ,\rho \approx 0)\), i.e., minimizing the augmented Lagrangian does not lead to the result of the problem. However, by increasing \(\rho \), it is visible that the y-intercept of the infeasible points increase so that one always reaches a state where \(x^\star = \arg \min _x L_A(x;\lambda ^\star ,\rho )\) as in the red set of curves. A similar illustration can be done with the augmented Lagrangian with equality constraint: \(f(x)+\rho /2 h^2(x)\) is the y-intercept and h(x) is the slope of the augmented Lagrangian associated to x. The main difference is that all points contribute linearly in terms of \(\lambda \) to \(L_A(x;\lambda ,\rho )\)

Back to article page