<< Chapter < Page | Chapter >> Page > |
A wealth of interesting problems in engineering, control, finance, and statistics can be formulated as optimization problems involving theeigenvalues of a matrix function. These very challenging problems cannot usually be solved via traditional techniques for nonlinearoptimization. However, they have been addressed in recent years by a combination of deep, elegant mathematical analysis and ingeniousalgorithmic and software development. In this workshop, three leading experts will discuss applications along with the theoretical andalgorithmic aspects of this fascinating topic.
Remark: This workshop was held on October 7, 2004 as part of the Computational Sciences Lecture Series (CSLS) at the University of Wisconsin-Madison.
By Prof. Stephen Boyd (Stanford University, USA)
Slides of talk [PDF] (Not yet available.) | Video [WMV] (Not yet available.)
ABSTRACT: In semidefinite programming (SDP) a linear function is minimized subject to the constraint that the eigenvalues of asymmetric matrix are nonnegative. While such problems were studied in a few papers in the 1970s, the relatively recent development ofefficient interior-point algorithms for SDP has spurred research in a wide variety of application fields, including control system analysisand synthesis, combinatorial optimization, circuit design, structural optimization, finance, and statistics. In this overview talk I willcover the basic properties of SDP, survey some applications, and give a brief description of interior-point methods for their solution.
By Prof. Adrian Lewis (Cornell University, USA)
Slides of talk [PDF] (Not yet available.) | Video [WMV] (Not yet available.)
ABSTRACT: The eigenvalues of a symmetric matrix are Lipschitzfunctions with elegant convexity properties, amenable to efficient interior-point optimization algorithms. By contrast, for example, thespectral radius of a nonsymmetric matrix is neither a convex function, nor Lipschitz. It may indicate practical behaviour much less reliablythan in the symmetric case, and is more challenging for numerical optimization (see Overton's talk). Nonetheless, this function doesshare several significant variational-analytic properties with its symmetric counterpart. I will outline these analogies, discuss thefundamental idea of Clarke regularity, highlight its usefulness in nonsmooth chain rules, and discuss robust regularizations of functionslike the spectral radius. (Including joint work with James Burke and Michael Overton.)
By Prof. Michael Overton (Courant Institute of Mathematical Sciences New York University,USA)
Slides of talk [PDF] (Not yet available.) | Video [WMV] (Not yet available.)
ABSTRACT: Stability measures arising in systems and control are typically nonsmooth, nonconvex functions. The simplest examples arethe abscissa and radius maps for polynomials (maximum real part, or modulus, of the roots) and the analagous matrix measures, the spectralabscissa and radius (maximum real part, or modulus, of the eigenvalues). More robust measures include the distance to instability(smallest perturbation that makes a polynomial or matrix unstable) and the $\epsilon$ pseudospectral abscissa or radius of a matrix (maximumreal part or modulus of the $\epsilon$\-pseudospectrum). When polynomials or matrices depend on parameters it is natural to consideroptimization of such functions. We discuss an algorithm for locally optimizing such nonsmooth, nonconvex functions over parameter spaceand illustrate its effectiveness, computing, for example, locally optimal low-order controllers for challenging problems from theliterature. We also give an overview of variational analysis of stabiity functionsin polynomial and matrix space, expanding on some of the issues discussed in Lewis's talk. (Joint work with James V. Burke and AdrianS. Lewis.)
Notification Switch
Would you like to follow the 'Computational sciences lecture series at uw-madison' conversation and receive update notifications?