This course covers the basic models and solution techniques for problems of sequential decision making under uncertainty (stochastic control). We will consider optimal control of a dynamical system over both a finite and an infinite number of stages (finite and infinite horizon). We will also discuss some approximation methods for problems involving large state spaces. Applications of dynamic programming in a variety of fields will be covered in recitations.
|
Find OpenCourseWare Online Exams!
Attribution: The Open Education Consortium
http://www.ocwconsortium.org/courses/view/c750544145210308ce3047ff4db3ff2a/
Course Home http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-231-dynamic-programming-and-stochastic-control-fall-2008