This work discusses the theory of control processes. The extremely rapid growth of the theory, associated intimately with the continuing trend toward automation, makes it imperative that the courses of this nature rest upon a broad basis. The work discusses the fundamentals of the calculus of variations, dynamic programming, discrete control processes, use of the digital computer, and functional analysis. Introductory courses in control theory are essential for training the modern graduate student in pure and applied mathematics, engineering, mathematical physics, economics, biology, operations research, and related fields. The work also describes the dual approaches of the calculus of variations and dynamic programming in the scalar case and illustrates ways to tackle the multidimensional optimization problems.
Introduction to the Mathematical Theory of Control Processes: Linear Equations and Quadratic Criteria v. 1