When you click on links to various merchants on this site and make a purchase, this can result in this site earning a commission. Affiliate programs and affiliations include, but are not limited to, the eBay Partner Network.
Part One deals with the fundamentals of modern stability theory: general results concerning stability and instability, sufficient conditions for the stability of linear systems, methods for determining the stability or instability of systems of various type, theorems on stability under random disturbances.
The many interesting topics covered in Mathematical Theory of Control Systems Design are spread over an Introduction and four parts. Each chapter concludes with a brief review of the main results and formulae, and each part ends with an exercise section. Part One treats the fundamentals of modern stability theory. Part Two is devoted to the optimal control of deterministic systems. Part Three is concerned with problems of the control of systems under random disturbances of their parameters, and Part Four provides an outline of modern numerical methods of control theory. The many examples included illustrate the main assertions, teaching the reader the skills needed to construct models of relevant phenomena, to design nonlinear control systems, to explain the qualitative differences between various classes of control systems, and to apply what they have learned to the investigation of particular systems. Audience: This book will be valuable to both graduate and postgraduate students in such disciplines as applied mathematics, mechanics, engineering, automation and cybernetics.
Springer Book Archives
I. Continuous and Discrete Deterministic Systems.- II. Stability of Stochastic Systems.- III. Description of Control Problems.- IV. The Classical Calculus of Variations and Optimal Control.- V. The Maximum Principle.- VI. Linear Control Systems.- VII. Dynamic Programming Approach. Sufficient Conditions for Optimal Control.- VIII. Some Additional Topics of Optimal Control Theory.- IX. Control of Stochastic Systems. Problem Statements and Investigation Techniques.- X. Optimal Control on a Time Interval of Random Duration.- XI. Optimal Estimation of the State of the System.- XII. Optimal Control of the Observation Process.- XIII. Linear Time-Invariant Control Systems.- XIV. Numerical Methods for the Investigation of Nonlinear Control Systems.- XV. Numerical Design of Optimal Control Systems.- General References.
Springer Book Archives
Give, and it shall be given unto you. ST. LUKE, VI, 38. The book is based on several courses of lectures on control theory and appli