The application of dynamic programming to the description and optimal control of dynamic processes

Abstract

The demand for improved controllers creates the need for synthesis techniques which emphasize optimality. Optimality automatically infers stability. This thesis presents one such technique which is not limited to linear systems. The theory of dynamic programming is applied to the synthesis of optimal controllers with emphasis on missile flight control systems. It is shown how the theory copes with time dependence, nonlinearities and random aspects which prevail in the missile field. The first chapter introduces the basic concept and terminology of dynamic programming through its application to determine is tic and stochastic versions of a first order control process. The second chapter presents a general formulation of processes. In the third chapter the description and optimization of trajectories is discussed. Under suitable assumptions concerning time dependence the problems can be explicitly formulated in two dimensions. This permits their solution on existing digital computers. The final chap­ter treats homing guidance. A simple method of implementing control so that a missile seeks a low drag ballistic: trajectory terminating at intercept is deduced. These examples demonstrate the usefulness of dynamic programming as a tool for synthesizing and analyzing advanced flight control systems. With this theory synthesis of sophisticated con­trollers can be attacked with mathematical rigor rather than through trial and error

    Similar works