next up previous contents index
Next: 3.3.2 Cost functional Up: 3.3 Optimal control problem Previous: 3.3 Optimal control problem   Contents   Index


3.3.1 Control system

In the notation announced at the end of the previous section, control systems that we want to study take the form

$\displaystyle \dot x=f(t,x,u),\qquad x(t_0)=x_0$ (3.17)

which is an exact copy of (1.1). Here $ x\in\mathbb{R}^n$ is the state, $ u\in U\subset \mathbb{R}^m$ is the control, $ t\in\mathbb{R}$ is the time, $ t_0$ is the initial time, and $ x_0$ is the initial state. Both $ x$ and $ u$ are functions of time: $ x=x(t)$ , $ u=u(t)$ . The control set $ U$ is usually a closed subset of $ \mathbb{R}^m$ and can be the entire $ \mathbb{R}^m$ ; in principle it can also vary with time, but here we take it to be fixed.

We want to know that for every choice of the initial data $ (t_0,x_0)$ and every admissible control $ u(\cdot)$ , the system (3.18) has a unique solution $ x(\cdot)$ on some time interval $ [t_0,t_1]$ . If this property holds, we will say that the system is well posed. To guarantee local existence and uniqueness of solutions for (3.18), we need to impose some regularity conditions on the right-hand side $ f$ and on the admissible controls $ u$ . For the sake of simplicity, we will usually be willing to make slightly stronger assumptions than necessary; when we do this, we will briefly indicate how our assumptions can be relaxed.

Let us first consider the case of no controls:

$\displaystyle \dot x=f(t,x).$ (3.18)

First, to assure sufficient regularity of $ f$ with respect to $ t$ , we take $ f(\cdot,x)$ to be piecewise continuous for each fixed $ x$ . Here, by a piecewise continuous function we mean a function having at most a finite number of discontinuities on every bounded interval, and possessing the limits from the right and from the left at each of these discontinuities. For convenience, we assume that the value of such a function at each discontinuity is equal to one of these one-sided limits (i.e., the function is either left-continuous or right-continuous at each point). The assumption of a finite number of discontinuities on each bounded interval is actually not crucial; we can allow discontinuities to have accumulation points, as long as the function remains locally bounded (or at least locally integrable).

Second, we need to specify how regular $ f$ should be with respect to $ x$ . A standard assumption in this regard is that $ f$ is locally Lipschitz in $ x$ , uniformly over $ t$ . Namely, for every $ (t_0,x_0)$ there should exist a constant $ L$ such that we have

$\displaystyle \vert f(t,x_1)-f(t,x_2)\vert\le L\vert x_1-x_2\vert
$

for all $ (t,x_1)$ and $ (t,x_2)$ in some neighborhood of $ (t_0,x_0)$ in $ \mathbb{R}\times\mathbb{R}^n$ . We can in fact be more generous and assume the following: $ f(t,\cdot)$ is $ \mathcal C^1$ for each fixed $ t$ , and $ {f}_{x}(\cdot,x)$ is piecewise continuous for each fixed $ x$ . It is easy to verify using the Mean Value Theorem that such a function $ f$ satisfies the previous Lipschitz condition. Note that here and below, we extend our earlier notation $ {f}_{x}:={\partial f}/{\partial x}$ to the vector case, so that $ {f}_{x}$ stands for the Jacobian matrix of partial derivatives of $ f$ with respect to $ x$ .

If $ f$ satisfies the above regularity assumptions, then on some interval $ [t_0,t_1]$ there exists a unique solution $ x(\cdot)$ of the system (3.19). Since we did not assume that $ f$ is continuous with respect to $ t$ , some care is needed in interpreting what we mean by a solution of (3.19). In the present situation, it is reasonable to call a function $ x(\cdot)$ a solution of (3.19) if it is continuous everywhere, $ \mathcal C^1$ almost everywhere, and satisfies the corresponding integral equation

$\displaystyle x(t)=x_0+\int_{t_0}^t f(s,x(s))ds.
$

A function $ x(\cdot)$ that can be represented as an integral of another function $ g(\cdot)$ , and thus automatically satisfies $ \dot x=g$ almost everywhere, is called absolutely continuous. This class of functions generalizes the piecewise $ \mathcal C^1$ functions that we considered earlier. Basically, the extra generality here is that the derivative can be discontinuous on a set of points that has measure zero (e.g., a countable set) rather than at a finite number of points on a bounded interval, and can approach infinity near these points. If we insist that the derivative be locally bounded, we arrive at the slightly smaller class of locally Lipschitz functions.

We are now ready to go back to the control system (3.18). To guarantee local existence and uniqueness of its solutions, we can impose assumptions on $ f$ and $ u$ that would let us invoke the previous existence and uniqueness result for the right-hand side

$\displaystyle \bar f(t,x):=f(t,x,u(t)).$ (3.19)

Here is one such set of assumptions which, although not the weakest possible, is adequate for our purposes: $ f$ is continuous in $ t$ and $ u$ and $ \mathcal C^1$ in $ x$ ; $ f_x$ is continuous in $ t$ and $ u$ ; and $ u(\cdot)$ is piecewise continuous as a function of $ t$ . Another, weaker set of hypotheses is obtained by replacing the assumptions of existence of $ f_x$ and its continuity with respect to all variables with the following Lipschitz property: for every bounded subset $ D$ of $ \mathbb{R}\times\mathbb{R}^n\times U$ , there exists an $ L$ such that we have

$\displaystyle \vert f(t,x_1,u)-f(t,x_2,u)\vert\le L\vert x_1-x_2\vert
$

for all $ (t,x_1,u),(t,x_2,u)\in D$ . Note that in either case, differentiability of $ f$ with respect to $ u$ is not assumed.


\begin{Exercise}Verify that each of the two sets of hypotheses just described
gu...
...~\eqref{e-barf}. Explain which hypotheses can be further
relaxed.
\end{Exercise}

When the right-hand side does not explicitly depend on time, i.e., when we have $ f=f(x,u)$ , a convenient way to guarantee that the above conditions hold is to assume that $ f$ is locally Lipschitz (as a function from $ \mathbb{R}^n\times U$ to $ \mathbb{R}^n$ ). In general, $ f$ depends on $ t$ in two ways: directly through the $ t$ argument, and indirectly through $ u$ . Regarding the first dependence, we are willing to be generous by assuming that $ f$ is at least continuous in $ t$ . In fact, we can always eliminate the direct dependence of $ f$ on $ t$ by introducing the extra state variable $ x_{n+1}:=t$ , with the dynamics $ \dot
x_{n+1}=1$ . Note that in order for the new system obtained in this way to satisfy our conditions for existence and uniqueness of solutions, continuity of $ f$ in $ t$ is not enough and we need $ f$ to be $ \mathcal C^1$ in $ t$ (or satisfy an appropriate Lipschitz condition).

On the other hand, as far as regularity of $ u$ with respect to $ t$ is concerned, it would be too restrictive to assume anything stronger than piecewise continuity. In fact, occasionally we may even want to relax this assumption and allow $ u$ to be a locally bounded function with countably many discontinuities. More precisely, the class of admissible controls can be defined to consist of functions $ u$ that are measurable3.2 and locally bounded. In view of the remarks made earlier about the system (3.19), local existence and uniqueness of solutions is still guaranteed for this larger class of controls. We will rarely need this level of generality, and piecewise continuous controls will be adequate for most of our purposes (with the exception of some of the material to be discussed in Sections 4.4 and 4.5--specifically, Fuller's problem and Filippov's theorem). Similarly to the case of the system (3.19), by a solution of (3.18) we mean an absolutely continuous function $ x(\cdot)$ satisfying

$\displaystyle x(t)=x_0+\int_{t_0}^t f(s,x(s),u(s))ds.
$

In what follows, we will always assume that the property of local existence and uniqueness of solutions holds for a given control system. This of course does not guarantee that solutions exist globally in time. Typically, we will consider a candidate optimal trajectory defined on some time interval $ [t_0,t_1]$ , and then existence over the same time interval will be automatically ensured for nearby trajectories.


next up previous contents index
Next: 3.3.2 Cost functional Up: 3.3 Optimal control problem Previous: 3.3 Optimal control problem   Contents   Index
Daniel 2010-12-20