A Dynamic Game Framework for Verification and Control of Stochastic Hybrid Systems
Jerry Ding and Maryam Kamgarpour and Sean Summers and Alessandro Abate and John Lygeros and Claire Tomlin
EECS Department, University of California, Berkeley
Technical Report No. UCB/EECS-2011-101
September 7, 2011
http://www2.eecs.berkeley.edu/Pubs/TechRpts/2011/EECS-2011-101.pdf
This report develops a framework for analyzing probabilistic reachability and safety problems for discrete time hybrid systems in a stochastic game setting. In particular, we formulate these problems as zero-sum stochastic games between the control, whose objective is to reach a desired target set or remain within a given safe set, and a rational adversary, whose objective is opposed to that of the control. It will be shown that the maximal probability of achieving the reachability and safety objectives subject to the worst-case adversary behavior can be computed through a suitable dynamic programming algorithm. Furthermore, there always exists an optimal control policy which achieves this worst-case probability, regardless of the choice of disturbance strategy, and sufficient conditions for optimality of the policy can be derived in terms of the dynamic programming recursion. We provide several application examples from the domains of air traffic management and robust motion planning to demonstrate our modeling framework and solution approach.
BibTeX citation:
@techreport{Ding:EECS-2011-101, Author= {Ding, Jerry and Kamgarpour, Maryam and Summers, Sean and Abate, Alessandro and Lygeros, John and Tomlin, Claire}, Title= {A Dynamic Game Framework for Verification and Control of Stochastic Hybrid Systems}, Year= {2011}, Month= {Sep}, Url= {http://www2.eecs.berkeley.edu/Pubs/TechRpts/2011/EECS-2011-101.html}, Number= {UCB/EECS-2011-101}, Abstract= {This report develops a framework for analyzing probabilistic reachability and safety problems for discrete time hybrid systems in a stochastic game setting. In particular, we formulate these problems as zero-sum stochastic games between the control, whose objective is to reach a desired target set or remain within a given safe set, and a rational adversary, whose objective is opposed to that of the control. It will be shown that the maximal probability of achieving the reachability and safety objectives subject to the worst-case adversary behavior can be computed through a suitable dynamic programming algorithm. Furthermore, there always exists an optimal control policy which achieves this worst-case probability, regardless of the choice of disturbance strategy, and sufficient conditions for optimality of the policy can be derived in terms of the dynamic programming recursion. We provide several application examples from the domains of air traffic management and robust motion planning to demonstrate our modeling framework and solution approach.}, }
EndNote citation:
%0 Report %A Ding, Jerry %A Kamgarpour, Maryam %A Summers, Sean %A Abate, Alessandro %A Lygeros, John %A Tomlin, Claire %T A Dynamic Game Framework for Verification and Control of Stochastic Hybrid Systems %I EECS Department, University of California, Berkeley %D 2011 %8 September 7 %@ UCB/EECS-2011-101 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/2011/EECS-2011-101.html %F Ding:EECS-2011-101