This action might not be possible to undo. Are you sure you want to continue?

Welcome to Scribd! Start your free trial and access books, documents and more.Find out more

Moshe Y. Vardiy IBM Almaden Research Center Pierre Wolperz Universite de Liege

Abstract

We investigate extensions of temporal logic by connectives de ned by nite automata on in nite words. We consider three di erent logics, corresponding to three di erent types of acceptance conditions ( nite, looping and repeating) for the automata. It turns out, however, that these logics all have the same expressive power and that their decision problems are all PSPACE-complete. We also investigate connectives de ned by alternating automata and show that they do not increase the expressive power of the logic or the complexity of the decision problem.

1 Introduction

For many years, logics of programs have been tools for reasoning about the input/output behavior of programs. When dealing with concurrent or nonterminating processes (like operating systems) there is, however, a need to reason about in nite computations. Thus, instead of considering the rst and last states of nite computations, we need to consider the in nite sequences of states that the program goes through. Logics to reason about such sequences include temporal logic Pn77] and temporal-logic-based process logics Ni80, HKP80]. In the propositional case, computations can be viewed as in nite sequences of propositional truth assignments. For reasoning about individual propositional truth assignments, propositional logic is a descriptively complete language, i.e., it can specify any set of propositional truth assignments. There is, however, no a priori robust notion of descriptive completeness for reasoning about sequences of propositional truth assignments.

A preliminary version of this paper, authored by P. Wolper, M.Y. Vardi, and A.P. Sistla, appeared in Proc. 24th IEEE Symp. on Foundations of Computer Science, 1983, pp. 185{194, under the title \Reasoning about In nite Computation Paths". y Address: IBM Almaden Research K53-802, 650 Harry Rd., San Jose, CA 95120-6099,USA, vardi@almaden.ibm.com z Address: Institut Monte ore, B28, Universite de Liege, B-4000 Liege Sart-Tilman, Belgium, pw@monte ore.ulg.ac.be

1

In GPSS80] propositional temporal logic (PTL) was shown to be expressively equivalent to the monadic rst-order theory of (N ; <), the natural numbers with the less-than relation. This was taken as an indication that PTL and the process logics based on it are \descriptively complete". The above assumes that rst-order notions are all we need to reason about in nite computations. But, the very fundamental notion of regularity of sets of event sequences is not rst-order; even the simple assertion that \the proposition p holds at least in every other state on a path" is not expressible in PTL Wo83]. In fact, PTL and the rst-order theory of (N ; <) are known to be expressively equivalent to star-free !-regular languages Lad77, Tho79, Tho81]. On the other hand, !-regular sequences are a natural way of describing concurrent processes Sh79], Mi80]; and furthermore, the ability to describe !-regular sequences is crucial to the task of program veri cation LPZ85]. There are several di erent ways to extend the expressive power of PTL. We could add some non- rst-order construct, such as least- xpoint or second-order quanti cation, but we prefer here to add an explicit mechanism for specifying !-regular events as was done in Wo83]. There, PTL is extended with a temporal connective corresponding to every nondeterministic nite automaton on in nite words.1;2 For example, if = fa; bg and A is the automaton accepting all in nite words over having a in every other position, then A is also a binary temporal connective, and the formula A(p; true) is satis ed by the paths where p holds in at least every other state. An important point that was not considered in Wo83] is that to de ne !-regular sequences one needs to impose certain repeating conditions on accepting automata runs. For example, Buchi automata, which de ne exactly the !-regular languages, require that some accepting state occurs in nitely often in the run Bu62, McN66]. Using Buchi automata to de ne temporal connectives gives rise to an extended temporal logic that we call ETLr . In addition to the notion of repeating acceptance, we also consider two other notions of acceptance. The rst notion is nite acceptance: an automaton accepts an in nite word if it accepts some pre x of the word by the standard notion of acceptance for nite words. The second notion is looping acceptance: an automaton accepts an in nite word if it has some in nite run over the word (this is the notion used in Wo83]). Using automata with nite and looping acceptance conditions to de ne temporal connectives gives rise to extended temporal logics that we call ETLf and ETLl, respectively. We should note that our interest in nite and looping acceptance is not due to their automata-theoretic

To be precise, the formalism used in Wo83] is that of right-linear context-free grammars over in nite words. 2 Note that here the use of automata is conceptually di erent from the use of automata in dynamic logic (cf. Ha84, HS84, Pr81]). In dynamic logic automata are used to describe owchart programs, while here automata are used to describe temporal sequences. Thus, dynamic logic automata describe regular sequences of program statements, while temporal logic automata describe regular properties of state sequences.

1

2

signi cance,3 but due to their signi cance as speci cation constructs. Finite acceptance can be viewed as describing a liveness property, i.e., \something good will eventually happen", while looping acceptance can be viewed as a safety condition, i.e., \something bad will never happen".4 Thus, nite and looping acceptance can be viewed as extensions of the \Eventually " and \Always" constructs of PTL. The notions of nite and looping acceptance are incomparable and are strictly weaker than the notion of repeating acceptance. For example, the sequence (ab?)! can be de ned by repeating acceptance but not by nite or looping acceptance. Thus, we could expect the logics ETLr , ETLf , and ETLl to have di erent expressive powers. One of the main results of this paper is that all these logics are expressively equivalent. They all have the expressive power of !-regular expressions, which by Bu62] is the same as that of the monadic second-order theory of (N ; <), usually denoted S1S. We also consider the complexity of the decision problem for ETLf and ETLl. It turns out that both logics have the same complexity as PTL: polynomial space (cf. HR83, SC85]). In contrast, the decision problem for S1S is nonelementary Me75], as is the emptiness problem for regular expressions with complement MS73].5 An important contribution of this paper is to show that temporal logic formulas can be directly compiled into equivalent Buchi automata. To make the construction and its proof of correctness easier and more systematic, we rst de ne variants of Buchi automata that are geared towards recognizing models of temporal formulas and prove that these variants can be converted to Buchi automata. We then use our construction to obtain decision procedures. Our approach is, starting with a formula, to build an equivalent Buchi automaton and then to check that this automaton is nonempty.6 Note that the construction of Buchi automata from temporal logic formulas is not only useful for obtaining decision procedures for the logic but is also the cornerstone of synthesis and veri cation methods based on temporal logic MW84, PR89, VW86b, Wo89]. This construction is also the cornerstone for applications of temporal logic to the veri cation of probabilistic and real-time programs (cf. AH90, Va85b]. Finally, to explore the full power of our technique, we introduce alternating nite automata connectives. These can be exponentially more succinct than nondeterministic

For an extensive study of acceptance conditions for !-automata see Ch74, Ka85, Lan69, MY88, Sta87, Tho90, Wa79]. Our three conditions corresponds to the rst three conditions in Lan69]. It is known that these three conditions exhaust all the possibilities in the Landweber classi cation (cf. Wa79]). 4 The notions of safety and liveness are due to Lamport Lam77]. Our notion of liveness here corresponds to the notion of guarantee in MP89]. 5 In SVW87] it is shown that the decision problem for ET Lr is also PSPACE-complete. This result requires signi cantly more complicated automata-theoretic techniques that are beyond the scope of this paper. For other comments on the complexity of ET Lr see Section 3. 6 The automata-theoretic approach described here can be viewed as a specialization of the automatatheoretic approach to decision problems of dynamic logic described in VW86a] (see also ES84, St82]). Note, however, that while the tree automata constructed in ES84, St82, VW86a] accept only some models of the given formulas, the automata constructed here accept all models of the given formulas.

3

3

An in nite word over an alphabet is a function w : ! ! . We use the notation i.automata connectives. F ) where: is a nite alphabet of letters. The automaton is deterministic if j S0 j= 1 and j (s. In nite acceptance automata. and F S is a set of accepting states. si+1 2 (si. Depending on the acceptance condition we impose. We will often denote it by wi and write w = w0w1 : : :. : : : . for all 0 i < n ? 1.1 De nitions We start by making our notation for in nite words precise. and one may expect their introduction to push the complexity of the logic up. The framework developed here is a specialization of the treeautomata framework developed in VW86a]. In repeating acceptance automata (Buchi automata Bu62]). we describe the word-automata framework in detail. in order to make the paper self-contained. NFA) on in nite words is a tuple A = ( . A nondeterministic nite automaton (abbr. 2. where s0 2 S0. A run of A over an in nite word w = w0w1 : : :. si+1 2 (si . 1. no condition is imposed by the set F . a) j 1 for all s 2 S and a 2 . is an in nite sequence = s0s1 : : :. S. any run = s0 s1 : : : is accepting. is a nite sequence = s0s1 : : :sn?1 . j ] for the interval fi. 2. a nite run = s0s1 : : : sn?1 is accepting if sn?1 2 F . The ith letter in an in nite word w is thus w(i). BBP89]). 2 Finite Automata on In nite Words In this section we de ne several classes of nite automata on in nite words and examine their nonemptiness problem. i + 1. j g of !. wi). A nite run of A over an in nite word w = w0w1 : : :. S is a nite set of states. :S ! 2S is the transition function.e. : : :g. wi). In view of the wide range of applications of temporal logic (cf. i. Surprisingly. .. we get di erent types of automata. We denote by ! the set of natural numbers f0. for all i 0. S0. In looping acceptance automata. where s0 2 S0. a run 4 . which are the alternating analogues of ETLf and RTLl. S0 is a set of initial states. mapping each state and letter to a set of possible successor states. we show that the decision problems for these logics are still PSPACE-complete. We investigate both ATLf and ATLl.

2. The monadic second-order theory of successor is the interpreted formalism which has individual variables ranging over the natural numbers. We are interested in the complexity of this problem as a function in the size 5 . and ! denotes countable repetition.2 The Nonemptiness Problem for Buchi Automata An important problem we will have to solve for Buchi automata is the nonemptiness problem. but not in Lfinite or in Lloop . Tho90.2: For the rest of this section. Ch74. the constant 0. 2. an in nite word w is accepted if there is some accepting run over w. 1. to determine whether it accepts at least one word. given a Buchi automaton. The set of in nite words accepted by an automaton A is denoted L(A). boolean connectives and quanti cation over both types of variables. which is. Ka85. Lloop and Lfinite . 3. intersection and complementation and is equivalent to the class of languages describable in the monadic second-order theory of one successor (S 1S ) and by !-regular expressions. Lan69. Lfinite and Lloop are incomparable.1: 1. bg is in Lloop but not in Lfinite. the successor function. The languages accepted by Buchi (repeating acceptance) automata are often called the !-regular languages. !-regular expressions are expressions of the form i i( i)! where the union is nite. It turns out that the class of languages accepted by repeating acceptance automata (Lrepeat) strictly contains the class of languages accepted by looping (Lloop) and nite acceptance (Lfinite) automata..= s0s1 : : : is accepting if there is some accepting state that repeats in nitely often. These last two classes are incomparable. The three di erent types of automata recognize di erent classes of languages. The language a?b(a b)! over the alphabet = fa. i. In all three types of automata. To establish the relations between Lrepeat. Lemma 2. bg is in Lrepeat . By results of Buchi Bu62] (see also McNaughton McN66]). bg is in Lfinite but not in Lloop . Lrepeat strictly contains Lfinite and Lloop . The language (a?b)! over the alphabet = fa. for some s 2 F there are in nitely many i's such that si = s. Corollary 2. The language a! over the alphabet = fa.e. we will only deal with Buchi automata. 2. Sta87. and are classical regular expressions. we state the following well-known facts (cf. Wa79]). this class of languages is closed under union. monadic predicate variables ranging over arbitrary sets of natural numbers. MY88.

2.) Let A = ( . We can now prove the following: Theorem 2. given Lemma 2. 3. We say that a state t is reachable from a state s if there are a nite word w = w1 : : : wk in ? and a nite sequence s0. proved to be NLOGSPACE complete in Jo75]. 6 . S. sk = t. and si+1 2 (si. To do this.) We now make this argument more precise. To show NLOGSPACE hardness. we only need to check if there is some accepting state reachable from some initial state and reachable from itself. we then nondeterministically attempt to construct a path from s to r and from r to itself. (We assume some standard encoding of automata. An instance is an element x of ? for which we want to determine membership in P . SC85. Otherwise repeat from step (2).3. The argument is that it is not necessary to rst build the whole automaton before applying the algorithm for nonemptiness. is used in HR83. : : :.of the given automaton.states. (A similar argument. Wo83]. A problem P is a subset of ?. . it is straightforward to construct a reduction from the graph accessibility problem.4: The nonemptiness problem for Buchi automata is logspace-complete for NLOGSPACE. Choose a transition from the current state and replace the current state by the target of this transition. If the current state is y. wi+1) for 0 i k ? 1. Given Lemma 2. The automata we will build are exponential in the size of the formula. so the size of an automaton is the length of its encoding. F ) be an automaton. However. Assume that we also use to encode Buchi automata and their constituent elements (alphabet. Lemma 2. At each point only three states are remembered. Proof: We rst prove that the problem is in NLOGSPACE.3: TB73] A Buchi automaton accepts some word i there is an accepting state of the automaton that is reachable from some initial state and is reachable from itself. it is possible to solve the satis ability problem using only polynomial space. To solve the satis ability problem for extended temporal logic.: : : ). to determine if a Buchi automaton accepts some word. we proceed as follows: 1. though in a somewhat di erent framework.3. we nondeterministically guess an initial state s and an accepting state r. S0. sk of states in S such that s0 = s. as the fact that the nonemptiness problem is in NLOGSPACE indicates. stop. Let be some xed nite alphabet. Thus the algorithm requires only logarithmic space. we will proceed as follows: build a Buchi automaton accepting the models of the formula and determine if that automaton is nonempty. To construct a path from any state x to any other state y. Make x the current state.

and either 1 si 62 F i and i = j or si 2 F i and j = (i + 1) mod k. One then jumps from the copy i to the copy (i + 1) mod k when a nal state of Ai is encountered. 3. be algorithms that associate with any instance x the encoding of a Buchi automaton x A x = ( x . x 2 P i Ax accepts some word. F i). The acceptance condition imposes that one goes in nitely often through a nal state of A0 in the copy 0. F = F 0 S 1 : : : S k?1 f0g. S x . determines whether y 2 S0 using at most f (j x j) space. k ? 1g. sk?1. 2. and 7. F ) as follows: S = S 0 : : : k 0 S k?1 f0. 6. given y 2 ?. De ne 1A = ( . w). S0 = S0 : : : S0 ? f0g. x 4. a) i sl1 2 (sl. When constructing Buchi automata from temporal logic formulas.6: Let A0. : : :. each of the steps in the algorithm can be executed in polynomial space. S. s1?1. S0i . i. given u. if y 2 x or y 2 Sx . : : :. using at most f (j x j) space. the whole algorithm requires polynomial space. We leave it to the reader to formally prove that L(A) = L(A0) \ : : : \ L(Ak?1). x . Proof: We use the algorithm described in the proof of Theorem 2. let f be a polynomial.5: Let P be a problem. given y 2 ?. Ak?1 be Buchi automata. i). v. S i . F x ) in the following manner: 1. The automaton A consists of k copies of the cross product of the automata Ai. and k (s0. let . . one starts by running the rst copy of this cross product. . This establishes that the problem is in NPSPACE and hence PSPACE. we will often have to take the intersection of several Buchi automata. Moreover. determines whether u 2 x (v. determines whether y 2 x using at most f (j x j) space. given y 2 ?. Theorem k2. j ) 2 ((s0. a). : : : . then j y j f (j x j). 5. : : : . . There is a Buchi automaton A ?1 with k i=0 jAij states such that L(A) = L(A0 ) \ : : : \ L(Ak?1 ). So.Lemma 2. 7 . the two states being remembered are also of size polynomial in the size of the problem. w 2 ?. The following lemma is a special case of Theorem A. . determines whether y 2 Fx using at most f (j x j) space. Intuitively. This forces all the components of A to visit their accepting states in a cyclic order. S0. S0 . determines whether y 2 Sx using at most f (j x j) space. for 0 l k ? 1. Given the assumptions we have made.1 in VW86a] and extends a construction of Ch74] (see also ES84]). Proof: Let Ai = ( .4 to check if Ax is nonempty. given y 2 ?. Then there is a polynomial space algorithm for determining membership in P .

F ). and F S is the nonempty set of accepting states. and for all k. i k < j . a subword automaton. j ] ! S such that '(i) = (wi ). there be a subword accepted by A viewed as an automaton on nite words with initial state (wi). The initial state corresponding to a position is determined by the symbol appearing at that position in the in nite word. :S ! 2S is the transition function. The subword condition. The labeling condition requires the labeling of the word to be compatible with the transition function of A. : ! S is the labeling function. Intuitively. a subword automaton checks that. Lemma 2. S. Before proving this we need a technical lemma. . Using a construction similar to the ag construction of Choueka Ch74]. a subword automaton A is a tuple ( . can be converted to a Buchi automaton with at most an exponential increase in size. S is the state set. '(k + 1) 2 ('(k). we de ne more specialized classes of automata: subword automata and set-subword automata. . Formally. we give a detailed treatment of this specialization. The automata accepting nite words at the various positions di er only by their initial state. They are the specialization to words of subtree automata and set-subtree automata de ned in VW86b].7: Let A = ( . and 8 . An in nite word w 2 ! is accepted by A if the following two conditions hold: labeling condition: (wi+1 ) 2 ( (wi ). S. . wk ). wi ) for every i 2 ! subword condition: for every i 2 !. '(j ) 2 F . For completeness sake. Then A accepts a word w:!! i (wi+1 ) 2 ( (wi). there is a nite word accepted by some automaton. there exists some j i and a mapping ' : i.2. .3 Subword Automata To make the construction of Buchi automata from extended temporal logic formulas easier. requires that from each position i in the word. starting at every position in an in nite word. We show now that because of the labeling condition we can do this conversion with only a quadratic increase in size. wi) for every i 2 !. where is the alphabet. even without the labeling condition. F ) be a subword automaton.

S. Consequently. Indeed. j ] contain at least two points and hence that the accepted subword wi : : :wj?1 be nonempty. Thus. As the run 2 is accepting. S ) and B2 = ( . a) otherwise. 9 . for every i 2 !. because of the labeling condition. If i > j . B1 will take care of checking the labeling condition and B2 of checking the subword condition. Assume without loss of generality that s2l 62 F for all j < l < k.8: Every subword automaton with m states is equivalent to a Buchi automa- Proof: Let A = (S. a) = (s. for every i 2 !. Let us show that a word w : ! ! is accepted by A i it is accepted by both B1 and B2. 1(s1i. wi) 6= . s21. a) = (s. . For the \only if" direction assume that A accepts w. there exists some j > i and a mapping ' : i. dition of acceptance is the requirement that the interval i. j ] ! S that satis es the subword condition at i. wk ). Consider the interval i. For the same reason. j ] ! S such that '(i) = (wi ). F ). Clearly. Basically.for every i 2 !. a) if s 2 F . the mapping ' = 'i 'i+1 satis es the required conditions ton with O(m2) states. a) if s = (a). (wi+1) = s1. i k < j . . We claim that it satis es the subword condition at i. We now de ne two new transition functions 1. 1. and 1 (s. there is a k > j such that s2k 2 F .i+1 2 1(s1i. and 2 (s. s11. 2 : S ! 2S : 1 (s. k]. k] ! S that satis es the subword condition at point i + 1. Proof: The only di erence between the condition in the lemma and the standard con- Theorem 2. S. S. We claim that the interval i. s1i = (wi). Then. otherwise. and for all k. Let i 0. Thus the \if" direction is trivial. so it remains to show the existence of the \right" subword. a) = . there exists some j i and a mapping 'i : i. wi) = ( (wi ). : : : and 2 = s20. : : : of B1 and B2 (resp. Let i 2 !. This means there are accepting runs 1 = s10. So. F ) be a subword automaton. '(j ) 2 F . Suppose rst that w is accepted by B1 and B2. . let us assume i = j . S. 2 (s. a) = ( (a). wi): It remains to verify the subword condition. there is some j i such that s2j 2 F . 2.) on w. k] satis es the subword condition at i. We verify rst that the labeling property holds. The labeling condition clearly holds. then we are done. We know that there exists some k i + 1 and a mapping 'i+1 : i + 1. '(k + 1) 2 ('(k). These transition functions let us de ne two Buchi automata B1 = ( .

wk ) = 2(s2k . s12. wl): For l = j we have '(l + 1) = s2l+1 2 2(s2l. It follows that the subword automaton A is equivalent to the Buchi automaton B . 0] and we de ne s20 = s. k]. s2. Subword automata are more adequate than Buchi automata for the purpose of reducing satis ability of formulas to nonemptiness of automata. s2jn 2 F . By Lemma 2. s11. Consequently. jn+1 ]. s21. 1 is an accepting run of B1 on w.Indeed. wl) = ('(l).k+1 = '(k + 1) 2 ('(k). jn+1 > jn and a mapping ' : jn. wk ). jn]. Moreover. '(k + 1) 2 ('(k). 10 s2. Hence we have de ned a run 2 of B2. : : : where s1i = (wi ). then s2k = (wjn ) and s2k 2 F . Clearly '(k) 2 F . there exists an interval jn. For jn < k jn+1 . w(k)): . the run is accepting. if k = jn. We show now that for every jn k < jn+1 .k+1 = '(k + 1) 2 ( (wk ). Indeed. The words that we shall deal with are going to consists of sets of formulas. By the labeling condition. Finally. By induction. : : : such that for each ji . we can assume that '(k) 62 F for jn < k < jn+1 . then s2k = '(k) 62 F . j ] and '(l) = s2l for l 2 j + 1. j1]. wk ): If jn < k < jn+1. by Theorem 2. s2ji 2 F . wk ) = 2(s2k . The rst interval is 0. wk ). we de ne s2k = '(k). Consequently. wl) = ( (wl ). to facilitate our task in the rest of the paper. jn k < jn+1. s22. Nevertheless. wl) = ('(l). Suppose now that 2 is de ned on the interval 0. where s is an arbitrary member of F . wl) = ('(l). a suitable mapping ' : i. For i l < j we have '(l + 1) = (wl+1) 2 ( (wl ). : : : of B2 by de ning it on a sequence of increasing intervals 0. we can construct an automaton B such that a word w is accepted by B i it is accepted by both B1 and B2. Without loss of generality. s2. we now specialize the notion of subword automata even further. '(jn+1 ) 2 F . and for all k.7. s2jn 2 F . wl) as s2j 2 F . We now have to show that if w is accepted by A then it is accepted by both B1 and B2. jn] = !. We now de ne a computation 2 = s20. wl) = (s2l. 0. Finally. S1 0. jn+1] ! S such that '(jn ) = (wjn ). j2]. wl) as s2l 62 F for j + 1 l < k. Clearly. for j + 1 l < k '(l + 1) = s2l+1 2 2(s2l. k] ! S can be de ned in the following way: '(l) = (wl ) for l 2 i.6. as for each n=1 jn . Consider the run 1 = s10. We have shown that w is accepted by both B1 and B2.k+1 2 2(s2k .

a) 6= . wi+2. and by s. '(j ) = . A word w : ! ! is accepted by A if for every i 2 ! and every f 2 wi there exists a nite interval i. ). : : : when viewed as elements of the state set S. Also. then certain formulas must appear in wi+1. j ] and a mapping ' : i. The power set 2 serves both as the alphabet and as the state set S. if s s0. the \right" formulas appear in the letters wi+1. We will denote elements of 2 by a. the transition is still legal if we try to verify fewer formulas at i or more formulas at i + 1. Intuitively. s1 s01 and s1 2 (s0. a set-subword automaton A is a pair ( . 2 (. 3. b. These are what we call monotonicity conditions. the transition of the automaton are meant to capture the fact that if a certain formula appears in wi. '(k + 1) 2 ('(k). A minimal requirement is that these formulas appear in the letter of the scanned position of the word. 4. and for all i k < j . Clearly if there is nothing to verify at i. : : : when viewed as letters from the alphabet . wk ). where is a nite set of basic symbols (in fact these symbols will be just formulas of the logic). a) and s02 2 (s2. we consider automata where the alphabet and the set of states are the same set and have the structure of a power set. (s. : : :. The empty set serves as a single accepting state. Thus. :S ! 2S is a transition function such that 1. 2.and the states of the automata that will accept these words are also going to be sets of formulas. 11 . 2-3. The acceptance condition requires that for each position i in the word and for each formula f in wi. then nothing is required at i + 1 (condition (2)).. The formulas in the state of the automaton are formulas that the automaton is trying to verify. s1. Intuitively. if s01 2 (s1. this condition is related to the labeling condition de ned for subword automata. a). a). We will call these automata set-subword automata. a). Formally. . j ] ! S such that '(i) = ff g.. a letter is a set of formulas that are \alleged" to be true and a state is a set of formulas that for which the automaton tries to verify the \allegation". A transition of the automaton is a minimum requirement on the formulas in wi+1 given the formulas that the automaton is trying to verify at i. then s01 2 (s. i s a. then s01 s02 2 (s1 s2. As we will see. a). a). The four conditions imposed on the transition relation can be explained as follows: 1.

'f (j ) = . we de ne '(i) = . If wi = . We take jf = j . it is equivalent to require that the automaton accept when started in the state identical to the letter of the node. otherwise we de ne '(k) = Sf 2wi 'f (y) for each k 2 i. Proof: By Lemma 2. This is an additivity condition. '(j ) = . But this follows. for i k < jf . j ]. Also by Lemma 2. take j = i. wk ). let w : ! ! 2 be a word. Proof: (1) ) (2). There exists some j i and a mapping ' : i. by Theorem 2. and for all i k < jf .g) . Since A accepts w. (2) ) (1). the subword condition of the subword automaton A0 is satis ed. (3) and (4). Furthermore.. and let i 2 !. '(k + 1) 2 ('(k).. If wi = . For every f 2 wi. It remains to show that satis es the labeling condition.8. j ] ! S such that '(i) = wi. ) be a set-subword automaton.9. f. It says that there is no interaction between di erent formulas that the automaton is trying to verify at position i.. extend 'f f 2wi to i.. by the monotonicity condition (3) in the de nition of set-subword automata. if we start the automaton in each of the singleton sets corresponding to the members of the letter of that position. by condition (4) in the de nition of set-subword automata. since is the identity mapping. 2 . . we have to show that for every i 2 !. Because j jf for each f 2 wi. given conditions (2). where is the identity function. jf ] ! S such that 'f (i) = ff g. otherwise take j = maxfjf g. then it is also accepted by A. wi+1 2 (wi . j ] ! S such that '(i) = wi. In other words. 12 .4. wk ) Theorem 2. then '(k + 1) 2 ('(k). and de ne 'f as follows: 'f (i) = ff g and 'f (k) = '(k) for i < k jf . Thus the union of two transitions is a legal transition. ) is equivalent to the subword automaton A0 = (2 . j ] by de ning 'f (k) = . The acceptance condition requires that for each position in the word. wk ). if the word w is accepted by the set-subword automaton A. Thus. there exists some j i and a mapping ' : i. it is immediate that if a word w is accepted by the automaton A0. there exists some jf i and a mapping 'f : i. Theorem 2. if i k < j . it accepts a nite subword.10: The set-subword automaton A = ( . a set-subword automaton can be converted to an equivalent Buchi automaton with only a quadratic increase in size.9. 2. For every f 2 wi . We will now prove that.9: Let A = ( . for jf < k j . wk ). The following are equivalent: 1. '(j ) = . Let f 2 wi. We can now prove that set-subword automata can be converted to subword automata without any increase in size. We only have to show that 'f (k + 1) 2 ('f (k). wi). 'f (k + 1) 2 ('f (k). . since 'f (i) '(i). and for all i k < j .

Consider now i + 1. : : :. and F S is a set of accepting states. where is the input alphabet fa1. and by the monotonicity conditions we have wi+1 2 (wi. S is the set of states. If '(i + 1) = . i j= f1 and .1 De nition We consider propositional temporal logic where the temporal operators are de ned by nite automata. fn) over sequences by mutual induction. wi+1) 6= . . : : : . . starting at i. : : : . for an atomic proposition p. The set of formulas is de ned inductively as follows: Every proposition p 2 Prop is a formula. i j= A(f1.. we consider formulas built from a set Prop of atomic propositions. i there is an accepting run = s0. so '(i + 1) 2 (wi. We now de ne satisfaction of formulas and runs of formulas A(f1. Otherwise. We call A an automaton connective. then :f1 and f1 ^ f2 are formulas. S0. : : : . if f1. S. so. . S. i + 1 2 i. F ). Otherwise. For every nondeterministic nite automaton A = ( . then wi = . i j= f . We assume some standard encoding for formulas. Note that such a function is an in nite word over the alphabet 2Prop. : : : of A(f1. : S ! 2S is the transition relation. then clearly '(i + 1) wi+1. fn) over . where A = ( . wi). A structure for our logic is an in nite sequence of truth assignments. Thus by the monotonicity condition wi+1 2 (wi. i. 0 j= f . If f1 and f2 are formulas.. .. Satisfaction of a formula f at a position i in a structure is denoted . F ). : : :. i j= f2. '(i + 1) wi+1. i j= f1 ^ f2 i . fn ). similarly to the extended temporal logic (ETL) of Wo83]. and for all i k < j . This of course includes also the encoding of the automata connectives. wi). If j = i. We say that a sequence satis es f (denoted j= f ) if . : : :. i j= p i p 2 (i). . ang. The length of a formula is the length of its encoding. We will thus use the terms word and sequence interchangeably. wi). S0 S is the set of initial states. fn ) is a formula. by condition (1) in the de nition of set-subword automata. 3 Temporal Logic with Automata Connectives 3. j ]. More precisely. fn are formulas (n = j j) then A(f1. . a function : ! ! 2Prop that assigns truth values to the atomic propositions in each state.'(j ) = .e. wk ). ('(i + 1). i j= :f i not . S0. 13 . s1. i j= f .. '(k + 1) 2 ('(k).

.2. Thus A1 is equivalent to \Until" connective of GPSS80].. S. S0 = fs0g and F = fs1g. Depending on how we de ne accepting runs. s1. It is not hard to show that our logics are translatable into S1S. S = fs0. namely. by Corollary 2.A run of a formula A(f1. given a formula f . and hence. : : : of states from S . F ). We can not. b) = . aj ).2.2: Consider the automaton A = ( .2 does not give any information about the relative expressive power of ETLf and ETLl. S0. . 3. Example 3. (s0. it only accepts one word: w = ababababab :: :. S0. starting at a point i. a) = fs1g. by Bu62] and McN66]. (s1. (s0. It is indeed not hard to see that all the extended temporal logics (ETLf . . over a structure . is a nite or in nite sequence = s0. the structures over which ETL is interpreted can be viewed as in nite words over the alphabet 2Prop . a) = (s1. S. (s0. where A = ( . We will say that a formula is satis able if this set is nonempty. where = fa. S.1: Consider the automaton A = ( . Note that.. S0 = fs0g and F = . If we consider looping acceptance. (s1. We are also interested in the expressive power of the logics we have de ned. a) = fs0g.2 Translations to Automata and Decision Procedure As we have pointed out. Example 3. Every formula de nes a set of sequences. fn). f2) is true of a sequence i f1 is true in every even state and f2 is true in every odd state of that sequence. It is shown in Wo83] that this property is not expressible in PTL. use Corollary 2. S0. i + k j= fj and sk+1 2 (sk . S = fs0. the set of sequences that satisfy it. where s0 2 S0 and for all k. to infer that ETLr is more expressive than ETLf and ETLl. Similarly. s1g. It thus de nes an ETLf connective such that A(f1. bg. The satis ability problem is to determine. 0 k < j j. b) = . there is some aj 2 such that . ETLl. it accepts the language a?b. and ETLr) are at least as expressive as PTL. a) = . b) = fs0g. b) = fs1g. If we consider nite acceptance. (s0. f2) is true of a sequence i f1 is true until f2 is true. Corollary 2. . It thus de nes an ETLl connective such that A1(f1. (s1.. F ). ETLr is at least as expressive as ETLf and ETLl. however. s1g. : : :. where = fa. Our yardstick for measuring this power is the ability of the logics to de ne sets of sequences. F ). we get three di erent versions of the logic: ETLf : A run is accepting i some state s 2 F occurs in ( nite acceptance) ETLl : A run is accepting i it is in nite (looping acceptance) ETLr : A run is accepting i some state s 2 F occurs in nitely often in (repeating or Buchi acceptance). they de ne ! -regular sets of 14 . bg. whether f is satis able.

To state these conditions. S. since negation in front of automata connectives causes an exponential blow-up. as is the translation from S1S to Buchi automata Bu62]. F ). This translation also yields a PSPACE decision procedure for the satis ability problem. Not so. Nevertheless.words. gn ) is considered to be a subformula of A(g1. In this section. gn 2 cl(g). : : :. . A Hintikka sequence for an ETLf formula g is a sequence : ! ! 2cl(g) that satis es conditions 1{5 below. where l is the length of g. Intuitively. From now on we identify a formula ::g0 with g0. for each s 2 S we de ne As to be the automaton ( . . We rst need to de ne the notion of the closure of an ETLf formula g. S. g1 ^ g2 2 cl(g) ! g1 . A(g1. for all s 2 S . whenever A(g1. gn). gn ) 2 cl(g) with A = ( . given a formula in one of the logics ETLf . It is similar in nature to the closure de ned for PDL in FL79]. Sequences on 2cl(g) corresponding to models of a formula satisfy some special properties. g2 2 cl(g). Note that the cardinality of cl(g) can easily be seen to be at most 2l. F ). we associate the formulas in cl(f ) that are satis ed at that point. We start with the translations for ETLf and then outline the di erences that occur when dealing with ETLl. there is an exponential translation to Buchi automata. al) : s0 2 S0. the closure of g consists of all subformulas of g and their negations. g1 2 cl(g) ! :g1 2 cl(g). one can build a Buchi automaton that accepts exactly the words satisfying the formula. gn ) 2 cl(g) ! g1. : : :. al 2 . gn ) 2 cl(g). With each point i 2 !. or ETLr . : : :. . we assume that A = ( . F ). The closure cl(g) of an ETLf formula g is then de ned as follows: g 2 cl(g). associates states of A to elements of 2cl(g). That is. : : :. S0. we de ne a mapping A : 2cl(g) ! 2S such that A (a) = fs 2 (s0. A structure : ! ! 2Prop for an ETLf formula g can be extended to a sequence : ! ! 2cl(g) in a natural way. A(g1. and gl 2 ag: In the following conditions. we will give the translation for ETLf and ETLl. fsg. denoted cl(g). for an automaton connective A. Precisely. The conditions are then: 15 . The translation for ETLr is given in SVW87]. assuming that As(g1. gn ) 2 cl(g) is mentioned. we might have expected the complexity of the translation to be nonelementary. S. for each of the logics we have de ned. ETLl. : : :. we use a mapping that. S0. : : :. Given an automaton A = ( . S. F ). :g1 2 cl(g) ! g1 2 cl(g). S0. . given A(g1. : : : . : : :. gn ) 2 cl(g) ! As(g1.

4. 0 j= g. : : : . this implies Hintikka condition 1. gn ) 2 '(k). and As(g1. Proposition 3. : : : . then either { s 2 F . : : :. or there exists some j > i and a mapping ' : i. This makes it easier to construct an automaton that checks these conditions. That Hintikka conditions 2. For the base case (g0 2 Prop). gn) 2 i.1. i j= g0. Consider the structure such that (i) = fp 2 Prop : p 2 ig. Hintikka conditions 2 and 3 are intended to capture the semantics of propositional connectives. for all k. If: Let g be an ETLf formula and let be a Hintikka sequence for g . As0 (g1 . 4.. i k < j . if A(g1. we show by induction that for all g0 2 cl(g) and i 2 !. . We de ne a Hintikka sequence for g as follows: for all i 0. and for all i 2 ! 2. By the de nition of a model. and 5 hold follows immediately from the semantic de nition of ETLf . : : : . The inductive step for formulas of the form :h and h ^ h0 follows directly from the Hintikka conditions 16 . we have that g0 2 i i . while we need two conditions for automata connectives due to the use of a single implication (\if : : : then"). i j= hg. We now have to show that is indeed a Hintikka sequence. Proof: Only if: Let g be an ETLf formula and let be a model of g. j ] ! 2cl(f ) such that: '(k) k for i k j . Note that for each propositional connective we need one condition using a double implication (\i ").3: An ETLf formula g has a model i it has a Hintikka sequence. h ^ h0 2 i i h 2 i and h0 2 i. 3. gn ) 2 '(i) for some s0 2 S0. h 2 i i :h 62 i. then either S0 \ F 6= . The reason for doing this is that Hintikka condition 5 is weaker than the converse of Hintikka condition 4. : : :. 3. 5. i = fh 2 cl(g) : . gn) 62 i. if As(g1. or { there is some t 2 As ( k ) such that At(g1. while Hintikka conditions 4 and 5 are intended to capture the semantics of automata connectives. 0 j= g . this is immediate by construction. then: S0 \ F = . if A(g1. : : :. '(j ) = . For this. gn ) 2 '(k + 1). g 2 0. gn) 62 i+1 for all s 2 A ( i). We now show that .

or there is a state s 2 As(g1. and if 0 l < k. For the transition relation L. Suppose rst that A(g1. : : :. then . 17 s a. This automaton ensures that for all eventualities (i. h ^ h0 2 s i h 2 s and h0 2 s. : : :. a) i : If A(g1. we have that s0 2 L (s. gn) 62 i+k . formulas of the form A(g1. gn ). A (a) we have that The set of starting states Ng consists of all sets s such that g 2 s. : : :. we convert the eventuality automaton to a Buchi automaton and combine it with the local automaton. sk 2 F . gn ) 2 s0. gn ) 62 i+l for 0 l k. such that L(AL) \ L(AE ) is the set of Hintikka sequences for g. gn) 62 i but . L. then. The state set is the collection of all sets of formulas in cl(g). gn) 2 i.. : : : . there is some nite word for which condition Hintikka 4 is satis ed. We do that by building two automata. gn) 2 s0. and j0. : : :. it checks Hintikka conditions 1-3 and 5. there is a nitely accepting run of A over starting at i. Ng .e. Ask (g1. and A (a) such that . there are nite sequences s0. is a set-subword automaton that checks Hintikka condition 4. This automaton is a Buchi automaton. The second automaton AE . AL and AE . By Hintikka condition 5 and by induction on k it follows that Asl (g1. i.. a) i a = s and: h 2 s i :h 62 s.. The Eventuality Automaton The eventuality automaton is a set-subword automaton AE = (cl(g). In particular. gn ) 2 s. Finally. By Hintikka condition 4 and the inductive hypothesis. gn ). : : : . checks the sequence locally. either S0 \ F 6= . : : : . jk . i j= A(g1. so . The Local Automaton The local automaton is AL = (2cl(g). i + l j= gjl and sl+1 2 (sl. : : :. ajl ). we have that s0 2 E (s. respectively. The rst automaton AL. gn). called the local automaton. We now build a Buchi automaton over the alphabet 2cl(g) that accepts precisely the Hintikka sequences for g. called the eventuality automaton. gn ) ). k 0. AL accepts precisely the sequences that satisfy Hintikka conditions 1-3 and 5. 2cl(g). Then there is a nitely accepting run of A over starting at i. : : :. Clearly. : : :. : : : . gn ) 2 s. i j= A(g1. : : : . if :A(g1. sk . : : :. Suppose now that A(g1. the S0 \ F = . For the transition relation E .e. That is. and for all s 2 :As(g1. so Hintikka condition 5 is violated. such that s0 2 S0. But sk 2 F . 2cl(g)). It remains to prove the inductive step for formulas of the form A(g1. : : : . E ).2 and 3.

To summarize.. we have the following: Theorem 3. 50) if :A(g1.10.4. We can now give an algorithm and complexity bounds for the satis ability problem for ETLf . We now have: Proposition 3. Equivalently. we can build a Buchi automaton accepting L(AL) \ L(AE ). we have: Theorem 3. it is su cient to test if L(Ag ) 6= . This automaton will have 23cl(g)+1 states. whose size is 2O(jgj). and verify that the latter is a Hintikka sequence for g. the last step of our construction is to take the projection of our automaton on 2Prop. This automaton will have 22cl(g) states.6: The satis ability problem for ETLf is logspace complete for PSPACE.5: Given an ETLf formula g built from a set of atomic propositions Prop. By Theorem 2. 7 18 . : : :. However. : : :. : : :. Combining this with the fact that satis ability for propositional temporal logic (a restriction of ETLf ) is PSPACE-hard SC85].4: Let g be an ETLf formula and : ! ! 2cl(g) be a sequence. then is a Hintikka sequence for f i 2 L(AL) \ L(AE ). By eliminating this redundancy we can build an equivalent automaton with 22cl(g) states. This construction contains some redundancy. it is not hard to see that the construction of Ag satis es the conditions of Lemma 2. Then by using Theorem 2.5. one can construct a Buchi automaton Ag (over the alphabet 2Prop ). The construction proceeds similarly to the one for ETLf . To test satis ability of an ETLf formula g. j ] ! 2cl(g) such that: '(k) k for i k j .It is immediate to check that conditions (1-4) of the de nition of set-subword automata are satis ed for AE . this can be done in nondeterministic logarithmic space in the size of Ag . that accepts exactly the sequences satisfying g . then there exists some j i and a mapping ' : i. the models of f are de ned by words over 2Prop. since the work done by the component that checks for the labeling condition (from the proof of Theorem 2. accepts words over 2cl(g). then there is some s 2 A( i) such that As(g1. gn ) 2 i. So. This is done by mapping each element b 2 2cl(g) into the element a 2 2Prop such that b \ Prop = a.8 and 2. AE accepts precisely the sequences that satisfy Hintikka condition 4. By Theorems 2.7) is subsumed by the work done by the local automaton. the automaton runs over a word over 2Prop . gn ) 2 i+1. Moreover.7 The automaton we have constructed. The rst di erence is that Hintikka conditions 4 and 5 are replaced by the following: 40) if A(g1. Furthermore. gn ) 2 i. we can build a Buchi automaton that accepts L(AE ).6. guesses a corresponding word over 2cl(g). Let us now consider ETLl.

and 19 . : : : . The Eventuality Automaton We have that s0 2 E (s. : : :. Proof: As in Proposition 3. i j= g0. i j= A(g1. gn) 2 '(k + 1) for all t 2 As ( k ). gn) 2 s. gn ) 2 i. Suppose now that A(g1. there is an in nite run = s0. More explicitly. gn) 2 '(k). Consider the structure such that (i) = fp 2 Prop : p 2 ig. a) i : s a. : : : . : : :. '(j ) = . we have that :A(g1. Hintikka condition 50 cannot hold. gn ) 62 i.. i + k j= gj . : : : . 0 The analogue of Proposition 3. so . gn). then gj 2 i+k . : : : . We show by induction that for all g0 2 cl(g) and i 2 !. i + k j= gj and sk+1 2 (aj . Suppose rst that A(g1. : : :. : : : .7: An ETLl formula g has a model i it has a Hintikka sequence. sk ). if . i + l j= gjl for all l 0. By Hintikka condition 40. if A(g1. for all k. ajl ) and gjl 2 i+l. s1. we have that s0 2 S0 and for all k 0 there is some aj 2 such that . We again construct an automaton to recognize Hintikka sequences in two parts: the local automaton and the eventuality automaton. one direction is immediate. But then.:As (g1. : : : and j0. : : : such that s0 2 S0 and if l 0 then sl+1 2 (sl. by Hintikka condition 2. By the induction hypothesis . : : :. : : :. : : : of A(g1. If . By the induction hypothesis. the for some s 2 A(s) we have that As(g1. i j= A(g1. The local and eventuality automata only di er from those for ETLf by their transition functions. : : : . h ^ h0 2 s i h 2 s and h0 2 s. a) i a = s and: h 2 s i :h 62 s. gn ) over starting at i.3 holds: Theorem 3. there are in nite sequences s0. They are the following: The Local Automaton We have that s0 2 L(s. we have that g0 2 i i . s1. gn ) 2 '(i) for all s0 2 S0. Let g be an ETLl formula and let be a Hintikka sequence for g. then :At(g1. j1. gn ) 2 i. gn ). gn) 2 s0. We show the inductive step that corresponds to automata connectives. if :As(g1. Then.3. i k < j . This time the local automaton deals with Hintikka conditions 1{4 and the eventuality automaton deals with condition 5.

using weak alternating automata is described in MSS88]. gn) 2 s0. we combine the local automata with the eventuality automata checking the satisfaction of eventualities. The construction described above simultaneously takes care of running automata in parallel. Our local automata correspond to those maximal models. This construction always yields automata whose size is exponential in the size of the formula. when we have an automata connective nested within another automata connective. However.If :A(g1.6: Proposition 3. and complementing automata. then is a Hintikka sequence for g i 2 L(AL) \ L(AE ).5 and 3. whose size is 2O(jgj). the size of the automaton Ag constructed in SVW87] for an ETLr formula g is 2O(jgj2).4 and Theorems 3. Thus. A direct construction of a Buchi automaton to check for eventualities would have to essentially use the constructions of Theorems 2. In that approach not only there 20 . Theorem 3. where a special construction is needed to complement Buchi automata. This technique can sometimes be more e cient than the maximal model technique. however. Thus. the exponent for ETLf and ETLl is linear. We could also construct our automata using the tableau technique of Pr80]. A major feature of our framework here is the use of set-subword automata to check for eventualities.10: The satis ability problem for ETLl is logspace complete for PSPACE.8: Let g be an ETLl formula and : ! ! 2cl(g) be a sequence. His approach eliminates the need for distinction between the local automaton and the eventuality automaton. Another approach. when we have a negated automata connective.9. the construction can be viewed as combining the classical subset construction of RS59] and Choueka's \ ag construction" in Ch74]. then for all s 2 A (a) we have that :As(g1. Using results by Safra Sa88]. and then one eliminates states whose eventualities are not satis ed. while the construction given here for ETLf and ETLl as well as the construction given in SVW87] for ETLr are exponential. There are. To appreciate the simplicity of our construction using set-subword automata. There one starts by building a maximal model. We now have the analogues of Proposition 3. Streett St90] suggested using so-called formula-checking automata. gn ) 2 s. other possible approaches to the translation of formulas to automata. As a result.9: Given an ETLl formula g built from a set of atomic propositions Prop. one can construct a Buchi automaton Ag (over the alphabet 2Prop ).8 and 2. which is provably optimal. the reader is encouraged to try to directly construct a Buchi automaton that checks Hintikka condition 4 for ETLf or checks Hintikka condition 5' for ETLl. instead of eliminating states. : : : . This should be contrasted with the treatment of ETLr in SVW87]. that accepts exactly the sequences satisfying f . : : :. Theorem 3. It is interesting the compare our technique here to Pratt's model construction technique Pr79]. while it is nonlinear for ETLr. this can be improved to 2O(jgj log jgj).

the construction of the local automaton is straightforward. but the automaton is constructed by a simple induction on the structure of the formula. the converse is also clearly true. . the notions of nite and looping acceptance are weaker than the notions of repeating acceptance. ETLl or ETLr formulas are expressible by Buchi automata. one can construct an ETLl formula g 0 of length 2O(m ). We believe. the \unitary" approaches of MSS88. 0. We start by going from ETLf to ETLl. there is only one run of the deterministic automaton over that structure. these translations involves an exponential increase in the length of the formula. where the transition relation 0 is de ned as follows: si 2 0(sj . Clearly. A(f1. if the automaton is deterministic. The automaton is A0 = (2 . F ). we replace A = ( . even if it is avoidable for ETLf and ETLl. we need to show how an ETLf formula A(f1. Now. however. in a given state exactly one gi is satis ed. given a structure. X ) i si 2 (sj . As we will see.1. Unlike our approach. a) for some a 2 X . that is satis ed by exactly the same sequences as g . however. Thus. Before giving these constructions. that the distinction between the local automaton. g2n ). Now. a formula A(f1. : : :. and the eventuality automaton that checks global properties is fundamental. fn) could be nondeterministic because states in a structure might satisfy more than one formula fi. To overcome this type of nondeterminism. and the main di culty lies in the construction of the \global" automaton SVW87. For example. Theorem 4. for ETLr or the temporal -calculus. St90] do not generalize to these logics. F ) by an automaton over 2 .1: Given an ETLf formula g of length m. Proof: To prove our theorem. where A is de ned by a nondeterministic nite acceptance automaton can be translated into ETLl. What we want is that. 21 . In the case of ETLr . it would be conceivable for ETLf and ETLl to be less expressive than ETLr and hence Buchi automata. S0. The subset construction alone. does not assure su cient determinism. S. We show that this is not the case as there is a translation of ETLr formulas to ETLf and ETLl. The idea of this translation is that the sequences accepted by a nite acceptance automaton are those rejected by a closely related looping acceptance automaton. we show that there are straightforward translations between ETLf and ETLl (which also involves a single exponential increase in the length of the formulas). : : :. : : : . fn). which checks local properties. which are based on nontrivial automata-theoretic results from Sa88]. : : : . 4 Translations Among the Logics The results of the previous section show that the set of sequences describable by ETLf . even though the automaton A is deterministic. fn) is V equivalent toVA0(g1. where the formula gi corresponding to a set Xi is ( aj 2Xi fj ) ^ ( aj 62Xi :fj ). To determinize nite acceptance automata. Since. by Lemma 2. S0. Va88].is no distinction between the local automaton and the eventuality automaton. we can use the classical subset construction of RS59]. ETLr has exactly the same expressive power as Buchi automata. S.

one We can now give our translations. 2S . the computation is nonaccepting if it is a looping computation of the automaton A000 = (2 . 00. In other words. To translate from ETLr to ETLf and ETLl. s)g. g2n ) on that structure and that computation is accepting i it reaches a state in F 00. one can construct an ETLf formula g 0 of length 2O(m) . fS0g. : : :. such that the language L(A) accepted by A satis es L(A) = 1 i k L(Ai)L(Bi ).3: Sa88]: Given a Buchi automaton A = ( . for a given structure. We now turn to the translation from ETLl to ETLf . there is a unique computation of A00(g1. 2S . can construct k m nondeterministic automata on nite words Ai each of size at most O(m) and k deterministic Buchi automata Bi. S0.1 except that this time the automaton A000 is the nite acceptance automaton de ned by A000 = (2 . 00. F 00) where s0 2 00(a. 00. that is satis ed by exactly the same sequences as f . : : :. F 00 = fs 2 2S js \ F 6= .2: Given an ETLl formula g of length m. one can construct an ETLf formula g 0 of length 2O(m) . However. Now. fn) into ETLl is :A000(g1 . but this automaton will have to use a more general type of accepting condition than the in nite repetition of some state in a set F McN66. : : : . g2n ) (1) The size of (1) is clearly exponential in the size of A(f1.Seach of size at most 2O(m). 22 . This yields an automaton A00 = (2 . S. s) i s0 = fs0 2 S j 9s 2 s such that s0 2 (a. Hence our translation of A(f1. : : : . Theorem 4.). Ch74]. we will rely on a construction that partially determinizes Buchi automata. 8 Remember that for looping automata the set of designated states is irrelevant. fS0g. fS0g. ?)8. Proof: The proof is identical to that of Theorem 4. here the automata are Buchi automata and they cannot be determinized by the subset construction. Rather than using this more general type of automaton. . It is possible to build a deterministic nite-state automaton corresponding to a Buchi automaton. F ).Our next step. that is satis ed by exactly the same sequences as g .g. fn) and hence the translation of a formula of length m will be of length 2O(m). we will also use the determinization of the automaton de ning a connective. . We rst deal with ETLf . is to apply the subset construction to the automaton A0. 2S ? F 00. where jS j = m.4: Given an ETLr formula g of length m. Lemma 4. Theorem 4.

Similarly. Thus. : : : . Note that this de nition coincides with the de nition of automata connectives when the language L is the language accepted by an automaton. we need to show how an ETLr formula A(f1. the ETLf formula corresponding to A0(g1. Given a deterministic Buchi automaton Bi0 = (2 . if we build an ETLf formula fi corresponding to each of the terms of the union. we temporarily use the following de nition. This implies that we have to ensure that only one argument of the automaton connective is true in each state of the structure. : : : . i j= fj if wi = aj . a sequence satis es A0(g1. S0. : : :. 0. Hence. we want to construct i an ETLf formula fi0 such that a sequence satis es fi0 i it satis es Bi0(g1. Clearly. The formula A(f1.3 enables us to express the language accepted by A0 as L(A0) = 1 i k L(A0i)L(Bi0) Each of the terms of the union de nes a language on in nite strings. we express that Bi0 eventually reaches a state s after which it never goes through a state in Fi0. we rst use the same construction as in Theorem 4. . However. Fi0). : : : . it is nonaccepting if after some point. : : : . That run is accepting if it goes through some state in Fi0 in nitely often. we will want the deterministic automata Bi to be deterministic over the structure. we will replace the nondeterministic Buchi automaton A by the deterministic automata Bi. a set of in nite words L over an alphabet = fa1. g2n ) will be _ fi: 1 i k We now give the translation for each of the terms of the union. Bi0. We start with the in nite parts of the term. each of these languages is satis ed by a given set of sequences. Similarly to automata connectives. : : : .Proof: For convenience in the proof.1 and replace the automaton A = ( . S0. Given a set of ETLr formulas ff1. 0i. there is exactly one run of Bi0(g1.s = 23 . F ) by the automaton A0 = (2 . S. fn) is then equivalent to A0(g1. : : :. it never goes through a state in Fi0. where A is de ned by a nondeterministic Buchi automaton can be translated into ETLf . We use the fact that given a in nite sequence. similarly to the proof of Theorem 4. ang is satis ed by a sequence i there is a word w1w2 : : : in L such that . The sequences that cause Bi0 never to go through a state 0 in Fi0 from the state s are those rejected by the nite-acceptance automaton Bi. The main di culty is to express in ETLf the condition imposed by the repetition set of the nondeterministic Buchi automaton. Lemma 4. To prove our theorem.3 on the automaton A0. F ).1. g2n ) i it satis es one of the languages L(A0i)L(Bi0). S 00. To do this.3. Now. fn ). g2n ). g2n ) over that sequence. We will express the latter condition and then negate it. fng. We will then show how repetition for deterministic Buchi automata can be expressed in ETLf . : : : . g2n ). : : :. S. we will use Lemma 4. Si0. By using Lemma 4.

: : :. Ti0 feg. s) = feg for s 2 G0i . It is thus exponential in the size of A. The formula fi contains at most one copy of A0i and 2 jBi0j copies of Bi0.5: Given an ETLr formula g of length m. : : :. fn). Ti0. fi0): i The size of each Bi0 is exponential in the size of the original automaton A whereas the size of each A0i is linear. feg) where a is a new letter. S 00. the result will be of length 2O(m). one can construct an ETLl formula f 0 of length O(2O(m) ).s = (2 fag. Si0 feg. g2n .s.loop(g1.loop is de ned as follows: Proof: The proof is along the same lines as that of Theorem 4. the automaton Bi. T 00. We then nally have that fi = A00(g1. 1g. We build the automaton i A00 = (2 fag. G0i ). T 00.s. a) = feg. Si0.loop = (2 f0. where a is a new letter and e a new i.loop is the automaton Bi. fsg. 00 The second nite acceptance automaton used is Bi. Thus when translating an ETLr formula of length l to ETLf .s.s. g2n ).s(g1.4 The rst such automaton is Bi. The formula 's then becomes 0 's = Bi. 0i. Fi0).s. To express that Bi0 eventually reaches s and satis es 's . 's) then expresses that the automaton Bi0 eventually reaches state s and from then on never goes through any state in Fi0. i00.s does not hold.4 However. feg). In other words. g2n .loop = (2 . Si0 ? Fi0. we need to state for all states s 2 Si0.s = Bi. 0i. Hence the formula W1 i k fi is also exponential in the size of A(f1. Theorem 4. The transition relation 00 is 0i extended with 00 (s.loop i where the transition relation 00 i. that is satis ed by exactly the same sequences as g . We outline how this is done for each of the nite acceptance automata appearing in the proof of Theorem 0 4. g2n ). Si0. : : :. ?): 0 0 That is. Thus.s i state. i0. we use the nite-acceptance au00 tomaton Bi. : : : .s with the set of designated states removed and the transition relation updated accordingly.s. i. we have to 24 .s We replace it by 00 Bi. to construct fi . to state that the computation of the automaton Bi0 is accepting.0 (2 . Le A0i be the automaton (2 . We use instead the looping automaton 0 Bi. we need to express that there is some computation of A0i after which fi0 holds. ^ 00 fi0 = :fi. 00 . They are hence also those satisfying the formula 's = :Bi.s 00 00 fi.s (g1. The result is similar to the one for ETLf .s. e is a new state and the i i transition relation i00 is i0 augmented with i00(a. 00 . ?).s i. The formula i. We now consider the translation to ETLl. S 00.s: s2Si 0 Now. : : :. use looping acceptance automata instead of nite acceptance automata. fsg. the formula 00 fi.

where s 2 S . CKS81] (abbr. We de ne an auxiliary mapping : 2S ! 2BS by (T ) = fs : s 2 T g f:s : s 2 S ? T g: We rst consider acceptance of nite words by AFA. An equivalent way of de ning acceptance of nite words by AFA is in terms of nite run forests. 1).4 is A00. F ). '0. 0). ai). An alternating nite automaton BL80. g1 ^ 's. Then '(S 0) is the Boolean value of ' when the states in S 0 are assigned 1 and the states in S ? S 0 are assigned 0. As 1 we have shown in Theorem 4.s = :Bi. : : :. a) is obtained by substituting (sj . and F S is the set of accepting states. 'l of formulas from BS . let us denote by BS the set of all Boolean formulas that use the states in S as variables. : S ! BS is the transition function that associates with each state and letter a Boolean formula in BS . i. : : :. ).s. : : :. Thus its negation will be satis ed if at some point Bi0 goes through the state s with 's true. i 5 Alternating Temporal Logic The results of the preceding sections show that our technique is applicable to automata connectives and to the negation of automata connectives. a) in ' for each sj . A accepts w if 'l(F ) = 1. '0 2 BS is the start formula. S. 1). Formulas of the form s or :s. 0). S is the set of states fs1. al is the sequence '0.s. Given a set S of states.loop (( 00 i. : : :.loop .s. if (s1. We can extend to BS : ('. we have constructed the operator Bi. 1 j m. ) = 0i( .loop(g1 ^ :'s . 00 i.loop (( . ) = 00 (( . This suggests that this technique can also deal with alternation. then for all letters 2 2 . 00 i. ) and We then have 00 00 fi. It does not a ect the exponential overall complexity of our translation as the automaton A00 is of size linear in the length of the original formula. ) = . The run of A on a word w = a1. then (:s1 _ s2. For example. ) = 0i( .1. are called atomic formulas. AFA) A is a tuple A = ( . smg. This is what we do here. g2n ^ :'s. a) = s3 _ s4 and (s2. : : : . Members of BS can be viewed as Boolean-valued functions on 2S .s. a) = (:s3 ^ :s4) _ (s3 ^ :s4).loop so that it is strictly deterministic and is true only if the formula 's is false each time Bi0 goes through the state s. a) = s3 ^ :s4. g2n ^ 's) 00 Intuitively. . where is the input alphabet. then for all letters 2 2 . Let ' 2 BS and S 0 S . The last nite acceptance operator appearing in the proof of Theorem 4.if 6= s. where 'i = ('i?1.s.s. A nite run forest of A on w is a collection of nite trees labeled by atomic formulas satisfying the following conditions: 25 . it is always possible to replace an ETLf operator by an ETLl operator at the cost of an exponential increase in size. if = s.loop (( .

one can construct an 2n -state NFA that accepts the same language. Then there is a set S 00 S such that ('x. aj )(S 00) = 1 and for all ' 2 (S 00) there is a child y of x labeled by '. Nevertheless. Let x be an internal node of depth j labeled by 'x. An in nite run forest of A on an in nite word w = a1a2 : : : is a collection of in nite trees labeled by atomic formulas satisfying the following conditions: All branches are in nite. A accepts w if it has an accepting run forest on w. Repeating acceptance is also de ned by means of in nite run forests. A accepts w if it has a run forest on w. all branches of the run forest are not required to have the same length. Let x be an internal node of depth j . A accepts w if it has an accepting run forest on w. Finite acceptance is de ned by means of nite acceptance forests. Afa's de ne regular languages. given any n-state AFA. There is a set S 0 S such that '0(S 0) = 1 and for all ' 2 (S 0) there is a tree in the forest whose root is labeled by '. The condition is that along any branch of the run forest there are in nitely many accepting nodes. 0 j < l. Again. Then there is a set S 00 S such that ('x. The notion of alternation can also be extended to automata on in nite words MH84]. Let x be an internal node of depth j labeled by 'x. A node x labeled by 'x is accepting if 'x(F ) = 1. for each n there is an n-states AFA A. 26 . Then there is a set S 00 S such that ('x. aj )(S 00) = 1 and for all ' 2 (S 00) there is a child y of x labeled by '. Furthermore. labeled by 'x. as opposed to the de nition used in the case of nite words. Note that. The run forest is accepting if x is accepting whenever x is a leaf. the run forest is accepting if x is accepting whenever x is a leaf. A nite run forest of A on an in nite word w = a1a2 : : : is a collection of nite trees labeled by atomic formulas satisfying the following conditions: There is a set S 0 S such that '0(S 0) = 1 and for all ' 2 (S 0) there is a tree in the forest whose root is labeled by '. CKS81] that they can be exponentially more succinct than NFA's.The length of all branches is l. it follows from the results in Le81]. Looping acceptance is de ned by means of in nite run forests. aj )(S 00) = 1 and for all ' 2 (S 00) there is a child y of x labeled by '. There is a set S 0 S such that '0(S 0) = 1 and for all ' 2 (S 0) there is a tree in the forest whose root is labeled by '. That is. such that the language de ned by A is not de nable by any NFA with less than 2n states.

with AFA connectives replacing NFA connectives. F ). i j= :f i not . For an AFA A = ( . i + j j= fl.e. Notice the di erence between the de nitions of run forests for AFA and AFA formulas. fn ) has an accepting run forest on starting at i. nite acceptance (resp. F ). Let x be an internal node of depth j labeled by 'x. looping acceptance. The run forest is accepting if x is accepting whenever x is a leaf. S. ('x. More precisely. while AFA formulas only read partial description of states (i. fn ) if and only if A(f1. i j= f1 and . Then there are an al 2 and a set S 00 S such that .As is the case with nite words. : : :. : : :. . as is shown in HRV90]. ATLf is de ned analogously to ETLf . A accepts w if it has an accepting run forest on w. repeating acceptance) AFA has the same expressive power as nite acceptance (resp.e. Furthermore. for an atomic proposition p. the input formula). fn ). 27 . S. i j= A(f1. i j= f2. repeating acceptance) NFA MH84]. looping acceptance. where A = ( . Essentially. . To de ne the semantics of automata connectives we rst adapt the de nition of nite run forests.. al)(S 00) = 1 and for all ' 2 (S 00) there is a child y of x labeled by '. This is not the case in run forests of AFA formulas.. alternation does not add any expressive power beyond nondeterminism.. : : :. . S. since it is possible that more than one argument of the AFA connective holds at a point i of the structure . . we de ne A' to be the AFA ( . i j= f1 ^ f2 i . '. AFA read complete description of states (i. i j= f . We believe that this de nition is the appropriate one in our context. In a run forest of an AFA all \spawned" automata read the same input. '0. F ) and a formula ' 2 BS . i j= p i p 2 (i). on a structure starting at i is a collection of nite trees labeled by atomic formulas satisfying the following conditions: There is a set S 0 S such that '0(S 0) = 1 and for all ' 2 (S 0) there is a tree in the forest whose root is labeled by '. We rst deal with the alternating analog of ETLf . using the standard de nition of run forests for AFA causes an exponential increase in the complexity of the logic. Thus the only gain in using alternation is in succinctness. The semantics of ATLf are de ned as follows: .. We can now de ne satisfaction for automata connectives: . A nite run forest of A(f1. '0 . the input letter). .

gn ) 2 cl(g). '0. we assume that A = ( . S. : : :. gn ) 2 cl(g) ! A:s(g1. it has the same complexity. we de ne a mapping S A : 2cl(g) ! 22 such that A (a) = fT : al 2 . whose size is 2O(jgj) . Theorem 5. Also the study of AFA connectives reveals the full power of our techniques. The new conditions 4 and 5 are then: 400) If A(g1. whenever A(g1. gn ) 2 cl(g) with A = ( . Hintikka sequences for ATLf formulas are de ned similarly to those used in Section 3. The interest in this logic is twofold. . since the translation from ATLf to ETLf causes a one exponential blow-up. ATLf is at least as expressive as ETLf . that accepts exactly the sequences satisfying g . one may expect ATLf to be of higher complexity then ETLf .1: 1. We are given a formula g. : : :. Indeed. gn ) 2 cl(g) ! g1. cl(g) is de ned as follows: g 2 cl(g) g1 ^ g2 2 cl(g) ! g1 . al)(T ) = 1g: In the following conditions. j ] ! 2cl(g) such that: 28 . gn ) 2 cl(g) ! As(g1. The proof parallels those given in Section 3. gn ) 2 cl(g) is mentioned. one can construct a Buchi automaton Ag (over the alphabet 2Prop ). and ('0 . gn) 2 cl(g). A(g1. except for conditions 4 and 5. The notion of closure is similar to that in Section 3. : : : . For an ATLf formula g. S. We again need a technical de nition to state these conditions. : : :. there is an exponential translation from ATLf to ETLf . and we construct an automaton. '0. which is the cross product of the local automaton and the eventuality automaton. Given A(g1. for all s 2 S . First. Also. then there exists some j i and a mapping : i. F ). gn ) 2 i. : : :. gn 2 cl(g) A(g1.Clearly. g2 2 cl(g) :g1 2 cl(g) ! g1 2 cl(g) g1 2 cl(g) ! :g1 2 cl(g) A(g1. 2. Given an ATLf formula g built from a set of atomic propositions Prop. for all s 2 S . and the second claim follows as in Section 3. The satis ability problem for ATLf is complete in PSPACE. Surprisingly. : : :. : : :. : : :. if we restrict the formulas in BS to be positive disjunctions then ATLf reduces to ETLf . F ). : : :. Proof: We prove here the rst claim. gl 2 a. .

we have that s0 2 E (s. if A(g1. with AFA connectives replacing NFA connectives. gn ) 2 s. on a structure starting at i is a collection of in nite trees labeled by states of S or negation of states of S satisfying the following conditions: 29 s a. gn ) 2 s0. or { there is some S 00 2 A( k ) such that for all 2 S 00 we have A (g1. gn ) 2 (k + 1). : : :. E ). then there is S 0 S such that '0(S 0) = 1 and for all 2 S 0 we have A (g1 . the set of starting states Ng consists of all sets s such that g 2 s.(j ) = . : : : . for all k. gn) 2 i. We rst need the following de nition: An in nite run forest of A(f1. there is a set S 0 S such that '0(S 0) = 1. : : :. : : :. where '0 is atomic. gn ) 2 i+1. '0. : : :. we have that s0 2 L(s. For the transition relation E . L. and . F ). We now deal with the alternating analog of ETLl. gn ) 2 s0. 0 Finally. a) i : If A(g1. and for all 2 (S 0) we have A (g1. 500) if :A(g1. Ng . The Local Automaton The local automaton is L = (2cl(g). then '0(F ) = 0 and for all S 0 2 A( i) there exists some 2 (S 0) such that :A (g1. : : :. fn ). We now de ne the semantics of automata connectives in ATLl. The Eventuality Automaton The eventuality automaton is a set-subword automaton AE = (cl(g). then either { '(F ) = 1. : : : . i k < j . : : : . . where A = ( . : : :. ATLl is de ned analogously to ETLl. gn ) 2 s. : : :. if A'(g1. gn) 2 (i). : : : . 2cl(g). if :A(g1.. 2cl(g)). then either '0(F ) = 1 or there is some S 0 2 A (a) such that for all 2 (S 0) we have A (g1. gn) 2 (k). then '0(F ) = 0 and for all S 0 2 A(a) there exists some 2 (S 0) such that :As (g1. S. For the transition relation L. : : : . gn) 2 s. a) i a = s and: g 2 s i :g 62 s. g1 ^ g2 2 s i g1 2 s and g2 2 s. gn ) 2 s.

ATLl is at least as expressive as ETLl. whose size is 2O(jgj) .1 and is left to the reader. Recall that an accepting node in a run forest is a node x labeled by 'x such that 'x(F ) = 1. i + j j= fl . 2. Also. 30 .All branches are in nite. Satisfaction for automata connectives in ATLr is de ned by: . Let x be a node of depth j labeled by 'x. . The satis ability problem for ATLl is complete in PSPACE. Finally. We can now de ne satisfaction for automata connectives in ATLl. i j= A(f1. we describe the alternating analog of ETLr. fn ) has an accepting in nite run forest on starting at i. : : :. Then there are an al 2 and a set S 00 S such that . : : : . then ATLl reduces to ETLl. fn ) has an in nite run forest on starting at i. : : :. Indeed. In ATLr. Given an ATLl formula g built from a set of atomic propositions Prop. In MH84] it is shown that alternation can be eliminated from repeating acceptance automata by an exponential construction. if we restrict the formulas in BS to be positive disjunctions. fn) if and only if A(f1. There is a set S 0 S such that '0(S 0) = 1 and for all ' 2 (S 0) there is a tree in the forest whose root is labeled by '. al)(S 00) = 1 and for all ' 2 (S 00) there is a child y of x labeled by '. ('x. Theorem 5. The proof of the following theorem is similar to the proof of Theorem 5. there is an exponential translation from ATLf to ETLf . We conjecture that this construction can be combined with the techniques of SVW87] to prove the analogue of Theorems 5. : : : . fn) if and only if A(f1. i j= A(f1.2: 1. an in nite run forest is accepting if every branch has in nitely many accepting nodes. one can construct a Buchi automaton Ag (over the alphabet) 2Prop . Clearly.2 (though with a nonlinear exponent in the exponential bound) for ATLr.1 and 5. that accepts exactly the sequences satisfying g .

In HP85] the language is extended by regular operators corresponding to concatenation and the Kleene star. however. also has a PSPACE-complete decision problem (see also BB89]). The ability to have an automaton operator nested within another automaton operator is essentially equivalent to the ability of an automaton to consult an oracle. Al. and even quanti cation over propositions does not change the expressive power of the language and does not render it undecidable. when the automaton reaches certain states it asks the oracle whether it accepts the rest of the word. we note that our results have an interesting interpretation from a purely automata-theoretic point of view. It has the same expressive power as ETL and. the looping acceptance condition corresponds to the loop construct HP79].6 Concluding Remarks There are other approaches to extending PTL. PTL is extended with quanti ers over propositions. we consider PDL with one deterministic atomic program (1DPDL). it does not show the ne interplay between the di erent acceptance conditions that we have considered. the nite acceptance condition corresponds to the diamond construct FL79]. Also. 1 It should be noted that the automata-theoretic techniques developed here are also applicable to propositional dynamic logic. which can be shown to be less expressive than PDL + quantification. Now we have three hierarchies Af . That is. the complexity of the decision procedure is again nonelementary. BKP86]. It is known that PDL is less expressive than PDL + loop (Pratt. see HS82]). see Va85a. the last language can even be shown to be highly undecidable ( 1? complete). and its next move depends on the answer. then C denotes the class of languages de ned by automata in C . If C is a class of automata. Furthermore. the temporal logics discussed here can be combined with dynamic logic to yield expressive process logics which are also amenable to automata-theoretic methods VW83]. which is less expressive than PDL + fixpoint Ko83]. and the repeating acceptance condition corresponds the repeat construct HP79]. VW86a]. The relation between the various types of acceptance criteria for nite automata can also be presented within the framework of propositional dynamic logic (PDL) FL79]. This. It is interesting to note however that this last result was obtained using an extension of the automata-theoretic techniques presented in this paper. Also. where the structures are in nite sequences. and let A be i 0 Ai. For this extension. It follows that adding to 1DPDL looping. which is less expressive than PDL + repeat HS82]. Since we are reasoning here about computation paths. in which PTL is extended with xpoint operators. let Ai be the class of nondeterministic nite automata with oracles from Ai?1. Let A0 be the class of nondeterministic nite automata. The extension of PTL closest to the one discussed in this paper is probably that of the temporal -calculus. as is shown in Va88]. xpoint. pushes the decision problem for their language to nonelementary complexity. It was introduced in EC80]. In this framework. This in sharp contrast with what happens for propositional dynamic logic in general. Consider now the following hierarchy. repeating. and Ar. SVW87]. 31 . In conclusion. In Si83. depending whether we use nite. see also BKP85.

1{12. 1962. Buchi. Barringer. Proc. Barringer. Banieqbal. pp. and Ar to be nonelementary. BKP85] H. Chroust. we might have expected the complexity of the emptiness problem for automata in Af . Proc. Springer-Verlag.. Pnueli. Our results show that not only do these hierarchies collapse but that they are also equivalent: Af = Al = Ar = Af = Al2 = Ar . Kuiper. 1990. Lecture Notes in Computer Science 398. and L. since each complementation causes at least an exponential blow-up. Internat. Halpern. BB89] B. 1960. BL80] J. 32 AH90] . Lecture Notes in Computer Science 398. 1986. and Philos. Bu62] J. Petersburgh.. Finite Automata. and A. Kuiper. Sci. 390{401. 1989. 19{35. Logic. R. Stanford University Press. We wish to thank R. J. \Temporal Logic in Speci cation". pages 62{74. Barringer. eds. \On a Decision Method in Restricted Second Order Arithmetic". in Proc. North Holland. Acknowledgements.A. eds. on Principles of Programming Languages. Pnueli. This should be contrasted with the nonelementariness of the emptiness problem for regular expressions with complement MS73]). and A. H. Springer-Verlag. St. respectively. and A. Henzinger. and A. Congr. Method. Fagin. Pnueli. \A Compositional Temporal Approach to a CSP-like Language". Proc. R.looping. 1985. Barringer. or repeating acceptance. pp. Theoretical Computer Science 10(1980). on Logic in Computer Science. J. BBP89] B. pp. R. 5th IEEE Symp. \Real-time logics: complexity and expressiveness". Banieqbal. 13th ACM Symp. References R. Stockmeyer for helpful comments. \On Equations for Regular Languages.. 1989. Al. 0 1 However. Temporal Logic in Speci cation. \Temporal Logic with Fixed Points". Brzozowski and E. \A Really Abstract Concurrent Model and its Temporal Logic". 207{227. Rosner. Alur and T. Leiss. We are also grateful to R. B. R. and Sequential Networks". H. E. Our results show that the problem is logspace complete for PSPACE for the three types of automata. in Formal Methods of Programming. 173{183. eds. Barringer. Banieqbal and H. pp. Neuhold and G. pp. Streett and an anonymous referees for their careful reviews of the paper. Pnueli. BKP86] H.

J. Stockmeyer. J. D. 1990. Gabbay. and L. Fischer and R. and J. pp. D. Completeness". 307{322. Lecture Notes in Computer Science 68. \Deciding Full Branching Time Logic". eds. 5th ACM Symp. J. E. \Propositional Dynamic Logic of Regular Programs". 1979 D. Vardi. Parikh. pp. 33 . Languages and Programming. 194{211.C. A.Y. \The Temporal Analysis of Fairness". Harel. 1980. Guenther. J. Emerson and E. 7th ACM Symp. pp. \Alternation".J. Decidability. Theoretical Computer Science 27(1983). 5th IEEE Symp. Peleg. Sistla. Lecture Notes in Computer Science 85. 478-488. 1982. Choueka. Well-Structured Programs". \Dynamic Logic". Gabbay and F. Shelah. pp. Computer and System Sciences 8(1974). Halpern and J. vol. A. \The Propositional Dynamic Logic of Deterministic. Proc. Proc. 1981. Ladner. on Principles of Programming Languages. Theoretical Computer Science 38(1985). A. D. 117{141. 203{213. SpringerVerlag. A. D. 175{201. 114{133. M. 2. \Theories of Automata on !-Tapes: A Simpli ed Approach". ACM 28(1981). Pratt. Computer and System Sciences 18(2). Kozen. pp. pp. Harel. pp. \Process Logic: Expressiveness.K. and R. E. Computer and System Science 25(2). 169{181. 163{173. Information and Control 61(1984). pp. in Handbook of Philosophical Logic. on Principles of Programming Languages. \On the Power of Bounded Concurrency III: Reasoning about Programs". H. 497{604. \Characterizing Correctness Properties of Parallel Programs as Fixpoints". pp. pp. 1978. First Order Dynamic Logic. Proc. R. \Process logic with Regular Formulas".Ch74] CKS81] EC80] ES84] FL79] GPSS80] Ha79] Ha84] HP79] HKP80] HP85] HRV90] HR83] Y. P. Las Vegas. Harel. 1979. Stavi. Springer-Verlag. and M. 144{170 D. Clarke. Tuscon. D. on Logic in Computer Science. Pnueli. pp. pp. D. 1984. J. Reif. Harel. 7th Int. (D. Kozen. Colloquium on Automata. \Nondeterminism in Logics of Programs". Harel and D. Emerson and A. Y. M. 127{165. S. Harel and V.). D. Rosner. Chandra. Proc.

Workshop on Logics of Programs. 11 (1975). Logic Colloquium. 333{354. 1985 pp. Milner. Springer-Verlag. SE-7(1977). M. \Propositional Dynamic Logic of Flowcharts". on Logic in Computer Science.E. \Alternating Automata on !-Words". pp. Repeating in Dynamic Logic". McNaughton. Springer-Verlag. pp. L. 521{530 A. \A Calculus of Communicating Systems". \Results on the Propositional -Calculus". R. Kaminski. Schupp. Brooklyn. 34 . Lecture Notes in Computer Science 193. pp. Lecture Notes in Computer Science 92. Information and Control. pp. and P. A. Kozen. Sherman. Springer-Verlag. \Weak Alternating Automata Give a Simple Explanation of Why Most Temporal and Dynamic Logic are Decidable in Exponential Time". pp. Zuck. Eng. 3rd IEEE Symp. Lecture Notes in Mathematics 453. Systems Theory 4(1969). Information and Control 55(1982). \Succinct Representation of Regular Languages by Boolean Automata". pp. 422{427. pp. N. \Testing and Generating In nite Sequences by a Finite Automaton". Theoretical Computer Science 27(1983). \The Glory of the Past". 1975. 321-330. Sherman. Harel and R. Meyer. L. H. Leiss. 33(1977). Computer and System Science. Miyano and T. pp. Hayashi. \Space-bounded Reducibility among Combinatorial Problems". \A Classi cation of !-Regular Languages". \Weak Monadic Second Order Theory of Successor is not Elementary Recursive". on Soft. Information and Control 64(1985). 175{192. Ladner. R. 217{229. Saoudi. A. D. 68{75. pp. R.HS82] HS84] Jo75] Ka85] Ko83] Lad77] Lam77] Lan69] Le81] LPZ85] McN66] Me75] MH84] Mi80] MSS88] D. 119{135. pp. Jones. \Looping vs. D. E. Math. E. 196{218. J. Proc. Landweber. 281{303. D. and L. Lichtenstein. 323{330. Theoretical Computer Science 36(1985). "Application of Model-Theoretic Games to Discrete Linear Orders and Finite Automata". \Decision Problems for !-Automata".E. 1980. Lamport. O. Muller. Proc. Proc. S. Pnueli. D. \Proving the Correctness of Multiprocess Programs". Harel and R. Theoretical Computer Science 32(1984). Theoretical Computer Science 13(1981). 1988. 132{154. pp. pp. R. 376-384. Information and Control 9(1966). IEEE Trans.

387-396. \Finite Automata and their Decision Problems". Rozenberg. Stockmeyer. Manna and P. O. \Descriptively Complete Process Logic". on Foundation of Computer Science. pp. 3{16. 1977. Pratt. Yamusaki. on Principles of Programming Languages. Pn77] A. Proc.A. P. RS59] M. \Accepting Conditions for Automata on !Languages". 115{122. \A Near-Optimal Method for Reasoning about Action". 1982. W.E. MY88] T. MP89] Z. April 1973. Pr80] V. \On the Synthesis of a Reactive Module". Branching Time. eds. Pratt.W. pp. in Linear Time. ed. Rabin and D. Proc. 137{147.). pp. Pnueli. \Using Graphs to Understand PDL". and G.R. pp. 109-121. and Partial Order in Logics and Models for Concurrency (J. pp. Ni80] H.R. 8th IEEE Symp. San Juan. Acta Informatica 14(1980). 1989. \Semantical Considerations on Floyd-Hoare Logic". 16th ACM Symp. Springer-Verlag. 35 MS73] . \Models of program logics". 4th IEEE Symp. Pr76] V. Kozen. Rosner.J. 1989. on Foundations of Computer Science. 46{57. Yorktown-Heights. PR89] A. 231{254. on Programming Languages and Systems 6(1984). 114{125. Lecture Notes in Computer Science 131. Proc.R. pp. Proceedings 17th IEEE Symposium on Foundations of Computer Science. Pr81] V. \Non-elementary Word Problems in Automata and Logic". Pnueli and R. Theoretical Computer Science 61(1988) pp. Mu63] D. \In nite Sequences and Finite Machines". pp. Manna and A. Muller. de Bakker. (D. Lecture Notes in Computer Science 354.R. de Roever. Proc. on Complexity of Computation. pp. ACM Trans. pp. 68{93. 20th IEEE Symp. New York. AMS Symp. Providence. Meyer and L. Pnueli. 201-284. 1963. Austin. \The Anchored Version of the Temporal Framework". MW84] Z. \The Temporal Logic of Programs". Workshop on Logics of Programs. October 1976. Proc. \Synthesis of Communicating Processes from Temporal Logic speci cation". Moriya and H. pp. IBM Journal of Research and Development 3(1959). Nishimura. Pr79] V. J. on Switching Circuit Theory and Logical Design. 359{369. Wolper.R. Springer-Verlag. Proc. Pratt. Scott. Pratt. 1979.). 179{190. Computer and System Sciences 20(1980). pp.

R. \An Automata-Theoretic Approach to Automatic Program Veri cation".Y. Electron. \The Complementation Problem for Buchi Automata with Applications to Temporal Logic". \Star-Free Regular Sets of !-Sequences". pp. v. B. Clarke.P. W. 29th IEEE Symp. Thomas. 1985. 183{221. Proc. Harvard University.C. M.Sa88] SC85] Sh79] Si83] St82] St90] Sta87] SVW87] TB73] Tho79] Tho81] Tho90] VW86a] VW86b] Va85a] S. ACM 32(1985). Computer and System Sciences. 261{283. Shaw. \Automata-Theoretic Techniques for Modal Logic of Programs". ETH Zurich. \Automata on In nite Objects". 148{156. Thomas. \Propositional Dynamic Logic of Looping and Converse is Elementarily Decidable". 1990. 135{191. 1st IEEE Symp. Safra. in Handbook of Theoretical Computer Science. Information and Control 48(1981). A. pp. Brooklyn. Sistla. B. W. 413{424. \The Complexity of Propositional Linear Temporal Logic". \On the Complexity of !-Automata". \A Combinatorial Approach to the Theory of !-Automata". personal communication. Trakhtenbrot and Y. (J. Technical Report. pp. \Theoretical Issues in The Design and Analysis of Distributed Systems". on Logic in Computer Science. 1973.). M.Y. on Foundation of Computer Science. 319{327. \The Taming of the Converse: Reasoning about 2-way Computations". pp. North-Holland. Wolper. Kybernetic 23(1987). \Research in the Theory of !-Languages". Elsevier. June 1979. pp. M. \Software Speci cation Languages Based on Regular Expressions". P. L.Y. 217{237. Vardi. A. Staiger. A. Verarbeit. pp. 1986. Information and Control 42(1979). PhD Thesis. pp. Vol. Information and Control. pp. 121{141. Vardi. 733{749. pp. Workshop on Logics of Programs. and P. 32(1986). Vardi and P. Thomas. P. Proc. Sistla. Wolper. Wolper. J. R. A.Y. Barzdin. 36 . Proc. J. pp. M. 1988. 415{439. M. Leeuwen. Streett. Inf. Springer-Verlag Lecture Notes in Computer Science 193. Boston. Streett. Theoretical Computer Science 49(1987). pp. \Finite Automata: Behavior and Synthesis". Sistla and E. ed. Vardi and P. 54(1982). A. 1983. M. 332{334. W.

Workshop Logic of Programs. Wagner. Proc.Y. on Principles of Programming Languages. eds. Y. San Diego. 1989. Lecture Notes in Computer Science 398. in Proc. 26th IEEE Symp.Va85b] M. Wo82] P. 250{259. pp. 72{99. \A Temporal Fixpoint Calculus". and A. Vardi and P. 1988. 1985. Information and Control 43(1979). Oct. Wolper. pp. pages 75{123. Springer-Verlag. Wolper. on Foundations of Computer Science. Banieqbal. 15th ACM Symp. VW83] M. Wa79] K. Vardi. Vardi. B. pp. 123{ 177. \Yet Another Process Logic". Information and Control 56(1983). Wolper. Wolper. Ph. Proc. \Synthesis of Communicating Processes from Temporal Logic Speci cations". 327-338. \On !-Regular Sets". Y. H. Wo89] P. Stanford University. Springer-Verlag. 37 . 1982. 501{512. Portland. Temporal Logic in Speci cation. Proc. \Automatic veri cation of probabilistic concurrent nite-state programs". Lecture Notes in Computer Science 164. Thesis. pp. 1983. \On the relation of programs and computations to models of temporal logic". Wo83] P.. Pnueli. \Temporal Logic Can Be More Expressive". Va88] M. D. Barringer. pp.

- 99_rep99
- Wavelet Theory Demystified (Michael Unser, Thierry Blu; IEEE Transactions on Signal Processing, Vol.51, No.2)(Feb.2003)
- SOFT NEUTROSOPHIC LEFT ALMOST SEMIGROUP
- Calabi Yau String Compactification
- mmf2008j
- smf_sem-cong_3_69-100
- QFT Lectures on AdS_CFT.pdf
- Object Theory
- 1105.4982
- Double Spherical Pendulum (Pp 18 - 19)
- Predicate Calculus
- first order probablistic logic
- Neutrosophic logics on Non-Archimedean Structures, by Andrew Schumann
- HM JIAPT
- 10.1.1.113
- Simmons, George - Introduction to Topology and Modern Analysis - 1st Ed (1963), McGraw-Hill
- Lks
- Claudio A. Scrucca- Aspects of D-brane dynamics in superstring theory
- Invitation to SIA
- Atiyah 20 Th Century
- Measure Book1
- AdS Dual of the Critical O(N) Vector Model
- About Categories
- Untitled
- Set Theory b
- Number System
- Новак Логики, сохраняющие степени истинности
- Abyanker High School Algebra in Algebraic Geometry 2 4
- 204 R25 v Vladareanu Proceedings SISOM2013 - A New Transdisciplinary Science - Extenics v1

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue listening from where you left off, or restart the preview.

scribd