Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 45 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 122 tok/s Pro
Kimi K2 178 tok/s Pro
GPT OSS 120B 450 tok/s Pro
Claude Sonnet 4.5 34 tok/s Pro
2000 character limit reached

Monitorability of $ω$-regular languages (1006.3638v1)

Published 18 Jun 2010 in cs.FL

Abstract: Arguably, omega-regular languages play an important role as a specification formalism in many approaches to systems monitoring via runtime verification. However, since their elements are infinite words, not every omega-regular language can sensibly be monitored at runtime when only a finite prefix of a word, modelling the observed system behaviour so far, is available. The monitorability of an omega-regular language, L, is thus a property that holds, if for any finite word u, observed so far, it is possible to add another finite word v, such that uv becomes a "finite witness" wrt. L; that is, for any infinite word w, we have that uvw \in L, or for any infinite word w, we have that uvw \not\in L. This notion has been studied in the past by several authors, and it is known that the class of monitorable languages is strictly more expressive than, e.g., the commonly used class of so-called safety languages. But an exact categorisation of monitorable languages has, so far, been missing. Motivated by the use of linear-time temporal logic (LTL) in many approaches to runtime verification, this paper first determines the complexity of the monitorability problem when L is given by an LTL formula. Further, it then shows that this result, in fact, transfers to omega-regular languages in general, i.e., whether they are given by an LTL formula, a nondeterministic Buechi automaton, or even by an omega-regular expression.

Citations (11)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.