CiRA: An Open-Source Python Package for Automated Generation of Test Case Descriptions from Natural Language Requirements (2310.08234v1)
Abstract: Deriving acceptance tests from high-level, natural language requirements that achieve full coverage is a major manual challenge at the interface between requirements engineering and testing. Conditional requirements (e.g., "If A or B then C.") imply causal relationships which - when extracted - allow to generate these acceptance tests automatically. This paper presents a tool from the CiRA (Causality In Requirements Artifacts) initiative, which automatically processes conditional natural language requirements and generates a minimal set of test case descriptions achieving full coverage. We evaluate the tool on a publicly available data set of 61 requirements from the requirements specification of the German Corona-Warn-App. The tool infers the correct test variables in 84.5% and correct variable configurations in 92.3% of all cases, which corroborates the feasibility of our approach.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.