Emergent Mind

Abstract

Blended modeling is an emerging paradigm involving seamless interaction between multiple notations for the same underlying modeling language. We focus on a model-driven engineering (MDE) approach based on meta-models to develop textual languages to improve the blended modeling capabilities of modeling tools. In this thesis, we propose an approach that can support the co-evolution of meta-models and grammars as language engineers develop textual languages in a meta-model-based MDE setting. Firstly, we comprehensively report on the challenges and limitations of modeling tools that support blended modeling, as well as opportunities to improve them. Second, we demonstrate how language engineers can extend Xtext's generator capabilities according to their needs. Third, we propose a semi-automatic method to transform a language with a generated grammar into a Python-style language. Finally, we provide a solution (i.e., GrammarOptimizer) that can support rapid prototyping of languages in different styles and the co-evolution of meta-models and grammars of evolving languages.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.