Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 37 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 10 tok/s Pro
GPT-5 High 15 tok/s Pro
GPT-4o 84 tok/s Pro
Kimi K2 198 tok/s Pro
GPT OSS 120B 448 tok/s Pro
Claude Sonnet 4 31 tok/s Pro
2000 character limit reached

Multilevel Modelling and Domain-Specific Languages (1910.03313v2)

Published 8 Oct 2019 in cs.SE

Abstract: Modern software engineering deals with demanding problems that yield large and complex software. The area of Model-Driven Software Engineering tackles this issue by using models during the development process, but it does not address some of the communication problems among different stakeholders. Domain-Specific Modelling Languages (DSML) aim at involving domain experts with non-technical profiles in that process. DSMLs define concepts with different levels of abstraction, but traditional modelling does not allow enough flexibility to organise them adequately. Multilevel Modelling (MLM) approaches provide an unbounded number of levels of abstraction, plus other features that perfectly fit DSMLs. Their development can also benefit from Model Transformations (MT), especially when these encode the behaviour of DSMLs. MTs can be exploited by MLM, becoming a precise and reusable definition of behaviour. This thesis presents a MLM and Multilevel MT approach which tackles open issues in the field and compares it with the state of the art through literature review and experiments, providing its formalisation and its implementation in the tool MultEcore, together with case studies.

Citations (11)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)