Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 173 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 23 tok/s Pro
GPT-4o 76 tok/s Pro
Kimi K2 202 tok/s Pro
GPT OSS 120B 447 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Well-Behaved Model Transformations with Model Subtyping (1703.08113v1)

Published 23 Mar 2017 in cs.PL

Abstract: In model-driven engineering, models abstract the relevant features of software artefacts and model transformations act on them automating complex tasks of the development process. It is, thus, crucially important to provide pragmatic, reliable methods to verify that model transformations guarantee the correctness of generated models in order to ensure the quality of the final end product. In this paper, we build on an object-oriented algebraic encoding of metamodels and models as defined in the standard Meta-Object Facility and in tools, such as the Eclipse Modeling Framework, to specify a domain-specific language for representing the action part of model transformations. We introduce the big-step operational structural semantics of this language and its type system, which includes a notion of polymorphic model subtyping, showing that well-typed model transformations are well behaved. That is, that metamodel-conformant model transformations never go wrong.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.