Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 91 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 16 tok/s Pro
GPT-5 High 20 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 212 tok/s Pro
GPT OSS 120B 471 tok/s Pro
Claude Sonnet 4 38 tok/s Pro
2000 character limit reached

Optimization Modulo the Theories of Signed Bit-Vectors and Floating-Point Numbers (1905.02838v1)

Published 7 May 2019 in cs.LO

Abstract: Optimization Modulo Theories (OMT) is an important extension of SMT which allows for finding models that optimize given objective functions, typically consisting in linear-arithmetic or pseudo-Boolean terms. However, many SMT and OMT applications, in particular from SW and HW verification, require handling bit-precise representations of numbers, which in SMT are handled by means of the theory of Bit-Vectors (BV) for the integers and that of Floating-Point Numbers (FP) for the reals respectively. Whereas an approach for OMT with (unsigned) BV has been proposed by Nadel & Ryvchin, unfortunately we are not aware of any existing approach for OMT with FP. In this paper we fill this gap. We present a novel OMT approach, based on the novel concept of attractor and dynamic attractor, which extends the work of Nadel & Ryvchin to signed BV and, most importantly, to FP. We have implemented some OMT(BV) and OMT(FP) procedures on top of OptiMathSAT and tested the latter ones on modified problems from the SMT-LIB repository. The empirical results support the validity and feasibility of the novel approach.

Citations (3)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.