Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 82 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 14 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 117 tok/s Pro
Kimi K2 200 tok/s Pro
GPT OSS 120B 469 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Polymonadic Programming (1406.2060v1)

Published 9 Jun 2014 in cs.PL

Abstract: Monads are a popular tool for the working functional programmer to structure effectful computations. This paper presents polymonads, a generalization of monads. Polymonads give the familiar monadic bind the more general type forall a,b. L a -> (a -> M b) -> N b, to compose computations with three different kinds of effects, rather than just one. Polymonads subsume monads and parameterized monads, and can express other constructions, including precise type-and-effect systems and information flow tracking; more generally, polymonads correspond to Tate's productoid semantic model. We show how to equip a core language (called lambda-PM) with syntactic support for programming with polymonads. Type inference and elaboration in lambda-PM allows programmers to write polymonadic code directly in an ML-like syntax--our algorithms compute principal types and produce elaborated programs wherein the binds appear explicitly. Furthermore, we prove that the elaboration is coherent: no matter which (type-correct) binds are chosen, the elaborated program's semantics will be the same. Pleasingly, the inferred types are easy to read: the polymonad laws justify (sometimes dramatic) simplifications, but with no effect on a type's generality.

Citations (19)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.