kth.sePublications
Change search
Link to record
Permanent link

Direct link
Publications (3 of 3) Show all publications
Lundén, D., Hummelgren, L., Kudlicka, J., Eriksson, O. & Broman, D. (2024). Suspension Analysis and Selective Continuation-Passing Style for Universal Probabilistic Programming Languages. In: Programming Languages and Systems - 33rd European Symposium on Programming, ESOP 2024, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2024, Proceedings: . Paper presented at 33rd European Symposium on Programming, ESOP 2024, held as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2024, Luxembourg City, Luxembourg, Apr 6 2024 - Apr 11 2024 (pp. 302-330). Springer Nature, 14577
Open this publication in new window or tab >>Suspension Analysis and Selective Continuation-Passing Style for Universal Probabilistic Programming Languages
Show others...
2024 (English)In: Programming Languages and Systems - 33rd European Symposium on Programming, ESOP 2024, Held as Part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2024, Proceedings, Springer Nature , 2024, Vol. 14577, p. 302-330Conference paper, Published paper (Refereed)
Abstract [en]

Universal probabilistic programming languages (PPLs) make it relatively easy to encode and automatically solve statistical inference problems. To solve inference problems, PPL implementations often apply Monte Carlo inference algorithms that rely on execution suspension. State-of-the-art solutions enable execution suspension either through (i) continuation-passing style (CPS) transformations or (ii) efficient, but comparatively complex, low-level solutions that are often not available in high-level languages. CPS transformations introduce overhead due to unnecessary closure allocations—a problem the PPL community has generally overlooked. To reduce overhead, we develop a new efficient selective CPS approach for PPLs. Specifically, we design a novel static suspension analysis technique that determines parts of programs that require suspension, given a particular inference algorithm. The analysis allows selectively CPS transforming the program only where necessary. We formally prove the correctness of the analysis and implement the analysis and transformation in the Miking CorePPL compiler. We evaluate the implementation for a large number of Monte Carlo inference algorithms on real-world models from phylogenetics, epidemiology, and topic modeling. The evaluation results demonstrate significant improvements across all models and inference algorithms.

Place, publisher, year, edition, pages
Springer Nature, 2024
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 14577
Keywords
Continuation-passing style, Probabilistic programming, Static analysis
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-346543 (URN)10.1007/978-3-031-57267-8_12 (DOI)001281762600012 ()2-s2.0-85192176283 (Scopus ID)
Conference
33rd European Symposium on Programming, ESOP 2024, held as part of the European Joint Conferences on Theory and Practice of Software, ETAPS 2024, Luxembourg City, Luxembourg, Apr 6 2024 - Apr 11 2024
Note

QC 20240521

Part of ISBN 978-303157266-1

Available from: 2024-05-16 Created: 2024-05-16 Last updated: 2024-09-12Bibliographically approved
Hummelgren, L., Palmkvist, V., Stjerna, L., Xu, X., Jaldén, J. & Broman, D. (2024). Trellis: A Domain-Specific Language for Hidden Markov Models with Sparse Transitions. In: Laemmel, R Pereira, JA Mosses, PD (Ed.), PROCEEDINGS OF THE 17TH ACM SIGPLAN INTERNATIONAL CONFERENCE ON SOFTWARE LANGUAGE ENGINEERING, SLE 2024: . Paper presented at 17th ACM SIGPLAN International Conference on Software Language Engineering (SLE), OCT 20-21, 2024, Pasadena, CA (pp. 196-209). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Trellis: A Domain-Specific Language for Hidden Markov Models with Sparse Transitions
Show others...
2024 (English)In: PROCEEDINGS OF THE 17TH ACM SIGPLAN INTERNATIONAL CONFERENCE ON SOFTWARE LANGUAGE ENGINEERING, SLE 2024 / [ed] Laemmel, R Pereira, JA Mosses, PD, Association for Computing Machinery (ACM) , 2024, p. 196-209Conference paper, Published paper (Refereed)
Abstract [en]

Hidden Markov models (HMMs) are frequently used in areas such as speech recognition and bioinformatics. However, implementing HMM algorithms correctly and efficiently is time-consuming and error-prone. Specifically, using model-specific knowledge to improve performance, such as sparsity in the transition probability matrix, ties the implementation to a particular model, making it harder to modify. Previous work has introduced high-level frameworks for defining HMMs, thus lifting the burden of efficiently implementing HMM algorithms from the user. However, existing tools are ill-suited for sparse HMMs with many states. This paper introduces Trellis, a domain-specific language for succinctly defining sparse HMMs that use GPU acceleration to achieve high performance. We show that Trellis outperforms previous work and is on par with a hand-written CUDA kernel implementation for a particular sparse HMM.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2024
Keywords
HiddenMarkovmodels, DSL, parallelization, GPU acceleration
National Category
Probability Theory and Statistics
Identifiers
urn:nbn:se:kth:diva-357515 (URN)10.1145/3687997.3695641 (DOI)001344239100017 ()2-s2.0-85210805499 (Scopus ID)
Conference
17th ACM SIGPLAN International Conference on Software Language Engineering (SLE), OCT 20-21, 2024, Pasadena, CA
Note

Part of ISBN 979-8-4007-1180-0

QC 20241209

Available from: 2024-12-09 Created: 2024-12-09 Last updated: 2025-05-27Bibliographically approved
Lundén, D., Hummelgren, L., Kudlicka, J., Eriksson, O. & Broman, D.Suspension Analysis and Selective Continuation-Passing Style for Higher-Order Probabilistic Programming Languages.
Open this publication in new window or tab >>Suspension Analysis and Selective Continuation-Passing Style for Higher-Order Probabilistic Programming Languages
Show others...
(English)Manuscript (preprint) (Other academic)
Abstract [en]

Probabilistic programming languages (PPLs) make encoding and automatically solving statistical inference problems relatively easy by separating models from the inference algorithm. A popular choice for solving inference problems is to use Monte Carlo inference algorithms. For higher-order functional PPLs, these inference algorithms rely on execution suspension to perform inference, most often enabled through a full continuation-passing style (CPS) transformation. However, standard CPS transformations for PPL compilers introduce significant overhead, a problem the community has generally overlooked. State-of-the-art solutions either perform complete CPS transformations with performance penalties due to unnecessary closure allocations or use efficient, but complex, low-level solutions that are often not available in high-level languages. In contrast to prior work, we develop a new approach that is both efficient and easy to implement using higher-order languages. Specifically, we design a novel static suspension analysis technique that determines the parts of a program that require suspension, given a particular inference algorithm. The analysis result allows selectively CPS transforming the program only where necessary. We formally prove the correctness of the suspension analysis and implement both the suspension analysis and selective CPS transformation in the Miking CorePPL compiler. We evaluate the implementation for a large number of Monte Carlo inference algorithms on real-world models from phylogenetics, epidemiology, and topic modeling. The evaluation results demonstrate significant improvements across all models and inference algorithms.

National Category
Computer Sciences
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-324295 (URN)
Note

QC 20230227

Available from: 2023-02-25 Created: 2023-02-25 Last updated: 2023-03-02Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0009-0003-9325-8405

Search in DiVA

Show all publications