Mapping Probability Word Problems to Executable Representations
Mapping Probability Word Problems to Executable Representations
Jan 1, 2021·,,,,,,·
0 min read
Simon Suster
Pieter Fivez
Pietro Totis
Angelika Kimmig
Jesse Davis
Luc De Raedt
Walter Daelemans
Abstract
While solving math word problems automatically has received considerable attention in the NLP community, few works have addressed probability word problems specifically. In this paper, we employ and analyse various neural models for answering such word problems. In a two-step approach, the problem text is first mapped to a formal representation in a declarative language using a sequence-to-sequence model, and then the resulting representation is executed using a probabilistic programming system to provide the answer. Our best performing model incorporates general-domain contextualised word representations that were finetuned using transfer learning on another in-domain dataset. We also apply end-to-end models to this task, which bring out the importance of the two-step approach in obtaining correct solutions to probability problems.
Type
Publication
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021, Virtual Event / Punta Cana, Dominican Republic, 7-11 November, 2021