A paper presented as a poster at ICML. Abstract:
We introduce a new generative model for code called probabilistic higher order grammar (PHOG). PHOG generalizes probabilistic context free grammars (PCFGs) by allowing conditioning of a production rule beyond the parent non-terminal, thus capturing rich contexts relevant to programs. Even though PHOG is more powerful than a PCFG, it can be learned from data just as efficiently. We trained a PHOG model on a large JavaScript code corpus and show that it is more precise than existing models, while similarly fast. As a result, PHOG can immediately benefit existing programming tools based on probabilistic models of code.
Also comes with reviews.
Recent comments
17 weeks 16 hours ago
17 weeks 16 hours ago
17 weeks 16 hours ago
23 weeks 1 day ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 33 weeks ago
1 year 37 weeks ago
1 year 39 weeks ago