A paper presented as a poster at ICML. Abstract:
We introduce a new generative model for code called probabilistic higher order grammar (PHOG). PHOG generalizes probabilistic context free grammars (PCFGs) by allowing conditioning of a production rule beyond the parent non-terminal, thus capturing rich contexts relevant to programs. Even though PHOG is more powerful than a PCFG, it can be learned from data just as efficiently. We trained a PHOG model on a large JavaScript code corpus and show that it is more precise than existing models, while similarly fast. As a result, PHOG can immediately benefit existing programming tools based on probabilistic models of code.
Also comes with reviews.
Recent comments
32 weeks 6 days ago
33 weeks 1 hour ago
33 weeks 1 hour ago
1 year 3 weeks ago
1 year 7 weeks ago
1 year 8 weeks ago
1 year 8 weeks ago
1 year 11 weeks ago
1 year 16 weeks ago
1 year 16 weeks ago