View on GitHub

esslli2019-graph-grammars

Materials for the course Graph Grammars for Natural Language Processing at ESSLLI 2019.

Course Info

This is the webpage for the course Graph Grammars for Natural Language Processing, to be held at ESSLLI 2019 at the University of Latvia in Riga. This page will be updated throughout the course.

Course Description

Grammar formalisms that describe languages of strings or trees (CFG, Regular Tree Grammar, Tree Adjoining Grammar, etc.) are well established in Natural Language Processing. Formalisms that operate on graphs are discussed less frequently, even though graphs offer an intuitive representation for natural language syntax and semantics. This course offers an introduction to graph grammars and their use in NLP. We start by examining graph-based representations in NLP, such as Abstract Meaning Representation (AMR). We then discuss context-free graph grammars, focusing on Hyperedge Replacement Grammars (HRG), and synchronous HRG. The course discusses theoretical properties of graph grammars, including their relationship to other formalisms in NLP. It also covers practical applications of graph grammars to natural language semantics. The course puts emphasis on synchronous grammars for string-to-graph translation, which we use for language understanding. It presents algorithms for learning grammars from annotated corpora (graphbanks), and for statistical semantic parsing.

Schedule and Materials

The schedule is subject to change.

Day 1, Monday 8/12

Introduction and Overview. Graph-based representations in Natural Language Processing: Dependency structure with reentrancy, Abstract Meaning Representation (AMR), and others. Review of graph theoretic concepts and graph properties.

Day 2, Tuesday 8/13

Review of formal grammars and automata. Context free graph grammars. Hyperedge Replacement Grammars (HRG). HRG languages and language hierarchy. Parsing algorithms for HRG.

Day 3, Wednesday 8/14

Algebraic perspective on Graph rewriting. Variations of graph grammars, including node replacement. Applications to Natural Language Syntax. Relationship to LCFRS and Tree Adjoining Grammars.

Day 4 - Thursday 8/15

Synchronous Hyperedge Replacement Grammars (SHRG). Application to natural language understanding, generation, and semantic decom- position. Comparison with other approaches to semantic construction (feature unification, lambda calculus). Lexicalized SHRG.

Day 5 - Friday 8/16

Learning SHRG rules from annotated corpora. String-to-graph alignments. Algorithms for statistical semantic parsing. Overview of software tools. Future directions.