CAPSTONE PROJECTS

Neural Models of Supertagging for Semantic Role Labeling and Beyond

Interactions

Neural Models of Supertagging for Semantic Role Labeling and Beyond

Student Team: Anusha Gouravaram, Diji Yang, and Julian Jakob Cremer

Project Mentor: Dr. John Chen, Interactions

Recent Transformer-type deep neural network models have shown great success in a variety of NLP tasks such as sequence labeling. One sequence tagging task for which work in Transformers is absent is supertagging, which we investigate here. Supertagging, as an “almost parsing” task, has been previously shown to be helpful for the task of semantic role labeling (SRL). Consequently, we have also investigated how Transformer-type supertagging models can positively affect SRL models. Finally, supertagging has been typically framed in terms of modeling syntactic structure. In contrast, we propose a new Dual Semantic and Syntactic (DSS) supertag grammar which also incorporates semantic information. We incorporate it along with Transformer-based modeling into a three-step pipeline that labels semantic roles when given raw input sentences. Our empirical experiments demonstrate that it can be helpful for semantic parsing tasks. Specifically, it has achieved competitive results on the CoNLL 2009 shared task.