Posted on , Updated on 

Keynote: Neuralizing Regular Expressions for Slot Filling

This paper proposes a novel approach to applying neural network-based regular expressions for the slot filling task, published at the 2021 EMNLP conference [Full Paper]. Interestingly, this work has a ‘prequel’—‘Cold-Start and Interpretability: Turning Regular Expressions into Trainable Recurrent Neural Networks [Full Paper],’ which I encountered a year earlier and found highly insightful. At the time, I anticipated that this approach could be extended to address other NLP challenges, such as named entity recognition. Indeed, the research team subsequently applied a similar technique to the slot filling problem the following year.

The inspiration I gained from this approach lies not only in transforming regular expressions into rule-driven neural networks but, more importantly, in the formal equivalence between dynamic programming and recurrent neural networks. This implies that, in theory, any pattern-matching algorithm based on dynamic programming can potentially be converted into an equivalent or approximate trainable neural network, making it adaptable to real-world scenarios such as knowledge-driven tasks, interpretability, and cold-start situations.

Fig.1 Cover Slide
Fig.1 Cover Slide
Fig.2 Background: Regular Expression for Slot Filling
Fig.2 Background: Regular Expression for Slot Filling
Fig.3 Background: Finite-State Transducer - 1
Fig.3 Background: Finite-State Transducer - 1
Fig.4 Background: Finite-State Transducer - 2
Fig.4 Background: Finite-State Transducer - 2
Fig.5 Background: Finite-State Transducer - 3
Fig.5 Background: Finite-State Transducer - 3
Fig.6 Background: Finite-State Transducer - 3'
Fig.6 Background: Finite-State Transducer - 3'
Fig.7 Methods
Fig.7 Methods
Fig.8 Methods: Converting RE to FST - 1
Fig.8 Methods: Converting RE to FST - 1
Fig.9 Methods: Converting RE to FST - 2
Fig.9 Methods: Converting RE to FST - 2
Fig.10 Methods: Converting RE to FST - 3
Fig.10 Methods: Converting RE to FST - 3
Fig.11 Methods: Converting RE to FST - 3'
Fig.11 Methods: Converting RE to FST - 3'
Fig.12 Methods: Converting RE to FST - 3''
Fig.12 Methods: Converting RE to FST - 3''
Fig.13 Methods: Converting RE to FST - 3'''
Fig.13 Methods: Converting RE to FST - 3'''
Fig.14 Methods: Converting RE to FST - 3''''
Fig.14 Methods: Converting RE to FST - 3''''
Fig.15 Methods: Converting RE to FST - 3''''
Fig.15 Methods: Converting RE to FST - 3''''
Fig.16 Methods: Converting RE to FST - 4
Fig.16 Methods: Converting RE to FST - 4
Fig.17 Methods: Converting RE to FST - 4'
Fig.17 Methods: Converting RE to FST - 4'
Fig.18 Methods: Inference in FST- 1
Fig.18 Methods: Inference in FST- 1
Fig.19 Methods: Inference in FST- 2
Fig.19 Methods: Inference in FST- 2
Fig.20 Methods: Inference in FST- 2'
Fig.20 Methods: Inference in FST- 2'
Fig.21 Methods: Independent FST - 1
Fig.21 Methods: Independent FST - 1
Fig.22 Methods: FST to i-FST
Fig.22 Methods: FST to i-FST
Fig.23 Methods: FST to i-FST - 1
Fig.23 Methods: FST to i-FST - 1
Fig.24 Methods: Parameter Tensor Decomposition
Fig.24 Methods: Parameter Tensor Decomposition
Fig.25 CP Decomposition (CPD)
Fig.25 CP Decomposition (CPD)
Fig.26 Incorporating External Word Embedding & Experiments
Fig.26 Incorporating External Word Embedding & Experiments
Fig.27 Experiments
Fig.27 Experiments