SLU paper
中文 | English
1 Introduction
This project is used to carry out the code implementation and reading of papers presented by Spoken Language Understanding (SLU).
The papers to be read are:
SLU paper
[Uncorrelated Joint Modeling]
-
"Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling"
-
BERT for Joint Intent Classification and Slot Filling
[Joint modeling of intention Association slots]
-
A Stack-Propagation Framework with Token-Level Intent Detection for Spoken Language Understanding (Spoken Language Understanding)
-
Slot-gated Modeling for Joint Slot Filling and Intent Prediction
-
A Self-Attention Joint Model for Spoken Language Understanding in Situational Dialog Applications (Spoken Language Understanding in Situational Dialog)
-
"A Self-Attentive Model with Gate Mechanism for Spoken Language Understanding"
[Bidirectional Association Joint Modeling]
-
A Novel Bi-directional Model for Joint Intent Detection and Slot Filling
-
Joint Slot Filling and Intent Detection via Capsule Neural Networks
-
JOINT INTENT DETECTION AND SLOT FILLING BASED ON CONTINUAL LEARNING MODEL
-
Focus on Interaction: A Novel Dynamic Graph Model for Joint Multiple Intent Detection and Slot Filling
-
Encoding Syntactic Knowledge in Transformer Encoder for Intent Detection and Slot Filling
-
A Two-Stage Selective Fusion Framework for Joint Intent Detection and Slot Filling
-
Joint intent detection and slot filling with wheel-graph attention networks
-
A Graph Attention Interactive Refine Framework with Contextual Regularization for Jointing Intent Detection and Slot Filling."
-
A joint model based on interactive gate mechanism for spoken language understanding (Spoken Language Understanding)
-
Learning High-Order Semantic Representation for Intent Classification and Slot Filling on Low-Resource Language via Hypergraph."
2 resolved paper Directory (under update)
The list of papers with existing code is:
3 Thank
Thank you for reading!
I wish you a full table of papers!