Skip to content

NLP-Tutorials/AACL-IJCNLP2022-KGC-Tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 

Repository files navigation

AACL-IJCNLP2022_Efficient_Robust_KGC_Tutorial

Materials for AACL2022 tutorial: Efficient and Robust Knowledge Graph Construction

Tutorial abstract [PDF]

Knowledge graph construction which aims to extract knowledge from the text corpus has appealed to the NLP community researchers. Previous decades have witnessed the remarkable progress of knowledge graph construction on the basis of neural models; however, those models often cost massive computation or labeled data resources and suffer from unstable inference accounting for biased or adversarial samples. Recently, numerous approaches have been explored to mitigate the efficiency and robustness issues for knowledge graph construction, such as prompt learning and adversarial training. In this tutorial, we aim at bringing interested NLP researchers up to speed about the recent and ongoing techniques for efficient and robust knowledge graph construction. Additionally, our goal is to provide a systematic and up-to-date overview of these methods and reveal new research opportunities to the audience.

If you find this tutorial helpful for your work, please kindly cite our paper.

@inproceedings{zhang-etal-2022-efficient-robust,
    title = "Efficient and Robust Knowledge Graph Construction",
    author = "Zhang, Ningyu  and
      Gui, Tao  and
      Nan, Guoshun",
    booktitle = "Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing: Tutorial Abstracts",
    month = nov,
    year = "2022",
    address = "Taipei",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2022.aacl-tutorials.1",
    pages = "1--7",
    abstract = "Knowledge graph construction which aims to extract knowledge from the text corpus, has appealed to the NLP community researchers. Previous decades have witnessed the remarkable progress of knowledge graph construction on the basis of neural models; however, those models often cost massive computation or labeled data resources and suffer from unstable inference accounting for biased or adversarial samples. Recently, numerous approaches have been explored to mitigate the efficiency and robustness issues for knowledge graph construction, such as prompt learning and adversarial training. In this tutorial, we aim to bring interested NLP researchers up to speed on the recent and ongoing techniques for efficient and robust knowledge graph construction. Additionally, our goal is to provide a systematic and up-to-date overview of these methods and reveal new research opportunities to the audience.",
}

Tutorial Materials

1. Slides [Introduction] [EfficientKGC] [RobustKGC] [Conclusion]

2. Video [AllParts]

3. Related Tutorials:

  • New Frontiers of Information Extraction. NAACL 2022 Tutorial [ppt]
  • Less Data, More ___? Data Augmentation and Semi-Supervised Learning for Natural Language Processing. ACL 2022 Tutorial [ppt]
  • Zero- and Few-Shot NLP with Pretrained Language Models. AACL 2022 Tutorial [ppt]
  • Data-Efficient Knowledge Graph Construction. CCKS2022 Tutorial [ppt]
  • Knowledge Informed Prompt Learning. MLNLP 2022 Tutorial (Chinese) [ppt][project]

4. Survey:

Knowledge Graph Construction

  • A Survey on Deep Learning for Named Entity Recognition (TKDE, 2022) [paper]
  • A Survey on Recent Advances in Named Entity Recognition from Deep Learning Models (COLING 2018) [paper]
  • A Survey on Neural Relation Extraction (Science China Technological Sciences, 2020) [paper]
  • What is Event Knowledge Graph: A Survey (TKDE, 2022) [paper]
  • Knowledge Extraction in Low-Resource Scenarios: Survey and Perspective (on arxiv, 2022) [paper]
  • Generative Knowledge Graph Construction: A Review (EMNLP, 2022) [paper]

Efficient NLP

  • A Survey on Recent Approaches for Natural Language Processing in Low-Resource Scenarios (NAACL 2021) [paper]
  • Few-Shot Named Entity Recognition: An Empirical Baseline Study (EMNLP 2021) [paper]
  • A Survey on Low-Resource Neural Machine Translation (IJCAI 2021) [paper]
  • Efficient Methods for Natural Language Processing: A Survey (on arxiv 2022) [paper]
  • Delta Tuning: A Comprehensive Study of Parameter Efficient Methods for Pre-trained Language Models (on arxiv 2021) [paper]
  • Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing (ACM Computing Surveys 2021) [paper]

Low-resource Learning

  • Generalizing from a Few Examples: A Survey on Few-shot Learning (ACM Computing Surveys, 2021) [paper]
  • Knowledge-aware Zero-Shot Learning: Survey and Perspective (IJCAI 2021) [paper]
  • Low-resource Learning with Knowledge Graphs: A Comprehensive Survey (2021) [paper]

5. Reading list:

  • Template-free prompt tuning for few-shot NER, in NAACL 2022. [pdf]
  • Reasoning with Latent Structure Refinement for Document-Level Relation Extraction, in ACL 2020. [pdf]
  • Making Pre-trained Language Models Better Few-shot Learners, in ACL 2022. [pdf]
  • PTR: Prompt Tuning with Rules for Text Classification, in AI Open 2022. [pdf]
  • Label Verbalization and Entailment for Effective Zero- and Few-Shot Relation Extraction, in EMNLP 2021. [pdf]
  • RelationPrompt: Leveraging Prompts to Generate Synthetic Data for Zero-Shot Relation Triplet Extraction, in EMNLP 2022 (Findings). [pdf]
  • KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction, in WWW 2022. [pdf]
  • Towards Realistic Low-resource Relation Extraction: A Benchmark with Empirical Baseline Study, in EMNLP 2022 (Findings). [pdf]
  • Decoupling Knowledge from Memorization: Retrieval-augmented Prompt Learning, in NeurIPS 2022. [pdf]
  • AliCG: Fine-grained and Evolvable Conceptual Graph Construction for Semantic Search at Alibaba, in KDD 2021. [pdf]
  • Relation Extraction as Open-book Examination: Retrieval-enhanced Prompt Tuning, in SIGIR 2022. [pdf]
  • LightNER: A Lightweight Tuning Paradigm for Low-resource NER via Pluggable Prompting, in COLING 2022. [pdf]
  • One Model for All Domains: Collaborative Domain-Prefix Tuning for Cross-Domain NER, in Arxiv 2023. [pdf]
  • PALT: Parameter-Lite Transfer of Language Models for Knowledge Graph Completion, in EMNLP 2022. [pdf]
  • Unified Structure Generation for Universal Information Extraction, in ACL 2022. [pdf]
  • LasUIE: Unifying Information Extraction with Latent Adaptive Structure-aware Generative Language Model, in NeurIPS 2022. [pdf]
  • Sequence-to-Sequence Knowledge Graph Completion and Question Answering, in ACL 2022. [pdf]
  • From Discrimination to Generation: Knowledge Graph Completion with Generative Transformer, in WWW 2022 (Poster) [pdf]
  • Generative Knowledge Graph Construction: A Review, in EMNLP 2022. [pdf]
  • FastRE: Towards Fast Relation Extraction with Convolutional Encoder and Improved Cascade Binary Tagging Framework, in IJCAI 2022. [pdf]
  • Long-tail relation extraction via knowledge graph embeddings and graph convolution networks, in NAACL 2019. [pdf]
  • Document-level Relation Extraction as Semantic Segmentation, in IJCAI 2021. [pdf]
  • Hybrid Transformer with Multi-level Fusion for Multimodal Knowledge Graph Completion, in SIGIR 2022. [pdf]
  • Good Visual Guidance Makes A Better Extractor: Hierarchical Visual Prefix for Multimodal Entity and Relation Extraction, in NAACL 2022 (Findings). [pdf]
  • Neuralizing Regular Expressions for Slot Filling, in EMNLP 2021. [pdf]
  • Event Extraction as Machine Reading Comprehension, in EMNLP 2020. [pdf]
  • RESIN: A Dockerized Schema-Guided Cross-document Cross-lingual Cross-media Information Extraction and Event Tracking System, in NAACL 2021 (demo). [pdf]
  • TextFlint: Unified Multilingual Robustness Evaluation Toolkit for Natural Language Processing, in ACL 2021. [pdf]
  • DeepKE: A Deep Learning Based Knowledge Extraction Toolkit for Knowledge Base Population, in EMNLP 2022 (Demo). [pdf]
  • OpenNRE: An Open and Extensible Toolkit for Neural Relation Extraction, in EMNLP 2019 (demo). [pdf]
  • OpenPrompt: An Open-source Framework for Prompt-learning, in ACL 2021 (demo). [pdf]
  • ZS4IE: A toolkit for Zero-Shot Information Extraction with simple Verbalizations, in NAACL 2022 (demo). [pdf]

Tutorial schedule

Local time (GMT) Content Presenter Slides
09:00-10:00 Introduction and Applications Guoshun Nan [Slides]
10:00-11:00 Efficient Knowledge Graph Construction Ningyu Zhang [Slides]
11:00-11:50 Robust Knowledge Graph Construction Tao Gui [Slides]
11:50-12:00 Summary Ningyu Zhang [Slides]

Presenters

     

Ningyu Zhang           Tao Gui               Guoshun Nan

Ningyu Zhang is an associate professor at Zhejiang University, his main research interests are knowledge graph, NLP, etc. He has published papers in top international academic conferences and journals such as NeurIPS/ICLR/WWW/KDD/WSDM/AAAI/IJCAI/ACL/ENNLP/NAACL/COLING/SIGIR/TASLP/ESWA/KBS/Journal of Software/Nature Communications. Three paper has been selected as Paper Digest Most Influential Papers (KnowPrompt'WWW22、DocuNet'IJCAI21、AliCG'KDD21). He has served as a PC for NeurIPS/ICLR/ICML/KDD/AAAI/IJCAI/ACL/EMNLP/NAACL, and reviewer of TKDE/WWWJ/JWS/TALLIP/IEEE Transactions on Cybernetics/ESWA.

Tao Gui is an associate professor in Institute of Modern Languages and Linguistics of Fudan University. He is the key member of FudanNLP group. He is a member of ACL, a member of the Youth Working Committee of the Chinese Information Processing Society of China, the member of the Language and Knowledge Computing Professional Committee of the Chinese Information Processing Society of China. He has published more than 30 papers in top international academic conferences and journals such as ACL, ENNLP, AAAI, IJCAI, SIGIR, and so on. He has served as Editor-in-Chief of the NLPR Information Extraction Special Issue, PCs for SIGIR, AAAI, IJCAI, and reviewer for TPAMI and ARR. He has received the Outstanding Doctoral Dissertation Award of the Chinese Information Processing Society of China, the area chair favorite Award of COLING 2018, the outstanding Paper Award of NLPCC 2019, and scholar of young talent promoting project of CAST.

Guoshun Nan is a tenure-track professor in School of Cyber Science and Engineering, Beijing University of Posts and Telecommunications (BUPT). He is the key member of National Engineering Research Center of Mobile Network Security, and a member of Wireless Technology Innovation Institute of BUPT. Before starting academic career, he also worked in Hewlett-Packard Company (China) for more than 4 years as an engineer. He is a member of ACL. His has broad interest in information extraction, model robustness, multimodal retrieval, cyber security and the next generation wireless networks. He has published more than 10 papers in top-tier conferences such as ACL, CVPR, EMNLP, SIGIR, IJCAI, CKIM and Sigcomm. He served as a reviewer for ACL, EMNLP, AAAI, IJCAI, Neurocomputing and IEEE Transaction on Image Processing.