A searchable list of some of my publications is below. You can also access my publications from the following sites.
My ORCID is
https://orcid.org/0000-0002-6236-2969
Publications:
1.
Karan Samel, Jun Ma, Zhengyang Wang, Tong Zhao, Irfan Essa
Knowledge Relevance BERT: Integrating Noisy Knowledge into Language Representation. Proceedings Article
In: AAAI workshop on Knowledge Augmented Methods for NLP (KnowledgeNLP-AAAI 2023), 2023.
@inproceedings{2023-Samel-KRBINKILR,
title = {Knowledge Relevance BERT: Integrating Noisy Knowledge into Language Representation.},
author = {Karan Samel and Jun Ma and Zhengyang Wang and Tong Zhao and Irfan Essa},
url = {https://knowledge-nlp.github.io/aaai2023/papers/005-KRBERT-oral.pdf},
year = {2023},
date = {2023-02-01},
urldate = {2023-02-01},
booktitle = {AAAI workshop on Knowledge Augmented Methods for NLP (KnowledgeNLP-AAAI 2023)},
abstract = {Integrating structured knowledge into language model representations increases recall of domain-specific information useful for downstream tasks. Matching between knowledge graph entities and text entity mentions can be easily performed when entity names are unique or entity-linking data exists. When extending this setting to new domains, newly mined knowledge contains ambiguous and incorrect information without explicit linking information. In such settings, we design a framework to robustly link relevant knowledge to input texts as an intermediate modeling step while performing end-to-end domain fine-tuning tasks. This is done by first computing the similarity of the existing task labels with candidate knowledge triplets to generate relevance labels. We use these labels to train a relevance model, which predicts the relevance of the inserted triplets to the original text. This relevance model is integrated within a language model, leading to our Knowledge Relevance BERT (KR-BERT) framework. We test KR-BERT for linking and ranking tasks on a real-world e-commerce dataset and a public entity linking task, where we show performance improvements over strong baselines.},
keywords = {AI, knowledge representation, NLP},
pubstate = {published},
tppubtype = {inproceedings}
}
Integrating structured knowledge into language model representations increases recall of domain-specific information useful for downstream tasks. Matching between knowledge graph entities and text entity mentions can be easily performed when entity names are unique or entity-linking data exists. When extending this setting to new domains, newly mined knowledge contains ambiguous and incorrect information without explicit linking information. In such settings, we design a framework to robustly link relevant knowledge to input texts as an intermediate modeling step while performing end-to-end domain fine-tuning tasks. This is done by first computing the similarity of the existing task labels with candidate knowledge triplets to generate relevance labels. We use these labels to train a relevance model, which predicts the relevance of the inserted triplets to the original text. This relevance model is integrated within a language model, leading to our Knowledge Relevance BERT (KR-BERT) framework. We test KR-BERT for linking and ranking tasks on a real-world e-commerce dataset and a public entity linking task, where we show performance improvements over strong baselines.
Other Publication Sites
Copyright/About
[Please see the Copyright Statement that may apply to the content listed here.]
This list of publications is produced by using the teachPress plugin for WordPress.