Skip to main content

Research Repository

Advanced Search

Cardiff University at SemEval-2020 Task 6: Fine-tuning BERT for Domain-Specific Definition Classification

Jeawak, Shelan S; Espinosa-Anke, Luis; Schockaert, Steven

Cardiff University at SemEval-2020 Task 6: Fine-tuning BERT for Domain-Specific Definition Classification Thumbnail


Authors

Luis Espinosa-Anke

Steven Schockaert



Abstract

We describe the system submitted to SemEval-2020 Task 6, Subtask 1. The aim of this subtask is to predict whether a given sentence contains a definition or not. Unsurprisingly, we found that strong results can be achieved by fine-tuning a pre-trained BERT language model. In this paper, we analyze the performance of this strategy. Among others, we show that results can be improved by using a two-step fine-tuning process, in which the BERT model is first fine-tuned on the full training set, and then further specialized towards a target domain.

Citation

Jeawak, S. S., Espinosa-Anke, L., & Schockaert, S. (2020). Cardiff University at SemEval-2020 Task 6: Fine-tuning BERT for Domain-Specific Definition Classification. In Proceedings of the Fourteenth Workshop on Semantic Evaluation (361-366)

Conference Name The International Workshop on Semantic Evaluation
Conference Location Barcelona, Spain (online)
Start Date Dec 12, 2020
End Date Dec 13, 2020
Acceptance Date Jun 26, 2020
Online Publication Date Aug 17, 2020
Publication Date Dec 12, 2020
Deposit Date May 17, 2021
Publicly Available Date Mar 28, 2024
Pages 361-366
Book Title Proceedings of the Fourteenth Workshop on Semantic Evaluation
ISBN 9781952148316
Public URL https://uwe-repository.worktribe.com/output/7336976

Files





You might also like



Downloadable Citations