Please use this identifier to cite or link to this item:
|Author(s):||Matos, B. C.|
Santos, R. B.
|Editor:||Cordeiro, J., Pereira, M. J., Rodrigues, N. F., and Pais, S.|
|Publication date/Defense date:||2022|
|Title:||Comparing different approaches for detecting hate speech in online Portuguese comments|
|Book title/volume:||OpenAccess Series in Informatics|
|Event title:||11th Symposium on Languages, Applications and Technologies (SLATE 2022)|
|DOI (Digital Object Identifier):||10.4230/OASIcs.SLATE.2022.10|
|Abstract:||Online Hate Speech (OHS) has been growing dramatically on social media, which has motivated researchers to develop a diversity of methods for its automated detection. However, the detection of OHS in Portuguese is still little studied. To fill this gap, we explored different models that proved to be successful in the literature to address this task. In particular, we have explored transfer learning approaches, based on existing BERT-like pre-trained models. The performed experiments were based on CO-HATE, a corpus of YouTube comments posted by the Portuguese online community that was manually labeled by different annotators. Among other categories, those comments were labeled regarding the presence of hate speech and the type of hate speech, specifically overt and covert hate speech. We have assessed the impact of using annotations from different annotators on the performance of such models. In addition, we have analyzed the impact of distinguishing overt and and covert hate speech. The results achieved show the importance of considering the annotator’s profile in the development of hate speech detection models. Regarding the hate speech type, the results obtained do not allow to make any conclusion on what type is easier to detect. Finally, we show that pre-processing does not seem to have a significant impact on the performance of this specific task.|
|Access type:||Open Access|
|Appears in Collections:||IT-CRI - Comunicações a conferências internacionais|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.