Hierarchical Multi-Task Learning for Fine-Grained and Coarse Text Classification
DOI:
https://doi.org/10.71465/fias272Keywords:
Hierarchical Multi-Task Learning, Fine-Grained Classification, Coarse-Grained Classification, NLP, Text Classification, Deep LearningAbstract
Text classification tasks often vary in granularity, with coarse labels capturing general topics and fine-grained labels capturing nuanced subcategories or sentiments. Traditional models trained separately on these classification levels struggle to leverage the hierarchical relationships between them. In this paper, we propose a hierarchical multi-task learning (HMTL) framework that jointly models coarse and fine-grained text classification tasks by aligning shared and task-specific layers in a hierarchical architecture. Our model exploits the inherent semantic dependencies between classification layers, enabling better generalization and improved performance on both tasks. Evaluations on benchmark datasets demonstrate that HMTL outperforms single-task baselines and flat multi-task models, particularly in domains with rich label hierarchies. The proposed framework provides a scalable and effective approach for tasks requiring contextual depth and label interdependence.