Graph Neural Anomaly Detection via Multi-Scale Temporal Subgraph Contrastive Learning
DOI:
https://doi.org/10.71465/fias541Keywords:
Dynamic Graph Neural Networks, Anomaly Detection, Contrastive Learning, Multi-Scale AnalysisAbstract
The detection of anomalous patterns in dynamic graph structures is a pivotal challenge in modern data mining, with critical applications ranging from financial fraud detection to cybersecurity and social network analysis. While static graph neural networks have achieved remarkable success in identifying structural irregularities, they often fail to capture the temporal evolution of anomalies that manifest only over extended periods. Existing dynamic approaches attempt to bridge this gap but frequently suffer from a trade-off between local structural sensitivity and long-term temporal dependency modeling. This paper introduces a novel framework, Graph Neural Anomaly Detection via Multi-Scale Temporal Subgraph Contrastive Learning (MSTS-CL). Our approach leverages a multi-scale subgraph sampling strategy to capture structural features at varying granularities, integrated with a temporal attention mechanism that highlights significant historical snapshots. Furthermore, we propose a self-supervised contrastive learning objective designed to maximize the mutual information between local temporal embeddings and global context representations, thereby mitigating the scarcity of labeled anomaly data. Extensive experiments on three benchmark datasets demonstrate that MSTS-CL outperforms state-of-the-art baselines by a significant margin, offering robust detection capabilities even in the presence of noise and structural sparsity.