Forecasting Cyber Threat Intelligence with Memory Augmented Transformer
DOI:
https://doi.org/10.20535/tacs.2664-29132025.3.346865Abstract
Cyber threat intelligence data are volatile, irregular, and shaped by abrupt regime shifts, making accurate forecasting particularly challenging. Motivated by this, we explore the potential of a memory-augmented Transformer forecaster that integrates an evolving memory mechanism and confidence-regulated attention. Introducing complementary design that enables the model to balance adaptability with stability, remaining robust under noise and structural changes in the threat landscape. Building on and re-architecting the original ACWA-based approach, the resulting ChronoTensor introduced enhanced model achieves parity with state-of-the-art forecasting methods while introducing transparent memory and attention pathways that enhance the interpretability and explainability of its predictions.
Downloads
Published
Issue
Section
License
Authors who publish with this journal agree to the following terms:
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).