A Proximal-ADMM-Incorporated Nonnegative Latent-Factorization-of-Tensors Model for Representing Dynamic Cryptocurrency Transaction Network

Page view(s)
0
Checked on
A Proximal-ADMM-Incorporated Nonnegative Latent-Factorization-of-Tensors Model for Representing Dynamic Cryptocurrency Transaction Network
Title:
A Proximal-ADMM-Incorporated Nonnegative Latent-Factorization-of-Tensors Model for Representing Dynamic Cryptocurrency Transaction Network
Journal Title:
IEEE Transactions on Systems, Man, and Cybernetics: Systems
Keywords:
Publication Date:
05 September 2025
Citation:
Liao, X., Wu, H., He, T., & Luo, X. (2025). A Proximal-ADMM-Incorporated Nonnegative Latent-Factorization-of-Tensors Model for Representing Dynamic Cryptocurrency Transaction Network. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 55(11), 8387–8401. https://doi.org/10.1109/tsmc.2025.3605054
Abstract:
Cryptocurrency services, as one of the most successful applications of blockchain technology, have recently garnered significant attention from the graph learning community. Its large-scale dynamic transaction records contain a variety of behavioral patterns and rich knowledge involving accounts, making the dynamic cryptocurrency transaction network embedding (DCTNE) a hot, yet thorny research topic. As the trading accounts increase and time accumulates, considerable transaction services are dispersed into various time slots, leading to very sparse transaction data within a time slot, that is, the transaction service data is high-dimensional and incomplete (HDI). To efficiently mine high-value knowledge from HDI data, this article proposes a proximal-ADMM-incorporated nonnegative latent-factorization-of-tensors (PNL) model for DCTNE that adopts threefold ideas: 1) incorporating the proximal terms into the alternating-direction-method-of-multipliers (ADMMs)-based learning scheme to reduce the oscillations for high estimation accuracy and fast convergence; 2) implementing a parallel training process with hyperparameter self-adaptation for high computational efficiency; and 3) proving that the proximal-incorporated learning scheme guarantees the convergence to a Karush–Kuhn–Tucker (KKT) stationary point. Experimental results on eight real-world DCTNs show that the PNL significantly outperforms several state-of-the-art (SOTA) models, demonstrating not only high efficiency and accuracy in performing DCTNE, but also strong potential to enhance the operational reliability and stability of cryptocurrency transaction systems.
License type:
Publisher Copyright
Funding Info:
There was no specific funding for the research done
Description:
© 2025 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
ISSN:
2168-2216
2168-2232
Files uploaded:

File Size Format Action
pnl-r-0528-clean.pdf 1.28 MB PDF Open