| dc.contributor.author | Dehbia, AHMED ZAID | |
| dc.contributor.author | ; Djamaa, Badis; Benatia, Akrem | |
| dc.date.accessioned | 2025-03-18T12:17:04Z | |
| dc.date.available | 2025-03-18T12:17:04Z | |
| dc.date.issued | 25/10/2024 | |
| dc.identifier.uri | http://depot.umc.edu.dz/handle/123456789/14556 | |
| dc.description.abstract | Deep neural networks (DNNs) have grown increasingly large and complex, which requires effective optimization techniques to improve efficiency and scalability. Sparsity has emerged as a primary and widely adopted optimization approach, enabling significant reductions in DNN computational demands while preserving model performance. Specifically, structured N:M sparsity has emerged as a promising approach due to its alignment with modern hardware architectures, allowing for efficient model compression and computations | fr_FR |
| dc.publisher | Université Frères Mentouri - Constantine 1 | |
| dc.title | Optimizing Deep Neural Networks with N :M Structured Sparsity | fr_FR |
| dc.type | Article | fr_FR |