Treffer: A comparative analysis of GPUs, TPUs, DPUs, and QPUs for deep learning with python

Title:
A comparative analysis of GPUs, TPUs, DPUs, and QPUs for deep learning with python
Source:
Indonesian Journal of Electrical Engineering and Computer Science; Vol 38, No 2: May 2025; 1324-1330 ; 2502-4760 ; 2502-4752 ; 10.11591/ijeecs.v38.i2
Publisher Information:
Institute of Advanced Engineering and Science
Publication Year:
2025
Document Type:
Fachzeitschrift article in journal/newspaper
File Description:
application/pdf
Language:
English
DOI:
10.11591/ijeecs.v38.i2.pp1324-1330
Rights:
Copyright (c) 2025 Ayoub Allali, Zineb El falah, Jaafar Abouchabaka, Najat Rafalia ; http://creativecommons.org/licenses/by-nc-sa/4.0
Accession Number:
edsbas.69185582
Database:
BASE

Weitere Informationen

In the rapidly evolving field of deep learning, the computational demands for training sophisticated models have escalated, prompting a shift towards specialized hardware accelerators such as graphics processing units (GPUs), tensor processing units (TPUs), data processing units (DPUs), and quantum processing units (QPUs). This article provides a comprehensive analysis of these heterogeneous computing architectures, highlighting their unique characteristics, performance metrics, and suitability for various deep learning tasks. By leveraging python, a predominant programming language in the data science domain, the integration and optimization techniques applicable to each hardware platform is explored, offering insights into their practical implications for deep learning research and application. the architectural differences that influence computational efficiency is examined, parallelism, and energy consumption, alongside discussing the evolving ecosystem of software tools and libraries that support deep learning on these platforms. Through a series of benchmarks and case studies, this study aims to equip researchers and practitioners with the knowledge to make informed decisions when selecting hardware for their deep learning projects, ultimately contributing to the acceleration of model development and innovation in the field.