A Case Study of Face Recognition Workloads in Edge Data Centers

Main Article Content

Theo Richardson

Abstract

Recent advances in artificial intelligence and machine learning have accelerated the development of specialized hardware platforms for AI workloads. However, many existing studies focus primarily on algorithmic performance, while overlooking the interactions between AI applications and underlying system infrastructures. This work presents a comprehensive end-to-end analysis of AI-centric edge computing workloads through a case study of a face recognition system deployed in an edge data center environment. The application is implemented using widely adopted open-source frameworks and machine learning tools. Our evaluation reveals that, despite high computational efficiency enabled by modern accelerators, significant performance overhead arises from data preprocessing, postprocessing, storage access, and network communication. As computational acceleration increases, system-level bottlenecks related to I/O and bandwidth become increasingly prominent, leading to what we characterize as an “AI tax” on infrastructure resources. To address these challenges, we design a specialized edge data center architecture optimized for AI workloads. Experimental results demonstrate that the proposed system achieves improved resource utilization and reduces total cost of ownership by approximately 15% compared with conventional homogeneous deployments.

Article Details

Section

Articles