Skip to yearly menu bar Skip to main content


Introduction
in
Tutorial: Sparsity in ML: Understanding and Optimizing Sparsity in Neural Networks Running on Heterogeneous Systems

Opening remarks and overview of sparsity in ML

Wen-Mei Hwu · Jinjun Xiong


Abstract:

We will give an overview of the sparsity problems in ML and summarize some of the latest work that address the challenges in handling sparsity from both systems and algorithm perspectives. We will discuss sparsity both as a result of pruning dense models and as an inherent property of other ML models. We will then discuss the challenges in implementing sparse algorithms on heterogeneous systems. We will conclude this session with an overview of the various runtime software libraries and tools, such as EMOGI, Pytorch-DGL, Tiled SpMM, BaM, and 2:4 sparsity, that we have developed recently to address the compute and memory challenges in handling sparsity in heterogeneous computing system. This session aims to give the audience the foundation for understanding the rest of the sessions.

See Wen-mei's bio here.

See Jinjun's bio here.

Chat is not available.