BEGIN:VCALENDAR
VERSION:2.0
PRODID:Linklings LLC
BEGIN:VTIMEZONE
TZID:America/Denver
X-LIC-LOCATION:America/Denver
BEGIN:DAYLIGHT
TZOFFSETFROM:-0700
TZOFFSETTO:-0600
TZNAME:MDT
DTSTART:19700308T020000
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=2SU
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0600
TZOFFSETTO:-0700
TZNAME:MST
DTSTART:19701101T020000
RRULE:FREQ=YEARLY;BYMONTH=11;BYDAY=1SU
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20240116T191658Z
LOCATION:301-302-303
DTSTART;TZID=America/Denver:20231114T160000
DTEND;TZID=America/Denver:20231114T163000
UID:submissions.supercomputing.org_SC23_sess167_pap166@linklings.com
SUMMARY:TANGO: Re-Thinking Quantization for Graph Neural Network Training 
 on GPUs
DESCRIPTION:Paper\n\nShiyang Chen (Rutgers University); Da Zheng (Amazon);
  Caiwen Ding (University of Connecticut); Chengying Huan (Institute of Sof
 tware, Chinese Academy of Sciences); Yuede Ji (University of North Texas);
  and Hang Liu (Rutgers University)\n\nGraph Neural Networks (GNNs) are rap
 idly gaining popularity since they hold state-of-the-art performance for v
 arious critical graph-related tasks. While quantization is a primary appro
 ach to accelerating GNN computation, quantized training faces remarkable c
 hallenges. We observe that current quantized GNN training systems often ex
 perience longer training time than their full-precision counterparts for t
 wo reasons: (i) addressing the accuracy challenge results in too much over
 head. (ii) The optimization opportunity exposed by quantization is not wel
 l leveraged. This paper introduces Tango, which re-thinks quantization cha
 llenges and opportunities for graph neural network training on GPUs with t
 he following contributions: First, we introduce light-weighted rules to me
 et the accuracy requirement for quantized GNN training. Second, we design 
 and implement quantization-aware primitives and inter-primitive optimizati
 ons to accelerate GNN training. Third, we integrate Tango with the mainstr
 eam Deep Graph Library (DGL) system and demonstrate that Tango outperforms
  the state-of-the-art across all the evaluated GNN models and datasets.\n\
 nTag: Artificial Intelligence/Machine Learning\n\nRegistration Category: T
 ech Program Reg Pass\n\nSession Chair: Israt Nisa (Amazon Web Services AI 
 Research and Education)
END:VEVENT
END:VCALENDAR
