Jiangnan Yu

Ph.D. Student, HKUST ECE

Jiangnan Yu

Computer Architecture · ML Systems · Hardware-Software Co-Design

I am a second-year Ph.D. student at The Hong Kong University of Science and Technology (HKUST), advised by Prof. Yuan Xie. My research interests lie at the intersection of computer architecture and machine learning, with a focus on efficient hardware acceleration for sparse computation, in-memory computing, and large language models.

Before joining HKUST, I received my M.S. and B.S. in Microelectronics from Fudan University.

  • FocusArchitecture for sparse and LLM workloads
  • MethodHardware-software co-design
  • AdvisorProf. Yuan Xie

Research Focus

My work explores efficient AI computing systems through architecture innovation and accelerator design.

News

Education

Selected Publications

*: Equal contributions

  1. Scalable Sparse Transformer Accelerator with In-Memory Butterfly Zero Skipper and Local Attention Reusable Engine for Irregular-Pruned NN

    Jiangnan Yu, et al.

    IEEE JETCAS, 2026 Accepted

  2. DSCIM: Digital Stochastic Computing in Memory Featuring Accurate OR Accumulation for Edge AI Models

    Jiangnan Yu*, et al.

    DATE, 2026 Accepted

  3. McPAL: Scaling Unstructured Sparse Inference with Multi-Chiplet HBM-PIM Architecture for LLMs

    Shiwei Liu, Jiangnan Yu, et al.

    DAC, 2025 Accepted

  4. DIRC-RAG: Accelerating Edge RAG with Robust High-Density and High-Loading-Bandwidth Digital In-ReRAM Computation

    Jiangnan Yu*, et al.

    ISLPED, 2025

  5. FullSparse: A Sparse-Aware GEMM Accelerator with Online Sparsity Prediction

    Jiangnan Yu, Yang Fan, et al.

    CF, 2024

  6. TPNoC: An Efficient Topology Reconfigurable NoC Generator

    Jiangnan Yu, et al.

    GLSVLSI, 2023

  7. NNASIM: An Efficient Event-Driven Simulator for DNN Accelerators with Accurate Timing and Area Models

    X. Yi, Jiangnan Yu, et al.

    ISCAS, 2022

Misc

Hobbies: Running and hiking.