SpAtten: Efficient Sparse Attention Architecture with Cascade Token and Head Pruning
HPCA(2021)
关键词
Natural Language Processing,Attention,Domain-Specific Accelerator,Algorithm-Architecture Co-design,Pruning,Quantization
AI 理解论文
溯源树
样例

生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要