谷歌浏览器插件
订阅小程序
在清言上使用

ULSeq-TA: Ultra-Long Sequence Attention Fusion Transformer Accelerator Supporting Grouped Sparse Softmax and Dual-Path Sparse LayerNorm

IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS(2024)

引用 1|浏览24
关键词
Transformers,Task analysis,System-on-chip,Decoding,Sparse matrices,Hardware,Transformer cores,Long sequence,software-hardware co-design,sparse LayerNorm,sparse Softmax,transformer accelerator
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要