STT-BNN: A Novel STT-MRAM In-Memory Computing Macro for Binary Neural Networks

Thi Nhan Pham, Quang Kien Trinh, Ik Joon Chang, Massimo Alioto

Research output: Contribution to journalArticlepeer-review

28 Citations (Scopus)

Abstract

This paper presents a novel architecture for in-memory computation of binary neural network (BNN) workloads based on STT-MRAM arrays. In the proposed architecture, BNN inputs are fed through bitlines, then, a BNN vector multiplication can be done by single sensing of the merged SL voltage of a row. Our design allows to perform unrestricted accumulation across rows for full utilization of the array and BNN model scalability, and overcomes challenges on the sensing circuit due to the limitation of low regular tunneling magnetoresistance ratio (TMR) in STT-MRAM. Circuit techniques are introduced in the periphery to make the energy-speed-area-robustness tradeoff more favorable. In particular, time-based sensing (TBS) and boosting are introduced to enhance the accuracy of the BNN computations. System simulations show 80.01% (98.42%) accuracy under the CIFAR-10 (MNIST) dataset under the effect of local and global process variations, corresponding to an 8.59% (0.38%) accuracy loss compared to the original BNN software implementation, while achieving an energy efficiency of 311 TOPS/W.

Original languageEnglish
Pages (from-to)569-579
Number of pages11
JournalIEEE Journal on Emerging and Selected Topics in Circuits and Systems
Volume12
Issue number2
DOIs
Publication statusPublished - 1 Jun 2022

Bibliographical note

Publisher Copyright:
© 2011 IEEE.

Keywords

  • In-memory computing
  • STT-MRAM
  • binary neural networks
  • energy efficiency

Fingerprint

Dive into the research topics of 'STT-BNN: A Novel STT-MRAM In-Memory Computing Macro for Binary Neural Networks'. Together they form a unique fingerprint.

Cite this