4.6 Article

Intrinsic variation effect in memristive neural network with weight quantization

Journal

NANOTECHNOLOGY
Volume 33, Issue 37, Pages -

Publisher

IOP Publishing Ltd
DOI: 10.1088/1361-6528/ac7651

Keywords

memristor crossbar array; memristive neural network; neuromorphic system; weight quantization; off-chip training; intrinsic variation

Funding

  1. NRF - Korean government [2020M3H5A1081111, 2020M3F3A2A01081656, 2021R1C1C1014530]
  2. MSIT (Ministry of Science and ICT), Korea, under the ITRC (Information Technology Research Center) support program [IITP-2021-0-02052]
  3. Brain Korea 21 Four Program
  4. National Research Foundation of Korea [2021R1C1C1014530, 2020M3F3A2A01081656] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

Ask authors/readers for more resources

In this study, the intrinsic variations in memristor devices and their impact on the neuromorphic system were analyzed. A memristor crossbar array was fabricated and 3-bit multilevel conductance was implemented as weight quantization to minimize performance degradation in the neural network. The study verified tuning operations, endurance and retention characteristics, and measured random telegraph noise (RTN) characteristics. A memristive neural network was constructed using off-chip training and evaluated for classification accuracy by applying intrinsic variations to quantized weights. The results highlight the importance of considering intrinsic variations in transferring pre-trained weights to a memristive neural network through off-chip training.
To analyze the effect of the intrinsic variations of the memristor device on the neuromorphic system, we fabricated 32 x 32 Al2O3/TiO (x) -based memristor crossbar array and implemented 3 bit multilevel conductance as weight quantization by utilizing the switching characteristics to minimize the performance degradation of the neural network. The tuning operation for 8 weight levels was confirmed with a tolerance of +/- 4 mu A (+/- 40 mu S). The endurance and retention characteristics were also verified, and the random telegraph noise (RTN) characteristics were measured according to the weight range to evaluate the internal stochastic variation effect. Subsequently, a memristive neural network was constructed by off-chip training with differential memristor pairs for the Modified National Institute of Standards and Technology (MNIST) handwritten dataset. The pre-trained weights were quantized, and the classification accuracy was evaluated by applying the intrinsic variations to each quantized weight. The intrinsic variations were applied using the measured weight inaccuracy given by the tuning tolerance, RTN characteristics, and the fault device yield. We believe these results should be considered when the pre-trained weights are transferred to a memristive neural network by off-chip training.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available