Abstract
The influence of the weight quantization levels number on a capsule
neural network functioning quality is studied. The network is tested on the MNIST
dataset recognition. It is shown that to achieve the performance of the network
with continuous weights it is enough to have 16 levels of uniform quantization and
8 levels of exponential quantization. The quantized weights of the capsule neural
network can be successfully implemented based on multilevel memristors.
File
tarkov_5.pdf192.63 KB
Pages
31-35