Loading...

Evaluating Effect of Number Representations on the Accuracy of Convolutional Neural Networks

Aghamohammadi Bonab, Yeganeh | 2022

119 Viewed
  1. Type of Document: M.Sc. Thesis
  2. Language: Farsi
  3. Document No: 55120 (19)
  4. University: Sharif University of Technology
  5. Department: Computer Engineering
  6. Advisor(s): Bayat Sarmadi, Siavash
  7. Abstract:
  8. Convolutional Neural Networks are a kind of neural network applicable in machine vision and image processing. The accuracy of these networks is dependent on different features such as network size network and input size. Today, researchers are improving the accuracy of neural networks by increasing their size. As a result, networks' computation will increase as well. The bigger the size of the neural network, the harder its hardware implementation. One of the proposed solutions to overcome this issue is to change the number representation while preserving the network accuracy. It's challenging to implement floating-point computation on hardware as it consumes a high amount of power and resources and loses throughput compared to fixed-point implementation. Researchers have shown that fixed-point representation slightly affects the accuracy of neural networks. In this research, we explored different methods for converting floating-point to fixed-point and the effects of fixed-point representation on the network accuracy. Since the machine learning frameworks don’t support fixed-point representation, we evaluated the fixed-point numbers on the network accuracy by implementing the inference phase of neural networks with fixed-point. Then, we evaluated the overhead of different rounding methods on hardware accelerators. Results show that LeNet and AlexNet can preserve the model’s accuracy with 4-bit and 8-bit fixed-point representation, respectively
  9. Keywords:
  10. Convolutional Neural Network ; Randomized Rounding ; Field Programmable Gate Array (FPGA) ; Fixed-Point Representation ; Floating-Point Representation

 Digital Object List

 Bookmark

No TOC