Reduction of neural network memories interconnections

No Thumbnail Available
Date
2013
Journal Title
Journal ISSN
Volume Title
Publisher
UMT.Lahore
Abstract
This research work proposes a detailed methodology for the reductions of interconnections of neural associative memories using Very Large Scale Integration. For data storage, various techniques have been employed in the past few decades and several improvements have been made regarding size, speed and capacity of memory. Content Addressable Memory (CAM) is a special type of memory which is relatively faster as compared to other memory techniques because the data is retrieved on the basis of partial information instead of using separate address bits. Using Hopfield neurobiological model, a computationally fast and inexpensive associative memory model is proposed. The proposed model is implemented with the reduced input vector length in VLSI that results in the sufficient reduction of the number of neurons to be integrated on a chip. By reducing the input vector length, the number of interconnections is reduced in CAM, consequently the order of multiplications during the formation of trans-conductance matrix is decreased that eventually leads towards a functionally compact and cost efficient CAM design. The proposed methodology is implemented on MATLAB/Simulink. The simulation results verify that the CAM designed using reduced vector length based neural associative memory model is highly efficient with respect to time, space and cost. Although the formulation described here, is independent of hardware yet it provides a complete analytical base for compact, efficient and inexpensive CAM design using simulation tools.
Description
Keywords
Citation
Collections