Design of Fully Analogue Artificial Neural Network with Learning Based on Backpropagation
MetadataShow full item record
A fully analogue implementation of training algorithms would speed up the training of artificial neural networks.A common choice for training the feedforward networks is the backpropagation with stochastic gradient descent. However, the circuit design that would enable its analogue implementation is still an open problem. This paper proposes a fully analogue training circuit block concept based on the backpropagation for neural networks without clock control. Capacitors are used as memory elements for the presented example. The XOR problem is used as an example for concept-level system validation.
Document typePeer reviewed
Document versionFinal PDF
SourceRadioengineering. 2021 vol. 30, č. 2, s. 357-363. ISSN 1210-2512
- 2021/2