This paper will focus on proving the four fundamental equations of the backpropagation. Then I will show how to use this algorithm combined with the stochastic gradient descent technique to implement the network for recognizing the handwritten digits. Parts of the proof are provided by the author Michael Nielsen in his online book Neural Networks and Deep Learning. Meanwhile, this paper will provide more details of his proofs and some basic definitions of gradient.
Li, H. (2022). Proofs for the Four Fundamental Equations of the Backpropagation and Algorithms in Feedforward Neural Networks. Afribary. Retrieved from https://tracking.afribary.com/works/paper-2123
Li, Hao "Proofs for the Four Fundamental Equations of the Backpropagation and Algorithms in Feedforward Neural Networks" Afribary. Afribary, 24 Apr. 2022, https://tracking.afribary.com/works/paper-2123. Accessed 09 Nov. 2024.
Li, Hao . "Proofs for the Four Fundamental Equations of the Backpropagation and Algorithms in Feedforward Neural Networks". Afribary, Afribary, 24 Apr. 2022. Web. 09 Nov. 2024. < https://tracking.afribary.com/works/paper-2123 >.
Li, Hao . "Proofs for the Four Fundamental Equations of the Backpropagation and Algorithms in Feedforward Neural Networks" Afribary (2022). Accessed November 09, 2024. https://tracking.afribary.com/works/paper-2123