[b]Artificial Neural Networks[/b] (ANNs) are a specific tool for implementing ML. [br][br]Mathematically, ANNs are composite functions made up of simple mathematical functions like linear ([math]y=mx+b[/math]) and exponential ([math]y=e^x[/math]) functions. Calculus is used to "train" these composite functions to do various tasks.[br][br]In the applet below you can see an ANN perform machine learning with the push of a button. Each time you press the button, backpropagation "learns" how to select weights (w's) to get [math]out_{o1}[/math] and [math]out_{o2}[/math] to better target [math]o_1[/math] and [math]o_2[/math]. Each press of the button logs one epoch (or learning cycle). Backpropagation is the name for the calculus optimization problem the network is solving.[br][br]TRY IT! If anything goes wrong, click the refresh button in the top right of the applet. We'll discuss what's going on as a group.[br]
How many epochs does it take to get the calculated outputs to within 2 decimal places of the targets?
Reset the app with the button in the top right. If the learning rate is increased to 5, how many epochs does it take to get the calculated outputs to within 2 decimal places?
The inputs together with your targets are often called [b]training data[/b].[br][br]The [b]biases[/b] are optional fixed numbers that get added to the net's. They are user-specified parameters, and are not considered "learnable". They are set to 0 in the applet for simplicity.[br][br]The E's on the right are [b]errors (or loss)[/b], or a measurement of how close the computed outputs are to your targets. In this example we're using mean square error for an error.[br][br]The [b]learning rate[/b] adjusts the impact of each epoch of backpropagation on the weights. Higher learning rates have a bigger impact. Careful though! A learning rate too high might accidentally jump your ANN into a very bad error!
An ANN takes input data, and uses it compute outputs that [b]target[/b] numbers you specify. [br][br]An ANN usually starts its life doing a very bad job of computing outputs close to your targets, but ANNs use calculus to "[b]learn[/b]" how to get their calculated outputs closer to your specified targets. [br][br]You're can see an ANN below. I know it looks insanely complicated, but I promise it's not too bad. :)[br][br]There are two inputs, [math]i_1=2[/math] and [math]i_2=3[/math]. Do you see them on the left? You can change them if you want, but don't just yet.[br][br]There are two targets, [math]o_1=0.2[/math] and [math]o_2=0.3[/math]. Do you seem them on the right? You can also change these, but don't just yet.[br][br]The two outputs the ANN has computed are [math]out_{o1}=0.65596[/math] and [math]out_{o2}=0.68926[/math]. Do you seem them in the gray boxes? You can't change these. These have been computed by the ANN.[br][br]You don't need to know too much about what's in between (all the w's and net's and 1/(1+e^-x)'s), but I'll talk about them a little bit. If anyone's curious you can read more [url=https://docs.paperspace.com/machine-learning/wiki/weights-and-biases]here[/url].[br][br]The cool thing about ANNs is that they use calculus to LEARN how to adjust the w's to get the computed outputs, [math]out_{o1}[/math] and [math]out_{o2}[/math], closer to your targets, [math]o_1[/math] and [math]o_2[/math]. It's called [b]backpropagation[/b], and each iteration of backpropagation is called an [b]epoch[/b].