A bunch of numbers go in. The numbers get combined using weight constants into more numbers. This step continues for a few times. Each combination step is called a layer. The final layer is more numbers as output. All weights start out as random. The first set of weights isn't very good. Using math the weights get improved gradually. The result is a trained neural network.
A bunch of numbers go in. The numbers get combined using weight constants into more numbers. This step continues for a few times. Each combination step is called a layer. The final layer is more numbers as output. All weights start out as random. The first set of weights isn't very good. Using math the weights get improved gradually. The result is a trained neural network.
remember a 5-year old should get the concept