DEV Community

Lini Abraham
Lini Abraham

Posted on • Edited on

AI Terms

Weights

A number that says how important an input is.

High weight = input is very important.
Low weight = input barely matters.

Biases

A number that shifts the output up or down, no matter what the inputs are.

A default setting.
Even if the input is zero, the neuron can still produce something.

Activation functions

A function that decides if a neuron should “fire” or how strong its signal should be.

It`s like a gatekeeper.
Only lets important signals through.

Feed-forward propagation

The process where the input goes through the network and creates an output.

Information moves forward only.

Back propagation

The process where the network learns from mistakes by adjusting weights and biases.

It`s similar to a correction loop where you make a mistake, figure out what went wrong, and adjust your thinking.

L1 and L2 Regularisation

Tricks to stop the model from memorizing too much (overfitting).

It`s a discipline rule to keep the model simple and focused.

L1: Can make the model ignore useless inputs completely.
L2: Smooths out the model’s focus to avoid extreme values.

Gradients

A slope that tells the network how to change the weights and biases to get better.

The gradient shows the direction and size of the correction needed.

Gradient Descent Gradient Ascent
Moves in the direction that reduces the output (minimizes loss) Moves in the direction that reduces the output (minimizes loss).
Goal: Find the lowest point (minimum) Goal: Find the highest point (maximum)
Used for: Minimizing error/loss Used for: Maximizing likelihood, rewards (like in reinforcement learning to maximize rewards)

Cost/Loss Function

A Cost Function is a formula that tells you how wrong your model’s predictions are.
It compares the actual value and the expected value

The goal of training an AI model is to reduce costs so that predictions get closer to the true answers.

Hyperparameters

The settings you choose before training the model.

For example, how fast to learn (learning rate), how many neurons to use, etc.

Top comments (0)