This is an old revision of the document!
NOTES ABOUT MACHINE LEARNING AND ARTIFICIAL INTELLIGENCE AI
Notes:
Vectors and matrices are basic for machine learning.
transfer learning: reuse existing models.
/Documents/PLURALSIGHT/datascience/Applied-Machine-Learning-mainsource /Users/santosj/Documents/PLURALSIGHT/datascience/bin/activatejupyter notebook /Users/santosj/Documents/PLURALSIGHT/datascience/Applied-Machine-Learning-mainTraining a neural network involves four key steps:
Input data passes through each layer of the network, applying weights, biases, and activation functions, to produce a prediction at the output layer.
The model’s prediction is compared to the actual label using a loss function (e.g., cross-entropy for classification). This produces a scalar loss value representing the prediction error.
Using backpropagation, gradients of the loss are calculated with respect to each weight and bias. The chain rule is applied layer by layer, from output back to input, determining how much each parameter contributed to the loss.
Gradients are used to adjust weights and biases in the direction that reduces the loss. This is done via gradient descent or variants like Adam, guided by the learning rate.
These steps repeat over many data samples, improving the network’s performance through each iteration.
Current practical models (is important to check they support Ollama)
https://github.com/ollama/ollama/blob/main/docs/gpu.md
PRACTICAL NOTES ON MODELS:
tensorsweight and bias « When defining a model weight and bias are called, generically, parameters.embedding (Embeddings capture the inherent properties and relationships of the original data in a condensed format and are often used in Machine Learning use cases. See Link « Better classificationIs a neural network design that activates only a few specialised sub-models (experts) per input, based on a gating mechanism. This allows models to scale to massive sizes efficiently, improving performance while reducing compute costs by avoiding the need to use the entire model every time.