C++ - optimizing gradient descent.

  Рет қаралды 571

Carlo Wood

Carlo Wood

15 күн бұрын

Its time to refactor stuff... I have the classes `Function`, `Sample`, `Scale`, `SampleMinimum`, `History`, `KineticEnergy`, and the following variables that exist outside the (epoch) loop: Function L; double learning_rate; History history; Sample new_sample; Scale scale; KineticEnergy energy; int vdirection; int hdirection; std::list⟨SampleMinimum⟩ extremes; best_minimum and last_extreme.
We need a State variable that deals with all internal state needed to find the best global minimum. Perhaps `gradient_descent::State`. This class then will contain all the variables.
You can follow the progress of the project as a whole here: github.com/CarloWood/cairowindow

Пікірлер: 1
@filburtcioglu3729
@filburtcioglu3729 11 күн бұрын
Great content bro keep going
but what is 'a lifetime?
12:20
leddoo
Рет қаралды 56 М.
Building the Gradient Descent Algorithm in 15 Minutes | Coding Challenge
22:29
La final estuvo difícil
00:34
Juan De Dios Pantoja
Рет қаралды 26 МЛН
Understanding Ownership in Rust
25:31
Let's Get Rusty
Рет қаралды 236 М.
Gradient Descent Explained
7:05
IBM Technology
Рет қаралды 55 М.
Joscha at Microsoft
48:46
Simuli
Рет қаралды 1 М.
C++ - optimizing gradient descent.
1:48:20
Carlo Wood
Рет қаралды 374
C++ - optimizing gradient descent.
2:15:41
Carlo Wood
Рет қаралды 167
Fixing A Bootlooping Surface Pro 6
9:36
Jiga Tech
Рет қаралды 7 М.
C++ - optimizing gradient descent.
2:00:45
Carlo Wood
Рет қаралды 360
The Evolution of Gradient Descent
9:19
Siraj Raval
Рет қаралды 93 М.