This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| network_stuff:machine_learning:supervised_learning [2023/04/25 05:12] – jotasandoku | network_stuff:machine_learning:supervised_learning [2023/11/02 14:38] (current) – external edit 127.0.0.1 | ||
|---|---|---|---|
| Line 3: | Line 3: | ||
| In python, we use the method '' | In python, we use the method '' | ||
| kmeans.fit(argument) | kmeans.fit(argument) | ||
| - | kmeans.predict (argument) | + | kmeans.predict (argument) |
| + | \\ | ||
| + | REGRESSION\\ | ||
| + | For continuous tgt values | ||
| + | |||
| + | ---- | ||
| + | |||
| + | DECISON TREES: | ||
| + | \\ | ||
| + | * k-nearest neighbors | ||
| + | * ~ " | ||
| + | * after doing it with all points, it creates a " | ||
| + | * Decision trees | ||
| + | * decision by path to leaves ; measure of center | ||
| + | * we ask question to narrow down areas (normally y/n Qs) | ||
| + | * decision trees can surface relationships that were not evident for the human understanding. | ||
| + | * Random forests: decision trees tend to overfitting. A solution is ' | ||