Playground notes

Machine learning
old
Working on the playground. Coincidentally, I found some really good stuff down there.
Author

Fujimiya Amane

Published

February 4, 2021

Modified

April 12, 2021

The playground exercise includes:

Modifying learning rates, training to test data set ratio, modifying batch size.

What I encountered:

A behavior of overfitting and redundant features is observed:

The first behaviour With two simple features, this works well. The first behaviour Adding more to the picture, we can still see the change is negligible. But if we try to add \(x_{1}^{2}\): The first behaviour As you can see - some features do not work well with the example - hence more complex it is, sometimes it is very bad for prediction.

One thing that I still do not understand is why given complex structure, but a single feature will make it practically… worse. That is, everything works fine even if it is complex, but a single feature will make the whole thing turns into chaos, as you can see from above.

Why this is the case? What is with the behavior?

What indicates complication? - Is \(\sin(x_{1})\) simpler than the behavior of \(x_{1}^{2}\)?