Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
TuringsSolutions 
posted an update Oct 8
Post
1825
Neural Network Chaos Monkey: Randomly shuts off parts of the neural network during training. The Chaos Monkey is super present at Epoch 1, is gone by the final Epoch. My hypothesis was that this would either increase the robustness of the model, or it would make the outputs totally worse. You can 100% reproduce my results, chaos wins again.

https://youtu.be/bWA9unotJ7k

this already exists, it's called dropout

·

@takeraparterer I am aware that dropout exists or I probably could not build this in the first place. This is dropout on steroids. I love random commenters!

On a more serious note, I feel that this would have the effect of:
at the start, with 100% dropout, would make the model untrainable
the model learns something during the middle of training, but overfits near the end

·

@takeraparterer Good thing I provide colab notebooks with literal tests that you can reproduce yourself! Literally just need to know a lick of python. Go get 'em, tiger!

hence my idea of the SILLY versions... ;)