ppo-LunarLander-v2 / results.json
harpomaxx's picture
first model using just 1000K episodes
6852d52
raw
history blame contribute delete
165 Bytes
{"mean_reward": 257.12212310000007, "std_reward": 13.856139863120385, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-06-26T23:12:17.530428"}