ppo-LunarLander-simple / results.json
mnaylor's picture
Commit with first PPO model
ba9efd9
raw
history blame contribute delete
No virus
164 Bytes
{"mean_reward": 278.8952206574262, "std_reward": 14.699030551034376, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-12-20T15:53:57.573444"}