ppo-LunarLander-v2 / results.json
lordsauron's picture
first commit : lunar lander
32c738c
raw
history blame contribute delete
165 Bytes
{"mean_reward": 243.32665810000003, "std_reward": 18.633822724439735, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-06-24T19:52:27.694622"}