ppo-LunarLander-v2 / results.json
jdospina's picture
First upload
7bc5b76
raw
history blame contribute delete
158 Bytes
{"mean_reward": 261.2057867, "std_reward": 26.541280681389626, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-12-13T18:32:08.120831"}