PPO-LunarLander-v2 / results.json
stinoco's picture
Adding PPO model for solving LunarLander-v2
64c188b
raw
history blame
165 Bytes
{"mean_reward": 265.01921235961925, "std_reward": 11.962475335301292, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-31T14:36:44.661972"}