ppo-CartPole-v1 / config.yml
araffin's picture
Initial Commit
5d8271b verified
raw
history blame
345 Bytes
!!python/object/apply:collections.OrderedDict
- - - batch_size
- 256
- - clip_range
- lin_0.2
- - ent_coef
- 0.0
- - gae_lambda
- 0.8
- - gamma
- 0.98
- - learning_rate
- lin_0.001
- - n_envs
- 8
- - n_epochs
- 20
- - n_steps
- 32
- - n_timesteps
- 100000.0
- - policy
- MlpPolicy