sgoodfriend's picture
PPO playing MountainCarContinuous-v0 from https://github.com/sgoodfriend/rl-algo-impls/tree/0511de345b17175b7cf1ea706c3e05981f11761c
c2802fd
raw
history blame contribute delete
74 Bytes
from rl_algo_impls.shared.vec_env.make_env import make_env, make_eval_env