Mriganka1999 commited on
Commit
01b991b
1 Parent(s): dc3bec6

Initial commit

Browse files
README.md ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: stable-baselines3
3
+ tags:
4
+ - PandaPickAndPlaceDense-v3
5
+ - deep-reinforcement-learning
6
+ - reinforcement-learning
7
+ - stable-baselines3
8
+ model-index:
9
+ - name: SAC
10
+ results:
11
+ - task:
12
+ type: reinforcement-learning
13
+ name: reinforcement-learning
14
+ dataset:
15
+ name: PandaPickAndPlaceDense-v3
16
+ type: PandaPickAndPlaceDense-v3
17
+ metrics:
18
+ - type: mean_reward
19
+ value: -8.14 +/- 2.88
20
+ name: mean_reward
21
+ verified: false
22
+ ---
23
+
24
+ # **SAC** Agent playing **PandaPickAndPlaceDense-v3**
25
+ This is a trained model of a **SAC** agent playing **PandaPickAndPlaceDense-v3**
26
+ using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
27
+
28
+ ## Usage (with Stable-baselines3)
29
+ TODO: Add your code
30
+
31
+
32
+ ```python
33
+ from stable_baselines3 import ...
34
+ from huggingface_sb3 import load_from_hub
35
+
36
+ ...
37
+ ```
config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"policy_class": {":type:": "<class 'abc.ABCMeta'>", ":serialized:": "gAWVNwAAAAAAAACMHnN0YWJsZV9iYXNlbGluZXMzLnNhYy5wb2xpY2llc5SMEE11bHRpSW5wdXRQb2xpY3mUk5Qu", "__module__": "stable_baselines3.sac.policies", "__doc__": "\n Policy class (with both actor and critic) for SAC.\n\n :param observation_space: Observation space\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param use_expln: Use ``expln()`` function instead of ``exp()`` when using gSDE to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param clip_mean: Clip the mean output when using gSDE to avoid numerical instability.\n :param features_extractor_class: Features extractor to use.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n :param n_critics: Number of critic networks to create.\n :param share_features_extractor: Whether to share or not the features extractor\n between the actor and the critic (this saves computation time)\n ", "__init__": "<function MultiInputPolicy.__init__ at 0x7caa9e1235b0>", "__abstractmethods__": "frozenset()", "_abc_impl": "<_abc._abc_data object at 0x7caa9e12b080>"}, "verbose": 1, "policy_kwargs": {"use_sde": false}, "num_timesteps": 1000000, "_total_timesteps": 1000000, "_num_timesteps_at_start": 0, "seed": null, "action_noise": null, "start_time": 1719030426776408209, "learning_rate": 0.0003, "tensorboard_log": null, "_last_obs": {":type:": "<class 'collections.OrderedDict'>", ":serialized:": "gAWViwIAAAAAAACMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwSbnVtcHkuY29yZS5udW1lcmljlIwLX2Zyb21idWZmZXKUk5QoljAAAAAAAAAAJyoYP3fXmD83QQs9R6CNv3eNjj+2/Qo9Nl6Mv1gwhb7tJAs9tYWyP+7Rij+2/Qo9lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksESwOGlIwBQ5R0lFKUjAxkZXNpcmVkX2dvYWyUaAcoljAAAAAAAAAAP0saPsvTaL/d8om/VF3Xv4nywz/+2fg9FXS8vlEVeL/d8om/ptCcv0wD9z7d8om/lGgOSwRLA4aUaBJ0lFKUjAtvYnNlcnZhdGlvbpRoByiWMAEAAAAAAACyAZC/MDGFv1BFnT/1nrs+3ViHvgZM0D7HoZk/JyoYP3fXmD83QQs9FrMMvKNOdrvvCSe8n++vOwB/rTsqKII8KDcIvC/1PryVYbq6Hbb/PrG2mD9fEVq/MGEovZeKsD7fAHw8Ckvovkegjb93jY4/tv0KPSaOAbzisXC7KrVGvOm2uDswRm87uKGHPLPcXrlrL9u7x5sAu8jd+D3e2Mw/I/h9PgJpoj448ze9QexQPMifmT82Xoy/WDCFvu0kCz0fHBK8hKJLuxyDULyL9tQ76g9aO7ihhzz+3F65aS/bu6vK+rr4l2a/QIKlvtr3eT7GZbU/MM8XQI6FNEDKoZk/tYWyP+7Rij+2/Qo9JJcgvJ/ZhLvzTUO8h23vO+dTeju4oYc8I91euWwv27u2mgC7lGgOSwRLE4aUaBJ0lFKUdS4=", "achieved_goal": "[[ 0.5943932 1.1940755 0.03399774]\n [-1.1064538 1.1136922 0.03393336]\n [-1.0966251 -0.26013446 0.03397076]\n [ 1.3947054 1.0845315 0.03393336]]", "desired_goal": "[[ 0.15067767 -0.9094817 -1.0777241 ]\n [-1.6825356 1.5308391 0.12150954]\n [-0.36807314 -0.96907526 -1.0777241 ]\n [-1.2251174 0.48244703 -1.0777241 ]]", "observation": "[[-1.12505174e+00 -1.04056358e+00 1.22867775e+00 3.66447121e-01\n -2.64349848e-01 4.06830013e-01 1.20024955e+00 5.94393194e-01\n 1.19407547e+00 3.39977406e-02 -8.58761929e-03 -3.75834922e-03\n -1.01952394e-02 5.36914123e-03 5.29468060e-03 1.58882923e-02\n -8.31393152e-03 -1.16551360e-02 -1.42197555e-03]\n [ 4.99436289e-01 1.19307530e+00 -8.51827562e-01 -4.11083102e-02\n 3.44807357e-01 1.53810671e-02 -4.53697503e-01 -1.10645378e+00\n 1.11369216e+00 3.39333639e-02 -7.90742598e-03 -3.67271202e-03\n -1.21281538e-02 5.63703896e-03 3.65103409e-03 1.65566057e-02\n -2.12537867e-04 -6.68900227e-03 -1.96241005e-03]\n [ 1.21516764e-01 1.60036826e+00 2.48016879e-01 3.17207396e-01\n -4.49096859e-02 1.27516398e-02 1.20018864e+00 -1.09662509e+00\n -2.60134459e-01 3.39707620e-02 -8.91783740e-03 -3.10722087e-03\n -1.27265714e-02 6.49911677e-03 3.32736457e-03 1.65566057e-02\n -2.12538958e-04 -6.68900134e-03 -1.91338861e-03]\n [-9.00756359e-01 -3.23259354e-01 2.44109541e-01 1.41716838e+00\n 2.37202072e+00 2.82065153e+00 1.20024991e+00 1.39470541e+00\n 1.08453155e+00 3.39333639e-02 -9.80165973e-03 -4.05426277e-03\n -1.19204400e-02 7.30675785e-03 3.81969824e-03 1.65566057e-02\n -2.12539497e-04 -6.68900274e-03 -1.96234649e-03]]"}, "_last_episode_starts": {":type:": "<class 'numpy.ndarray'>", ":serialized:": "gAWVdwAAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYEAAAAAAAAAAEBAQGUjAVudW1weZSMBWR0eXBllJOUjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSwSFlIwBQ5R0lFKULg=="}, "_last_original_obs": {":type:": "<class 'collections.OrderedDict'>", ":serialized:": "gAWViwIAAAAAAACMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwSbnVtcHkuY29yZS5udW1lcmljlIwLX2Zyb21idWZmZXKUk5QoljAAAAAAAAAAb7lZPVo62j2TwaM8t7rIvexpyz3Qv6M8ufDGvd8Yx7zWwKM8Xon+PSUKxj3Qv6M8lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksESwOGlIwBQ5R0lFKUjAxkZXNpcmVkX2dvYWyUaAcoljAAAAAAAAAAavpfPCO2or0K16M8bZAUvokhCD7lg8w9YwIAvWxVrb0K16M8AAHYvbXIKj0K16M8lGgOSwRLA4aUaBJ0lFKUjAtvYnNlcnZhdGlvbpRoByiWMAEAAAAAAAC/a5a+esaUvnK71j6urWw86dOZvc5ACz4J16M9b7lZPVo62j2TwaM8ocaeN1wYqzaPG1o5OSXgt7+RLzgneKi3vHn2urhknLqNKA85Ini+PWnjzT7DLCc8UNWWvRxbAz7NnYg6cc+YPLe6yL3sacs90L+jPD/YjDhbOhs3vvW6uDexqbdZ6he2QD9QrIAYTS+UxVwuxY37uEzfRTudRwc/UfVnPrAweTt3n/G61m8OOdrVoz258Ma93xjHvNbAozw5mZa2a9YZOAfyPbkAGTE1VsA3t6CEjKznypKu+hCKL8/DyriqJ3a+xoSLvdwrZz5hXHc+OR1OP/nWdz8L16M9Xon+PSUKxj3Qv6M8ONiMuEw6G7c6/G+4LrKpN6/uFzZAF1isu0lTr3BeWq6Uffu4lGgOSwRLE4aUaBJ0lFKUdS4=", "achieved_goal": "[[ 0.05315536 0.10655661 0.01998976]\n [-0.09801238 0.09932312 0.01998892]\n [-0.09713883 -0.02430385 0.01998941]\n [ 0.12428544 0.09669904 0.01998892]]", "desired_goal": "[[ 0.01367054 -0.07944896 0.02 ]\n [-0.14508219 0.13294043 0.09986094]\n [-0.03125228 -0.08463559 0.02 ]\n [-0.10547066 0.04169532 0.02 ]]", "observation": "[[-2.93790787e-01 -2.90576756e-01 4.19398844e-01 1.44457053e-02\n -7.51112178e-02 1.35989398e-01 7.99999908e-02 5.31553589e-02\n 1.06556609e-01 1.99897643e-02 1.89275615e-05 5.09903293e-06\n 2.08003665e-04 -2.67202140e-05 4.18589880e-05 -2.00831109e-05\n -1.88045902e-03 -1.19318720e-03 1.36526491e-04]\n [ 9.30025727e-02 4.02125627e-01 1.02035431e-02 -7.36490488e-02\n 1.28277242e-01 1.04230049e-03 1.86536033e-02 -9.80123803e-02\n 9.93231237e-02 1.99889243e-02 6.71599919e-05 9.25230688e-06\n -8.91494419e-05 -2.02288920e-05 -2.26371617e-06 -2.95936886e-12\n 1.86533455e-10 5.01976932e-11 -1.19950193e-04]\n [ 3.01929098e-03 5.28436482e-01 2.26521745e-01 3.80234048e-03\n -1.84343650e-03 1.35838374e-04 7.99977332e-02 -9.71388295e-02\n -2.43038516e-02 1.99894123e-02 -4.48818582e-06 3.66777349e-05\n -1.81146068e-04 6.59740181e-07 -1.09524317e-05 -3.99376365e-12\n -6.67535402e-11 2.51141163e-10 -9.66858279e-05]\n [-2.40385681e-01 -6.81243390e-02 2.25753248e-01 2.41563335e-01\n 8.05133402e-01 9.68123972e-01 8.00000057e-02 1.24285445e-01\n 9.66990367e-02 1.99889243e-02 -6.71599410e-05 -9.25229324e-06\n -5.72169447e-05 2.02293413e-05 2.26396855e-06 -3.07083525e-12\n -1.92165325e-10 -4.96513386e-11 -1.19920034e-04]]"}, "_episode_num": 20624, "use_sde": false, "sde_sample_freq": -1, "_current_progress_remaining": 0.0, "_stats_window_size": 100, "ep_info_buffer": {":type:": "<class 'collections.deque'>", ":serialized:": "gAWV4AsAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpRHwC1N/hESdvuMAWyUSzKMAXSUR0DF+IDeXRgJdX2UKGgGR8Alhegte2NOaAdLMmgIR0DF+Hf8ZUDMdX2UKGgGR7+nZ00WM0gsaAdLAWgIR0DF+H221D0EdX2UKGgGR8AkZP+GXXyzaAdLMmgIR0DF+PA5WBBidX2UKGgGR8At8TEBKcuraAdLMmgIR0DF+ZxDeCTVdX2UKGgGR8AnXt3OfNA1aAdLMmgIR0DF+Z1WS2YwdX2UKGgGR8AdpEx7AtWdaAdLMmgIR0DF+Zp2St/4dX2UKGgGR8AlkGmk30f6aAdLMmgIR0DF+gj7XQMQdX2UKGgGR8AoyofjjrAyaAdLMmgIR0DF+q57LMcIdX2UKGgGR8AwSzVc2R7raAdLMmgIR0DF+q5xgiNbdX2UKGgGR8AiqJa7mMfjaAdLMmgIR0DF+qmwkgOjdX2UKGgGR8AklrO7g88taAdLMmgIR0DF+xmU2UB5dX2UKGgGR8Ap3/z8P4EfaAdLMmgIR0DF+8o9X9zfdX2UKGgGR8Asx34bjtG/aAdLMmgIR0DF+81rhzeXdX2UKGgGR8AbUKgIyCWeaAdLMmgIR0DF+8g2Kl54dX2UKGgGR8AH8UKzAvcraAdLMmgIR0DF/DVOmBOIdX2UKGgGR8ATR62OQyRCaAdLMmgIR0DF/QlByCFsdX2UKGgGR8AQqpiqhlDnaAdLMmgIR0DF/RXsRg7YdX2UKGgGR8AnVHeaa1CxaAdLMmgIR0DF/S4ukDZEdX2UKGgGR8ArTcu8K5TZaAdLMmgIR0DF/dKg00m/dX2UKGgGR8AXVAprk8zRaAdLMmgIR0DF/oo8U21ldX2UKGgGR8AliiosI3R5aAdLMmgIR0DF/ou9vjwQdX2UKGgGR8AnhCu2Zy+6aAdLMmgIR0DF/oW5jH4odX2UKGgGR8AjlwtJ4B3iaAdLMmgIR0DF/vYfKZDzdX2UKGgGR8AxUQqqfe1saAdLMmgIR0DF/6Bid8RddX2UKGgGR8ApFgWrOqvNaAdLMmgIR0DF/6EvCdjHdX2UKGgGR8AlGKJl8PWhaAdLMmgIR0DF/5wkJKJ3dX2UKGgGR8AX9pj+aScLaAdLMmgIR0DGAAn5P/JedX2UKGgGR8Ac2/mDDjzaaAdLMmgIR0DGALkEovzwdX2UKGgGR8AnkOhkAggYaAdLMmgIR0DGALqPsAvMdX2UKGgGR8AqIx33YcvNaAdLMmgIR0DGALcihWYGdX2UKGgGR8AbR/wy6+WXaAdLMmgIR0DGASixkd3jdX2UKGgGR8AdV32VVxS6aAdLMmgIR0DGAdewqy4XdX2UKGgGR8Ag8O3DvVmSaAdLMmgIR0DGAdfied08dX2UKGgGR8AcDn5i3G4raAdLMmgIR0DGAdZOBUaRdX2UKGgGR8AhTYU34sVdaAdLMmgIR0DGAkmOQyRCdX2UKGgGR8ANsySFGoaUaAdLMmgIR0DGAvtBMSK4dX2UKGgGR8ApjG+9Jz1caAdLMmgIR0DGAwXW4EwGdX2UKGgGR8AGe0Re1KGtaAdLMmgIR0DGAxSofjjrdX2UKGgGR8Ag27V8Ti84aAdLMmgIR0DGA66/EfkndX2UKGgGR8AaXTAnDziCaAdLMmgIR0DGBID5uZTidX2UKGgGR8AaM2AG0NSZaAdLMmgIR0DGBIG34Kx+dX2UKGgGR8Ak92mHgxagaAdLMmgIR0DGBH6+FlCkdX2UKGgGR8AQHQswtapxaAdLMmgIR0DGBPAWHk92dX2UKGgGR8AVEyad+XqraAdLMmgIR0DGBZ0bPyCndX2UKGgGR8AfagPEsJ6ZaAdLMmgIR0DGBZ2XZ5AydX2UKGgGR8AjQo6S1Vo6aAdLMmgIR0DGBZqLMs6JdX2UKGgGR8ApgLzf779AaAdLMmgIR0DGBgtCgK4QdX2UKGgGR8AwFqcmShalaAdLMmgIR0DGBr/49HMEdX2UKGgGR8AqN4UN8VpLaAdLMmgIR0DGBsJi1AqvdX2UKGgGR8AibSx7iQ1aaAdLMmgIR0DGBr9ITXardX2UKGgGR8AhYzzErGzbaAdLMmgIR0DGBzVcjZ+QdX2UKGgGR8AhPpJwsGxEaAdLMmgIR0DGB+NQVKwqdX2UKGgGR8APU6eXiR4haAdLMmgIR0DGB+Sx/ustdX2UKGgGR8AfnCaZx7zDaAdLMmgIR0DGB9/o/zJ7dX2UKGgGR8AkfnXd0q6OaAdLMmgIR0DGCFIRTS9edX2UKGgGR8AQHynUDuBuaAdLMmgIR0DGCP9uejEfdX2UKGgGR8Ak56fra/RFaAdLMmgIR0DGCQBC6YmcdX2UKGgGR8Aqd0ihWYF8aAdLMmgIR0DGCP0tyxRmdX2UKGgGR8Awi1E3Kji5aAdLMmgIR0DGCZSY7aIvdX2UKGgGR8AkuksSTQmeaAdLMmgIR0DGCn0ahpQDdX2UKGgGR8ArWFcIJJGwaAdLMmgIR0DGCoZ++dsjdX2UKGgGR7+m1hLGrCFcaAdLAWgIR0DGCo5LoOhCdX2UKGgGR8AXeQr+YMOPaAdLMmgIR0DGCpfBacI7dX2UKGgGR8AckeOn2qT9aAdLMmgIR0DGCwsNSZSfdX2UKGgGR8AnMqkuYhMbaAdLMmgIR0DGC7tMoMKDdX2UKGgGR8AjCW2w3YL9aAdLMmgIR0DGC8GndfsvdX2UKGgGR8AYYu01IiC8aAdLMmgIR0DGC7f0PH1fdX2UKGgGR8AoONXo1UEQaAdLMmgIR0DGDC9IkJKKdX2UKGgGR8AdY3HaN+9baAdLMmgIR0DGDONOsT37dX2UKGgGR8AptA57w8W9aAdLMmgIR0DGDOqmhufmdX2UKGgGR8AgHIEr5IpZaAdLMmgIR0DGDOVbX6IndX2UKGgGR8AjJ/JeVs1saAdLMmgIR0DGDV3f4yoGdX2UKGgGR8AhmNrCWNWEaAdLMmgIR0DGDhGZLIxQdX2UKGgGR8Ajvt3wCr93aAdLMmgIR0DGDhjngYP5dX2UKGgGR8AuC6zVtoBaaAdLMmgIR0DGDg6RISUUdX2UKGgGR8AlMEX+ERJ3aAdLMmgIR0DGDoZaiblSdX2UKGgGR8AhSLc9GI9DaAdLMmgIR0DGDza11GLDdX2UKGgGR8Ar/FwT/Q0GaAdLMmgIR0DGDz1Y2bXpdX2UKGgGR8AuYZ6Uqx1QaAdLMmgIR0DGDzXx4IKMdX2UKGgGR8AjPpA2Q4jsaAdLMmgIR0DGD7G7pV0cdX2UKGgGR8AkXjlxOtW/aAdLMmgIR0DGEKd2C/XYdX2UKGgGR8AoDlcQiA2AaAdLMmgIR0DGELyUPhAGdX2UKGgGR8Adc6FM7EHdaAdLMmgIR0DGEMgT7EYPdX2UKGgGR8AdME9t/FzdaAdLMmgIR0DGEWHBzmwJdX2UKGgGR8AY7vnbItDlaAdLMmgIR0DGEhIC2c8UdX2UKGgGR8AhbesxO+IuaAdLMmgIR0DGEhmpEQXidX2UKGgGR8AjL2V3Ux20aAdLMmgIR0DGEg8yi22HdX2UKGgGR8Ai1IV/MGHIaAdLMmgIR0DGEn9ZA6dUdX2UKGgGR8Ad1jqfOD8MaAdLMmgIR0DGEzJ+QU5/dX2UKGgGR8Ag/BciW3SbaAdLMmgIR0DGEzjrC3w1dX2UKGgGR8AuqW0JF9a2aAdLMmgIR0DGEy9senyedX2UKGgGR8AqmxptaY/naAdLMmgIR0DGE6naJyhjdX2UKGgGR8AqYlPacqe9aAdLMmgIR0DGFF3dCVrzdX2UKGgGR8Apv9oexOclaAdLMmgIR0DGFGS+g13udX2UKGgGR8AHVJ4B3iaRaAdLMmgIR0DGFFxnBciXdX2UKGgGR8Ahz4vexfOVaAdLMmgIR0DGFNO1rqMWdX2UKGgGR8AhdPpIMBp6aAdLMmgIR0DGFXusNlRQdX2UKGgGR8AcHBHkLhJiaAdLMmgIR0DGFYD4agmJdX2UKGgGR8AlZjgAIY3vaAdLMmgIR0DGFXW+49X+dWUu"}, "ep_success_buffer": {":type:": "<class 'collections.deque'>", ":serialized:": "gAWVhgAAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKImJiImJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiImJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYllLg=="}, "_n_updates": 249975, "buffer_size": 1000000, "batch_size": 256, "learning_starts": 100, "tau": 0.005, "gamma": 0.99, "gradient_steps": 1, "optimize_memory_usage": false, "replay_buffer_class": {":type:": "<class 'abc.ABCMeta'>", ":serialized:": "gAWVOQAAAAAAAACMIHN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5idWZmZXJzlIwQRGljdFJlcGxheUJ1ZmZlcpSTlC4=", "__module__": "stable_baselines3.common.buffers", "__annotations__": "{'observation_space': <class 'gymnasium.spaces.dict.Dict'>, 'obs_shape': typing.Dict[str, typing.Tuple[int, ...]], 'observations': typing.Dict[str, numpy.ndarray], 'next_observations': typing.Dict[str, numpy.ndarray]}", "__doc__": "\n Dict Replay buffer used in off-policy algorithms like SAC/TD3.\n Extends the ReplayBuffer to use dictionary observations\n\n :param buffer_size: Max number of element in the buffer\n :param observation_space: Observation space\n :param action_space: Action space\n :param device: PyTorch device\n :param n_envs: Number of parallel environments\n :param optimize_memory_usage: Enable a memory efficient variant\n Disabled for now (see https://github.com/DLR-RM/stable-baselines3/pull/243#discussion_r531535702)\n :param handle_timeout_termination: Handle timeout termination (due to timelimit)\n separately and treat the task as infinite horizon task.\n https://github.com/DLR-RM/stable-baselines3/issues/284\n ", "__init__": "<function DictReplayBuffer.__init__ at 0x7caa9e23f490>", "add": "<function DictReplayBuffer.add at 0x7caa9e23f520>", "sample": "<function DictReplayBuffer.sample at 0x7caa9e23f5b0>", "_get_samples": "<function DictReplayBuffer._get_samples at 0x7caa9e23f640>", "__abstractmethods__": "frozenset()", "_abc_impl": "<_abc._abc_data object at 0x7caa9e243500>"}, "replay_buffer_kwargs": {}, "train_freq": {":type:": "<class 'stable_baselines3.common.type_aliases.TrainFreq'>", ":serialized:": "gAWVYQAAAAAAAACMJXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi50eXBlX2FsaWFzZXOUjAlUcmFpbkZyZXGUk5RLAWgAjBJUcmFpbkZyZXF1ZW5jeVVuaXSUk5SMBHN0ZXCUhZRSlIaUgZQu"}, "use_sde_at_warmup": false, "target_entropy": -4.0, "ent_coef": "auto", "target_update_interval": 1, "observation_space": {":type:": "<class 'gymnasium.spaces.dict.Dict'>", ":serialized:": "gAWVMgQAAAAAAACMFWd5bW5hc2l1bS5zcGFjZXMuZGljdJSMBERpY3SUk5QpgZR9lCiMBnNwYWNlc5SMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwUZ3ltbmFzaXVtLnNwYWNlcy5ib3iUjANCb3iUk5QpgZR9lCiMBWR0eXBllIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYowNYm91bmRlZF9iZWxvd5SMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYDAAAAAAAAAAEBAZRoE4wCYjGUiYiHlFKUKEsDjAF8lE5OTkr/////Sv////9LAHSUYksDhZSMAUOUdJRSlIwNYm91bmRlZF9hYm92ZZRoHCiWAwAAAAAAAAABAQGUaCBLA4WUaCR0lFKUjAZfc2hhcGWUSwOFlIwDbG93lGgcKJYMAAAAAAAAAAAAIMEAACDBAAAgwZRoFksDhZRoJHSUUpSMBGhpZ2iUaBwolgwAAAAAAAAAAAAgQQAAIEEAACBBlGgWSwOFlGgkdJRSlIwIbG93X3JlcHKUjAUtMTAuMJSMCWhpZ2hfcmVwcpSMBDEwLjCUjApfbnBfcmFuZG9tlE51YowMZGVzaXJlZF9nb2FslGgNKYGUfZQoaBBoFmgZaBwolgMAAAAAAAAAAQEBlGggSwOFlGgkdJRSlGgnaBwolgMAAAAAAAAAAQEBlGggSwOFlGgkdJRSlGgsSwOFlGguaBwolgwAAAAAAAAAAAAgwQAAIMEAACDBlGgWSwOFlGgkdJRSlGgzaBwolgwAAAAAAAAAAAAgQQAAIEEAACBBlGgWSwOFlGgkdJRSlGg4jAUtMTAuMJRoOowEMTAuMJRoPE51YowLb2JzZXJ2YXRpb26UaA0pgZR9lChoEGgWaBloHCiWEwAAAAAAAAABAQEBAQEBAQEBAQEBAQEBAQEBlGggSxOFlGgkdJRSlGgnaBwolhMAAAAAAAAAAQEBAQEBAQEBAQEBAQEBAQEBAZRoIEsThZRoJHSUUpRoLEsThZRoLmgcKJZMAAAAAAAAAAAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMGUaBZLE4WUaCR0lFKUaDNoHCiWTAAAAAAAAAAAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBlGgWSxOFlGgkdJRSlGg4jAUtMTAuMJRoOowEMTAuMJRoPE51YnVoLE5oEE5oPE51Yi4=", "spaces": "OrderedDict([('achieved_goal', Box(-10.0, 10.0, (3,), float32)), ('desired_goal', Box(-10.0, 10.0, (3,), float32)), ('observation', Box(-10.0, 10.0, (19,), float32))])", "_shape": null, "dtype": null, "_np_random": null}, "action_space": {":type:": "<class 'gymnasium.spaces.box.Box'>", ":serialized:": "gAWVawIAAAAAAACMFGd5bW5hc2l1bS5zcGFjZXMuYm94lIwDQm94lJOUKYGUfZQojAVkdHlwZZSMBW51bXB5lIwFZHR5cGWUk5SMAmY0lImIh5RSlChLA4wBPJROTk5K/////0r/////SwB0lGKMDWJvdW5kZWRfYmVsb3eUjBJudW1weS5jb3JlLm51bWVyaWOUjAtfZnJvbWJ1ZmZlcpSTlCiWBAAAAAAAAAABAQEBlGgIjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSwSFlIwBQ5R0lFKUjA1ib3VuZGVkX2Fib3ZllGgRKJYEAAAAAAAAAAEBAQGUaBVLBIWUaBl0lFKUjAZfc2hhcGWUSwSFlIwDbG93lGgRKJYQAAAAAAAAAAAAgL8AAIC/AACAvwAAgL+UaAtLBIWUaBl0lFKUjARoaWdolGgRKJYQAAAAAAAAAAAAgD8AAIA/AACAPwAAgD+UaAtLBIWUaBl0lFKUjAhsb3dfcmVwcpSMBC0xLjCUjAloaWdoX3JlcHKUjAMxLjCUjApfbnBfcmFuZG9tlIwUbnVtcHkucmFuZG9tLl9waWNrbGWUjBBfX2dlbmVyYXRvcl9jdG9ylJOUjAVQQ0c2NJRoMowUX19iaXRfZ2VuZXJhdG9yX2N0b3KUk5SGlFKUfZQojA1iaXRfZ2VuZXJhdG9ylIwFUENHNjSUjAVzdGF0ZZR9lChoPYoQAwiL5FV7Qs4TDuqR03IAT4wDaW5jlIoRrW+5A4GHaDIrz4zJa9YWoAB1jApoYXNfdWludDMylEsAjAh1aW50ZWdlcpRLAHVidWIu", "dtype": "float32", "bounded_below": "[ True True True True]", "bounded_above": "[ True True True True]", "_shape": [4], "low": "[-1. -1. -1. -1.]", "high": "[1. 1. 1. 1.]", "low_repr": "-1.0", "high_repr": "1.0", "_np_random": "Generator(PCG64)"}, "n_envs": 4, "lr_schedule": {":type:": "<class 'function'>", ":serialized:": "gAWVoAMAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwOX21ha2VfZnVuY3Rpb26Uk5QoaACMDV9idWlsdGluX3R5cGWUk5SMCENvZGVUeXBllIWUUpQoSwFLAEsASwFLA0sTQwx0AIgAfACDAYMBUwCUToWUjAVmbG9hdJSFlIwScHJvZ3Jlc3NfcmVtYWluaW5nlIWUjEkvdXNyL2xvY2FsL2xpYi9weXRob24zLjEwL2Rpc3QtcGFja2FnZXMvc3RhYmxlX2Jhc2VsaW5lczMvY29tbW9uL3V0aWxzLnB5lIwIPGxhbWJkYT6US2FDAgwAlIwOdmFsdWVfc2NoZWR1bGWUhZQpdJRSlH2UKIwLX19wYWNrYWdlX1+UjBhzdGFibGVfYmFzZWxpbmVzMy5jb21tb26UjAhfX25hbWVfX5SMHnN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi51dGlsc5SMCF9fZmlsZV9flIxJL3Vzci9sb2NhbC9saWIvcHl0aG9uMy4xMC9kaXN0LXBhY2thZ2VzL3N0YWJsZV9iYXNlbGluZXMzL2NvbW1vbi91dGlscy5weZR1Tk5oAIwQX21ha2VfZW1wdHlfY2VsbJSTlClSlIWUdJRSlIwcY2xvdWRwaWNrbGUuY2xvdWRwaWNrbGVfZmFzdJSMEl9mdW5jdGlvbl9zZXRzdGF0ZZSTlGghfZR9lChoGGgPjAxfX3F1YWxuYW1lX1+UjCFnZXRfc2NoZWR1bGVfZm4uPGxvY2Fscz4uPGxhbWJkYT6UjA9fX2Fubm90YXRpb25zX1+UfZSMDl9fa3dkZWZhdWx0c19flE6MDF9fZGVmYXVsdHNfX5ROjApfX21vZHVsZV9flGgZjAdfX2RvY19flE6MC19fY2xvc3VyZV9flGgAjApfbWFrZV9jZWxslJOUaAIoaAcoSwFLAEsASwFLAUsTQwSIAFMAlGgJKYwBX5SFlGgOjARmdW5jlEuFQwIEAZSMA3ZhbJSFlCl0lFKUaBVOTmgdKVKUhZR0lFKUaCRoPn2UfZQoaBhoNWgnjBljb25zdGFudF9mbi48bG9jYWxzPi5mdW5jlGgpfZRoK05oLE5oLWgZaC5OaC9oMUc/M6kqMFUyYYWUUpSFlIwXX2Nsb3VkcGlja2xlX3N1Ym1vZHVsZXOUXZSMC19fZ2xvYmFsc19flH2UdYaUhlIwhZRSlIWUaEZdlGhIfZR1hpSGUjAu"}, "batch_norm_stats": [], "batch_norm_stats_target": [], "system_info": {"OS": "Linux-6.1.85+-x86_64-with-glibc2.35 # 1 SMP PREEMPT_DYNAMIC Sun Apr 28 14:29:16 UTC 2024", "Python": "3.10.12", "Stable-Baselines3": "2.3.2", "PyTorch": "2.3.0+cu121", "GPU Enabled": "False", "Numpy": "1.25.2", "Cloudpickle": "2.2.1", "Gymnasium": "0.29.1", "OpenAI Gym": "0.25.2"}}
replay.mp4 ADDED
Binary file (699 kB). View file
 
results.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"mean_reward": -8.143745030462743, "std_reward": 2.882199536987433, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-06-22T07:42:16.582007"}
sac-PandaPickAndPlaceDense-v3.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:54605630d26ee778c4d4615ddc1b19b8cad85dd2d70211d7453bb5305a66cc5a
3
+ size 3305104
sac-PandaPickAndPlaceDense-v3/_stable_baselines3_version ADDED
@@ -0,0 +1 @@
 
 
1
+ 2.3.2
sac-PandaPickAndPlaceDense-v3/actor.optimizer.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6a9b7d6cac79933ea3ad8247025fb97bc122100390cbad4a35e5b9f1d95b6909
3
+ size 602702
sac-PandaPickAndPlaceDense-v3/critic.optimizer.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1d47c3df2cf2486346330a7f5fe1bbd3010f6786be389395220d70ae121c2da1
3
+ size 1189290
sac-PandaPickAndPlaceDense-v3/data ADDED
@@ -0,0 +1,114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "policy_class": {
3
+ ":type:": "<class 'abc.ABCMeta'>",
4
+ ":serialized:": "gAWVNwAAAAAAAACMHnN0YWJsZV9iYXNlbGluZXMzLnNhYy5wb2xpY2llc5SMEE11bHRpSW5wdXRQb2xpY3mUk5Qu",
5
+ "__module__": "stable_baselines3.sac.policies",
6
+ "__doc__": "\n Policy class (with both actor and critic) for SAC.\n\n :param observation_space: Observation space\n :param action_space: Action space\n :param lr_schedule: Learning rate schedule (could be constant)\n :param net_arch: The specification of the policy and value networks.\n :param activation_fn: Activation function\n :param use_sde: Whether to use State Dependent Exploration or not\n :param log_std_init: Initial value for the log standard deviation\n :param use_expln: Use ``expln()`` function instead of ``exp()`` when using gSDE to ensure\n a positive standard deviation (cf paper). It allows to keep variance\n above zero and prevent it from growing too fast. In practice, ``exp()`` is usually enough.\n :param clip_mean: Clip the mean output when using gSDE to avoid numerical instability.\n :param features_extractor_class: Features extractor to use.\n :param normalize_images: Whether to normalize images or not,\n dividing by 255.0 (True by default)\n :param optimizer_class: The optimizer to use,\n ``th.optim.Adam`` by default\n :param optimizer_kwargs: Additional keyword arguments,\n excluding the learning rate, to pass to the optimizer\n :param n_critics: Number of critic networks to create.\n :param share_features_extractor: Whether to share or not the features extractor\n between the actor and the critic (this saves computation time)\n ",
7
+ "__init__": "<function MultiInputPolicy.__init__ at 0x7caa9e1235b0>",
8
+ "__abstractmethods__": "frozenset()",
9
+ "_abc_impl": "<_abc._abc_data object at 0x7caa9e12b080>"
10
+ },
11
+ "verbose": 1,
12
+ "policy_kwargs": {
13
+ "use_sde": false
14
+ },
15
+ "num_timesteps": 1000000,
16
+ "_total_timesteps": 1000000,
17
+ "_num_timesteps_at_start": 0,
18
+ "seed": null,
19
+ "action_noise": null,
20
+ "start_time": 1719030426776408209,
21
+ "learning_rate": 0.0003,
22
+ "tensorboard_log": null,
23
+ "_last_obs": {
24
+ ":type:": "<class 'collections.OrderedDict'>",
25
+ ":serialized:": "gAWViwIAAAAAAACMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwSbnVtcHkuY29yZS5udW1lcmljlIwLX2Zyb21idWZmZXKUk5QoljAAAAAAAAAAJyoYP3fXmD83QQs9R6CNv3eNjj+2/Qo9Nl6Mv1gwhb7tJAs9tYWyP+7Rij+2/Qo9lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksESwOGlIwBQ5R0lFKUjAxkZXNpcmVkX2dvYWyUaAcoljAAAAAAAAAAP0saPsvTaL/d8om/VF3Xv4nywz/+2fg9FXS8vlEVeL/d8om/ptCcv0wD9z7d8om/lGgOSwRLA4aUaBJ0lFKUjAtvYnNlcnZhdGlvbpRoByiWMAEAAAAAAACyAZC/MDGFv1BFnT/1nrs+3ViHvgZM0D7HoZk/JyoYP3fXmD83QQs9FrMMvKNOdrvvCSe8n++vOwB/rTsqKII8KDcIvC/1PryVYbq6Hbb/PrG2mD9fEVq/MGEovZeKsD7fAHw8Ckvovkegjb93jY4/tv0KPSaOAbzisXC7KrVGvOm2uDswRm87uKGHPLPcXrlrL9u7x5sAu8jd+D3e2Mw/I/h9PgJpoj448ze9QexQPMifmT82Xoy/WDCFvu0kCz0fHBK8hKJLuxyDULyL9tQ76g9aO7ihhzz+3F65aS/bu6vK+rr4l2a/QIKlvtr3eT7GZbU/MM8XQI6FNEDKoZk/tYWyP+7Rij+2/Qo9JJcgvJ/ZhLvzTUO8h23vO+dTeju4oYc8I91euWwv27u2mgC7lGgOSwRLE4aUaBJ0lFKUdS4=",
26
+ "achieved_goal": "[[ 0.5943932 1.1940755 0.03399774]\n [-1.1064538 1.1136922 0.03393336]\n [-1.0966251 -0.26013446 0.03397076]\n [ 1.3947054 1.0845315 0.03393336]]",
27
+ "desired_goal": "[[ 0.15067767 -0.9094817 -1.0777241 ]\n [-1.6825356 1.5308391 0.12150954]\n [-0.36807314 -0.96907526 -1.0777241 ]\n [-1.2251174 0.48244703 -1.0777241 ]]",
28
+ "observation": "[[-1.12505174e+00 -1.04056358e+00 1.22867775e+00 3.66447121e-01\n -2.64349848e-01 4.06830013e-01 1.20024955e+00 5.94393194e-01\n 1.19407547e+00 3.39977406e-02 -8.58761929e-03 -3.75834922e-03\n -1.01952394e-02 5.36914123e-03 5.29468060e-03 1.58882923e-02\n -8.31393152e-03 -1.16551360e-02 -1.42197555e-03]\n [ 4.99436289e-01 1.19307530e+00 -8.51827562e-01 -4.11083102e-02\n 3.44807357e-01 1.53810671e-02 -4.53697503e-01 -1.10645378e+00\n 1.11369216e+00 3.39333639e-02 -7.90742598e-03 -3.67271202e-03\n -1.21281538e-02 5.63703896e-03 3.65103409e-03 1.65566057e-02\n -2.12537867e-04 -6.68900227e-03 -1.96241005e-03]\n [ 1.21516764e-01 1.60036826e+00 2.48016879e-01 3.17207396e-01\n -4.49096859e-02 1.27516398e-02 1.20018864e+00 -1.09662509e+00\n -2.60134459e-01 3.39707620e-02 -8.91783740e-03 -3.10722087e-03\n -1.27265714e-02 6.49911677e-03 3.32736457e-03 1.65566057e-02\n -2.12538958e-04 -6.68900134e-03 -1.91338861e-03]\n [-9.00756359e-01 -3.23259354e-01 2.44109541e-01 1.41716838e+00\n 2.37202072e+00 2.82065153e+00 1.20024991e+00 1.39470541e+00\n 1.08453155e+00 3.39333639e-02 -9.80165973e-03 -4.05426277e-03\n -1.19204400e-02 7.30675785e-03 3.81969824e-03 1.65566057e-02\n -2.12539497e-04 -6.68900274e-03 -1.96234649e-03]]"
29
+ },
30
+ "_last_episode_starts": {
31
+ ":type:": "<class 'numpy.ndarray'>",
32
+ ":serialized:": "gAWVdwAAAAAAAACMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYEAAAAAAAAAAEBAQGUjAVudW1weZSMBWR0eXBllJOUjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSwSFlIwBQ5R0lFKULg=="
33
+ },
34
+ "_last_original_obs": {
35
+ ":type:": "<class 'collections.OrderedDict'>",
36
+ ":serialized:": "gAWViwIAAAAAAACMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwSbnVtcHkuY29yZS5udW1lcmljlIwLX2Zyb21idWZmZXKUk5QoljAAAAAAAAAAb7lZPVo62j2TwaM8t7rIvexpyz3Qv6M8ufDGvd8Yx7zWwKM8Xon+PSUKxj3Qv6M8lIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYksESwOGlIwBQ5R0lFKUjAxkZXNpcmVkX2dvYWyUaAcoljAAAAAAAAAAavpfPCO2or0K16M8bZAUvokhCD7lg8w9YwIAvWxVrb0K16M8AAHYvbXIKj0K16M8lGgOSwRLA4aUaBJ0lFKUjAtvYnNlcnZhdGlvbpRoByiWMAEAAAAAAAC/a5a+esaUvnK71j6urWw86dOZvc5ACz4J16M9b7lZPVo62j2TwaM8ocaeN1wYqzaPG1o5OSXgt7+RLzgneKi3vHn2urhknLqNKA85Ini+PWnjzT7DLCc8UNWWvRxbAz7NnYg6cc+YPLe6yL3sacs90L+jPD/YjDhbOhs3vvW6uDexqbdZ6he2QD9QrIAYTS+UxVwuxY37uEzfRTudRwc/UfVnPrAweTt3n/G61m8OOdrVoz258Ma93xjHvNbAozw5mZa2a9YZOAfyPbkAGTE1VsA3t6CEjKznypKu+hCKL8/DyriqJ3a+xoSLvdwrZz5hXHc+OR1OP/nWdz8L16M9Xon+PSUKxj3Qv6M8ONiMuEw6G7c6/G+4LrKpN6/uFzZAF1isu0lTr3BeWq6Uffu4lGgOSwRLE4aUaBJ0lFKUdS4=",
37
+ "achieved_goal": "[[ 0.05315536 0.10655661 0.01998976]\n [-0.09801238 0.09932312 0.01998892]\n [-0.09713883 -0.02430385 0.01998941]\n [ 0.12428544 0.09669904 0.01998892]]",
38
+ "desired_goal": "[[ 0.01367054 -0.07944896 0.02 ]\n [-0.14508219 0.13294043 0.09986094]\n [-0.03125228 -0.08463559 0.02 ]\n [-0.10547066 0.04169532 0.02 ]]",
39
+ "observation": "[[-2.93790787e-01 -2.90576756e-01 4.19398844e-01 1.44457053e-02\n -7.51112178e-02 1.35989398e-01 7.99999908e-02 5.31553589e-02\n 1.06556609e-01 1.99897643e-02 1.89275615e-05 5.09903293e-06\n 2.08003665e-04 -2.67202140e-05 4.18589880e-05 -2.00831109e-05\n -1.88045902e-03 -1.19318720e-03 1.36526491e-04]\n [ 9.30025727e-02 4.02125627e-01 1.02035431e-02 -7.36490488e-02\n 1.28277242e-01 1.04230049e-03 1.86536033e-02 -9.80123803e-02\n 9.93231237e-02 1.99889243e-02 6.71599919e-05 9.25230688e-06\n -8.91494419e-05 -2.02288920e-05 -2.26371617e-06 -2.95936886e-12\n 1.86533455e-10 5.01976932e-11 -1.19950193e-04]\n [ 3.01929098e-03 5.28436482e-01 2.26521745e-01 3.80234048e-03\n -1.84343650e-03 1.35838374e-04 7.99977332e-02 -9.71388295e-02\n -2.43038516e-02 1.99894123e-02 -4.48818582e-06 3.66777349e-05\n -1.81146068e-04 6.59740181e-07 -1.09524317e-05 -3.99376365e-12\n -6.67535402e-11 2.51141163e-10 -9.66858279e-05]\n [-2.40385681e-01 -6.81243390e-02 2.25753248e-01 2.41563335e-01\n 8.05133402e-01 9.68123972e-01 8.00000057e-02 1.24285445e-01\n 9.66990367e-02 1.99889243e-02 -6.71599410e-05 -9.25229324e-06\n -5.72169447e-05 2.02293413e-05 2.26396855e-06 -3.07083525e-12\n -1.92165325e-10 -4.96513386e-11 -1.19920034e-04]]"
40
+ },
41
+ "_episode_num": 20624,
42
+ "use_sde": false,
43
+ "sde_sample_freq": -1,
44
+ "_current_progress_remaining": 0.0,
45
+ "_stats_window_size": 100,
46
+ "ep_info_buffer": {
47
+ ":type:": "<class 'collections.deque'>",
48
+ ":serialized:": "gAWV4AsAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKH2UKIwBcpRHwC1N/hESdvuMAWyUSzKMAXSUR0DF+IDeXRgJdX2UKGgGR8Alhegte2NOaAdLMmgIR0DF+Hf8ZUDMdX2UKGgGR7+nZ00WM0gsaAdLAWgIR0DF+H221D0EdX2UKGgGR8AkZP+GXXyzaAdLMmgIR0DF+PA5WBBidX2UKGgGR8At8TEBKcuraAdLMmgIR0DF+ZxDeCTVdX2UKGgGR8AnXt3OfNA1aAdLMmgIR0DF+Z1WS2YwdX2UKGgGR8AdpEx7AtWdaAdLMmgIR0DF+Zp2St/4dX2UKGgGR8AlkGmk30f6aAdLMmgIR0DF+gj7XQMQdX2UKGgGR8AoyofjjrAyaAdLMmgIR0DF+q57LMcIdX2UKGgGR8AwSzVc2R7raAdLMmgIR0DF+q5xgiNbdX2UKGgGR8AiqJa7mMfjaAdLMmgIR0DF+qmwkgOjdX2UKGgGR8AklrO7g88taAdLMmgIR0DF+xmU2UB5dX2UKGgGR8Ap3/z8P4EfaAdLMmgIR0DF+8o9X9zfdX2UKGgGR8Asx34bjtG/aAdLMmgIR0DF+81rhzeXdX2UKGgGR8AbUKgIyCWeaAdLMmgIR0DF+8g2Kl54dX2UKGgGR8AH8UKzAvcraAdLMmgIR0DF/DVOmBOIdX2UKGgGR8ATR62OQyRCaAdLMmgIR0DF/QlByCFsdX2UKGgGR8AQqpiqhlDnaAdLMmgIR0DF/RXsRg7YdX2UKGgGR8AnVHeaa1CxaAdLMmgIR0DF/S4ukDZEdX2UKGgGR8ArTcu8K5TZaAdLMmgIR0DF/dKg00m/dX2UKGgGR8AXVAprk8zRaAdLMmgIR0DF/oo8U21ldX2UKGgGR8AliiosI3R5aAdLMmgIR0DF/ou9vjwQdX2UKGgGR8AnhCu2Zy+6aAdLMmgIR0DF/oW5jH4odX2UKGgGR8AjlwtJ4B3iaAdLMmgIR0DF/vYfKZDzdX2UKGgGR8AxUQqqfe1saAdLMmgIR0DF/6Bid8RddX2UKGgGR8ApFgWrOqvNaAdLMmgIR0DF/6EvCdjHdX2UKGgGR8AlGKJl8PWhaAdLMmgIR0DF/5wkJKJ3dX2UKGgGR8AX9pj+aScLaAdLMmgIR0DGAAn5P/JedX2UKGgGR8Ac2/mDDjzaaAdLMmgIR0DGALkEovzwdX2UKGgGR8AnkOhkAggYaAdLMmgIR0DGALqPsAvMdX2UKGgGR8AqIx33YcvNaAdLMmgIR0DGALcihWYGdX2UKGgGR8AbR/wy6+WXaAdLMmgIR0DGASixkd3jdX2UKGgGR8AdV32VVxS6aAdLMmgIR0DGAdewqy4XdX2UKGgGR8Ag8O3DvVmSaAdLMmgIR0DGAdfied08dX2UKGgGR8AcDn5i3G4raAdLMmgIR0DGAdZOBUaRdX2UKGgGR8AhTYU34sVdaAdLMmgIR0DGAkmOQyRCdX2UKGgGR8ANsySFGoaUaAdLMmgIR0DGAvtBMSK4dX2UKGgGR8ApjG+9Jz1caAdLMmgIR0DGAwXW4EwGdX2UKGgGR8AGe0Re1KGtaAdLMmgIR0DGAxSofjjrdX2UKGgGR8Ag27V8Ti84aAdLMmgIR0DGA66/EfkndX2UKGgGR8AaXTAnDziCaAdLMmgIR0DGBID5uZTidX2UKGgGR8AaM2AG0NSZaAdLMmgIR0DGBIG34Kx+dX2UKGgGR8Ak92mHgxagaAdLMmgIR0DGBH6+FlCkdX2UKGgGR8AQHQswtapxaAdLMmgIR0DGBPAWHk92dX2UKGgGR8AVEyad+XqraAdLMmgIR0DGBZ0bPyCndX2UKGgGR8AfagPEsJ6ZaAdLMmgIR0DGBZ2XZ5AydX2UKGgGR8AjQo6S1Vo6aAdLMmgIR0DGBZqLMs6JdX2UKGgGR8ApgLzf779AaAdLMmgIR0DGBgtCgK4QdX2UKGgGR8AwFqcmShalaAdLMmgIR0DGBr/49HMEdX2UKGgGR8AqN4UN8VpLaAdLMmgIR0DGBsJi1AqvdX2UKGgGR8AibSx7iQ1aaAdLMmgIR0DGBr9ITXardX2UKGgGR8AhYzzErGzbaAdLMmgIR0DGBzVcjZ+QdX2UKGgGR8AhPpJwsGxEaAdLMmgIR0DGB+NQVKwqdX2UKGgGR8APU6eXiR4haAdLMmgIR0DGB+Sx/ustdX2UKGgGR8AfnCaZx7zDaAdLMmgIR0DGB9/o/zJ7dX2UKGgGR8AkfnXd0q6OaAdLMmgIR0DGCFIRTS9edX2UKGgGR8AQHynUDuBuaAdLMmgIR0DGCP9uejEfdX2UKGgGR8Ak56fra/RFaAdLMmgIR0DGCQBC6YmcdX2UKGgGR8Aqd0ihWYF8aAdLMmgIR0DGCP0tyxRmdX2UKGgGR8Awi1E3Kji5aAdLMmgIR0DGCZSY7aIvdX2UKGgGR8AkuksSTQmeaAdLMmgIR0DGCn0ahpQDdX2UKGgGR8ArWFcIJJGwaAdLMmgIR0DGCoZ++dsjdX2UKGgGR7+m1hLGrCFcaAdLAWgIR0DGCo5LoOhCdX2UKGgGR8AXeQr+YMOPaAdLMmgIR0DGCpfBacI7dX2UKGgGR8AckeOn2qT9aAdLMmgIR0DGCwsNSZSfdX2UKGgGR8AnMqkuYhMbaAdLMmgIR0DGC7tMoMKDdX2UKGgGR8AjCW2w3YL9aAdLMmgIR0DGC8GndfsvdX2UKGgGR8AYYu01IiC8aAdLMmgIR0DGC7f0PH1fdX2UKGgGR8AoONXo1UEQaAdLMmgIR0DGDC9IkJKKdX2UKGgGR8AdY3HaN+9baAdLMmgIR0DGDONOsT37dX2UKGgGR8AptA57w8W9aAdLMmgIR0DGDOqmhufmdX2UKGgGR8AgHIEr5IpZaAdLMmgIR0DGDOVbX6IndX2UKGgGR8AjJ/JeVs1saAdLMmgIR0DGDV3f4yoGdX2UKGgGR8AhmNrCWNWEaAdLMmgIR0DGDhGZLIxQdX2UKGgGR8Ajvt3wCr93aAdLMmgIR0DGDhjngYP5dX2UKGgGR8AuC6zVtoBaaAdLMmgIR0DGDg6RISUUdX2UKGgGR8AlMEX+ERJ3aAdLMmgIR0DGDoZaiblSdX2UKGgGR8AhSLc9GI9DaAdLMmgIR0DGDza11GLDdX2UKGgGR8Ar/FwT/Q0GaAdLMmgIR0DGDz1Y2bXpdX2UKGgGR8AuYZ6Uqx1QaAdLMmgIR0DGDzXx4IKMdX2UKGgGR8AjPpA2Q4jsaAdLMmgIR0DGD7G7pV0cdX2UKGgGR8AkXjlxOtW/aAdLMmgIR0DGEKd2C/XYdX2UKGgGR8AoDlcQiA2AaAdLMmgIR0DGELyUPhAGdX2UKGgGR8Adc6FM7EHdaAdLMmgIR0DGEMgT7EYPdX2UKGgGR8AdME9t/FzdaAdLMmgIR0DGEWHBzmwJdX2UKGgGR8AY7vnbItDlaAdLMmgIR0DGEhIC2c8UdX2UKGgGR8AhbesxO+IuaAdLMmgIR0DGEhmpEQXidX2UKGgGR8AjL2V3Ux20aAdLMmgIR0DGEg8yi22HdX2UKGgGR8Ai1IV/MGHIaAdLMmgIR0DGEn9ZA6dUdX2UKGgGR8Ad1jqfOD8MaAdLMmgIR0DGEzJ+QU5/dX2UKGgGR8Ag/BciW3SbaAdLMmgIR0DGEzjrC3w1dX2UKGgGR8AuqW0JF9a2aAdLMmgIR0DGEy9senyedX2UKGgGR8AqmxptaY/naAdLMmgIR0DGE6naJyhjdX2UKGgGR8AqYlPacqe9aAdLMmgIR0DGFF3dCVrzdX2UKGgGR8Apv9oexOclaAdLMmgIR0DGFGS+g13udX2UKGgGR8AHVJ4B3iaRaAdLMmgIR0DGFFxnBciXdX2UKGgGR8Ahz4vexfOVaAdLMmgIR0DGFNO1rqMWdX2UKGgGR8AhdPpIMBp6aAdLMmgIR0DGFXusNlRQdX2UKGgGR8AcHBHkLhJiaAdLMmgIR0DGFYD4agmJdX2UKGgGR8AlZjgAIY3vaAdLMmgIR0DGFXW+49X+dWUu"
49
+ },
50
+ "ep_success_buffer": {
51
+ ":type:": "<class 'collections.deque'>",
52
+ ":serialized:": "gAWVhgAAAAAAAACMC2NvbGxlY3Rpb25zlIwFZGVxdWWUk5QpS2SGlFKUKImJiImJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiImJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYmJiYllLg=="
53
+ },
54
+ "_n_updates": 249975,
55
+ "buffer_size": 1000000,
56
+ "batch_size": 256,
57
+ "learning_starts": 100,
58
+ "tau": 0.005,
59
+ "gamma": 0.99,
60
+ "gradient_steps": 1,
61
+ "optimize_memory_usage": false,
62
+ "replay_buffer_class": {
63
+ ":type:": "<class 'abc.ABCMeta'>",
64
+ ":serialized:": "gAWVOQAAAAAAAACMIHN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi5idWZmZXJzlIwQRGljdFJlcGxheUJ1ZmZlcpSTlC4=",
65
+ "__module__": "stable_baselines3.common.buffers",
66
+ "__annotations__": "{'observation_space': <class 'gymnasium.spaces.dict.Dict'>, 'obs_shape': typing.Dict[str, typing.Tuple[int, ...]], 'observations': typing.Dict[str, numpy.ndarray], 'next_observations': typing.Dict[str, numpy.ndarray]}",
67
+ "__doc__": "\n Dict Replay buffer used in off-policy algorithms like SAC/TD3.\n Extends the ReplayBuffer to use dictionary observations\n\n :param buffer_size: Max number of element in the buffer\n :param observation_space: Observation space\n :param action_space: Action space\n :param device: PyTorch device\n :param n_envs: Number of parallel environments\n :param optimize_memory_usage: Enable a memory efficient variant\n Disabled for now (see https://github.com/DLR-RM/stable-baselines3/pull/243#discussion_r531535702)\n :param handle_timeout_termination: Handle timeout termination (due to timelimit)\n separately and treat the task as infinite horizon task.\n https://github.com/DLR-RM/stable-baselines3/issues/284\n ",
68
+ "__init__": "<function DictReplayBuffer.__init__ at 0x7caa9e23f490>",
69
+ "add": "<function DictReplayBuffer.add at 0x7caa9e23f520>",
70
+ "sample": "<function DictReplayBuffer.sample at 0x7caa9e23f5b0>",
71
+ "_get_samples": "<function DictReplayBuffer._get_samples at 0x7caa9e23f640>",
72
+ "__abstractmethods__": "frozenset()",
73
+ "_abc_impl": "<_abc._abc_data object at 0x7caa9e243500>"
74
+ },
75
+ "replay_buffer_kwargs": {},
76
+ "train_freq": {
77
+ ":type:": "<class 'stable_baselines3.common.type_aliases.TrainFreq'>",
78
+ ":serialized:": "gAWVYQAAAAAAAACMJXN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi50eXBlX2FsaWFzZXOUjAlUcmFpbkZyZXGUk5RLAWgAjBJUcmFpbkZyZXF1ZW5jeVVuaXSUk5SMBHN0ZXCUhZRSlIaUgZQu"
79
+ },
80
+ "use_sde_at_warmup": false,
81
+ "target_entropy": -4.0,
82
+ "ent_coef": "auto",
83
+ "target_update_interval": 1,
84
+ "observation_space": {
85
+ ":type:": "<class 'gymnasium.spaces.dict.Dict'>",
86
+ ":serialized:": "gAWVMgQAAAAAAACMFWd5bW5hc2l1bS5zcGFjZXMuZGljdJSMBERpY3SUk5QpgZR9lCiMBnNwYWNlc5SMC2NvbGxlY3Rpb25zlIwLT3JkZXJlZERpY3SUk5QpUpQojA1hY2hpZXZlZF9nb2FslIwUZ3ltbmFzaXVtLnNwYWNlcy5ib3iUjANCb3iUk5QpgZR9lCiMBWR0eXBllIwFbnVtcHmUjAVkdHlwZZSTlIwCZjSUiYiHlFKUKEsDjAE8lE5OTkr/////Sv////9LAHSUYowNYm91bmRlZF9iZWxvd5SMEm51bXB5LmNvcmUubnVtZXJpY5SMC19mcm9tYnVmZmVylJOUKJYDAAAAAAAAAAEBAZRoE4wCYjGUiYiHlFKUKEsDjAF8lE5OTkr/////Sv////9LAHSUYksDhZSMAUOUdJRSlIwNYm91bmRlZF9hYm92ZZRoHCiWAwAAAAAAAAABAQGUaCBLA4WUaCR0lFKUjAZfc2hhcGWUSwOFlIwDbG93lGgcKJYMAAAAAAAAAAAAIMEAACDBAAAgwZRoFksDhZRoJHSUUpSMBGhpZ2iUaBwolgwAAAAAAAAAAAAgQQAAIEEAACBBlGgWSwOFlGgkdJRSlIwIbG93X3JlcHKUjAUtMTAuMJSMCWhpZ2hfcmVwcpSMBDEwLjCUjApfbnBfcmFuZG9tlE51YowMZGVzaXJlZF9nb2FslGgNKYGUfZQoaBBoFmgZaBwolgMAAAAAAAAAAQEBlGggSwOFlGgkdJRSlGgnaBwolgMAAAAAAAAAAQEBlGggSwOFlGgkdJRSlGgsSwOFlGguaBwolgwAAAAAAAAAAAAgwQAAIMEAACDBlGgWSwOFlGgkdJRSlGgzaBwolgwAAAAAAAAAAAAgQQAAIEEAACBBlGgWSwOFlGgkdJRSlGg4jAUtMTAuMJRoOowEMTAuMJRoPE51YowLb2JzZXJ2YXRpb26UaA0pgZR9lChoEGgWaBloHCiWEwAAAAAAAAABAQEBAQEBAQEBAQEBAQEBAQEBlGggSxOFlGgkdJRSlGgnaBwolhMAAAAAAAAAAQEBAQEBAQEBAQEBAQEBAQEBAZRoIEsThZRoJHSUUpRoLEsThZRoLmgcKJZMAAAAAAAAAAAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMEAACDBAAAgwQAAIMGUaBZLE4WUaCR0lFKUaDNoHCiWTAAAAAAAAAAAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBAAAgQQAAIEEAACBBlGgWSxOFlGgkdJRSlGg4jAUtMTAuMJRoOowEMTAuMJRoPE51YnVoLE5oEE5oPE51Yi4=",
87
+ "spaces": "OrderedDict([('achieved_goal', Box(-10.0, 10.0, (3,), float32)), ('desired_goal', Box(-10.0, 10.0, (3,), float32)), ('observation', Box(-10.0, 10.0, (19,), float32))])",
88
+ "_shape": null,
89
+ "dtype": null,
90
+ "_np_random": null
91
+ },
92
+ "action_space": {
93
+ ":type:": "<class 'gymnasium.spaces.box.Box'>",
94
+ ":serialized:": "gAWVawIAAAAAAACMFGd5bW5hc2l1bS5zcGFjZXMuYm94lIwDQm94lJOUKYGUfZQojAVkdHlwZZSMBW51bXB5lIwFZHR5cGWUk5SMAmY0lImIh5RSlChLA4wBPJROTk5K/////0r/////SwB0lGKMDWJvdW5kZWRfYmVsb3eUjBJudW1weS5jb3JlLm51bWVyaWOUjAtfZnJvbWJ1ZmZlcpSTlCiWBAAAAAAAAAABAQEBlGgIjAJiMZSJiIeUUpQoSwOMAXyUTk5OSv////9K/////0sAdJRiSwSFlIwBQ5R0lFKUjA1ib3VuZGVkX2Fib3ZllGgRKJYEAAAAAAAAAAEBAQGUaBVLBIWUaBl0lFKUjAZfc2hhcGWUSwSFlIwDbG93lGgRKJYQAAAAAAAAAAAAgL8AAIC/AACAvwAAgL+UaAtLBIWUaBl0lFKUjARoaWdolGgRKJYQAAAAAAAAAAAAgD8AAIA/AACAPwAAgD+UaAtLBIWUaBl0lFKUjAhsb3dfcmVwcpSMBC0xLjCUjAloaWdoX3JlcHKUjAMxLjCUjApfbnBfcmFuZG9tlIwUbnVtcHkucmFuZG9tLl9waWNrbGWUjBBfX2dlbmVyYXRvcl9jdG9ylJOUjAVQQ0c2NJRoMowUX19iaXRfZ2VuZXJhdG9yX2N0b3KUk5SGlFKUfZQojA1iaXRfZ2VuZXJhdG9ylIwFUENHNjSUjAVzdGF0ZZR9lChoPYoQAwiL5FV7Qs4TDuqR03IAT4wDaW5jlIoRrW+5A4GHaDIrz4zJa9YWoAB1jApoYXNfdWludDMylEsAjAh1aW50ZWdlcpRLAHVidWIu",
95
+ "dtype": "float32",
96
+ "bounded_below": "[ True True True True]",
97
+ "bounded_above": "[ True True True True]",
98
+ "_shape": [
99
+ 4
100
+ ],
101
+ "low": "[-1. -1. -1. -1.]",
102
+ "high": "[1. 1. 1. 1.]",
103
+ "low_repr": "-1.0",
104
+ "high_repr": "1.0",
105
+ "_np_random": "Generator(PCG64)"
106
+ },
107
+ "n_envs": 4,
108
+ "lr_schedule": {
109
+ ":type:": "<class 'function'>",
110
+ ":serialized:": "gAWVoAMAAAAAAACMF2Nsb3VkcGlja2xlLmNsb3VkcGlja2xllIwOX21ha2VfZnVuY3Rpb26Uk5QoaACMDV9idWlsdGluX3R5cGWUk5SMCENvZGVUeXBllIWUUpQoSwFLAEsASwFLA0sTQwx0AIgAfACDAYMBUwCUToWUjAVmbG9hdJSFlIwScHJvZ3Jlc3NfcmVtYWluaW5nlIWUjEkvdXNyL2xvY2FsL2xpYi9weXRob24zLjEwL2Rpc3QtcGFja2FnZXMvc3RhYmxlX2Jhc2VsaW5lczMvY29tbW9uL3V0aWxzLnB5lIwIPGxhbWJkYT6US2FDAgwAlIwOdmFsdWVfc2NoZWR1bGWUhZQpdJRSlH2UKIwLX19wYWNrYWdlX1+UjBhzdGFibGVfYmFzZWxpbmVzMy5jb21tb26UjAhfX25hbWVfX5SMHnN0YWJsZV9iYXNlbGluZXMzLmNvbW1vbi51dGlsc5SMCF9fZmlsZV9flIxJL3Vzci9sb2NhbC9saWIvcHl0aG9uMy4xMC9kaXN0LXBhY2thZ2VzL3N0YWJsZV9iYXNlbGluZXMzL2NvbW1vbi91dGlscy5weZR1Tk5oAIwQX21ha2VfZW1wdHlfY2VsbJSTlClSlIWUdJRSlIwcY2xvdWRwaWNrbGUuY2xvdWRwaWNrbGVfZmFzdJSMEl9mdW5jdGlvbl9zZXRzdGF0ZZSTlGghfZR9lChoGGgPjAxfX3F1YWxuYW1lX1+UjCFnZXRfc2NoZWR1bGVfZm4uPGxvY2Fscz4uPGxhbWJkYT6UjA9fX2Fubm90YXRpb25zX1+UfZSMDl9fa3dkZWZhdWx0c19flE6MDF9fZGVmYXVsdHNfX5ROjApfX21vZHVsZV9flGgZjAdfX2RvY19flE6MC19fY2xvc3VyZV9flGgAjApfbWFrZV9jZWxslJOUaAIoaAcoSwFLAEsASwFLAUsTQwSIAFMAlGgJKYwBX5SFlGgOjARmdW5jlEuFQwIEAZSMA3ZhbJSFlCl0lFKUaBVOTmgdKVKUhZR0lFKUaCRoPn2UfZQoaBhoNWgnjBljb25zdGFudF9mbi48bG9jYWxzPi5mdW5jlGgpfZRoK05oLE5oLWgZaC5OaC9oMUc/M6kqMFUyYYWUUpSFlIwXX2Nsb3VkcGlja2xlX3N1Ym1vZHVsZXOUXZSMC19fZ2xvYmFsc19flH2UdYaUhlIwhZRSlIWUaEZdlGhIfZR1hpSGUjAu"
111
+ },
112
+ "batch_norm_stats": [],
113
+ "batch_norm_stats_target": []
114
+ }
sac-PandaPickAndPlaceDense-v3/ent_coef_optimizer.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:32223b1ae9959b21199c920d6b28000c72b1b77b2833d1c2147597e22e98518e
3
+ size 1940
sac-PandaPickAndPlaceDense-v3/policy.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:99ca60f9666d92c12096caec9f890890fdc43bb5523733d78299e63d91747000
3
+ size 1489334
sac-PandaPickAndPlaceDense-v3/pytorch_variables.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b11d3a32f5d3b44d9fee84d29d6abe511dc26c0690a49a0d889498b5beab1b29
3
+ size 1180
sac-PandaPickAndPlaceDense-v3/system_info.txt ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ - OS: Linux-6.1.85+-x86_64-with-glibc2.35 # 1 SMP PREEMPT_DYNAMIC Sun Apr 28 14:29:16 UTC 2024
2
+ - Python: 3.10.12
3
+ - Stable-Baselines3: 2.3.2
4
+ - PyTorch: 2.3.0+cu121
5
+ - GPU Enabled: False
6
+ - Numpy: 1.25.2
7
+ - Cloudpickle: 2.2.1
8
+ - Gymnasium: 0.29.1
9
+ - OpenAI Gym: 0.25.2
vec_normalize.pkl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:db7835d72afd5917d9a703cc181b2e4ffcd38f3d1bfcfe2f49dd2d1ba9e676cb
3
+ size 3248