File size: 45,485 Bytes
8f0a8fd
1
{"language": "Python", "id": 11, "repo_owner": "pymc-devs", "repo_name": "pymc", "head_branch": "ruff_linter", "workflow_name": "pre-commit", "workflow_filename": "pre-commit.yml", "workflow_path": ".github/workflows/pre-commit.yml", "contributor": "juanitorduz", "sha_fail": "9981ca154ba03a88deaa96d16b119de6183017e5", "sha_success": "c50bdf8c2e84c61953b892b8b80ea724bf1746b4", "workflow": "name: pre-commit\n\non:\n  pull_request:\n  push:\n    branches: [main]\n\njobs:\n  pre-commit:\n    runs-on: ubuntu-latest\n    env:\n      SKIP: no-commit-to-branch\n    steps:\n    - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11\n    - uses: actions/setup-python@v5\n      with:\n        python-version: \"3.9\"  # Run pre-commit on oldest supported Python version\n    - uses: pre-commit/[email protected]\n  mypy:\n    runs-on: ubuntu-latest\n    defaults:\n      run:\n        shell: bash -l {0}\n    steps:\n      - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11\n      - name: Cache conda\n        uses: actions/cache@v3\n        env:\n          # Increase this value to reset cache if environment-test.yml has not changed\n          CACHE_NUMBER: 0\n        with:\n          path: ~/conda_pkgs_dir\n          key: ${{ runner.os }}-py39-conda-${{ env.CACHE_NUMBER }}-${{\n            hashFiles('conda-envs/environment-test.yml') }}\n      - name: Cache multiple paths\n        uses: actions/cache@v3\n        env:\n          # Increase this value to reset cache if requirements.txt has not changed\n          CACHE_NUMBER: 0\n        with:\n          path: |\n            ~/.cache/pip\n            $RUNNER_TOOL_CACHE/Python/*\n            ~\\AppData\\Local\\pip\\Cache\n          key: ${{ runner.os }}-build-${{ matrix.python-version }}-${{ env.CACHE_NUMBER }}-${{\n            hashFiles('requirements.txt') }}\n      - uses: conda-incubator/setup-miniconda@v2\n        with:\n          miniforge-variant: Mambaforge\n          miniforge-version: latest\n          mamba-version: \"*\"\n          activate-environment: pymc-test\n          channel-priority: strict\n          environment-file: conda-envs/environment-test.yml\n          python-version: \"3.9\"  # Run pre-commit on oldest supported Python version\n          use-mamba: true\n          use-only-tar-bz2: false # IMPORTANT: This may break caching of conda packages! See https://github.com/conda-incubator/setup-miniconda/issues/267\n      - name: Install-pymc and mypy dependencies\n        run: |\n          conda activate pymc-test\n          pip install -e .\n          pip install --pre -U polyagamma\n          python --version\n      - name: Run mypy\n        run: |\n          conda activate pymc-test\n          python ./scripts/run_mypy.py --verbose\n", "logs": [{"step_name": "pre-commit/4_Run [email protected]", "log": "##[group]Run pre-commit/[email protected]\nwith:\n  extra_args: --all-files\nenv:\n  SKIP: no-commit-to-branch\n  pythonLocation: /opt/hostedtoolcache/Python/3.9.18/x64\n  PKG_CONFIG_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib/pkgconfig\n  Python_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python2_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python3_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib\n##[endgroup]\n##[group]Run python -m pip install pre-commit\n\u001b[36;1mpython -m pip install pre-commit\u001b[0m\nshell: /usr/bin/bash --noprofile --norc -e -o pipefail {0}\nenv:\n  SKIP: no-commit-to-branch\n  pythonLocation: /opt/hostedtoolcache/Python/3.9.18/x64\n  PKG_CONFIG_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib/pkgconfig\n  Python_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python2_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python3_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib\n##[endgroup]\nCollecting pre-commit\n  Downloading pre_commit-3.6.0-py2.py3-none-any.whl.metadata (1.3 kB)\nCollecting cfgv>=2.0.0 (from pre-commit)\n  Downloading cfgv-3.4.0-py2.py3-none-any.whl.metadata (8.5 kB)\nCollecting identify>=1.0.0 (from pre-commit)\n  Downloading identify-2.5.33-py2.py3-none-any.whl.metadata (4.4 kB)\nCollecting nodeenv>=0.11.1 (from pre-commit)\n  Downloading nodeenv-1.8.0-py2.py3-none-any.whl.metadata (21 kB)\nCollecting pyyaml>=5.1 (from pre-commit)\n  Downloading PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)\nCollecting virtualenv>=20.10.0 (from pre-commit)\n  Downloading virtualenv-20.25.0-py3-none-any.whl.metadata (4.5 kB)\nRequirement already satisfied: setuptools in /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/site-packages (from nodeenv>=0.11.1->pre-commit) (58.1.0)\nCollecting distlib<1,>=0.3.7 (from virtualenv>=20.10.0->pre-commit)\n  Downloading distlib-0.3.8-py2.py3-none-any.whl.metadata (5.1 kB)\nCollecting filelock<4,>=3.12.2 (from virtualenv>=20.10.0->pre-commit)\n  Downloading filelock-3.13.1-py3-none-any.whl.metadata (2.8 kB)\nCollecting platformdirs<5,>=3.9.1 (from virtualenv>=20.10.0->pre-commit)\n  Downloading platformdirs-4.1.0-py3-none-any.whl.metadata (11 kB)\nDownloading pre_commit-3.6.0-py2.py3-none-any.whl (204 kB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 204.0/204.0 kB 14.2 MB/s eta 0:00:00\nDownloading cfgv-3.4.0-py2.py3-none-any.whl (7.2 kB)\nDownloading identify-2.5.33-py2.py3-none-any.whl (98 kB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 98.9/98.9 kB 29.1 MB/s eta 0:00:00\nDownloading nodeenv-1.8.0-py2.py3-none-any.whl (22 kB)\nDownloading PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (738 kB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 738.9/738.9 kB 67.7 MB/s eta 0:00:00\nDownloading virtualenv-20.25.0-py3-none-any.whl (3.8 MB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 3.8/3.8 MB 93.7 MB/s eta 0:00:00\nDownloading distlib-0.3.8-py2.py3-none-any.whl (468 kB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 468.9/468.9 kB 77.3 MB/s eta 0:00:00\nDownloading filelock-3.13.1-py3-none-any.whl (11 kB)\nDownloading platformdirs-4.1.0-py3-none-any.whl (17 kB)\nInstalling collected packages: distlib, pyyaml, platformdirs, nodeenv, identify, filelock, cfgv, virtualenv, pre-commit\nSuccessfully installed cfgv-3.4.0 distlib-0.3.8 filelock-3.13.1 identify-2.5.33 nodeenv-1.8.0 platformdirs-4.1.0 pre-commit-3.6.0 pyyaml-6.0.1 virtualenv-20.25.0\n\n[notice] A new release of pip is available: 23.0.1 -> 23.3.2\n[notice] To update, run: pip install --upgrade pip\n##[group]Run python -m pip freeze --local\n\u001b[36;1mpython -m pip freeze --local\u001b[0m\nshell: /usr/bin/bash --noprofile --norc -e -o pipefail {0}\nenv:\n  SKIP: no-commit-to-branch\n  pythonLocation: /opt/hostedtoolcache/Python/3.9.18/x64\n  PKG_CONFIG_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib/pkgconfig\n  Python_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python2_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python3_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib\n##[endgroup]\ncfgv==3.4.0\ndistlib==0.3.8\nfilelock==3.13.1\nidentify==2.5.33\nnodeenv==1.8.0\nplatformdirs==4.1.0\npre-commit==3.6.0\nPyYAML==6.0.1\nvirtualenv==20.25.0\n##[group]Run actions/cache@v3\nwith:\n  path: ~/.cache/pre-commit\n  key: pre-commit-3|/opt/hostedtoolcache/Python/3.9.18/x64|8521b1deaccc96b7bf978113f6a9d2819d4547b59963b2c02ead71c093c77e29\n  enableCrossOsArchive: false\n  fail-on-cache-miss: false\n  lookup-only: false\nenv:\n  SKIP: no-commit-to-branch\n  pythonLocation: /opt/hostedtoolcache/Python/3.9.18/x64\n  PKG_CONFIG_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib/pkgconfig\n  Python_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python2_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python3_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib\n##[endgroup]\nCache not found for input keys: pre-commit-3|/opt/hostedtoolcache/Python/3.9.18/x64|8521b1deaccc96b7bf978113f6a9d2819d4547b59963b2c02ead71c093c77e29\n##[group]Run pre-commit run --show-diff-on-failure --color=always --all-files\n\u001b[36;1mpre-commit run --show-diff-on-failure --color=always --all-files\u001b[0m\nshell: /usr/bin/bash --noprofile --norc -e -o pipefail {0}\nenv:\n  SKIP: no-commit-to-branch\n  pythonLocation: /opt/hostedtoolcache/Python/3.9.18/x64\n  PKG_CONFIG_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib/pkgconfig\n  Python_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python2_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  Python3_ROOT_DIR: /opt/hostedtoolcache/Python/3.9.18/x64\n  LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.9.18/x64/lib\n##[endgroup]\n[INFO]\u001b[m Initializing environment for https://github.com/pre-commit/pre-commit-hooks.\n[INFO]\u001b[m Initializing environment for https://github.com/lucianopaz/head_of_apache.\n[INFO]\u001b[m Initializing environment for https://github.com/asottile/pyupgrade.\n[INFO]\u001b[m Initializing environment for https://github.com/astral-sh/ruff-pre-commit.\n[INFO]\u001b[m Initializing environment for https://github.com/PyCQA/pydocstyle.\n[INFO]\u001b[m Initializing environment for https://github.com/MarcoGorelli/madforhooks.\n[INFO]\u001b[m Initializing environment for local:pandas,pyyaml.\n[INFO]\u001b[m Initializing environment for local:pyyaml.\n[INFO]\u001b[m Installing environment for https://github.com/pre-commit/pre-commit-hooks.\n[INFO]\u001b[m Once installed this environment will be reused.\n[INFO]\u001b[m This may take a few minutes...\n[INFO]\u001b[m Installing environment for https://github.com/lucianopaz/head_of_apache.\n[INFO]\u001b[m Once installed this environment will be reused.\n[INFO]\u001b[m This may take a few minutes...\n[INFO]\u001b[m Installing environment for https://github.com/asottile/pyupgrade.\n[INFO]\u001b[m Once installed this environment will be reused.\n[INFO]\u001b[m This may take a few minutes...\n[INFO]\u001b[m Installing environment for https://github.com/astral-sh/ruff-pre-commit.\n[INFO]\u001b[m Once installed this environment will be reused.\n[INFO]\u001b[m This may take a few minutes...\n[INFO]\u001b[m Installing environment for https://github.com/PyCQA/pydocstyle.\n[INFO]\u001b[m Once installed this environment will be reused.\n[INFO]\u001b[m This may take a few minutes...\n[INFO]\u001b[m Installing environment for https://github.com/MarcoGorelli/madforhooks.\n[INFO]\u001b[m Once installed this environment will be reused.\n[INFO]\u001b[m This may take a few minutes...\n[INFO]\u001b[m Installing environment for local.\n[INFO]\u001b[m Once installed this environment will be reused.\n[INFO]\u001b[m This may take a few minutes...\n[INFO]\u001b[m Installing environment for local.\n[INFO]\u001b[m Once installed this environment will be reused.\n[INFO]\u001b[m This may take a few minutes...\ncheck for merge conflicts............................................................\u001b[42mPassed\u001b[m\ncheck toml...........................................................................\u001b[42mPassed\u001b[m\ncheck yaml...........................................................................\u001b[42mPassed\u001b[m\ndebug statements (python)............................................................\u001b[42mPassed\u001b[m\nfix end of files.....................................................................\u001b[42mPassed\u001b[m\ndon't commit to branch..............................................................\u001b[43;30mSkipped\u001b[m\nfix requirements.txt.................................................................\u001b[42mPassed\u001b[m\ntrim trailing whitespace.............................................................\u001b[42mPassed\u001b[m\nApply Apache 2.0 License.............................................................\u001b[42mPassed\u001b[m\npyupgrade............................................................................\u001b[42mPassed\u001b[m\nruff.................................................................................\u001b[42mPassed\u001b[m\nruff-format..........................................................................\u001b[41mFailed\u001b[m\n\u001b[2m- hook id: ruff-format\u001b[m\n\u001b[2m- files were modified by this hook\u001b[m\n\n11 files reformatted, 197 files left unchanged\n\npydocstyle...........................................................................\u001b[42mPassed\u001b[m\nDisallow print statements............................................................\u001b[42mPassed\u001b[m\nCheck no tests are ignored...........................................................\u001b[42mPassed\u001b[m\nGenerate pip dependency from conda...................................................\u001b[42mPassed\u001b[m\nNo relative imports..................................................................\u001b[42mPassed\u001b[m\nCheck no links that should be cross-references are in the docs.......................\u001b[42mPassed\u001b[m\npre-commit hook(s) made changes.\nIf you are seeing this message in CI, reproduce locally with: `pre-commit run --all-files`.\nTo run `pre-commit` as part of git workflow, use `pre-commit install`.\nAll changes made by hooks:\n\u001b[1mdiff --git a/pymc/distributions/discrete.py b/pymc/distributions/discrete.py\u001b[m\n\u001b[1mindex f95b437..8771193 100644\u001b[m\n\u001b[1m--- a/pymc/distributions/discrete.py\u001b[m\n\u001b[1m+++ b/pymc/distributions/discrete.py\u001b[m\n\u001b[36m@@ -112,6 +112,7 @@\u001b[m \u001b[mclass Binomial(Discrete):\u001b[m\n     logit_p : tensor_like of float\u001b[m\n         Alternative log odds for the probability of success.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = binomial\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -334,6 +335,7 @@\u001b[m \u001b[mclass Bernoulli(Discrete):\u001b[m\n     logit_p : tensor_like of float\u001b[m\n         Alternative log odds for the probability of success.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = bernoulli\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -450,6 +452,7 @@\u001b[m \u001b[mclass DiscreteWeibull(Discrete):\u001b[m\n         Shape parameter (beta > 0).\u001b[m\n \u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = discrete_weibull\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -539,6 +542,7 @@\u001b[m \u001b[mclass Poisson(Discrete):\u001b[m\n     The Poisson distribution can be derived as a limiting case of the\u001b[m\n     binomial distribution.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = poisson\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -662,6 +666,7 @@\u001b[m \u001b[mclass NegativeBinomial(Discrete):\u001b[m\n     n : tensor_like of float\u001b[m\n         Alternative number of target success trials (n > 0)\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = nbinom\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -1108,6 +1113,7 @@\u001b[m \u001b[mclass Categorical(Discrete):\u001b[m\n     logit_p : float\u001b[m\n         Alternative log odds for the probability of success.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = categorical\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -1183,6 +1189,7 @@\u001b[m \u001b[mclass _OrderedLogistic(Categorical):\u001b[m\n     Underlying class for ordered logistic distributions.\u001b[m\n     See docs for the OrderedLogistic wrapper class for more details on how to use it in models.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = categorical\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -1289,6 +1296,7 @@\u001b[m \u001b[mclass _OrderedProbit(Categorical):\u001b[m\n     Underlying class for ordered probit distributions.\u001b[m\n     See docs for the OrderedProbit wrapper class for more details on how to use it in models.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = categorical\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[1mdiff --git a/pymc/distributions/multivariate.py b/pymc/distributions/multivariate.py\u001b[m\n\u001b[1mindex 1e5a956..570c139 100644\u001b[m\n\u001b[1m--- a/pymc/distributions/multivariate.py\u001b[m\n\u001b[1m+++ b/pymc/distributions/multivariate.py\u001b[m\n\u001b[36m@@ -235,6 +235,7 @@\u001b[m \u001b[mclass MvNormal(Continuous):\u001b[m\n         vals_raw = pm.Normal('vals_raw', mu=0, sigma=1, shape=(5, 3))\u001b[m\n         vals = pm.Deterministic('vals', pt.dot(chol, vals_raw.T).T)\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = multivariate_normal\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -355,6 +356,7 @@\u001b[m \u001b[mclass MvStudentT(Continuous):\u001b[m\n     lower : bool, default=True\u001b[m\n         Whether the cholesky fatcor is given as a lower triangular matrix.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = mv_studentt\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -436,6 +438,7 @@\u001b[m \u001b[mclass Dirichlet(SimplexContinuous):\u001b[m\n         Concentration parameters (a > 0). The number of categories is given by the\u001b[m\n         length of the last axis.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = dirichlet\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -515,6 +518,7 @@\u001b[m \u001b[mclass Multinomial(Discrete):\u001b[m\n         categories is given by the length of the last axis. Elements are expected to sum\u001b[m\n         to 1 along the last axis.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = multinomial\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -662,6 +666,7 @@\u001b[m \u001b[mclass DirichletMultinomial(Discrete):\u001b[m\n         Dirichlet concentration parameters (a > 0). The number of categories is given by\u001b[m\n         the length of the last axis.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = dirichlet_multinomial\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -716,6 +721,7 @@\u001b[m \u001b[mclass _OrderedMultinomial(Multinomial):\u001b[m\n     Underlying class for ordered multinomial distributions.\u001b[m\n     See docs for the OrderedMultinomial wrapper class for more details on how to use it in models.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = multinomial\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -940,6 +946,7 @@\u001b[m \u001b[mclass Wishart(Continuous):\u001b[m\n     This distribution is unusable in a PyMC model. You should instead\u001b[m\n     use LKJCholeskyCov or LKJCorr.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = wishart\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -1763,6 +1770,7 @@\u001b[m \u001b[mclass MatrixNormal(Continuous):\u001b[m\n             vals = pm.MatrixNormal('vals', mu=mu, colchol=colchol, rowcov=rowcov,\u001b[m\n                                    observed=data)\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = matrixnormal\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -1977,6 +1985,7 @@\u001b[m \u001b[mclass KroneckerNormal(Continuous):\u001b[m\n     ----------\u001b[m\n     .. [1] Saatchi, Y. (2011). \"Scalable inference for structured Gaussian process models\"\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = kroneckernormal\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -2183,6 +2192,7 @@\u001b[m \u001b[mclass CAR(Continuous):\u001b[m\n         \"Generalized Hierarchical Multivariate CAR Models for Areal Data\"\u001b[m\n         Biometrics, Vol. 61, No. 4 (Dec., 2005), pp. 950-961\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = car\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -2400,9 +2410,7 @@\u001b[m \u001b[mclass ICAR(Continuous):\u001b[m\n         return pt.zeros(N)\u001b[m\n \u001b[m\n     def logp(value, W, node1, node2, N, sigma, zero_sum_stdev):\u001b[m\n\u001b[31m-        pairwise_difference = (-1 / (2 * sigma**2)) * pt.sum(\u001b[m\n\u001b[31m-            pt.square(value[node1] - value[node2])\u001b[m\n\u001b[31m-        )\u001b[m\n\u001b[32m+\u001b[m\u001b[32m        pairwise_difference = (-1 / (2 * sigma**2)) * pt.sum(pt.square(value[node1] - value[node2]))\u001b[m\n         zero_sum = (\u001b[m\n             -0.5 * pt.pow(pt.sum(value) / (zero_sum_stdev * N), 2)\u001b[m\n             - pt.log(pt.sqrt(2.0 * np.pi))\u001b[m\n\u001b[36m@@ -2498,6 +2506,7 @@\u001b[m \u001b[mclass StickBreakingWeights(SimplexContinuous):\u001b[m\n     .. [2] M\u00fcller, P., Quintana, F. A., Jara, A., & Hanson, T. (2015). Bayesian nonparametric data\u001b[m\n            analysis. New York: Springer.\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_op = stickbreakingweights\u001b[m\n \u001b[m\n     @classmethod\u001b[m\n\u001b[36m@@ -2641,6 +2650,7 @@\u001b[m \u001b[mclass ZeroSumNormal(Distribution):\u001b[m\n             # the zero sum axes will be the last two\u001b[m\n             v = pm.ZeroSumNormal(\"v\", shape=(3, 4, 5), n_zerosum_axes=2)\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     rv_type = ZeroSumNormalRV\u001b[m\n \u001b[m\n     def __new__(\u001b[m\n\u001b[1mdiff --git a/pymc/model/core.py b/pymc/model/core.py\u001b[m\n\u001b[1mindex c45f3f5..6ee6d49 100644\u001b[m\n\u001b[1m--- a/pymc/model/core.py\u001b[m\n\u001b[1m+++ b/pymc/model/core.py\u001b[m\n\u001b[36m@@ -138,9 +138,7 @@\u001b[m \u001b[mclass ContextMeta(type):\u001b[m\n \u001b[m\n     # FIXME: is there a more elegant way to automatically add methods to the class that\u001b[m\n     # are instance methods instead of class methods?\u001b[m\n\u001b[31m-    def __init__(\u001b[m\n\u001b[31m-        cls, name, bases, nmspc, context_class: Optional[Type] = None, **kwargs\u001b[m\n\u001b[31m-    ):  # pylint: disable=unused-argument\u001b[m\n\u001b[32m+\u001b[m\u001b[32m    def __init__(cls, name, bases, nmspc, context_class: Optional[Type] = None, **kwargs):  # pylint: disable=unused-argument\u001b[m\n         \"\"\"Add ``__enter__`` and ``__exit__`` methods to the new class automatically.\"\"\"\u001b[m\n         if context_class is not None:\u001b[m\n             cls._context_class = context_class\u001b[m\n\u001b[36m@@ -1740,7 +1738,7 @@\u001b[m \u001b[mclass Model(WithMemoization, metaclass=ContextMeta):\u001b[m\n             done = {}\u001b[m\n             used_ids = {}\u001b[m\n             for i, out in enumerate(rv_inputs.maker.fgraph.outputs):\u001b[m\n\u001b[31m-                print_(f\"{i}: \", end=\"\"),\u001b[m\n\u001b[32m+\u001b[m\u001b[32m                (print_(f\"{i}: \", end=\"\"),)\u001b[m\n                 # Don't print useless deepcopys\u001b[m\n                 if out.owner and isinstance(out.owner.op, DeepCopyOp):\u001b[m\n                     out = out.owner.inputs[0]\u001b[m\n\u001b[1mdiff --git a/pymc/ode/ode.py b/pymc/ode/ode.py\u001b[m\n\u001b[1mindex a5e3741..600f306 100644\u001b[m\n\u001b[1m--- a/pymc/ode/ode.py\u001b[m\n\u001b[1m+++ b/pymc/ode/ode.py\u001b[m\n\u001b[36m@@ -67,6 +67,7 @@\u001b[m \u001b[mclass DifferentialEquation(Op):\u001b[m\n         ode_model = DifferentialEquation(func=odefunc, times=times, n_states=1, n_theta=1, t0=0)\u001b[m\n \u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     _itypes = [\u001b[m\n         TensorType(floatX, (False,)),  # y0 as 1D floatX vector\u001b[m\n         TensorType(floatX, (False,)),  # theta as 1D floatX vector\u001b[m\n\u001b[1mdiff --git a/pymc/printing.py b/pymc/printing.py\u001b[m\n\u001b[1mindex ffc943a..9fe7d05 100644\u001b[m\n\u001b[1m--- a/pymc/printing.py\u001b[m\n\u001b[1m+++ b/pymc/printing.py\u001b[m\n\u001b[36m@@ -123,9 +123,7 @@\u001b[m \u001b[mdef str_for_model(model: Model, formatting: str = \"plain\", include_params: bool\u001b[m\n             \\begin{{array}}{{rcl}}\u001b[m\n             {}\u001b[m\n             \\end{{array}}\u001b[m\n\u001b[31m-            $$\"\"\".format(\u001b[m\n\u001b[31m-            \"\\\\\\\\\".join(var_reprs)\u001b[m\n\u001b[31m-        )\u001b[m\n\u001b[32m+\u001b[m\u001b[32m            $$\"\"\".format(\"\\\\\\\\\".join(var_reprs))\u001b[m\n     else:\u001b[m\n         # align vars on their ~\u001b[m\n         names = [s[: s.index(\"~\") - 1] for s in var_reprs]\u001b[m\n\u001b[1mdiff --git a/pymc/step_methods/metropolis.py b/pymc/step_methods/metropolis.py\u001b[m\n\u001b[1mindex 1adb462..e080cdd 100644\u001b[m\n\u001b[1m--- a/pymc/step_methods/metropolis.py\u001b[m\n\u001b[1m+++ b/pymc/step_methods/metropolis.py\u001b[m\n\u001b[36m@@ -134,7 +134,7 @@\u001b[m \u001b[mclass Metropolis(ArrayStepShared):\u001b[m\n         tune_interval=100,\u001b[m\n         model=None,\u001b[m\n         mode=None,\u001b[m\n\u001b[31m-        **kwargs\u001b[m\n\u001b[32m+\u001b[m\u001b[32m        **kwargs,\u001b[m\n     ):\u001b[m\n         \"\"\"Create an instance of a Metropolis stepper\u001b[m\n \u001b[m\n\u001b[36m@@ -771,7 +771,7 @@\u001b[m \u001b[mclass DEMetropolis(PopulationArrayStepShared):\u001b[m\n         tune_interval=100,\u001b[m\n         model=None,\u001b[m\n         mode=None,\u001b[m\n\u001b[31m-        **kwargs\u001b[m\n\u001b[32m+\u001b[m\u001b[32m        **kwargs,\u001b[m\n     ):\u001b[m\n         model = pm.modelcontext(model)\u001b[m\n         initial_values = model.initial_point()\u001b[m\n\u001b[36m@@ -915,7 +915,7 @@\u001b[m \u001b[mclass DEMetropolisZ(ArrayStepShared):\u001b[m\n         tune_drop_fraction: float = 0.9,\u001b[m\n         model=None,\u001b[m\n         mode=None,\u001b[m\n\u001b[31m-        **kwargs\u001b[m\n\u001b[32m+\u001b[m\u001b[32m        **kwargs,\u001b[m\n     ):\u001b[m\n         model = pm.modelcontext(model)\u001b[m\n         initial_values = model.initial_point()\u001b[m\n\u001b[1mdiff --git a/pymc/tuning/starting.py b/pymc/tuning/starting.py\u001b[m\n\u001b[1mindex 6a4d338..ad5f554 100644\u001b[m\n\u001b[1m--- a/pymc/tuning/starting.py\u001b[m\n\u001b[1m+++ b/pymc/tuning/starting.py\u001b[m\n\u001b[36m@@ -52,7 +52,7 @@\u001b[m \u001b[mdef find_MAP(\u001b[m\n     model=None,\u001b[m\n     *args,\u001b[m\n     seed: Optional[int] = None,\u001b[m\n\u001b[31m-    **kwargs\u001b[m\n\u001b[32m+\u001b[m\u001b[32m    **kwargs,\u001b[m\n ):\u001b[m\n     \"\"\"Finds the local maximum a posteriori point given a model.\u001b[m\n \u001b[m\n\u001b[1mdiff --git a/pymc/variational/approximations.py b/pymc/variational/approximations.py\u001b[m\n\u001b[1mindex 00df445..feb0a3a 100644\u001b[m\n\u001b[1m--- a/pymc/variational/approximations.py\u001b[m\n\u001b[1m+++ b/pymc/variational/approximations.py\u001b[m\n\u001b[36m@@ -46,6 +46,7 @@\u001b[m \u001b[mclass MeanFieldGroup(Group):\u001b[m\n     that latent space variables are uncorrelated that is the main drawback\u001b[m\n     of the method\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     __param_spec__ = dict(mu=(\"d\",), rho=(\"d\",))\u001b[m\n     short_name = \"mean_field\"\u001b[m\n     alias_names = frozenset([\"mf\"])\u001b[m\n\u001b[36m@@ -350,27 +351,21 @@\u001b[m \u001b[mclass SingleGroupApproximation(Approximation):\u001b[m\n class MeanField(SingleGroupApproximation):\u001b[m\n     __doc__ = \"\"\"**Single Group Mean Field Approximation**\u001b[m\n \u001b[m\n\u001b[31m-    \"\"\" + str(\u001b[m\n\u001b[31m-        MeanFieldGroup.__doc__\u001b[m\n\u001b[31m-    )\u001b[m\n\u001b[32m+\u001b[m\u001b[32m    \"\"\" + str(MeanFieldGroup.__doc__)\u001b[m\n     _group_class = MeanFieldGroup\u001b[m\n \u001b[m\n \u001b[m\n class FullRank(SingleGroupApproximation):\u001b[m\n     __doc__ = \"\"\"**Single Group Full Rank Approximation**\u001b[m\n \u001b[m\n\u001b[31m-    \"\"\" + str(\u001b[m\n\u001b[31m-        FullRankGroup.__doc__\u001b[m\n\u001b[31m-    )\u001b[m\n\u001b[32m+\u001b[m\u001b[32m    \"\"\" + str(FullRankGroup.__doc__)\u001b[m\n     _group_class = FullRankGroup\u001b[m\n \u001b[m\n \u001b[m\n class Empirical(SingleGroupApproximation):\u001b[m\n     __doc__ = \"\"\"**Single Group Full Rank Approximation**\u001b[m\n \u001b[m\n\u001b[31m-    \"\"\" + str(\u001b[m\n\u001b[31m-        EmpiricalGroup.__doc__\u001b[m\n\u001b[31m-    )\u001b[m\n\u001b[32m+\u001b[m\u001b[32m    \"\"\" + str(EmpiricalGroup.__doc__)\u001b[m\n     _group_class = EmpiricalGroup\u001b[m\n \u001b[m\n     def __init__(self, trace=None, size=None, **kwargs):\u001b[m\n\u001b[1mdiff --git a/pymc/variational/operators.py b/pymc/variational/operators.py\u001b[m\n\u001b[1mindex 1122a70..f6ef095 100644\u001b[m\n\u001b[1m--- a/pymc/variational/operators.py\u001b[m\n\u001b[1m+++ b/pymc/variational/operators.py\u001b[m\n\u001b[36m@@ -130,6 +130,7 @@\u001b[m \u001b[mclass KSD(Operator):\u001b[m\n         Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm\u001b[m\n         arXiv:1608.04471\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     has_test_function = True\u001b[m\n     returns_loss = False\u001b[m\n     require_logq = False\u001b[m\n\u001b[1mdiff --git a/pymc/variational/opvi.py b/pymc/variational/opvi.py\u001b[m\n\u001b[1mindex cf98c98..bd1874f 100644\u001b[m\n\u001b[1m--- a/pymc/variational/opvi.py\u001b[m\n\u001b[1m+++ b/pymc/variational/opvi.py\u001b[m\n\u001b[36m@@ -663,6 +663,7 @@\u001b[m \u001b[mclass Group(WithMemoization):\u001b[m\n     -   Kingma, D. P., & Welling, M. (2014).\u001b[m\n         `Auto-Encoding Variational Bayes. stat, 1050, 1. <https://arxiv.org/abs/1312.6114>`_\u001b[m\n     \"\"\"\u001b[m\n\u001b[32m+\u001b[m\n     # needs to be defined in init\u001b[m\n     shared_params = None\u001b[m\n     symbolic_initial = None\u001b[m\n\u001b[36m@@ -709,8 +710,9 @@\u001b[m \u001b[mclass Group(WithMemoization):\u001b[m\n     def group_for_short_name(cls, name):\u001b[m\n         if name.lower() not in cls.__name_registry:\u001b[m\n             raise KeyError(\u001b[m\n\u001b[31m-                \"No such group: {!r}, \"\u001b[m\n\u001b[31m-                \"only the following are supported\\n\\n{}\".format(name, cls.__name_registry)\u001b[m\n\u001b[32m+\u001b[m\u001b[32m                \"No such group: {!r}, \" \"only the following are supported\\n\\n{}\".format(\u001b[m\n\u001b[32m+\u001b[m\u001b[32m                    name, cls.__name_registry\u001b[m\n\u001b[32m+\u001b[m\u001b[32m                )\u001b[m\n             )\u001b[m\n         return cls.__name_registry[name.lower()]\u001b[m\n \u001b[m\n\u001b[1mdiff --git a/versioneer.py b/versioneer.py\u001b[m\n\u001b[1mindex a560e68..c2b9d28 100644\u001b[m\n\u001b[1m--- a/versioneer.py\u001b[m\n\u001b[1m+++ b/versioneer.py\u001b[m\n\u001b[36m@@ -432,9 +432,7 @@\u001b[m \u001b[mdef run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, env=\u001b[m\n     return stdout, process.returncode\u001b[m\n \u001b[m\n \u001b[m\n\u001b[31m-LONG_VERSION_PY[\u001b[m\n\u001b[31m-    \"git\"\u001b[m\n\u001b[31m-] = r'''\u001b[m\n\u001b[32m+\u001b[m\u001b[32mLONG_VERSION_PY[\"git\"] = r'''\u001b[m\n # This file helps to compute a version number in source trees obtained from\u001b[m\n # git-archive tarball (such as those provided by githubs download-from-tag\u001b[m\n # feature). Distribution tarballs (built by setup.py sdist) and build\u001b[m\n##[error]Process completed with exit code 1.\n"}], "diff": "diff --git a/pymc/distributions/discrete.py b/pymc/distributions/discrete.py\nindex f95b4374d..877119350 100644\n--- a/pymc/distributions/discrete.py\n+++ b/pymc/distributions/discrete.py\n@@ -112,6 +112,7 @@ class Binomial(Discrete):\n     logit_p : tensor_like of float\n         Alternative log odds for the probability of success.\n     \"\"\"\n+\n     rv_op = binomial\n \n     @classmethod\n@@ -334,6 +335,7 @@ class Bernoulli(Discrete):\n     logit_p : tensor_like of float\n         Alternative log odds for the probability of success.\n     \"\"\"\n+\n     rv_op = bernoulli\n \n     @classmethod\n@@ -450,6 +452,7 @@ class DiscreteWeibull(Discrete):\n         Shape parameter (beta > 0).\n \n     \"\"\"\n+\n     rv_op = discrete_weibull\n \n     @classmethod\n@@ -539,6 +542,7 @@ class Poisson(Discrete):\n     The Poisson distribution can be derived as a limiting case of the\n     binomial distribution.\n     \"\"\"\n+\n     rv_op = poisson\n \n     @classmethod\n@@ -662,6 +666,7 @@ class NegativeBinomial(Discrete):\n     n : tensor_like of float\n         Alternative number of target success trials (n > 0)\n     \"\"\"\n+\n     rv_op = nbinom\n \n     @classmethod\n@@ -1108,6 +1113,7 @@ class Categorical(Discrete):\n     logit_p : float\n         Alternative log odds for the probability of success.\n     \"\"\"\n+\n     rv_op = categorical\n \n     @classmethod\n@@ -1183,6 +1189,7 @@ class _OrderedLogistic(Categorical):\n     Underlying class for ordered logistic distributions.\n     See docs for the OrderedLogistic wrapper class for more details on how to use it in models.\n     \"\"\"\n+\n     rv_op = categorical\n \n     @classmethod\n@@ -1289,6 +1296,7 @@ class _OrderedProbit(Categorical):\n     Underlying class for ordered probit distributions.\n     See docs for the OrderedProbit wrapper class for more details on how to use it in models.\n     \"\"\"\n+\n     rv_op = categorical\n \n     @classmethod\ndiff --git a/pymc/distributions/multivariate.py b/pymc/distributions/multivariate.py\nindex 1e5a9567a..570c13988 100644\n--- a/pymc/distributions/multivariate.py\n+++ b/pymc/distributions/multivariate.py\n@@ -235,6 +235,7 @@ class MvNormal(Continuous):\n         vals_raw = pm.Normal('vals_raw', mu=0, sigma=1, shape=(5, 3))\n         vals = pm.Deterministic('vals', pt.dot(chol, vals_raw.T).T)\n     \"\"\"\n+\n     rv_op = multivariate_normal\n \n     @classmethod\n@@ -355,6 +356,7 @@ class MvStudentT(Continuous):\n     lower : bool, default=True\n         Whether the cholesky fatcor is given as a lower triangular matrix.\n     \"\"\"\n+\n     rv_op = mv_studentt\n \n     @classmethod\n@@ -436,6 +438,7 @@ class Dirichlet(SimplexContinuous):\n         Concentration parameters (a > 0). The number of categories is given by the\n         length of the last axis.\n     \"\"\"\n+\n     rv_op = dirichlet\n \n     @classmethod\n@@ -515,6 +518,7 @@ class Multinomial(Discrete):\n         categories is given by the length of the last axis. Elements are expected to sum\n         to 1 along the last axis.\n     \"\"\"\n+\n     rv_op = multinomial\n \n     @classmethod\n@@ -662,6 +666,7 @@ class DirichletMultinomial(Discrete):\n         Dirichlet concentration parameters (a > 0). The number of categories is given by\n         the length of the last axis.\n     \"\"\"\n+\n     rv_op = dirichlet_multinomial\n \n     @classmethod\n@@ -716,6 +721,7 @@ class _OrderedMultinomial(Multinomial):\n     Underlying class for ordered multinomial distributions.\n     See docs for the OrderedMultinomial wrapper class for more details on how to use it in models.\n     \"\"\"\n+\n     rv_op = multinomial\n \n     @classmethod\n@@ -940,6 +946,7 @@ class Wishart(Continuous):\n     This distribution is unusable in a PyMC model. You should instead\n     use LKJCholeskyCov or LKJCorr.\n     \"\"\"\n+\n     rv_op = wishart\n \n     @classmethod\n@@ -1763,6 +1770,7 @@ class MatrixNormal(Continuous):\n             vals = pm.MatrixNormal('vals', mu=mu, colchol=colchol, rowcov=rowcov,\n                                    observed=data)\n     \"\"\"\n+\n     rv_op = matrixnormal\n \n     @classmethod\n@@ -1977,6 +1985,7 @@ class KroneckerNormal(Continuous):\n     ----------\n     .. [1] Saatchi, Y. (2011). \"Scalable inference for structured Gaussian process models\"\n     \"\"\"\n+\n     rv_op = kroneckernormal\n \n     @classmethod\n@@ -2183,6 +2192,7 @@ class CAR(Continuous):\n         \"Generalized Hierarchical Multivariate CAR Models for Areal Data\"\n         Biometrics, Vol. 61, No. 4 (Dec., 2005), pp. 950-961\n     \"\"\"\n+\n     rv_op = car\n \n     @classmethod\n@@ -2400,9 +2410,7 @@ class ICAR(Continuous):\n         return pt.zeros(N)\n \n     def logp(value, W, node1, node2, N, sigma, zero_sum_stdev):\n-        pairwise_difference = (-1 / (2 * sigma**2)) * pt.sum(\n-            pt.square(value[node1] - value[node2])\n-        )\n+        pairwise_difference = (-1 / (2 * sigma**2)) * pt.sum(pt.square(value[node1] - value[node2]))\n         zero_sum = (\n             -0.5 * pt.pow(pt.sum(value) / (zero_sum_stdev * N), 2)\n             - pt.log(pt.sqrt(2.0 * np.pi))\n@@ -2498,6 +2506,7 @@ class StickBreakingWeights(SimplexContinuous):\n     .. [2] M\u00fcller, P., Quintana, F. A., Jara, A., & Hanson, T. (2015). Bayesian nonparametric data\n            analysis. New York: Springer.\n     \"\"\"\n+\n     rv_op = stickbreakingweights\n \n     @classmethod\n@@ -2641,6 +2650,7 @@ class ZeroSumNormal(Distribution):\n             # the zero sum axes will be the last two\n             v = pm.ZeroSumNormal(\"v\", shape=(3, 4, 5), n_zerosum_axes=2)\n     \"\"\"\n+\n     rv_type = ZeroSumNormalRV\n \n     def __new__(\ndiff --git a/pymc/model/core.py b/pymc/model/core.py\nindex c45f3f550..6ee6d491a 100644\n--- a/pymc/model/core.py\n+++ b/pymc/model/core.py\n@@ -138,9 +138,7 @@ class ContextMeta(type):\n \n     # FIXME: is there a more elegant way to automatically add methods to the class that\n     # are instance methods instead of class methods?\n-    def __init__(\n-        cls, name, bases, nmspc, context_class: Optional[Type] = None, **kwargs\n-    ):  # pylint: disable=unused-argument\n+    def __init__(cls, name, bases, nmspc, context_class: Optional[Type] = None, **kwargs):  # pylint: disable=unused-argument\n         \"\"\"Add ``__enter__`` and ``__exit__`` methods to the new class automatically.\"\"\"\n         if context_class is not None:\n             cls._context_class = context_class\n@@ -1740,7 +1738,7 @@ class Model(WithMemoization, metaclass=ContextMeta):\n             done = {}\n             used_ids = {}\n             for i, out in enumerate(rv_inputs.maker.fgraph.outputs):\n-                print_(f\"{i}: \", end=\"\"),\n+                (print_(f\"{i}: \", end=\"\"),)\n                 # Don't print useless deepcopys\n                 if out.owner and isinstance(out.owner.op, DeepCopyOp):\n                     out = out.owner.inputs[0]\ndiff --git a/pymc/ode/ode.py b/pymc/ode/ode.py\nindex a5e374130..600f30632 100644\n--- a/pymc/ode/ode.py\n+++ b/pymc/ode/ode.py\n@@ -67,6 +67,7 @@ class DifferentialEquation(Op):\n         ode_model = DifferentialEquation(func=odefunc, times=times, n_states=1, n_theta=1, t0=0)\n \n     \"\"\"\n+\n     _itypes = [\n         TensorType(floatX, (False,)),  # y0 as 1D floatX vector\n         TensorType(floatX, (False,)),  # theta as 1D floatX vector\ndiff --git a/pymc/printing.py b/pymc/printing.py\nindex ffc943aa1..9fe7d056c 100644\n--- a/pymc/printing.py\n+++ b/pymc/printing.py\n@@ -123,9 +123,7 @@ def str_for_model(model: Model, formatting: str = \"plain\", include_params: bool\n             \\begin{{array}}{{rcl}}\n             {}\n             \\end{{array}}\n-            $$\"\"\".format(\n-            \"\\\\\\\\\".join(var_reprs)\n-        )\n+            $$\"\"\".format(\"\\\\\\\\\".join(var_reprs))\n     else:\n         # align vars on their ~\n         names = [s[: s.index(\"~\") - 1] for s in var_reprs]\ndiff --git a/pymc/step_methods/metropolis.py b/pymc/step_methods/metropolis.py\nindex 1adb462d9..e080cdd09 100644\n--- a/pymc/step_methods/metropolis.py\n+++ b/pymc/step_methods/metropolis.py\n@@ -134,7 +134,7 @@ class Metropolis(ArrayStepShared):\n         tune_interval=100,\n         model=None,\n         mode=None,\n-        **kwargs\n+        **kwargs,\n     ):\n         \"\"\"Create an instance of a Metropolis stepper\n \n@@ -771,7 +771,7 @@ class DEMetropolis(PopulationArrayStepShared):\n         tune_interval=100,\n         model=None,\n         mode=None,\n-        **kwargs\n+        **kwargs,\n     ):\n         model = pm.modelcontext(model)\n         initial_values = model.initial_point()\n@@ -915,7 +915,7 @@ class DEMetropolisZ(ArrayStepShared):\n         tune_drop_fraction: float = 0.9,\n         model=None,\n         mode=None,\n-        **kwargs\n+        **kwargs,\n     ):\n         model = pm.modelcontext(model)\n         initial_values = model.initial_point()\ndiff --git a/pymc/tuning/starting.py b/pymc/tuning/starting.py\nindex 6a4d33894..ad5f554ae 100644\n--- a/pymc/tuning/starting.py\n+++ b/pymc/tuning/starting.py\n@@ -52,7 +52,7 @@ def find_MAP(\n     model=None,\n     *args,\n     seed: Optional[int] = None,\n-    **kwargs\n+    **kwargs,\n ):\n     \"\"\"Finds the local maximum a posteriori point given a model.\n \ndiff --git a/pymc/variational/approximations.py b/pymc/variational/approximations.py\nindex 00df44599..feb0a3a92 100644\n--- a/pymc/variational/approximations.py\n+++ b/pymc/variational/approximations.py\n@@ -46,6 +46,7 @@ class MeanFieldGroup(Group):\n     that latent space variables are uncorrelated that is the main drawback\n     of the method\n     \"\"\"\n+\n     __param_spec__ = dict(mu=(\"d\",), rho=(\"d\",))\n     short_name = \"mean_field\"\n     alias_names = frozenset([\"mf\"])\n@@ -350,27 +351,21 @@ class SingleGroupApproximation(Approximation):\n class MeanField(SingleGroupApproximation):\n     __doc__ = \"\"\"**Single Group Mean Field Approximation**\n \n-    \"\"\" + str(\n-        MeanFieldGroup.__doc__\n-    )\n+    \"\"\" + str(MeanFieldGroup.__doc__)\n     _group_class = MeanFieldGroup\n \n \n class FullRank(SingleGroupApproximation):\n     __doc__ = \"\"\"**Single Group Full Rank Approximation**\n \n-    \"\"\" + str(\n-        FullRankGroup.__doc__\n-    )\n+    \"\"\" + str(FullRankGroup.__doc__)\n     _group_class = FullRankGroup\n \n \n class Empirical(SingleGroupApproximation):\n     __doc__ = \"\"\"**Single Group Full Rank Approximation**\n \n-    \"\"\" + str(\n-        EmpiricalGroup.__doc__\n-    )\n+    \"\"\" + str(EmpiricalGroup.__doc__)\n     _group_class = EmpiricalGroup\n \n     def __init__(self, trace=None, size=None, **kwargs):\ndiff --git a/pymc/variational/operators.py b/pymc/variational/operators.py\nindex 1122a704b..f6ef09572 100644\n--- a/pymc/variational/operators.py\n+++ b/pymc/variational/operators.py\n@@ -130,6 +130,7 @@ class KSD(Operator):\n         Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm\n         arXiv:1608.04471\n     \"\"\"\n+\n     has_test_function = True\n     returns_loss = False\n     require_logq = False\ndiff --git a/pymc/variational/opvi.py b/pymc/variational/opvi.py\nindex cf98c985a..bd1874ffe 100644\n--- a/pymc/variational/opvi.py\n+++ b/pymc/variational/opvi.py\n@@ -663,6 +663,7 @@ class Group(WithMemoization):\n     -   Kingma, D. P., & Welling, M. (2014).\n         `Auto-Encoding Variational Bayes. stat, 1050, 1. <https://arxiv.org/abs/1312.6114>`_\n     \"\"\"\n+\n     # needs to be defined in init\n     shared_params = None\n     symbolic_initial = None\n@@ -709,8 +710,9 @@ class Group(WithMemoization):\n     def group_for_short_name(cls, name):\n         if name.lower() not in cls.__name_registry:\n             raise KeyError(\n-                \"No such group: {!r}, \"\n-                \"only the following are supported\\n\\n{}\".format(name, cls.__name_registry)\n+                \"No such group: {!r}, \" \"only the following are supported\\n\\n{}\".format(\n+                    name, cls.__name_registry\n+                )\n             )\n         return cls.__name_registry[name.lower()]\n \ndiff --git a/versioneer.py b/versioneer.py\nindex a560e685f..c2b9d28bc 100644\n--- a/versioneer.py\n+++ b/versioneer.py\n@@ -432,9 +432,7 @@ def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False, env=\n     return stdout, process.returncode\n \n \n-LONG_VERSION_PY[\n-    \"git\"\n-] = r'''\n+LONG_VERSION_PY[\"git\"] = r'''\n # This file helps to compute a version number in source trees obtained from\n # git-archive tarball (such as those provided by githubs download-from-tag\n # feature). Distribution tarballs (built by setup.py sdist) and build\n", "difficulty": 3, "changed_files": ["pymc/distributions/discrete.py", "pymc/distributions/multivariate.py", "pymc/model/core.py", "pymc/ode/ode.py", "pymc/printing.py", "pymc/step_methods/metropolis.py", "pymc/tuning/starting.py", "pymc/variational/approximations.py", "pymc/variational/operators.py", "pymc/variational/opvi.py", "versioneer.py"], "commit_link": "https://github.com/pymc-devs/pymc/tree/9981ca154ba03a88deaa96d16b119de6183017e5"}