File size: 25,455 Bytes
7b61cd4
1
{"language": "Python", "id": 17, "repo_owner": "scrapy", "repo_name": "scrapy", "head_branch": "master", "workflow_name": "Checks", "workflow_filename": "checks.yml", "workflow_path": ".github/workflows/checks.yml", "contributor": "Chenwei-Niu", "sha_fail": "cc2ad923f94d2f5485c20f594db56e2540024ae0", "sha_success": "39ee8d1ee2decce995c4dafc5da5a6fb55f478c9", "workflow": "name: Checks\non: [push, pull_request]\n\nconcurrency:\n  group: ${{github.workflow}}-${{ github.ref }}\n  cancel-in-progress: true\n\njobs:\n  checks:\n    runs-on: ubuntu-latest\n    strategy:\n      fail-fast: false\n      matrix:\n        include:\n        - python-version: \"3.12\"\n          env:\n            TOXENV: pylint\n        - python-version: 3.8\n          env:\n            TOXENV: typing\n        - python-version: \"3.11\"  # Keep in sync with .readthedocs.yml\n          env:\n            TOXENV: docs\n        - python-version: \"3.12\"\n          env:\n            TOXENV: twinecheck\n\n    steps:\n    - uses: actions/checkout@v4\n\n    - name: Set up Python ${{ matrix.python-version }}\n      uses: actions/setup-python@v4\n      with:\n        python-version: ${{ matrix.python-version }}\n\n    - name: Run check\n      env: ${{ matrix.env }}\n      run: |\n        pip install -U tox\n        tox\n\n  pre-commit:\n    runs-on: ubuntu-latest\n    steps:\n    - uses: actions/checkout@v4\n    - uses: pre-commit/[email protected]\n", "logs": [{"step_name": "checks (3.8, typing)/4_Run check.txt", "log": "##[group]Run pip install -U tox\n\u001b[36;1mpip install -U tox\u001b[0m\n\u001b[36;1mtox\u001b[0m\nshell: /usr/bin/bash -e {0}\nenv:\n  pythonLocation: /opt/hostedtoolcache/Python/3.8.18/x64\n  PKG_CONFIG_PATH: /opt/hostedtoolcache/Python/3.8.18/x64/lib/pkgconfig\n  Python_ROOT_DIR: /opt/hostedtoolcache/Python/3.8.18/x64\n  Python2_ROOT_DIR: /opt/hostedtoolcache/Python/3.8.18/x64\n  Python3_ROOT_DIR: /opt/hostedtoolcache/Python/3.8.18/x64\n  LD_LIBRARY_PATH: /opt/hostedtoolcache/Python/3.8.18/x64/lib\n  TOXENV: typing\n##[endgroup]\nCollecting tox\n  Obtaining dependency information for tox from https://files.pythonhosted.org/packages/f5/f9/963052e8b825645c54262dce7b7c88691505e3b9ee10a3e3667711eaaf21/tox-4.11.3-py3-none-any.whl.metadata\n  Downloading tox-4.11.3-py3-none-any.whl.metadata (5.0 kB)\nCollecting cachetools>=5.3.1 (from tox)\n  Obtaining dependency information for cachetools>=5.3.1 from https://files.pythonhosted.org/packages/a9/c9/c8a7710f2cedcb1db9224fdd4d8307c9e48cbddc46c18b515fefc0f1abbe/cachetools-5.3.1-py3-none-any.whl.metadata\n  Downloading cachetools-5.3.1-py3-none-any.whl.metadata (5.2 kB)\nCollecting chardet>=5.2 (from tox)\n  Obtaining dependency information for chardet>=5.2 from https://files.pythonhosted.org/packages/38/6f/f5fbc992a329ee4e0f288c1fe0e2ad9485ed064cac731ed2fe47dcc38cbf/chardet-5.2.0-py3-none-any.whl.metadata\n  Downloading chardet-5.2.0-py3-none-any.whl.metadata (3.4 kB)\nCollecting colorama>=0.4.6 (from tox)\n  Downloading colorama-0.4.6-py2.py3-none-any.whl (25 kB)\nCollecting filelock>=3.12.3 (from tox)\n  Obtaining dependency information for filelock>=3.12.3 from https://files.pythonhosted.org/packages/5e/5d/97afbafd9d584ff1b45fcb354a479a3609bd97f912f8f1f6c563cb1fae21/filelock-3.12.4-py3-none-any.whl.metadata\n  Downloading filelock-3.12.4-py3-none-any.whl.metadata (2.8 kB)\nCollecting packaging>=23.1 (from tox)\n  Obtaining dependency information for packaging>=23.1 from https://files.pythonhosted.org/packages/ec/1a/610693ac4ee14fcdf2d9bf3c493370e4f2ef7ae2e19217d7a237ff42367d/packaging-23.2-py3-none-any.whl.metadata\n  Downloading packaging-23.2-py3-none-any.whl.metadata (3.2 kB)\nCollecting platformdirs>=3.10 (from tox)\n  Obtaining dependency information for platformdirs>=3.10 from https://files.pythonhosted.org/packages/56/29/3ec311dc18804409ecf0d2b09caa976f3ae6215559306b5b530004e11156/platformdirs-3.11.0-py3-none-any.whl.metadata\n  Downloading platformdirs-3.11.0-py3-none-any.whl.metadata (11 kB)\nCollecting pluggy>=1.3 (from tox)\n  Obtaining dependency information for pluggy>=1.3 from https://files.pythonhosted.org/packages/05/b8/42ed91898d4784546c5f06c60506400548db3f7a4b3fb441cba4e5c17952/pluggy-1.3.0-py3-none-any.whl.metadata\n  Downloading pluggy-1.3.0-py3-none-any.whl.metadata (4.3 kB)\nCollecting pyproject-api>=1.6.1 (from tox)\n  Obtaining dependency information for pyproject-api>=1.6.1 from https://files.pythonhosted.org/packages/cf/b4/39eea50542e50e93876ebc09c4349a9c9eee9f6b9c9d30f88c7dc5433db8/pyproject_api-1.6.1-py3-none-any.whl.metadata\n  Downloading pyproject_api-1.6.1-py3-none-any.whl.metadata (2.8 kB)\nCollecting tomli>=2.0.1 (from tox)\n  Downloading tomli-2.0.1-py3-none-any.whl (12 kB)\nCollecting virtualenv>=20.24.3 (from tox)\n  Obtaining dependency information for virtualenv>=20.24.3 from https://files.pythonhosted.org/packages/4e/8b/f0d3a468c0186c603217a6656ea4f49259630e8ed99558501d92f6ff7dc3/virtualenv-20.24.5-py3-none-any.whl.metadata\n  Downloading virtualenv-20.24.5-py3-none-any.whl.metadata (4.5 kB)\nCollecting distlib<1,>=0.3.7 (from virtualenv>=20.24.3->tox)\n  Obtaining dependency information for distlib<1,>=0.3.7 from https://files.pythonhosted.org/packages/43/a0/9ba967fdbd55293bacfc1507f58e316f740a3b231fc00e3d86dc39bc185a/distlib-0.3.7-py2.py3-none-any.whl.metadata\n  Downloading distlib-0.3.7-py2.py3-none-any.whl.metadata (5.1 kB)\nDownloading tox-4.11.3-py3-none-any.whl (153 kB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 153.8/153.8 kB 19.6 MB/s eta 0:00:00\nDownloading cachetools-5.3.1-py3-none-any.whl (9.3 kB)\nDownloading chardet-5.2.0-py3-none-any.whl (199 kB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 199.4/199.4 kB 54.3 MB/s eta 0:00:00\nDownloading filelock-3.12.4-py3-none-any.whl (11 kB)\nDownloading packaging-23.2-py3-none-any.whl (53 kB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 53.0/53.0 kB 18.9 MB/s eta 0:00:00\nDownloading platformdirs-3.11.0-py3-none-any.whl (17 kB)\nDownloading pluggy-1.3.0-py3-none-any.whl (18 kB)\nDownloading pyproject_api-1.6.1-py3-none-any.whl (12 kB)\nDownloading virtualenv-20.24.5-py3-none-any.whl (3.7 MB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 3.7/3.7 MB 101.6 MB/s eta 0:00:00\nDownloading distlib-0.3.7-py2.py3-none-any.whl (468 kB)\n   \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 468.9/468.9 kB 83.7 MB/s eta 0:00:00\nInstalling collected packages: distlib, tomli, pluggy, platformdirs, packaging, filelock, colorama, chardet, cachetools, virtualenv, pyproject-api, tox\nSuccessfully installed cachetools-5.3.1 chardet-5.2.0 colorama-0.4.6 distlib-0.3.7 filelock-3.12.4 packaging-23.2 platformdirs-3.11.0 pluggy-1.3.0 pyproject-api-1.6.1 tomli-2.0.1 tox-4.11.3 virtualenv-20.24.5\n\n[notice] A new release of pip is available: 23.0.1 -> 23.3\n[notice] To update, run: pip install --upgrade pip\ntyping: install_deps> python -I -m pip install -ctests/upper-constraints.txt mypy==1.5.1 types-attrs==19.1.0 types-lxml==2023.3.28 types-Pillow==10.0.0.3 types-Pygments==2.16.0.0 types-pyOpenSSL==23.2.0.2 types-setuptools==68.2.0.0 typing-extensions==4.7.1 'w3lib>=2.1.2'\n.pkg: install_requires> python -I -m pip install 'setuptools>=40.8.0' wheel\n.pkg: _optional_hooks> python /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/pyproject_api/_backend.py True setuptools.build_meta __legacy__\n.pkg: get_requires_for_build_sdist> python /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/pyproject_api/_backend.py True setuptools.build_meta __legacy__\n.pkg: get_requires_for_build_wheel> python /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/pyproject_api/_backend.py True setuptools.build_meta __legacy__\n.pkg: install_requires_for_build_wheel> python -I -m pip install wheel\n.pkg: freeze> python -m pip freeze --all\n.pkg: pip==23.2.1,setuptools==68.2.0,wheel==0.41.2\n.pkg: prepare_metadata_for_build_wheel> python /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/pyproject_api/_backend.py True setuptools.build_meta __legacy__\n.pkg: build_sdist> python /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/pyproject_api/_backend.py True setuptools.build_meta __legacy__\ntyping: install_package_deps> python -I -m pip install -ctests/upper-constraints.txt 'PyDispatcher>=2.0.5; platform_python_implementation == \"CPython\"' 'PyPyDispatcher>=2.1.0; platform_python_implementation == \"PyPy\"' 'Twisted>=18.9.0' 'cryptography>=36.0.0' 'cssselect>=0.9.1' 'itemadapter>=0.1.0' 'itemloaders>=1.0.1' 'lxml>=4.4.1' packaging 'parsel>=1.5.0' 'protego>=0.1.15' 'pyOpenSSL>=21.0.0' 'queuelib>=1.4.2' 'service-identity>=18.1.0' setuptools tldextract 'w3lib>=1.17.0' 'zope.interface>=5.1.0'\ntyping: install_package> python -I -m pip install -ctests/upper-constraints.txt --force-reinstall --no-deps /home/runner/work/scrapy/scrapy/.tox/.tmp/package/1/Scrapy-2.11.0.tar.gz\ntyping: freeze> python -m pip freeze --all\ntyping: attrs==23.1.0,Automat==22.10.0,certifi==2023.7.22,cffi==1.16.0,charset-normalizer==3.3.0,constantly==15.1.0,cryptography==41.0.4,cssselect==1.2.0,filelock==3.12.4,hyperlink==21.0.0,idna==3.4,incremental==22.10.0,itemadapter==0.8.0,itemloaders==1.1.0,jmespath==1.0.1,lxml==4.9.3,mypy==1.5.1,mypy-extensions==1.0.0,packaging==23.2,parsel==1.8.1,pip==23.3,Protego==0.3.0,pyasn1==0.5.0,pyasn1-modules==0.3.0,pycparser==2.21,PyDispatcher==2.0.7,pyOpenSSL==23.2.0,queuelib==1.6.2,requests==2.31.0,requests-file==1.5.1,Scrapy @ file:///home/runner/work/scrapy/scrapy/.tox/.tmp/package/1/Scrapy-2.11.0.tar.gz#sha256=4d445cc983c6a263ec00575a9fe7de7dc03f7e00eedb8629eb85077c420d42e7,service-identity==23.1.0,setuptools==68.2.2,six==1.16.0,tldextract==5.0.1,tomli==2.0.1,Twisted==23.8.0,types-attrs==19.1.0,types-beautifulsoup4==4.12.0.6,types-docutils==0.20.0.3,types-html5lib==1.1.11.15,types-lxml==2023.3.28,types-Pillow==10.0.0.3,types-Pygments==2.16.0.0,types-pyOpenSSL==23.2.0.2,types-setuptools==68.2.0.0,typing_extensions==4.7.1,urllib3==2.0.7,w3lib==2.1.2,wheel==0.41.2,zope.interface==6.1\ntyping: commands[0]> mypy scrapy tests\nscrapy/utils/log.py:244: error: Incompatible return value type (got \"Tuple[Any, Optional[Any], Any]\", expected \"Tuple[int, str, Dict[Any, Any]]\")  [return-value]\nFound 1 error in 1 file (checked 338 source files)\ntyping: exit 1 (10.57 seconds) /home/runner/work/scrapy/scrapy> mypy scrapy tests pid=1884\n.pkg: _exit> python /opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/site-packages/pyproject_api/_backend.py True setuptools.build_meta __legacy__\n  typing: FAIL code 1 (31.11=setup[20.53]+cmd[10.57] seconds)\n  evaluation failed :( (31.27 seconds)\n##[error]Process completed with exit code 1.\n"}, {"step_name": "pre-commit/3_Run [email protected]", "log": "##[group]Run pre-commit/[email protected]\nwith:\n  extra_args: --all-files\n##[endgroup]\n##[group]Run python -m pip install pre-commit\n\u001b[36;1mpython -m pip install pre-commit\u001b[0m\nshell: /usr/bin/bash --noprofile --norc -e -o pipefail {0}\n##[endgroup]\nDefaulting to user installation because normal site-packages is not writeable\nCollecting pre-commit\n  Downloading pre_commit-3.5.0-py2.py3-none-any.whl (203 kB)\n     \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 203.7/203.7 KB 3.7 MB/s eta 0:00:00\nCollecting identify>=1.0.0\n  Downloading identify-2.5.30-py2.py3-none-any.whl (98 kB)\n     \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 98.9/98.9 KB 20.1 MB/s eta 0:00:00\nRequirement already satisfied: pyyaml>=5.1 in /usr/lib/python3/dist-packages (from pre-commit) (5.4.1)\nCollecting cfgv>=2.0.0\n  Downloading cfgv-3.4.0-py2.py3-none-any.whl (7.2 kB)\nCollecting virtualenv>=20.10.0\n  Downloading virtualenv-20.24.5-py3-none-any.whl (3.7 MB)\n     \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 3.7/3.7 MB 69.0 MB/s eta 0:00:00\nCollecting nodeenv>=0.11.1\n  Downloading nodeenv-1.8.0-py2.py3-none-any.whl (22 kB)\nRequirement already satisfied: setuptools in /usr/lib/python3/dist-packages (from nodeenv>=0.11.1->pre-commit) (59.6.0)\nCollecting filelock<4,>=3.12.2\n  Downloading filelock-3.12.4-py3-none-any.whl (11 kB)\nCollecting distlib<1,>=0.3.7\n  Downloading distlib-0.3.7-py2.py3-none-any.whl (468 kB)\n     \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 468.9/468.9 KB 57.0 MB/s eta 0:00:00\nCollecting platformdirs<4,>=3.9.1\n  Downloading platformdirs-3.11.0-py3-none-any.whl (17 kB)\nInstalling collected packages: distlib, platformdirs, nodeenv, identify, filelock, cfgv, virtualenv, pre-commit\nSuccessfully installed cfgv-3.4.0 distlib-0.3.7 filelock-3.12.4 identify-2.5.30 nodeenv-1.8.0 platformdirs-3.11.0 pre-commit-3.5.0 virtualenv-20.24.5\n##[group]Run python -m pip freeze --local\n\u001b[36;1mpython -m pip freeze --local\u001b[0m\nshell: /usr/bin/bash --noprofile --norc -e -o pipefail {0}\n##[endgroup]\nargcomplete==3.1.2\ncfgv==3.4.0\ndistlib==0.3.7\nfilelock==3.12.4\nidentify==2.5.30\nnodeenv==1.8.0\npackaging==23.2\npipx==1.2.0\nplatformdirs==3.11.0\npre-commit==3.5.0\nuserpath==1.9.1\nvirtualenv==20.24.5\n##[group]Run actions/cache@v3\nwith:\n  path: ~/.cache/pre-commit\n  key: pre-commit-3||7a8fe885594aed9a90fd5938b4bb49b65732538a44c08aad3d6ea69d9d0cf64c\n  enableCrossOsArchive: false\n  fail-on-cache-miss: false\n  lookup-only: false\n##[endgroup]\nCache Size: ~33 MB (34938818 B)\n[command]/usr/bin/tar -xf /home/runner/work/_temp/2e73504e-c13e-4da4-97ff-45c557d92a27/cache.tzst -P -C /home/runner/work/scrapy/scrapy --use-compress-program unzstd\nReceived 34938818 of 34938818 (100.0%), 33.3 MBs/sec\nCache restored successfully\nCache restored from key: pre-commit-3||7a8fe885594aed9a90fd5938b4bb49b65732538a44c08aad3d6ea69d9d0cf64c\n##[group]Run pre-commit run --show-diff-on-failure --color=always --all-files\n\u001b[36;1mpre-commit run --show-diff-on-failure --color=always --all-files\u001b[0m\nshell: /usr/bin/bash --noprofile --norc -e -o pipefail {0}\n##[endgroup]\nbandit...................................................................\u001b[42mPassed\u001b[m\nflake8...................................................................\u001b[41mFailed\u001b[m\n\u001b[2m- hook id: flake8\u001b[m\n\u001b[2m- exit code: 1\u001b[m\n\n\u001b[1mscrapy/utils/response.py\u001b[m\u001b[36m:\u001b[m17\u001b[36m:\u001b[m1\u001b[36m:\u001b[m \u001b[1m\u001b[31mF401\u001b[m 'scrapy.utils.decorators.deprecated' imported but unused\n\u001b[1mtests/test_downloadermiddleware_stats.py\u001b[m\u001b[36m:\u001b[m12\u001b[36m:\u001b[m1\u001b[36m:\u001b[m \u001b[1m\u001b[31mE302\u001b[m expected 2 blank lines, found 1\n\u001b[1mtests/test_utils_response.py\u001b[m\u001b[36m:\u001b[m2\u001b[36m:\u001b[m1\u001b[36m:\u001b[m \u001b[1m\u001b[31mF401\u001b[m 'warnings' imported but unused\n\u001b[1mtests/test_utils_response.py\u001b[m\u001b[36m:\u001b[m6\u001b[36m:\u001b[m1\u001b[36m:\u001b[m \u001b[1m\u001b[31mF401\u001b[m 'scrapy.exceptions.ScrapyDeprecationWarning' imported but unused\n\nblack....................................................................\u001b[41mFailed\u001b[m\n\u001b[2m- hook id: black\u001b[m\n\u001b[2m- files were modified by this hook\u001b[m\n\n\u001b[1mreformatted tests/test_downloadermiddleware_stats.py\u001b[0m\n\n\u001b[1mAll done! \u2728 \ud83c\udf70 \u2728\u001b[0m\n\u001b[34m\u001b[1m1 file \u001b[0m\u001b[1mreformatted\u001b[0m, \u001b[34m345 files \u001b[0mleft unchanged.\n\nisort....................................................................\u001b[41mFailed\u001b[m\n\u001b[2m- hook id: isort\u001b[m\n\u001b[2m- files were modified by this hook\u001b[m\n\nFixing /home/runner/work/scrapy/scrapy/tests/test_downloadermiddleware_stats.py\n\nblacken-docs.............................................................\u001b[42mPassed\u001b[m\npre-commit hook(s) made changes.\nIf you are seeing this message in CI, reproduce locally with: `pre-commit run --all-files`.\nTo run `pre-commit` as part of git workflow, use `pre-commit install`.\nAll changes made by hooks:\n\u001b[1mdiff --git a/tests/test_downloadermiddleware_stats.py b/tests/test_downloadermiddleware_stats.py\u001b[m\n\u001b[1mindex 766b965..eda5a0a 100644\u001b[m\n\u001b[1m--- a/tests/test_downloadermiddleware_stats.py\u001b[m\n\u001b[1m+++ b/tests/test_downloadermiddleware_stats.py\u001b[m\n\u001b[36m@@ -6,8 +6,9 @@\u001b[m \u001b[mfrom scrapy.downloadermiddlewares.stats import DownloaderStats\u001b[m\n from scrapy.exceptions import ScrapyDeprecationWarning\u001b[m\n from scrapy.http import Request, Response\u001b[m\n from scrapy.spiders import Spider\u001b[m\n\u001b[31m-from scrapy.utils.test import get_crawler\u001b[m\n from scrapy.utils.python import to_bytes\u001b[m\n\u001b[32m+\u001b[m\u001b[32mfrom scrapy.utils.test import get_crawler\u001b[m\n\u001b[32m+\u001b[m\n \u001b[m\n class MyException(Exception):\u001b[m\n     pass\u001b[m\n##[error]Process completed with exit code 1.\n"}], "diff": "diff --git a/docs/topics/downloader-middleware.rst b/docs/topics/downloader-middleware.rst\nindex a8e5b23bf..1abbc4968 100644\n--- a/docs/topics/downloader-middleware.rst\n+++ b/docs/topics/downloader-middleware.rst\n@@ -1039,8 +1039,8 @@ RobotsTxtMiddleware\n \n     * :ref:`Protego <protego-parser>` (default)\n     * :ref:`RobotFileParser <python-robotfileparser>`\n-    * :ref:`Reppy <reppy-parser>`\n     * :ref:`Robotexclusionrulesparser <rerp-parser>`\n+    * :ref:`Reppy <reppy-parser>` (deprecated)\n \n     You can change the robots.txt_ parser with the :setting:`ROBOTSTXT_PARSER`\n     setting. Or you can also :ref:`implement support for a new parser <support-for-new-robots-parser>`.\n@@ -1133,6 +1133,7 @@ In order to use this parser:\n \n     .. warning:: `Upstream issue #122\n         <https://github.com/seomoz/reppy/issues/122>`_ prevents reppy usage in Python 3.9+.\n+        Because of this the Reppy parser is deprecated.\n \n * Set :setting:`ROBOTSTXT_PARSER` setting to\n   ``scrapy.robotstxt.ReppyRobotParser``\ndiff --git a/scrapy/robotstxt.py b/scrapy/robotstxt.py\nindex 604b5e314..5c5ac4e41 100644\n--- a/scrapy/robotstxt.py\n+++ b/scrapy/robotstxt.py\n@@ -1,7 +1,9 @@\n import logging\n import sys\n from abc import ABCMeta, abstractmethod\n+from warnings import warn\n \n+from scrapy.exceptions import ScrapyDeprecationWarning\n from scrapy.utils.python import to_unicode\n \n logger = logging.getLogger(__name__)\n@@ -79,6 +81,7 @@ class PythonRobotParser(RobotParser):\n \n class ReppyRobotParser(RobotParser):\n     def __init__(self, robotstxt_body, spider):\n+        warn(\"ReppyRobotParser is deprecated.\", ScrapyDeprecationWarning, stacklevel=2)\n         from reppy.robots import Robots\n \n         self.spider = spider\ndiff --git a/scrapy/utils/log.py b/scrapy/utils/log.py\nindex 276c62a87..fdea46a3d 100644\n--- a/scrapy/utils/log.py\n+++ b/scrapy/utils/log.py\n@@ -235,8 +235,15 @@ def logformatter_adapter(logkws: dict) -> Tuple[int, str, dict]:\n     if not {\"level\", \"msg\", \"args\"} <= set(logkws):\n         warnings.warn(\"Missing keys in LogFormatter method\", ScrapyDeprecationWarning)\n \n+    if \"format\" in logkws:\n+        warnings.warn(\n+            \"`format` key in LogFormatter methods has been \"\n+            \"deprecated, use `msg` instead\",\n+            ScrapyDeprecationWarning,\n+        )\n+\n     level = logkws.get(\"level\", logging.INFO)\n-    message = logkws.get(\"msg\")\n+    message = logkws.get(\"format\", logkws.get(\"msg\"))\n     # NOTE: This also handles 'args' being an empty dict, that case doesn't\n     # play well in logger.log calls\n     args = logkws if not logkws.get(\"args\") else logkws[\"args\"]\ndiff --git a/scrapy/utils/response.py b/scrapy/utils/response.py\nindex be51b0bee..c540d6278 100644\n--- a/scrapy/utils/response.py\n+++ b/scrapy/utils/response.py\n@@ -55,6 +55,25 @@ def response_status_message(status: Union[bytes, float, int, str]) -> str:\n     return f\"{status_int} {to_unicode(message)}\"\n \n \n+@deprecated\n+def response_httprepr(response: Response) -> bytes:\n+    \"\"\"Return raw HTTP representation (as bytes) of the given response. This\n+    is provided only for reference, since it's not the exact stream of bytes\n+    that was received (that's not exposed by Twisted).\n+    \"\"\"\n+    values = [\n+        b\"HTTP/1.1 \",\n+        to_bytes(str(response.status)),\n+        b\" \",\n+        to_bytes(http.RESPONSES.get(response.status, b\"\")),\n+        b\"\\r\\n\",\n+    ]\n+    if response.headers:\n+        values.extend([response.headers.to_string(), b\"\\r\\n\"])\n+    values.extend([b\"\\r\\n\", response.body])\n+    return b\"\".join(values)\n+\n+\n def open_in_browser(\n     response: Union[\n         \"scrapy.http.response.html.HtmlResponse\",\ndiff --git a/tests/test_downloadermiddleware_stats.py b/tests/test_downloadermiddleware_stats.py\nindex 766b96521..39dfe9ab5 100644\n--- a/tests/test_downloadermiddleware_stats.py\n+++ b/tests/test_downloadermiddleware_stats.py\n@@ -6,8 +6,9 @@ from scrapy.downloadermiddlewares.stats import DownloaderStats\n from scrapy.exceptions import ScrapyDeprecationWarning\n from scrapy.http import Request, Response\n from scrapy.spiders import Spider\n+from scrapy.utils.response import response_httprepr\n from scrapy.utils.test import get_crawler\n-from scrapy.utils.python import to_bytes\n+\n \n class MyException(Exception):\n     pass\n@@ -55,7 +56,7 @@ class TestDownloaderStats(TestCase):\n             self.mw.process_response(self.req, test_response, self.spider)\n             with warnings.catch_warnings():\n                 warnings.simplefilter(\"ignore\", ScrapyDeprecationWarning)\n-                resp_size = to_bytes(test_response)\n+                resp_size = len(response_httprepr(test_response))\n             self.assertStatsEqual(\"downloader/response_bytes\", resp_size)\n \n     def test_process_exception(self):\ndiff --git a/tests/test_utils_response.py b/tests/test_utils_response.py\nindex d45358d9a..80e15a60f 100644\n--- a/tests/test_utils_response.py\n+++ b/tests/test_utils_response.py\n@@ -10,6 +10,7 @@ from scrapy.utils.response import (\n     get_base_url,\n     get_meta_refresh,\n     open_in_browser,\n+    response_httprepr,\n     response_status_message,\n )\n \n@@ -19,6 +20,35 @@ __doctests__ = [\"scrapy.utils.response\"]\n class ResponseUtilsTest(unittest.TestCase):\n     dummy_response = TextResponse(url=\"http://example.org/\", body=b\"dummy_response\")\n \n+    def test_response_httprepr(self):\n+        with warnings.catch_warnings():\n+            warnings.simplefilter(\"ignore\", ScrapyDeprecationWarning)\n+\n+            r1 = Response(\"http://www.example.com\")\n+            self.assertEqual(response_httprepr(r1), b\"HTTP/1.1 200 OK\\r\\n\\r\\n\")\n+\n+            r1 = Response(\n+                \"http://www.example.com\",\n+                status=404,\n+                headers={\"Content-type\": \"text/html\"},\n+                body=b\"Some body\",\n+            )\n+            self.assertEqual(\n+                response_httprepr(r1),\n+                b\"HTTP/1.1 404 Not Found\\r\\nContent-Type: text/html\\r\\n\\r\\nSome body\",\n+            )\n+\n+            r1 = Response(\n+                \"http://www.example.com\",\n+                status=6666,\n+                headers={\"Content-type\": \"text/html\"},\n+                body=b\"Some body\",\n+            )\n+            self.assertEqual(\n+                response_httprepr(r1),\n+                b\"HTTP/1.1 6666 \\r\\nContent-Type: text/html\\r\\n\\r\\nSome body\",\n+            )\n+\n     def test_open_in_browser(self):\n         url = \"http:///www.example.com/some/page.html\"\n         body = b\"<html> <head> <title>test page</title> </head> <body>test body</body> </html>\"\n", "difficulty": "1"}