From cba40bf1a4a8db67e4c96ec82cfc0d6699ace8f4 Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Tue, 1 Apr 2025 16:37:13 -1000 Subject: [PATCH 01/37] Increment version to 3.11.17.dev0 (#10681) --- aiohttp/__init__.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/aiohttp/__init__.py b/aiohttp/__init__.py index 93b06c7367a..8a3d34a4f87 100644 --- a/aiohttp/__init__.py +++ b/aiohttp/__init__.py @@ -1,4 +1,4 @@ -__version__ = "3.11.16" +__version__ = "3.11.17.dev0" from typing import TYPE_CHECKING, Tuple From 8541a7ca342a2316e60e3fe3e12a07c1e73d33d1 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Tue, 1 Apr 2025 17:16:07 -1000 Subject: [PATCH 02/37] Bump multidict from 6.2.0 to 6.3.1 (#10667) Bumps [multidict](https://github.com/aio-libs/multidict) from 6.2.0 to 6.3.0.
Release notes

Sourced from multidict's releases.

6.3.0

Bug fixes

  • Set operations for KeysView and ItemsView of case-insensitive multidicts and their proxies are processed in case-insensitive manner.

    Related issues and pull requests on GitHub: #965.

  • Rewrote :class:multidict.CIMultiDict and it proxy to always return :class:multidict.istr keys. istr is derived from :class:str, thus the change is backward compatible.

    The performance boost is about 15% for some operations for C Extension, pure Python implementation have got a visible (15% - 230%) speedup as well.

    Related issues and pull requests on GitHub: #1097.

  • Fixed a crash when extending a multidict from multidict proxy if C Extensions were used.

    Related issues and pull requests on GitHub: #1100.

Features

  • Implemented a custom parser for METH_FASTCALL | METH_KEYWORDS protocol -- by :user:asvetlov.

    The patch re-enables fast call protocol in the :py:mod:multidict C Extension.

    Speedup is about 25%-30% for the library benchmarks for Python 3.12+.

    Related issues and pull requests on GitHub: #1070.

  • The C-extension no longer pre-allocates a Python exception object in lookup-related methods of :py:class:~multidict.MultiDict when the passed-in key is not found but default value is provided.

    Namely, this affects :py:meth:MultiDict.getone() <multidict.MultiDict.getone>, :py:meth:MultiDict.getall() <multidict.MultiDict.getall>, :py:meth:MultiDict.get() <multidict.MultiDict.get>, :py:meth:MultiDict.pop() <multidict.MultiDict.pop>, :py:meth:MultiDict.popone() <multidict.MultiDict.popone>, and :py:meth:MultiDict.popall() <multidict.MultiDict.popall>.

... (truncated)

Changelog

Sourced from multidict's changelog.

6.3.0

(2025-03-31)

Bug fixes

  • Set operations for KeysView and ItemsView of case-insensitive multidicts and their proxies are processed in case-insensitive manner.

    Related issues and pull requests on GitHub: :issue:965.

  • Rewrote :class:multidict.CIMultiDict and it proxy to always return :class:multidict.istr keys. istr is derived from :class:str, thus the change is backward compatible.

    The performance boost is about 15% for some operations for C Extension, pure Python implementation have got a visible (15% - 230%) speedup as well.

    Related issues and pull requests on GitHub: :issue:1097.

  • Fixed a crash when extending a multidict from multidict proxy if C Extensions were used.

    Related issues and pull requests on GitHub: :issue:1100.

Features

  • Implemented a custom parser for METH_FASTCALL | METH_KEYWORDS protocol -- by :user:asvetlov.

    The patch re-enables fast call protocol in the :py:mod:multidict C Extension.

    Speedup is about 25%-30% for the library benchmarks for Python 3.12+.

    Related issues and pull requests on GitHub: :issue:1070.

  • The C-extension no longer pre-allocates a Python exception object in lookup-related methods of :py:class:~multidict.MultiDict when the passed-in key is not found but default value is provided.

    Namely, this affects :py:meth:MultiDict.getone() <multidict.MultiDict.getone>, :py:meth:MultiDict.getall() <multidict.MultiDict.getall>, :py:meth:`MultiDict.get()

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=multidict&package-manager=pip&previous-version=6.2.0&new-version=6.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/cython.txt | 2 +- requirements/dev.txt | 2 +- requirements/multidict.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 7 files changed, 7 insertions(+), 7 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index 6c8b21f5aa8..ffee818cdbb 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -26,7 +26,7 @@ gunicorn==23.0.0 # via -r requirements/base.in idna==3.4 # via yarl -multidict==6.2.0 +multidict==6.3.1 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 142c09092b0..f24d2f8919a 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -111,7 +111,7 @@ markupsafe==3.0.2 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.2.0 +multidict==6.3.1 # via # -r requirements/multidict.in # -r requirements/runtime-deps.in diff --git a/requirements/cython.txt b/requirements/cython.txt index fc290ab6688..5f2bbcb7c1f 100644 --- a/requirements/cython.txt +++ b/requirements/cython.txt @@ -6,7 +6,7 @@ # cython==3.0.12 # via -r requirements/cython.in -multidict==6.2.0 +multidict==6.3.1 # via -r requirements/multidict.in typing-extensions==4.12.2 # via multidict diff --git a/requirements/dev.txt b/requirements/dev.txt index b9e52c24751..93d89ba3daf 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -109,7 +109,7 @@ markupsafe==3.0.2 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.2.0 +multidict==6.3.1 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/multidict.txt b/requirements/multidict.txt index be4d86595fc..4ee354b5aa0 100644 --- a/requirements/multidict.txt +++ b/requirements/multidict.txt @@ -4,7 +4,7 @@ # # pip-compile --allow-unsafe --output-file=requirements/multidict.txt --resolver=backtracking --strip-extras requirements/multidict.in # -multidict==6.2.0 +multidict==6.3.1 # via -r requirements/multidict.in typing-extensions==4.12.2 # via multidict diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 0575278acab..7504cfc629b 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -24,7 +24,7 @@ frozenlist==1.5.0 # aiosignal idna==3.4 # via yarl -multidict==6.2.0 +multidict==6.3.1 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/test.txt b/requirements/test.txt index b2ea7bfff70..d0a87da4001 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -57,7 +57,7 @@ markdown-it-py==3.0.0 # via rich mdurl==0.1.2 # via markdown-it-py -multidict==6.2.0 +multidict==6.3.1 # via # -r requirements/runtime-deps.in # yarl From a0fcf320fa8335c13f3e7d3db41f7662f362482d Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Wed, 2 Apr 2025 11:02:45 +0000 Subject: [PATCH 03/37] Bump pytest-cov from 6.0.0 to 6.1.0 (#10687) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Bumps [pytest-cov](https://github.com/pytest-dev/pytest-cov) from 6.0.0 to 6.1.0.
Changelog

Sourced from pytest-cov's changelog.

6.1.0 (2025-04-01)

  • Change terminal output to use full width lines for the coverage header. Contributed by Tsvika Shapira in [#678](https://github.com/pytest-dev/pytest-cov/issues/678) <https://github.com/pytest-dev/pytest-cov/pull/678>_.
  • Removed unnecessary CovFailUnderWarning. Fixes [#675](https://github.com/pytest-dev/pytest-cov/issues/675) <https://github.com/pytest-dev/pytest-cov/issues/675>_.
  • Fixed the term report not using the precision specified via --cov-precision.
Commits
  • 10f8cde Bump version: 6.0.0 → 6.1.0
  • 10b14af Update changelog.
  • aa57aed Refactor a bit the internals to be a bit less boilerplatey and have more clar...
  • e760099 Make sure the CLI precision is used when creating report. Fixes #674.
  • 44540e1 Remove unnecessary CovFailUnderWarning. Closes #675.
  • 204af14 Update changelog.
  • 089e7bb Upgrade ruff.
  • ab2cd26 Add py 3.13 to test grid and update some deps.
  • 2de0c6c add reference to code source
  • 362a359 move section between functions
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pytest-cov&package-manager=pip&previous-version=6.0.0&new-version=6.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/test.txt | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/requirements/constraints.txt b/requirements/constraints.txt index f24d2f8919a..fd1bb4a78a8 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -178,7 +178,7 @@ pytest-codspeed==3.2.0 # via # -r requirements/lint.in # -r requirements/test.in -pytest-cov==6.0.0 +pytest-cov==6.1.0 # via -r requirements/test.in pytest-mock==3.14.0 # via diff --git a/requirements/dev.txt b/requirements/dev.txt index 93d89ba3daf..e028b4545cc 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -173,7 +173,7 @@ pytest-codspeed==3.2.0 # via # -r requirements/lint.in # -r requirements/test.in -pytest-cov==6.0.0 +pytest-cov==6.1.0 # via -r requirements/test.in pytest-mock==3.14.0 # via diff --git a/requirements/test.txt b/requirements/test.txt index d0a87da4001..21af44f7b75 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -96,7 +96,7 @@ pytest==8.3.5 # pytest-xdist pytest-codspeed==3.2.0 # via -r requirements/test.in -pytest-cov==6.0.0 +pytest-cov==6.1.0 # via -r requirements/test.in pytest-mock==3.14.0 # via -r requirements/test.in From 39b1afa9bed968dad488017ec1f1d1e43d056ffd Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 4 Apr 2025 11:15:23 +0000 Subject: [PATCH 04/37] Bump typing-extensions from 4.12.2 to 4.13.1 (#10693) Bumps [typing-extensions](https://github.com/python/typing_extensions) from 4.12.2 to 4.13.1.
Release notes

Sourced from typing-extensions's releases.

4.13.1

This is a bugfix release fixing two edge cases that appear on old bugfix releases of CPython.

Bugfixes:

  • Fix regression in 4.13.0 on Python 3.10.2 causing a TypeError when using Concatenate. Patch by Daraan.
  • Fix TypeError when using evaluate_forward_ref on Python 3.10.1-2 and 3.9.8-10. Patch by Daraan.

4.13.0

New features:

  • Add typing_extensions.TypeForm from PEP 747. Patch by Jelle Zijlstra.
  • Add typing_extensions.get_annotations, a backport of inspect.get_annotations that adds features specified by PEP 649. Patches by Jelle Zijlstra and Alex Waygood.
  • Backport evaluate_forward_ref from CPython PR #119891 to evaluate ForwardRefs. Patch by Daraan, backporting a CPython PR by Jelle Zijlstra.

Bugfixes and changed features:

  • Update PEP 728 implementation to a newer version of the PEP. Patch by Jelle Zijlstra.
  • Copy the coroutine status of functions and methods wrapped with @typing_extensions.deprecated. Patch by Sebastian Rittau.
  • Fix bug where TypeAliasType instances could be subscripted even where they were not generic. Patch by Daraan.
  • Fix bug where a subscripted TypeAliasType instance did not have all attributes of the original TypeAliasType instance on older Python versions. Patch by Daraan and Alex Waygood.
  • Fix bug where subscripted TypeAliasType instances (and some other subscripted objects) had wrong parameters if they were directly subscripted with an Unpack object. Patch by Daraan.
  • Backport to Python 3.10 the ability to substitute ... in generic Callable aliases that have a Concatenate special form as their argument. Patch by Daraan.
  • Extended the Concatenate backport for Python 3.8-3.10 to now accept Ellipsis as an argument. Patch by Daraan.
  • Fix backport of get_type_hints to reflect Python 3.11+ behavior which does not add Union[..., NoneType] to annotations that have a None default value anymore. This fixes wrapping of Annotated in an unwanted Optional in such cases. Patch by Daraan.
  • Fix error in subscription of Unpack aliases causing nested Unpacks to not be resolved correctly. Patch by Daraan.
  • Backport CPython PR #124795: fix TypeAliasType not raising an error on non-tuple inputs for type_params. Patch by Daraan.
  • Fix that lists and ... could not be used for parameter expressions for TypeAliasType

... (truncated)

Changelog

Sourced from typing-extensions's changelog.

Release 4.13.1 (April 3, 2025)

Bugfixes:

  • Fix regression in 4.13.0 on Python 3.10.2 causing a TypeError when using Concatenate. Patch by Daraan.
  • Fix TypeError when using evaluate_forward_ref on Python 3.10.1-2 and 3.9.8-10. Patch by Daraan.

Release 4.13.0 (March 25, 2025)

No user-facing changes since 4.13.0rc1.

Release 4.13.0rc1 (March 18, 2025)

New features:

  • Add typing_extensions.TypeForm from PEP 747. Patch by Jelle Zijlstra.
  • Add typing_extensions.get_annotations, a backport of inspect.get_annotations that adds features specified by PEP 649. Patches by Jelle Zijlstra and Alex Waygood.
  • Backport evaluate_forward_ref from CPython PR #119891 to evaluate ForwardRefs. Patch by Daraan, backporting a CPython PR by Jelle Zijlstra.

Bugfixes and changed features:

  • Update PEP 728 implementation to a newer version of the PEP. Patch by Jelle Zijlstra.
  • Copy the coroutine status of functions and methods wrapped with @typing_extensions.deprecated. Patch by Sebastian Rittau.
  • Fix bug where TypeAliasType instances could be subscripted even where they were not generic. Patch by Daraan.
  • Fix bug where a subscripted TypeAliasType instance did not have all attributes of the original TypeAliasType instance on older Python versions. Patch by Daraan and Alex Waygood.
  • Fix bug where subscripted TypeAliasType instances (and some other subscripted objects) had wrong parameters if they were directly subscripted with an Unpack object. Patch by Daraan.
  • Backport to Python 3.10 the ability to substitute ... in generic Callable aliases that have a Concatenate special form as their argument. Patch by Daraan.
  • Extended the Concatenate backport for Python 3.8-3.10 to now accept Ellipsis as an argument. Patch by Daraan.
  • Fix backport of get_type_hints to reflect Python 3.11+ behavior which does not add Union[..., NoneType] to annotations that have a None default value anymore. This fixes wrapping of Annotated in an unwanted Optional in such cases. Patch by Daraan.
  • Fix error in subscription of Unpack aliases causing nested Unpacks to not be resolved correctly. Patch by Daraan.

... (truncated)

Commits
  • 45a8847 Prepare release 4.13.1 (#573)
  • f264e58 Move CI to "ubuntu-latest" (round 2) (#570)
  • 5ce0e69 Fix TypeError with evaluate_forward_ref on some 3.10 and 3.9 versions (#558)
  • 304f5cb Add SQLAlchemy to third-party daily tests (#561)
  • ebe2b94 Fix duplicated keywords for typing._ConcatenateGenericAlias in 3.10.2 (#557)
  • 9f93d6f Add intersphinx links for 3.13 typing features (#550)
  • c893401 Prepare release 4.13.0 (#555)
  • 6239d86 Use latest Python docs as intersphinx base rather than 3.12 docs (#549)
  • 671a337 Fix 'Test and lint' workflow running on forks (#551)
  • e77e8e2 Disable pyanalyze tests for now (#554)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=typing-extensions&package-manager=pip&previous-version=4.12.2&new-version=4.13.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/cython.txt | 2 +- requirements/dev.txt | 2 +- requirements/lint.txt | 2 +- requirements/multidict.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 8 files changed, 8 insertions(+), 8 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index ffee818cdbb..efbdf3d5436 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -40,7 +40,7 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -typing-extensions==4.12.2 +typing-extensions==4.13.1 # via multidict uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython" # via -r requirements/base.in diff --git a/requirements/constraints.txt b/requirements/constraints.txt index fd1bb4a78a8..70f4833e38f 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -258,7 +258,7 @@ trustme==1.2.1 ; platform_machine != "i686" # via # -r requirements/lint.in # -r requirements/test.in -typing-extensions==4.12.2 +typing-extensions==4.13.1 # via # multidict # mypy diff --git a/requirements/cython.txt b/requirements/cython.txt index 5f2bbcb7c1f..31272de30f3 100644 --- a/requirements/cython.txt +++ b/requirements/cython.txt @@ -8,5 +8,5 @@ cython==3.0.12 # via -r requirements/cython.in multidict==6.3.1 # via -r requirements/multidict.in -typing-extensions==4.12.2 +typing-extensions==4.13.1 # via multidict diff --git a/requirements/dev.txt b/requirements/dev.txt index e028b4545cc..f9a3c08c6bd 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -249,7 +249,7 @@ trustme==1.2.1 ; platform_machine != "i686" # via # -r requirements/lint.in # -r requirements/test.in -typing-extensions==4.12.2 +typing-extensions==4.13.1 # via # multidict # mypy diff --git a/requirements/lint.txt b/requirements/lint.txt index c400e12cea0..fb4680014e1 100644 --- a/requirements/lint.txt +++ b/requirements/lint.txt @@ -95,7 +95,7 @@ tomli==2.2.1 # slotscheck trustme==1.2.1 # via -r requirements/lint.in -typing-extensions==4.12.2 +typing-extensions==4.13.1 # via # mypy # pydantic diff --git a/requirements/multidict.txt b/requirements/multidict.txt index 4ee354b5aa0..405d04c027b 100644 --- a/requirements/multidict.txt +++ b/requirements/multidict.txt @@ -6,5 +6,5 @@ # multidict==6.3.1 # via -r requirements/multidict.in -typing-extensions==4.12.2 +typing-extensions==4.13.1 # via multidict diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 7504cfc629b..9dbcc807684 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -36,7 +36,7 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -typing-extensions==4.12.2 +typing-extensions==4.13.1 # via multidict yarl==1.18.3 # via -r requirements/runtime-deps.in diff --git a/requirements/test.txt b/requirements/test.txt index 21af44f7b75..b3a8368d3cf 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -123,7 +123,7 @@ tomli==2.2.1 # pytest trustme==1.2.1 ; platform_machine != "i686" # via -r requirements/test.in -typing-extensions==4.12.2 +typing-extensions==4.13.1 # via # multidict # mypy From afacd1ba8ca4ee97ca12259efdfe7b4bfefc5e6b Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 4 Apr 2025 11:49:42 +0000 Subject: [PATCH 05/37] Bump multidict from 6.3.1 to 6.3.2 (#10695) Bumps [multidict](https://github.com/aio-libs/multidict) from 6.3.1 to 6.3.2.
Release notes

Sourced from multidict's releases.

6.3.2

Bug fixes

  • Resolved a memory leak by ensuring proper reference count decrementation -- by :user:asvetlov and :user:bdraco.

    Related issues and pull requests on GitHub: #1121.


Changelog

Sourced from multidict's changelog.

6.3.2

(2025-04-03)

Bug fixes

  • Resolved a memory leak by ensuring proper reference count decrementation -- by :user:asvetlov and :user:bdraco.

    Related issues and pull requests on GitHub: :issue:1121.


Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=multidict&package-manager=pip&previous-version=6.3.1&new-version=6.3.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/cython.txt | 2 +- requirements/dev.txt | 2 +- requirements/multidict.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 7 files changed, 7 insertions(+), 7 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index efbdf3d5436..3fe85928b13 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -26,7 +26,7 @@ gunicorn==23.0.0 # via -r requirements/base.in idna==3.4 # via yarl -multidict==6.3.1 +multidict==6.3.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 70f4833e38f..f96d0bf236e 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -111,7 +111,7 @@ markupsafe==3.0.2 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.3.1 +multidict==6.3.2 # via # -r requirements/multidict.in # -r requirements/runtime-deps.in diff --git a/requirements/cython.txt b/requirements/cython.txt index 31272de30f3..7a3b4737f54 100644 --- a/requirements/cython.txt +++ b/requirements/cython.txt @@ -6,7 +6,7 @@ # cython==3.0.12 # via -r requirements/cython.in -multidict==6.3.1 +multidict==6.3.2 # via -r requirements/multidict.in typing-extensions==4.13.1 # via multidict diff --git a/requirements/dev.txt b/requirements/dev.txt index f9a3c08c6bd..0888d764aa0 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -109,7 +109,7 @@ markupsafe==3.0.2 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.3.1 +multidict==6.3.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/multidict.txt b/requirements/multidict.txt index 405d04c027b..3237ab6c359 100644 --- a/requirements/multidict.txt +++ b/requirements/multidict.txt @@ -4,7 +4,7 @@ # # pip-compile --allow-unsafe --output-file=requirements/multidict.txt --resolver=backtracking --strip-extras requirements/multidict.in # -multidict==6.3.1 +multidict==6.3.2 # via -r requirements/multidict.in typing-extensions==4.13.1 # via multidict diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 9dbcc807684..fddb7252229 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -24,7 +24,7 @@ frozenlist==1.5.0 # aiosignal idna==3.4 # via yarl -multidict==6.3.1 +multidict==6.3.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/test.txt b/requirements/test.txt index b3a8368d3cf..28159c29844 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -57,7 +57,7 @@ markdown-it-py==3.0.0 # via rich mdurl==0.1.2 # via markdown-it-py -multidict==6.3.1 +multidict==6.3.2 # via # -r requirements/runtime-deps.in # yarl From 89f17288da5aa614ae55a4a4c4544fc499a5cc0f Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 4 Apr 2025 11:55:20 +0000 Subject: [PATCH 06/37] Bump pydantic from 2.11.1 to 2.11.2 (#10694) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Bumps [pydantic](https://github.com/pydantic/pydantic) from 2.11.1 to 2.11.2.
Release notes

Sourced from pydantic's releases.

v2.11.2 2025-04-03

What's Changed

Fixes

Full Changelog: https://github.com/pydantic/pydantic/compare/v2.11.1...v2.11.2

Changelog

Sourced from pydantic's changelog.

v2.11.2 (2025-04-03)

GitHub release

What's Changed

Fixes

  • Bump pydantic-core to v2.33.1 by @​Viicos in #11678
  • Make sure __pydantic_private__ exists before setting private attributes by @​Viicos in #11666
  • Do not override FieldInfo._complete when using field from parent class by @​Viicos in #11668
  • Provide the available definitions when applying discriminated unions by @​Viicos in #11670
  • Do not expand root type in the mypy plugin for variables by @​Viicos in #11676
  • Mention the attribute name in model fields deprecation message by @​Viicos in #11674
  • Properly validate parameterized mappings by @​Viicos in #11658
Commits
  • bd1f8cf Prepare release v2.11.2 (#11684)
  • f70f291 Add integration documentation for llms.txt (#11677)
  • 34095c7 Properly validate parameterized mappings (#11658)
  • dfa6c67 Mention the attribute name in model fields deprecation message (#11674)
  • cbf4202 Do not expand root type in the mypy plugin for variables (#11676)
  • 8b0825a Provide the available definitions when applying discriminated unions (#11670)
  • 86c5703 Do not override FieldInfo._complete when using field from parent class (#11...
  • da84149 Make sure __pydantic_private__ exists before setting private attributes (#1...
  • 0cfe853 Bump pydantic-core to v2.33.1 (#11678)
  • See full diff in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pydantic&package-manager=pip&previous-version=2.11.1&new-version=2.11.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/constraints.txt | 4 ++-- requirements/dev.txt | 4 ++-- requirements/lint.txt | 4 ++-- requirements/test.txt | 4 ++-- 4 files changed, 8 insertions(+), 8 deletions(-) diff --git a/requirements/constraints.txt b/requirements/constraints.txt index f96d0bf236e..df736e5c481 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -148,9 +148,9 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -pydantic==2.11.1 +pydantic==2.11.2 # via python-on-whales -pydantic-core==2.33.0 +pydantic-core==2.33.1 # via pydantic pyenchant==3.2.2 # via sphinxcontrib-spelling diff --git a/requirements/dev.txt b/requirements/dev.txt index 0888d764aa0..45ca59e2fa0 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -145,9 +145,9 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -pydantic==2.11.1 +pydantic==2.11.2 # via python-on-whales -pydantic-core==2.33.0 +pydantic-core==2.33.1 # via pydantic pygments==2.19.1 # via diff --git a/requirements/lint.txt b/requirements/lint.txt index fb4680014e1..fd67ca91910 100644 --- a/requirements/lint.txt +++ b/requirements/lint.txt @@ -61,9 +61,9 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -pydantic==2.11.1 +pydantic==2.11.2 # via python-on-whales -pydantic-core==2.33.0 +pydantic-core==2.33.1 # via pydantic pygments==2.19.1 # via rich diff --git a/requirements/test.txt b/requirements/test.txt index 28159c29844..c67ee7a1f7f 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -81,9 +81,9 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -pydantic==2.11.1 +pydantic==2.11.2 # via python-on-whales -pydantic-core==2.33.0 +pydantic-core==2.33.1 # via pydantic pygments==2.19.1 # via rich From eb1abe4982ef9088efbb06ab6343796f44560754 Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Mon, 7 Apr 2025 03:19:52 +0000 Subject: [PATCH 07/37] [PR #10698/25693469 backport][3.12] Bump yarl to 1.19.0 (#10699) --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 5 files changed, 5 insertions(+), 5 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index 3fe85928b13..a2e94325437 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -44,5 +44,5 @@ typing-extensions==4.13.1 # via multidict uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython" # via -r requirements/base.in -yarl==1.18.3 +yarl==1.19.0 # via -r requirements/runtime-deps.in diff --git a/requirements/constraints.txt b/requirements/constraints.txt index df736e5c481..a18b548d9ea 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -285,7 +285,7 @@ wait-for-it==2.3.0 # via -r requirements/test.in wheel==0.45.1 # via pip-tools -yarl==1.18.3 +yarl==1.19.0 # via -r requirements/runtime-deps.in # The following packages are considered to be unsafe in a requirements file: diff --git a/requirements/dev.txt b/requirements/dev.txt index 45ca59e2fa0..f8df5ff8903 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -276,7 +276,7 @@ wait-for-it==2.3.0 # via -r requirements/test.in wheel==0.45.1 # via pip-tools -yarl==1.18.3 +yarl==1.19.0 # via -r requirements/runtime-deps.in # The following packages are considered to be unsafe in a requirements file: diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index fddb7252229..69190e8f6b8 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -38,5 +38,5 @@ pycparser==2.22 # via cffi typing-extensions==4.13.1 # via multidict -yarl==1.18.3 +yarl==1.19.0 # via -r requirements/runtime-deps.in diff --git a/requirements/test.txt b/requirements/test.txt index c67ee7a1f7f..e4bcbbae36b 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -138,5 +138,5 @@ uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpytho # via -r requirements/base.in wait-for-it==2.3.0 # via -r requirements/test.in -yarl==1.18.3 +yarl==1.19.0 # via -r requirements/runtime-deps.in From a2569f9366ea62016f90733fbc7a5e1cddbb8cce Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Mon, 7 Apr 2025 11:30:43 +0000 Subject: [PATCH 08/37] Bump pytest-cov from 6.1.0 to 6.1.1 (#10703) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Bumps [pytest-cov](https://github.com/pytest-dev/pytest-cov) from 6.1.0 to 6.1.1.
Changelog

Sourced from pytest-cov's changelog.

6.1.1 (2025-04-05)

  • Fixed breakage that occurs when --cov-context and the no_cover marker are used together.
Commits
  • 9463242 Bump version: 6.1.0 → 6.1.1
  • 7f2781b Update changelog.
  • a59548f Allow the context plugin to check if the controller is running or not. Fixes ...
  • See full diff in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pytest-cov&package-manager=pip&previous-version=6.1.0&new-version=6.1.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/test.txt | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/requirements/constraints.txt b/requirements/constraints.txt index a18b548d9ea..ae114cd4abd 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -178,7 +178,7 @@ pytest-codspeed==3.2.0 # via # -r requirements/lint.in # -r requirements/test.in -pytest-cov==6.1.0 +pytest-cov==6.1.1 # via -r requirements/test.in pytest-mock==3.14.0 # via diff --git a/requirements/dev.txt b/requirements/dev.txt index f8df5ff8903..265a6cb1f98 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -173,7 +173,7 @@ pytest-codspeed==3.2.0 # via # -r requirements/lint.in # -r requirements/test.in -pytest-cov==6.1.0 +pytest-cov==6.1.1 # via -r requirements/test.in pytest-mock==3.14.0 # via diff --git a/requirements/test.txt b/requirements/test.txt index e4bcbbae36b..4114bcc2b7c 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -96,7 +96,7 @@ pytest==8.3.5 # pytest-xdist pytest-codspeed==3.2.0 # via -r requirements/test.in -pytest-cov==6.1.0 +pytest-cov==6.1.1 # via -r requirements/test.in pytest-mock==3.14.0 # via -r requirements/test.in From da9e8e8bbf59388dbbc5b4c9b4f5f933bb4fe540 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Wed, 9 Apr 2025 11:28:13 +0000 Subject: [PATCH 09/37] Bump pydantic from 2.11.2 to 2.11.3 (#10710) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Bumps [pydantic](https://github.com/pydantic/pydantic) from 2.11.2 to 2.11.3.
Release notes

Sourced from pydantic's releases.

v2.11.3 2025-04-08

What's Changed

Packaging

Fixes

Full Changelog: https://github.com/pydantic/pydantic/compare/v2.11.2...v2.11.3

Changelog

Sourced from pydantic's changelog.

v2.11.3 (2025-04-08)

GitHub release

What's Changed

Packaging

Fixes

  • Preserve field description when rebuilding model fields by @​Viicos in #11698
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pydantic&package-manager=pip&previous-version=2.11.2&new-version=2.11.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/lint.txt | 2 +- requirements/test.txt | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/requirements/constraints.txt b/requirements/constraints.txt index ae114cd4abd..ea91f5c1b3d 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -148,7 +148,7 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -pydantic==2.11.2 +pydantic==2.11.3 # via python-on-whales pydantic-core==2.33.1 # via pydantic diff --git a/requirements/dev.txt b/requirements/dev.txt index 265a6cb1f98..7278bf42682 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -145,7 +145,7 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -pydantic==2.11.2 +pydantic==2.11.3 # via python-on-whales pydantic-core==2.33.1 # via pydantic diff --git a/requirements/lint.txt b/requirements/lint.txt index fd67ca91910..303d2756904 100644 --- a/requirements/lint.txt +++ b/requirements/lint.txt @@ -61,7 +61,7 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -pydantic==2.11.2 +pydantic==2.11.3 # via python-on-whales pydantic-core==2.33.1 # via pydantic diff --git a/requirements/test.txt b/requirements/test.txt index 4114bcc2b7c..8eb141f314c 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -81,7 +81,7 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -pydantic==2.11.2 +pydantic==2.11.3 # via python-on-whales pydantic-core==2.33.1 # via pydantic From 38373d6be8087001b4f6682b0568d2a03e149d6b Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 10 Apr 2025 01:00:11 +0000 Subject: [PATCH 10/37] Bump multidict from 6.3.2 to 6.4.2 (#10712) Bumps [multidict](https://github.com/aio-libs/multidict) from 6.3.2 to 6.4.2.
Release notes

Sourced from multidict's releases.

6.4.2

Bug fixes

  • Fixed a segmentation fault when creating subclassed :py:class:~multidict.MultiDict objects on Python < 3.11 -- by :user:bdraco.

    The problem first appeared in 6.4.0

    Related issues and pull requests on GitHub: #1141.


6.4.1

No change release of 6.4.0 since the attestations failed to upload to GitHub


6.4.0

Bug fixes

  • Fixed a memory leak creating new :class:~multidict.istr objects -- by :user:bdraco.

    The leak was introduced in 6.3.0

    Related issues and pull requests on GitHub: #1133.

  • Fixed reference counting when calling :py:meth:multidict.MultiDict.update -- by :user:bdraco.

    The leak was introduced in 4.4.0

    Related issues and pull requests on GitHub: #1135.

Features

  • Switched C Extension to use heap types and the module state.

    Related issues and pull requests on GitHub: #1125.

  • Started building armv7l wheels -- by :user:bdraco.

    Related issues and pull requests on GitHub:

... (truncated)

Changelog

Sourced from multidict's changelog.

6.4.2

(2025-04-09)

Bug fixes

  • Fixed a segmentation fault when creating subclassed :py:class:~multidict.MultiDict objects on Python < 3.11 -- by :user:bdraco.

    The problem first appeared in 6.4.0

    Related issues and pull requests on GitHub: :issue:1141.


6.4.1

(2025-04-09)

No significant changes.


6.4.0

(2025-04-09)

Bug fixes

  • Fixed a memory leak creating new :class:~multidict.istr objects -- by :user:bdraco.

    The leak was introduced in 6.3.0

    Related issues and pull requests on GitHub: :issue:1133.

  • Fixed reference counting when calling :py:meth:multidict.MultiDict.update -- by :user:bdraco.

... (truncated)

Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=multidict&package-manager=pip&previous-version=6.3.2&new-version=6.4.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/cython.txt | 2 +- requirements/dev.txt | 2 +- requirements/multidict.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 7 files changed, 7 insertions(+), 7 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index a2e94325437..1c553cd3875 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -26,7 +26,7 @@ gunicorn==23.0.0 # via -r requirements/base.in idna==3.4 # via yarl -multidict==6.3.2 +multidict==6.4.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/constraints.txt b/requirements/constraints.txt index ea91f5c1b3d..acfa6facf51 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -111,7 +111,7 @@ markupsafe==3.0.2 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.3.2 +multidict==6.4.2 # via # -r requirements/multidict.in # -r requirements/runtime-deps.in diff --git a/requirements/cython.txt b/requirements/cython.txt index 7a3b4737f54..e472d1de6dc 100644 --- a/requirements/cython.txt +++ b/requirements/cython.txt @@ -6,7 +6,7 @@ # cython==3.0.12 # via -r requirements/cython.in -multidict==6.3.2 +multidict==6.4.2 # via -r requirements/multidict.in typing-extensions==4.13.1 # via multidict diff --git a/requirements/dev.txt b/requirements/dev.txt index 7278bf42682..ca4ad7751a1 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -109,7 +109,7 @@ markupsafe==3.0.2 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.3.2 +multidict==6.4.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/multidict.txt b/requirements/multidict.txt index 3237ab6c359..70a4468156f 100644 --- a/requirements/multidict.txt +++ b/requirements/multidict.txt @@ -4,7 +4,7 @@ # # pip-compile --allow-unsafe --output-file=requirements/multidict.txt --resolver=backtracking --strip-extras requirements/multidict.in # -multidict==6.3.2 +multidict==6.4.2 # via -r requirements/multidict.in typing-extensions==4.13.1 # via multidict diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 69190e8f6b8..227515f3f7a 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -24,7 +24,7 @@ frozenlist==1.5.0 # aiosignal idna==3.4 # via yarl -multidict==6.3.2 +multidict==6.4.2 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/test.txt b/requirements/test.txt index 8eb141f314c..13cb2904eb1 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -57,7 +57,7 @@ markdown-it-py==3.0.0 # via rich mdurl==0.1.2 # via markdown-it-py -multidict==6.3.2 +multidict==6.4.2 # via # -r requirements/runtime-deps.in # yarl From 87ada6b878c2f05acf85d9940ac4c1d7c91c3d88 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 10 Apr 2025 23:26:25 +0000 Subject: [PATCH 11/37] Bump urllib3 from 2.3.0 to 2.4.0 (#10715) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.3.0 to 2.4.0.
Release notes

Sourced from urllib3's releases.

2.4.0

🚀 urllib3 is fundraising for HTTP/2 support

urllib3 is raising ~$40,000 USD to release HTTP/2 support and ensure long-term sustainable maintenance of the project after a sharp decline in financial support. If your company or organization uses Python and would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and thousands of other projects please consider contributing financially to ensure HTTP/2 support is developed sustainably and maintained for the long-haul.

Thank you for your support.

Features

  • Applied PEP 639 by specifying the license fields in pyproject.toml. (#3522)
  • Updated exceptions to save and restore more properties during the pickle/serialization process. (#3567)
  • Added verify_flags option to create_urllib3_context with a default of VERIFY_X509_PARTIAL_CHAIN and VERIFY_X509_STRICT for Python 3.13+. (#3571)

Bugfixes

  • Fixed a bug with partial reads of streaming data in Emscripten. (#3555)

Misc

  • Switched to uv for installing development dependecies. (#3550)
  • Removed the multiple.intoto.jsonl asset from GitHub releases. Attestation of release files since v2.3.0 can be found on PyPI. (#3566)
Changelog

Sourced from urllib3's changelog.

2.4.0 (2025-04-10)

Features

  • Applied PEP 639 by specifying the license fields in pyproject.toml. ([#3522](https://github.com/urllib3/urllib3/issues/3522) <https://github.com/urllib3/urllib3/issues/3522>__)
  • Updated exceptions to save and restore more properties during the pickle/serialization process. ([#3567](https://github.com/urllib3/urllib3/issues/3567) <https://github.com/urllib3/urllib3/issues/3567>__)
  • Added verify_flags option to create_urllib3_context with a default of VERIFY_X509_PARTIAL_CHAIN and VERIFY_X509_STRICT for Python 3.13+. ([#3571](https://github.com/urllib3/urllib3/issues/3571) <https://github.com/urllib3/urllib3/issues/3571>__)

Bugfixes

  • Fixed a bug with partial reads of streaming data in Emscripten. ([#3555](https://github.com/urllib3/urllib3/issues/3555) <https://github.com/urllib3/urllib3/issues/3555>__)

Misc

  • Switched to uv for installing development dependecies. ([#3550](https://github.com/urllib3/urllib3/issues/3550) <https://github.com/urllib3/urllib3/issues/3550>__)
  • Removed the multiple.intoto.jsonl asset from GitHub releases. Attestation of release files since v2.3.0 can be found on PyPI. ([#3566](https://github.com/urllib3/urllib3/issues/3566) <https://github.com/urllib3/urllib3/issues/3566>__)
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=urllib3&package-manager=pip&previous-version=2.3.0&new-version=2.4.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/doc-spelling.txt | 2 +- requirements/doc.txt | 2 +- 4 files changed, 4 insertions(+), 4 deletions(-) diff --git a/requirements/constraints.txt b/requirements/constraints.txt index acfa6facf51..7f0243cc1d0 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -271,7 +271,7 @@ typing-inspection==0.4.0 # via pydantic uritemplate==4.1.1 # via gidgethub -urllib3==2.3.0 +urllib3==2.4.0 # via requests uvloop==0.21.0 ; platform_system != "Windows" # via diff --git a/requirements/dev.txt b/requirements/dev.txt index ca4ad7751a1..d501d2dfcbd 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -262,7 +262,7 @@ typing-inspection==0.4.0 # via pydantic uritemplate==4.1.1 # via gidgethub -urllib3==2.3.0 +urllib3==2.4.0 # via requests uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython" # via diff --git a/requirements/doc-spelling.txt b/requirements/doc-spelling.txt index dfd7f09765d..fe5d7e5708d 100644 --- a/requirements/doc-spelling.txt +++ b/requirements/doc-spelling.txt @@ -72,7 +72,7 @@ towncrier==23.11.0 # via # -r requirements/doc.in # sphinxcontrib-towncrier -urllib3==2.3.0 +urllib3==2.4.0 # via requests # The following packages are considered to be unsafe in a requirements file: diff --git a/requirements/doc.txt b/requirements/doc.txt index 15356c89a9e..086c945725e 100644 --- a/requirements/doc.txt +++ b/requirements/doc.txt @@ -65,7 +65,7 @@ towncrier==23.11.0 # via # -r requirements/doc.in # sphinxcontrib-towncrier -urllib3==2.3.0 +urllib3==2.4.0 # via requests # The following packages are considered to be unsafe in a requirements file: From abb3e87f3430f3d4e95a33f62f79882fb065325b Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 10 Apr 2025 23:51:56 +0000 Subject: [PATCH 12/37] Bump multidict from 6.4.2 to 6.4.3 (#10716) Bumps [multidict](https://github.com/aio-libs/multidict) from 6.4.2 to 6.4.3.
Release notes

Sourced from multidict's releases.

6.4.3

Bug fixes

  • Fixed building the library in debug mode.

    Related issues and pull requests on GitHub: #1144.

  • Fixed custom PyType_GetModuleByDef() when non-heap type object was passed.

    Related issues and pull requests on GitHub: #1147.

Packaging updates and notes for downstreams

  • Added the ability to build in debug mode by setting :envvar:MULTIDICT_DEBUG_BUILD in the environment -- by :user:bdraco.

    Related issues and pull requests on GitHub: #1145.


Changelog

Sourced from multidict's changelog.

6.4.3

(2025-04-10)

Bug fixes

  • Fixed building the library in debug mode.

    Related issues and pull requests on GitHub: :issue:1144.

  • Fixed custom PyType_GetModuleByDef() when non-heap type object was passed.

    Related issues and pull requests on GitHub: :issue:1147.

Packaging updates and notes for downstreams

  • Added the ability to build in debug mode by setting :envvar:MULTIDICT_DEBUG_BUILD in the environment -- by :user:bdraco.

    Related issues and pull requests on GitHub: :issue:1145.


Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=multidict&package-manager=pip&previous-version=6.4.2&new-version=6.4.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/cython.txt | 2 +- requirements/dev.txt | 2 +- requirements/multidict.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 7 files changed, 7 insertions(+), 7 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index 1c553cd3875..08beaa66522 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -26,7 +26,7 @@ gunicorn==23.0.0 # via -r requirements/base.in idna==3.4 # via yarl -multidict==6.4.2 +multidict==6.4.3 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 7f0243cc1d0..e8a2d85b2bb 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -111,7 +111,7 @@ markupsafe==3.0.2 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.4.2 +multidict==6.4.3 # via # -r requirements/multidict.in # -r requirements/runtime-deps.in diff --git a/requirements/cython.txt b/requirements/cython.txt index e472d1de6dc..d5661f8fff3 100644 --- a/requirements/cython.txt +++ b/requirements/cython.txt @@ -6,7 +6,7 @@ # cython==3.0.12 # via -r requirements/cython.in -multidict==6.4.2 +multidict==6.4.3 # via -r requirements/multidict.in typing-extensions==4.13.1 # via multidict diff --git a/requirements/dev.txt b/requirements/dev.txt index d501d2dfcbd..5aa7fd7c174 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -109,7 +109,7 @@ markupsafe==3.0.2 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.4.2 +multidict==6.4.3 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/multidict.txt b/requirements/multidict.txt index 70a4468156f..64a6ea16b87 100644 --- a/requirements/multidict.txt +++ b/requirements/multidict.txt @@ -4,7 +4,7 @@ # # pip-compile --allow-unsafe --output-file=requirements/multidict.txt --resolver=backtracking --strip-extras requirements/multidict.in # -multidict==6.4.2 +multidict==6.4.3 # via -r requirements/multidict.in typing-extensions==4.13.1 # via multidict diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 227515f3f7a..3fcc493e191 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -24,7 +24,7 @@ frozenlist==1.5.0 # aiosignal idna==3.4 # via yarl -multidict==6.4.2 +multidict==6.4.3 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/test.txt b/requirements/test.txt index 13cb2904eb1..3b16120500c 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -57,7 +57,7 @@ markdown-it-py==3.0.0 # via rich mdurl==0.1.2 # via markdown-it-py -multidict==6.4.2 +multidict==6.4.3 # via # -r requirements/runtime-deps.in # yarl From 5268479e62179542539d968222081bef1569956c Mon Sep 17 00:00:00 2001 From: Tim Menninger Date: Mon, 14 Apr 2025 14:06:54 -0700 Subject: [PATCH 13/37] [PR #10700/ceeca6a backport][3.12] Add support for switching the zlib implementation (#10723) --- CHANGES/9798.feature.rst | 5 + aiohttp/__init__.py | 2 + aiohttp/_websocket/reader_py.py | 18 +- aiohttp/_websocket/writer.py | 9 +- aiohttp/abc.py | 3 +- aiohttp/compression_utils.py | 147 +++++++++++-- aiohttp/http_writer.py | 3 +- aiohttp/multipart.py | 3 +- aiohttp/web_response.py | 5 +- docs/client_reference.rst | 24 +++ docs/conf.py | 2 + docs/spelling_wordlist.txt | 1 + docs/web_reference.rst | 7 +- requirements/dev.txt | 4 + requirements/lint.in | 2 + requirements/lint.txt | 4 + requirements/test.in | 2 + requirements/test.txt | 4 + tests/conftest.py | 16 ++ tests/test_client_functional.py | 5 + tests/test_client_request.py | 7 +- tests/test_client_ws_functional.py | 2 + tests/test_compression_utils.py | 18 +- tests/test_http_writer.py | 291 +++++++++++++++++++++++++- tests/test_multipart.py | 5 +- tests/test_web_functional.py | 40 ++-- tests/test_web_response.py | 32 ++- tests/test_web_sendfile_functional.py | 6 +- tests/test_websocket_parser.py | 15 +- tests/test_websocket_writer.py | 1 + 30 files changed, 606 insertions(+), 77 deletions(-) create mode 100644 CHANGES/9798.feature.rst diff --git a/CHANGES/9798.feature.rst b/CHANGES/9798.feature.rst new file mode 100644 index 00000000000..c1584b04491 --- /dev/null +++ b/CHANGES/9798.feature.rst @@ -0,0 +1,5 @@ +Allow user setting zlib compression backend -- by :user:`TimMenninger` + +This change allows the user to call :func:`aiohttp.set_zlib_backend()` with the +zlib compression module of their choice. Default behavior continues to use +the builtin ``zlib`` library. diff --git a/aiohttp/__init__.py b/aiohttp/__init__.py index 66645143fc9..6321e713ed4 100644 --- a/aiohttp/__init__.py +++ b/aiohttp/__init__.py @@ -47,6 +47,7 @@ WSServerHandshakeError, request, ) +from .compression_utils import set_zlib_backend from .connector import ( AddrInfoType as AddrInfoType, SocketFactoryType as SocketFactoryType, @@ -183,6 +184,7 @@ "BasicAuth", "ChainMapProxy", "ETag", + "set_zlib_backend", # http "HttpVersion", "HttpVersion10", diff --git a/aiohttp/_websocket/reader_py.py b/aiohttp/_websocket/reader_py.py index 92ad47a52f0..19579bd39a8 100644 --- a/aiohttp/_websocket/reader_py.py +++ b/aiohttp/_websocket/reader_py.py @@ -238,15 +238,23 @@ def _feed_data(self, data: bytes) -> None: self._decompressobj = ZLibDecompressor( suppress_deflate_header=True ) + # XXX: It's possible that the zlib backend (isal is known to + # do this, maybe others too?) will return max_length bytes, + # but internally buffer more data such that the payload is + # >max_length, so we return one extra byte and if we're able + # to do that, then the message is too big. payload_merged = self._decompressobj.decompress_sync( - assembled_payload + WS_DEFLATE_TRAILING, self._max_msg_size + assembled_payload + WS_DEFLATE_TRAILING, + ( + self._max_msg_size + 1 + if self._max_msg_size + else self._max_msg_size + ), ) - if self._decompressobj.unconsumed_tail: - left = len(self._decompressobj.unconsumed_tail) + if self._max_msg_size and len(payload_merged) > self._max_msg_size: raise WebSocketError( WSCloseCode.MESSAGE_TOO_BIG, - f"Decompressed message size {self._max_msg_size + left}" - f" exceeds limit {self._max_msg_size}", + f"Decompressed message exceeds size limit {self._max_msg_size}", ) elif type(assembled_payload) is bytes: payload_merged = assembled_payload diff --git a/aiohttp/_websocket/writer.py b/aiohttp/_websocket/writer.py index fc2cf32b934..19163f9afdf 100644 --- a/aiohttp/_websocket/writer.py +++ b/aiohttp/_websocket/writer.py @@ -2,13 +2,12 @@ import asyncio import random -import zlib from functools import partial from typing import Any, Final, Optional, Union from ..base_protocol import BaseProtocol from ..client_exceptions import ClientConnectionResetError -from ..compression_utils import ZLibCompressor +from ..compression_utils import ZLibBackend, ZLibCompressor from .helpers import ( MASK_LEN, MSG_SIZE, @@ -95,7 +94,9 @@ async def send_frame( message = ( await compressobj.compress(message) + compressobj.flush( - zlib.Z_FULL_FLUSH if self.notakeover else zlib.Z_SYNC_FLUSH + ZLibBackend.Z_FULL_FLUSH + if self.notakeover + else ZLibBackend.Z_SYNC_FLUSH ) ).removesuffix(WS_DEFLATE_TRAILING) # Its critical that we do not return control to the event @@ -160,7 +161,7 @@ async def send_frame( def _make_compress_obj(self, compress: int) -> ZLibCompressor: return ZLibCompressor( - level=zlib.Z_BEST_SPEED, + level=ZLibBackend.Z_BEST_SPEED, wbits=-compress, max_sync_chunk_size=WEBSOCKET_MAX_SYNC_CHUNK_SIZE, ) diff --git a/aiohttp/abc.py b/aiohttp/abc.py index 5794a9108b0..3c4f8c61b00 100644 --- a/aiohttp/abc.py +++ b/aiohttp/abc.py @@ -1,7 +1,6 @@ import asyncio import logging import socket -import zlib from abc import ABC, abstractmethod from collections.abc import Sized from http.cookies import BaseCookie, Morsel @@ -219,7 +218,7 @@ async def drain(self) -> None: @abstractmethod def enable_compression( - self, encoding: str = "deflate", strategy: int = zlib.Z_DEFAULT_STRATEGY + self, encoding: str = "deflate", strategy: Optional[int] = None ) -> None: """Enable HTTP body compression""" diff --git a/aiohttp/compression_utils.py b/aiohttp/compression_utils.py index ebe8857f487..f08c3d9cdff 100644 --- a/aiohttp/compression_utils.py +++ b/aiohttp/compression_utils.py @@ -1,7 +1,15 @@ import asyncio +import sys import zlib from concurrent.futures import Executor -from typing import Optional, cast +from typing import Any, Final, Optional, Protocol, TypedDict, cast + +if sys.version_info >= (3, 12): + from collections.abc import Buffer +else: + from typing import Union + + Buffer = Union[bytes, bytearray, "memoryview[int]", "memoryview[bytes]"] try: try: @@ -16,14 +24,113 @@ MAX_SYNC_CHUNK_SIZE = 1024 +class ZLibCompressObjProtocol(Protocol): + def compress(self, data: Buffer) -> bytes: ... + def flush(self, mode: int = ..., /) -> bytes: ... + + +class ZLibDecompressObjProtocol(Protocol): + def decompress(self, data: Buffer, max_length: int = ...) -> bytes: ... + def flush(self, length: int = ..., /) -> bytes: ... + + @property + def eof(self) -> bool: ... + + +class ZLibBackendProtocol(Protocol): + MAX_WBITS: int + Z_FULL_FLUSH: int + Z_SYNC_FLUSH: int + Z_BEST_SPEED: int + Z_FINISH: int + + def compressobj( + self, + level: int = ..., + method: int = ..., + wbits: int = ..., + memLevel: int = ..., + strategy: int = ..., + zdict: Optional[Buffer] = ..., + ) -> ZLibCompressObjProtocol: ... + def decompressobj( + self, wbits: int = ..., zdict: Buffer = ... + ) -> ZLibDecompressObjProtocol: ... + + def compress( + self, data: Buffer, /, level: int = ..., wbits: int = ... + ) -> bytes: ... + def decompress( + self, data: Buffer, /, wbits: int = ..., bufsize: int = ... + ) -> bytes: ... + + +class CompressObjArgs(TypedDict, total=False): + wbits: int + strategy: int + level: int + + +class ZLibBackendWrapper: + def __init__(self, _zlib_backend: ZLibBackendProtocol): + self._zlib_backend: ZLibBackendProtocol = _zlib_backend + + @property + def name(self) -> str: + return getattr(self._zlib_backend, "__name__", "undefined") + + @property + def MAX_WBITS(self) -> int: + return self._zlib_backend.MAX_WBITS + + @property + def Z_FULL_FLUSH(self) -> int: + return self._zlib_backend.Z_FULL_FLUSH + + @property + def Z_SYNC_FLUSH(self) -> int: + return self._zlib_backend.Z_SYNC_FLUSH + + @property + def Z_BEST_SPEED(self) -> int: + return self._zlib_backend.Z_BEST_SPEED + + @property + def Z_FINISH(self) -> int: + return self._zlib_backend.Z_FINISH + + def compressobj(self, *args: Any, **kwargs: Any) -> ZLibCompressObjProtocol: + return self._zlib_backend.compressobj(*args, **kwargs) + + def decompressobj(self, *args: Any, **kwargs: Any) -> ZLibDecompressObjProtocol: + return self._zlib_backend.decompressobj(*args, **kwargs) + + def compress(self, data: Buffer, *args: Any, **kwargs: Any) -> bytes: + return self._zlib_backend.compress(data, *args, **kwargs) + + def decompress(self, data: Buffer, *args: Any, **kwargs: Any) -> bytes: + return self._zlib_backend.decompress(data, *args, **kwargs) + + # Everything not explicitly listed in the Protocol we just pass through + def __getattr__(self, attrname: str) -> Any: + return getattr(self._zlib_backend, attrname) + + +ZLibBackend: ZLibBackendWrapper = ZLibBackendWrapper(zlib) + + +def set_zlib_backend(new_zlib_backend: ZLibBackendProtocol) -> None: + ZLibBackend._zlib_backend = new_zlib_backend + + def encoding_to_mode( encoding: Optional[str] = None, suppress_deflate_header: bool = False, ) -> int: if encoding == "gzip": - return 16 + zlib.MAX_WBITS + return 16 + ZLibBackend.MAX_WBITS - return -zlib.MAX_WBITS if suppress_deflate_header else zlib.MAX_WBITS + return -ZLibBackend.MAX_WBITS if suppress_deflate_header else ZLibBackend.MAX_WBITS class ZlibBaseHandler: @@ -45,7 +152,7 @@ def __init__( suppress_deflate_header: bool = False, level: Optional[int] = None, wbits: Optional[int] = None, - strategy: int = zlib.Z_DEFAULT_STRATEGY, + strategy: Optional[int] = None, executor: Optional[Executor] = None, max_sync_chunk_size: Optional[int] = MAX_SYNC_CHUNK_SIZE, ): @@ -58,12 +165,15 @@ def __init__( executor=executor, max_sync_chunk_size=max_sync_chunk_size, ) - if level is None: - self._compressor = zlib.compressobj(wbits=self._mode, strategy=strategy) - else: - self._compressor = zlib.compressobj( - wbits=self._mode, strategy=strategy, level=level - ) + self._zlib_backend: Final = ZLibBackendWrapper(ZLibBackend._zlib_backend) + + kwargs: CompressObjArgs = {} + kwargs["wbits"] = self._mode + if strategy is not None: + kwargs["strategy"] = strategy + if level is not None: + kwargs["level"] = level + self._compressor = self._zlib_backend.compressobj(**kwargs) self._compress_lock = asyncio.Lock() def compress_sync(self, data: bytes) -> bytes: @@ -92,8 +202,10 @@ async def compress(self, data: bytes) -> bytes: ) return self.compress_sync(data) - def flush(self, mode: int = zlib.Z_FINISH) -> bytes: - return self._compressor.flush(mode) + def flush(self, mode: Optional[int] = None) -> bytes: + return self._compressor.flush( + mode if mode is not None else self._zlib_backend.Z_FINISH + ) class ZLibDecompressor(ZlibBaseHandler): @@ -109,7 +221,8 @@ def __init__( executor=executor, max_sync_chunk_size=max_sync_chunk_size, ) - self._decompressor = zlib.decompressobj(wbits=self._mode) + self._zlib_backend: Final = ZLibBackendWrapper(ZLibBackend._zlib_backend) + self._decompressor = self._zlib_backend.decompressobj(wbits=self._mode) def decompress_sync(self, data: bytes, max_length: int = 0) -> bytes: return self._decompressor.decompress(data, max_length) @@ -141,14 +254,6 @@ def flush(self, length: int = 0) -> bytes: def eof(self) -> bool: return self._decompressor.eof - @property - def unconsumed_tail(self) -> bytes: - return self._decompressor.unconsumed_tail - - @property - def unused_data(self) -> bytes: - return self._decompressor.unused_data - class BrotliDecompressor: # Supports both 'brotlipy' and 'Brotli' packages diff --git a/aiohttp/http_writer.py b/aiohttp/http_writer.py index e031a97708d..3e05628238d 100644 --- a/aiohttp/http_writer.py +++ b/aiohttp/http_writer.py @@ -2,7 +2,6 @@ import asyncio import sys -import zlib from typing import ( # noqa Any, Awaitable, @@ -80,7 +79,7 @@ def enable_chunking(self) -> None: self.chunked = True def enable_compression( - self, encoding: str = "deflate", strategy: int = zlib.Z_DEFAULT_STRATEGY + self, encoding: str = "deflate", strategy: Optional[int] = None ) -> None: self._compress = ZLibCompressor(encoding=encoding, strategy=strategy) diff --git a/aiohttp/multipart.py b/aiohttp/multipart.py index bd4d8ae1ddf..459cc321a1d 100644 --- a/aiohttp/multipart.py +++ b/aiohttp/multipart.py @@ -5,7 +5,6 @@ import sys import uuid import warnings -import zlib from collections import deque from types import TracebackType from typing import ( @@ -1028,7 +1027,7 @@ def enable_encoding(self, encoding: str) -> None: self._encoding = "quoted-printable" def enable_compression( - self, encoding: str = "deflate", strategy: int = zlib.Z_DEFAULT_STRATEGY + self, encoding: str = "deflate", strategy: Optional[int] = None ) -> None: self._compress = ZLibCompressor( encoding=encoding, diff --git a/aiohttp/web_response.py b/aiohttp/web_response.py index 151fbea3473..8a940ef43bf 100644 --- a/aiohttp/web_response.py +++ b/aiohttp/web_response.py @@ -6,7 +6,6 @@ import math import time import warnings -import zlib from concurrent.futures import Executor from http import HTTPStatus from http.cookies import SimpleCookie @@ -82,7 +81,7 @@ class StreamResponse(BaseClass, HeadersMixin): _keep_alive: Optional[bool] = None _chunked: bool = False _compression: bool = False - _compression_strategy: int = zlib.Z_DEFAULT_STRATEGY + _compression_strategy: Optional[int] = None _compression_force: Optional[ContentCoding] = None _req: Optional["BaseRequest"] = None _payload_writer: Optional[AbstractStreamWriter] = None @@ -192,7 +191,7 @@ def enable_chunked_encoding(self, chunk_size: Optional[int] = None) -> None: def enable_compression( self, force: Optional[Union[bool, ContentCoding]] = None, - strategy: int = zlib.Z_DEFAULT_STRATEGY, + strategy: Optional[int] = None, ) -> None: """Enables response compression encoding.""" # Backwards compatibility for when force was a bool <0.17. diff --git a/docs/client_reference.rst b/docs/client_reference.rst index a99db06764b..8d01d726e1c 100644 --- a/docs/client_reference.rst +++ b/docs/client_reference.rst @@ -2163,6 +2163,30 @@ Utilities .. versionadded:: 3.0 +.. function:: set_zlib_backend(lib) + + Sets the compression backend for zlib-based operations. + + This function allows you to override the default zlib backend + used internally by passing a module that implements the standard + compression interface. + + The module should implement at minimum the exact interface offered by the + latest version of zlib. + + :param types.ModuleType lib: A module that implements the zlib-compatible compression API. + + Example usage:: + + import zlib_ng.zlib_ng as zng + import aiohttp + + aiohttp.set_zlib_backend(zng) + + .. note:: aiohttp has been tested internally with :mod:`zlib`, :mod:`zlib_ng.zlib_ng`, and :mod:`isal.isal_zlib`. + + .. versionadded:: 3.12 + FormData ^^^^^^^^ diff --git a/docs/conf.py b/docs/conf.py index 595f02efb89..84dadfc8442 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -85,6 +85,8 @@ "aiohttpdemos": ("https://aiohttp-demos.readthedocs.io/en/latest/", None), "aiojobs": ("https://aiojobs.readthedocs.io/en/stable/", None), "aiohappyeyeballs": ("https://aiohappyeyeballs.readthedocs.io/en/latest/", None), + "isal": ("https://python-isal.readthedocs.io/en/stable/", None), + "zlib_ng": ("https://python-zlib-ng.readthedocs.io/en/stable/", None), } # Add any paths that contain templates here, relative to this directory. diff --git a/docs/spelling_wordlist.txt b/docs/spelling_wordlist.txt index 59ea99c40bb..f2321adb708 100644 --- a/docs/spelling_wordlist.txt +++ b/docs/spelling_wordlist.txt @@ -375,3 +375,4 @@ wss www xxx yarl +zlib diff --git a/docs/web_reference.rst b/docs/web_reference.rst index 62edd4c24aa..f2954b06b51 100644 --- a/docs/web_reference.rst +++ b/docs/web_reference.rst @@ -669,7 +669,7 @@ and :ref:`aiohttp-web-signals` handlers:: .. seealso:: :meth:`enable_compression` - .. method:: enable_compression(force=None, strategy=zlib.Z_DEFAULT_STRATEGY) + .. method:: enable_compression(force=None, strategy=None) Enable compression. @@ -680,7 +680,10 @@ and :ref:`aiohttp-web-signals` handlers:: :class:`ContentCoding`. *strategy* accepts a :mod:`zlib` compression strategy. - See :func:`zlib.compressobj` for possible values. + See :func:`zlib.compressobj` for possible values, or refer to the + docs for the zlib of your using, should you use :func:`aiohttp.set_zlib_backend` + to change zlib backend. If ``None``, the default value adopted by + your zlib backend will be used where applicable. .. seealso:: :attr:`compression` diff --git a/requirements/dev.txt b/requirements/dev.txt index 5aa7fd7c174..90d5c88acb5 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -99,6 +99,8 @@ incremental==24.7.2 # via towncrier iniconfig==2.1.0 # via pytest +isal==1.7.2 + # via -r requirements/test.in jinja2==3.1.6 # via # sphinx @@ -278,6 +280,8 @@ wheel==0.45.1 # via pip-tools yarl==1.19.0 # via -r requirements/runtime-deps.in +zlib_ng==0.5.1 + # via -r requirements/test.in # The following packages are considered to be unsafe in a requirements file: pip==25.0.1 diff --git a/requirements/lint.in b/requirements/lint.in index 4759dadc6a9..fe996d00176 100644 --- a/requirements/lint.in +++ b/requirements/lint.in @@ -1,6 +1,7 @@ aiodns blockbuster freezegun +isal mypy; implementation_name == "cpython" pre-commit pytest @@ -11,3 +12,4 @@ slotscheck trustme uvloop; platform_system != "Windows" valkey +zlib_ng diff --git a/requirements/lint.txt b/requirements/lint.txt index 303d2756904..b53cccca9f6 100644 --- a/requirements/lint.txt +++ b/requirements/lint.txt @@ -39,6 +39,8 @@ idna==3.7 # via trustme iniconfig==2.1.0 # via pytest +isal==1.7.2 + # via -r requirements/lint.in markdown-it-py==3.0.0 # via rich mdurl==0.1.2 @@ -111,3 +113,5 @@ valkey==6.1.0 # via -r requirements/lint.in virtualenv==20.30.0 # via pre-commit +zlib-ng==0.5.1 + # via -r requirements/lint.in diff --git a/requirements/test.in b/requirements/test.in index c6547bee5e5..91b5e115952 100644 --- a/requirements/test.in +++ b/requirements/test.in @@ -3,6 +3,7 @@ blockbuster coverage freezegun +isal mypy; implementation_name == "cpython" proxy.py >= 2.4.4rc5 pytest @@ -15,3 +16,4 @@ re-assert setuptools-git trustme; platform_machine != "i686" # no 32-bit wheels wait-for-it +zlib_ng diff --git a/requirements/test.txt b/requirements/test.txt index 3b16120500c..4953cdbd09a 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -53,6 +53,8 @@ idna==3.4 # yarl iniconfig==2.1.0 # via pytest +isal==1.7.2 + # via -r requirements/test.in markdown-it-py==3.0.0 # via rich mdurl==0.1.2 @@ -140,3 +142,5 @@ wait-for-it==2.3.0 # via -r requirements/test.in yarl==1.19.0 # via -r requirements/runtime-deps.in +zlib_ng==0.5.1 + # via -r requirements/test.in diff --git a/tests/conftest.py b/tests/conftest.py index 5bca52fe67c..be763400f45 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -4,6 +4,7 @@ import socket import ssl import sys +import zlib from hashlib import md5, sha1, sha256 from pathlib import Path from tempfile import TemporaryDirectory @@ -11,10 +12,13 @@ from unittest import mock from uuid import uuid4 +import isal.isal_zlib import pytest +import zlib_ng.zlib_ng from blockbuster import blockbuster_ctx from aiohttp.client_proto import ResponseHandler +from aiohttp.compression_utils import ZLibBackend, ZLibBackendProtocol, set_zlib_backend from aiohttp.http import WS_KEY from aiohttp.test_utils import get_unused_port_socket, loop_context @@ -295,3 +299,15 @@ def unused_port_socket() -> Generator[socket.socket, None, None]: yield s finally: s.close() + + +@pytest.fixture(params=[zlib, zlib_ng.zlib_ng, isal.isal_zlib]) +def parametrize_zlib_backend( + request: pytest.FixtureRequest, +) -> Generator[None, None, None]: + original_backend: ZLibBackendProtocol = ZLibBackend._zlib_backend + set_zlib_backend(request.param) + + yield + + set_zlib_backend(original_backend) diff --git a/tests/test_client_functional.py b/tests/test_client_functional.py index 9ffe5f523f3..0ea3ce1619a 100644 --- a/tests/test_client_functional.py +++ b/tests/test_client_functional.py @@ -2040,6 +2040,7 @@ async def expect_handler(request): assert expect_called +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_encoding_deflate(aiohttp_client) -> None: async def handler(request): resp = web.Response(text="text") @@ -2058,6 +2059,7 @@ async def handler(request): resp.close() +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_encoding_deflate_nochunk(aiohttp_client) -> None: async def handler(request): resp = web.Response(text="text") @@ -2075,6 +2077,7 @@ async def handler(request): resp.close() +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_encoding_gzip(aiohttp_client) -> None: async def handler(request): resp = web.Response(text="text") @@ -2093,6 +2096,7 @@ async def handler(request): resp.close() +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_encoding_gzip_write_by_chunks(aiohttp_client) -> None: async def handler(request): resp = web.StreamResponse() @@ -2113,6 +2117,7 @@ async def handler(request): resp.close() +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_encoding_gzip_nochunk(aiohttp_client) -> None: async def handler(request): resp = web.Response(text="text") diff --git a/tests/test_client_request.py b/tests/test_client_request.py index f86ff5d7587..6454b42c89b 100644 --- a/tests/test_client_request.py +++ b/tests/test_client_request.py @@ -4,7 +4,6 @@ import pathlib import sys import urllib.parse -import zlib from http.cookies import BaseCookie, Morsel, SimpleCookie from typing import Any, Callable, Dict, Iterable, Optional from unittest import mock @@ -23,6 +22,7 @@ _gen_default_accept_encoding, _merge_ssl_params, ) +from aiohttp.compression_utils import ZLibBackend from aiohttp.http import HttpVersion10, HttpVersion11 from aiohttp.test_utils import make_mocked_coro @@ -800,6 +800,7 @@ async def test_bytes_data(loop, conn) -> None: resp.close() +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_content_encoding(loop, conn) -> None: req = ClientRequest( "post", URL("http://python.org/"), data="foo", compress="deflate", loop=loop @@ -826,6 +827,7 @@ async def test_content_encoding_dont_set_headers_if_no_body(loop, conn) -> None: resp.close() +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_content_encoding_header(loop, conn) -> None: req = ClientRequest( "post", @@ -925,8 +927,9 @@ async def test_file_upload_not_chunked(loop) -> None: await req.close() +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_precompressed_data_stays_intact(loop) -> None: - data = zlib.compress(b"foobar") + data = ZLibBackend.compress(b"foobar") req = ClientRequest( "post", URL("http://python.org/"), diff --git a/tests/test_client_ws_functional.py b/tests/test_client_ws_functional.py index 0ca57ab3ab2..7b6bd032244 100644 --- a/tests/test_client_ws_functional.py +++ b/tests/test_client_ws_functional.py @@ -953,6 +953,7 @@ async def delayed_send_frame( assert cancelled is True +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_send_recv_compress(aiohttp_client: AiohttpClient) -> None: async def handler(request: web.Request) -> web.WebSocketResponse: ws = web.WebSocketResponse() @@ -978,6 +979,7 @@ async def handler(request: web.Request) -> web.WebSocketResponse: assert resp.get_extra_info("socket") is None +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_send_recv_compress_wbits(aiohttp_client) -> None: async def handler(request): ws = web.WebSocketResponse() diff --git a/tests/test_compression_utils.py b/tests/test_compression_utils.py index 047a4ff7cf0..fdaf91b36a0 100644 --- a/tests/test_compression_utils.py +++ b/tests/test_compression_utils.py @@ -1,22 +1,34 @@ """Tests for compression utils.""" -from aiohttp.compression_utils import ZLibCompressor, ZLibDecompressor +import pytest +from aiohttp.compression_utils import ZLibBackend, ZLibCompressor, ZLibDecompressor + +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_compression_round_trip_in_executor() -> None: """Ensure that compression and decompression work correctly in the executor.""" - compressor = ZLibCompressor(max_sync_chunk_size=1) + compressor = ZLibCompressor( + strategy=ZLibBackend.Z_DEFAULT_STRATEGY, max_sync_chunk_size=1 + ) + assert type(compressor._compressor) is type(ZLibBackend.compressobj()) decompressor = ZLibDecompressor(max_sync_chunk_size=1) + assert type(decompressor._decompressor) is type(ZLibBackend.decompressobj()) data = b"Hi" * 100 compressed_data = await compressor.compress(data) + compressor.flush() decompressed_data = await decompressor.decompress(compressed_data) assert data == decompressed_data +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_compression_round_trip_in_event_loop() -> None: """Ensure that compression and decompression work correctly in the event loop.""" - compressor = ZLibCompressor(max_sync_chunk_size=10000) + compressor = ZLibCompressor( + strategy=ZLibBackend.Z_DEFAULT_STRATEGY, max_sync_chunk_size=10000 + ) + assert type(compressor._compressor) is type(ZLibBackend.compressobj()) decompressor = ZLibDecompressor(max_sync_chunk_size=10000) + assert type(decompressor._decompressor) is type(ZLibBackend.decompressobj()) data = b"Hi" * 100 compressed_data = await compressor.compress(data) + compressor.flush() decompressed_data = await decompressor.decompress(compressed_data) diff --git a/tests/test_http_writer.py b/tests/test_http_writer.py index 420816b3137..7f813692571 100644 --- a/tests/test_http_writer.py +++ b/tests/test_http_writer.py @@ -2,7 +2,7 @@ import array import asyncio import zlib -from typing import Generator, Iterable +from typing import Generator, Iterable, Union from unittest import mock import pytest @@ -10,6 +10,7 @@ from aiohttp import ClientConnectionResetError, hdrs, http from aiohttp.base_protocol import BaseProtocol +from aiohttp.compression_utils import ZLibBackend from aiohttp.http_writer import _serialize_headers from aiohttp.test_utils import make_mocked_coro @@ -61,6 +62,26 @@ def protocol(loop, transport): return protocol +def decompress(data: bytes) -> bytes: + d = ZLibBackend.decompressobj() + return d.decompress(data) + + +def decode_chunked(chunked: Union[bytes, bytearray]) -> bytes: + i = 0 + out = b"" + while i < len(chunked): + j = chunked.find(b"\r\n", i) + assert j != -1, "Malformed chunk" + size = int(chunked[i:j], 16) + if size == 0: + break + i = j + 2 + out += chunked[i : i + size] + i += size + 2 # skip \r\n after the chunk + return out + + def test_payloadwriter_properties(transport, protocol, loop) -> None: writer = http.StreamWriter(protocol, loop) assert writer.protocol == protocol @@ -112,6 +133,7 @@ async def test_write_payload_length(protocol, transport, loop) -> None: @pytest.mark.usefixtures("disable_writelines") +@pytest.mark.internal # Used for performance benchmarking async def test_write_large_payload_deflate_compression_data_in_eof( protocol: BaseProtocol, transport: asyncio.Transport, @@ -137,7 +159,42 @@ async def test_write_large_payload_deflate_compression_data_in_eof( assert zlib.decompress(content) == (b"data" * 4096) + payload +@pytest.mark.usefixtures("disable_writelines") +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_large_payload_deflate_compression_data_in_eof_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + + await msg.write(b"data" * 4096) + # Behavior depends on zlib backend, isal compress() returns b'' initially + # and the entire compressed bytes at flush() for this data + backend_to_write_called = { + "isal.isal_zlib": False, + "zlib": True, + "zlib_ng.zlib_ng": True, + } + assert transport.write.called == backend_to_write_called[ZLibBackend.name] # type: ignore[attr-defined] + chunks = [c[1][0] for c in list(transport.write.mock_calls)] # type: ignore[attr-defined] + transport.write.reset_mock() # type: ignore[attr-defined] + + # This payload compresses to 20447 bytes + payload = b"".join( + [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)] + ) + await msg.write_eof(payload) + chunks.extend([c[1][0] for c in list(transport.write.mock_calls)]) # type: ignore[attr-defined] + + assert all(chunks) + content = b"".join(chunks) + assert ZLibBackend.decompress(content) == (b"data" * 4096) + payload + + @pytest.mark.usefixtures("enable_writelines") +@pytest.mark.internal # Used for performance benchmarking async def test_write_large_payload_deflate_compression_data_in_eof_writelines( protocol: BaseProtocol, transport: asyncio.Transport, @@ -164,6 +221,43 @@ async def test_write_large_payload_deflate_compression_data_in_eof_writelines( assert zlib.decompress(content) == (b"data" * 4096) + payload +@pytest.mark.usefixtures("enable_writelines") +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_large_payload_deflate_compression_data_in_eof_writelines_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + + await msg.write(b"data" * 4096) + # Behavior depends on zlib backend, isal compress() returns b'' initially + # and the entire compressed bytes at flush() for this data + backend_to_write_called = { + "isal.isal_zlib": False, + "zlib": True, + "zlib_ng.zlib_ng": True, + } + assert transport.write.called == backend_to_write_called[ZLibBackend.name] # type: ignore[attr-defined] + chunks = [c[1][0] for c in list(transport.write.mock_calls)] # type: ignore[attr-defined] + transport.write.reset_mock() # type: ignore[attr-defined] + assert not transport.writelines.called # type: ignore[attr-defined] + + # This payload compresses to 20447 bytes + payload = b"".join( + [bytes((*range(0, i), *range(i, 0, -1))) for i in range(255) for _ in range(64)] + ) + await msg.write_eof(payload) + assert transport.writelines.called != transport.write.called # type: ignore[attr-defined] + if transport.writelines.called: # type: ignore[attr-defined] + chunks.extend(transport.writelines.mock_calls[0][1][0]) # type: ignore[attr-defined] + else: # transport.write.called: # type: ignore[attr-defined] + chunks.extend([c[1][0] for c in list(transport.write.mock_calls)]) # type: ignore[attr-defined] + content = b"".join(chunks) + assert ZLibBackend.decompress(content) == (b"data" * 4096) + payload + + async def test_write_payload_chunked_filter( protocol: BaseProtocol, transport: asyncio.Transport, @@ -200,6 +294,7 @@ async def test_write_payload_chunked_filter_multiple_chunks( ) +@pytest.mark.internal # Used for performance benchmarking async def test_write_payload_deflate_compression(protocol, transport, loop) -> None: COMPRESSED = b"x\x9cKI,I\x04\x00\x04\x00\x01\x9b" write = transport.write = mock.Mock() @@ -214,6 +309,24 @@ async def test_write_payload_deflate_compression(protocol, transport, loop) -> N assert COMPRESSED == content.split(b"\r\n\r\n", 1)[-1] +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_payload_deflate_compression_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + await msg.write(b"data") + await msg.write_eof() + + chunks = [c[1][0] for c in list(transport.write.mock_calls)] # type: ignore[attr-defined] + assert all(chunks) + content = b"".join(chunks) + assert b"data" == decompress(content) + + +@pytest.mark.internal # Used for performance benchmarking async def test_write_payload_deflate_compression_chunked( protocol: BaseProtocol, transport: asyncio.Transport, @@ -232,8 +345,27 @@ async def test_write_payload_deflate_compression_chunked( assert content == expected +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_payload_deflate_compression_chunked_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + msg.enable_chunking() + await msg.write(b"data") + await msg.write_eof() + + chunks = [c[1][0] for c in list(transport.write.mock_calls)] # type: ignore[attr-defined] + assert all(chunks) + content = b"".join(chunks) + assert b"data" == decompress(decode_chunked(content)) + + @pytest.mark.usefixtures("enable_writelines") @pytest.mark.usefixtures("force_writelines_small_payloads") +@pytest.mark.internal # Used for performance benchmarking async def test_write_payload_deflate_compression_chunked_writelines( protocol: BaseProtocol, transport: asyncio.Transport, @@ -252,6 +384,27 @@ async def test_write_payload_deflate_compression_chunked_writelines( assert content == expected +@pytest.mark.usefixtures("enable_writelines") +@pytest.mark.usefixtures("force_writelines_small_payloads") +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_payload_deflate_compression_chunked_writelines_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + msg.enable_chunking() + await msg.write(b"data") + await msg.write_eof() + + chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)] # type: ignore[attr-defined] + assert all(chunks) + content = b"".join(chunks) + assert b"data" == decompress(decode_chunked(content)) + + +@pytest.mark.internal # Used for performance benchmarking async def test_write_payload_deflate_and_chunked( buf: bytearray, protocol: BaseProtocol, @@ -270,6 +423,25 @@ async def test_write_payload_deflate_and_chunked( assert thing == buf +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_payload_deflate_and_chunked_all_zlib( + buf: bytearray, + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + msg.enable_chunking() + + await msg.write(b"da") + await msg.write(b"ta") + await msg.write_eof() + + assert b"data" == decompress(decode_chunked(buf)) + + +@pytest.mark.internal # Used for performance benchmarking async def test_write_payload_deflate_compression_chunked_data_in_eof( protocol: BaseProtocol, transport: asyncio.Transport, @@ -288,8 +460,27 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof( assert content == expected +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_payload_deflate_compression_chunked_data_in_eof_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + msg.enable_chunking() + await msg.write(b"data") + await msg.write_eof(b"end") + + chunks = [c[1][0] for c in list(transport.write.mock_calls)] # type: ignore[attr-defined] + assert all(chunks) + content = b"".join(chunks) + assert b"dataend" == decompress(decode_chunked(content)) + + @pytest.mark.usefixtures("enable_writelines") @pytest.mark.usefixtures("force_writelines_small_payloads") +@pytest.mark.internal # Used for performance benchmarking async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines( protocol: BaseProtocol, transport: asyncio.Transport, @@ -308,6 +499,27 @@ async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines( assert content == expected +@pytest.mark.usefixtures("enable_writelines") +@pytest.mark.usefixtures("force_writelines_small_payloads") +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_payload_deflate_compression_chunked_data_in_eof_writelines_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + msg.enable_chunking() + await msg.write(b"data") + await msg.write_eof(b"end") + + chunks = [b"".join(c[1][0]) for c in list(transport.writelines.mock_calls)] # type: ignore[attr-defined] + assert all(chunks) + content = b"".join(chunks) + assert b"dataend" == decompress(decode_chunked(content)) + + +@pytest.mark.internal # Used for performance benchmarking async def test_write_large_payload_deflate_compression_chunked_data_in_eof( protocol: BaseProtocol, transport: asyncio.Transport, @@ -334,8 +546,36 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof( assert zlib.decompress(content) == (b"data" * 4096) + payload +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_large_payload_deflate_compression_chunked_data_in_eof_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + msg.enable_chunking() + + await msg.write(b"data" * 4096) + # This payload compresses to 1111 bytes + payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)]) + await msg.write_eof(payload) + + compressed = [] + chunks = [c[1][0] for c in list(transport.write.mock_calls)] # type: ignore[attr-defined] + chunked_body = b"".join(chunks) + split_body = chunked_body.split(b"\r\n") + while split_body: + if split_body.pop(0): + compressed.append(split_body.pop(0)) + + content = b"".join(compressed) + assert ZLibBackend.decompress(content) == (b"data" * 4096) + payload + + @pytest.mark.usefixtures("enable_writelines") @pytest.mark.usefixtures("force_writelines_small_payloads") +@pytest.mark.internal # Used for performance benchmarking async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines( protocol: BaseProtocol, transport: asyncio.Transport, @@ -362,6 +602,36 @@ async def test_write_large_payload_deflate_compression_chunked_data_in_eof_write assert zlib.decompress(content) == (b"data" * 4096) + payload +@pytest.mark.usefixtures("enable_writelines") +@pytest.mark.usefixtures("force_writelines_small_payloads") +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_large_payload_deflate_compression_chunked_data_in_eof_writelines_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + msg.enable_chunking() + + await msg.write(b"data" * 4096) + # This payload compresses to 1111 bytes + payload = b"".join([bytes((*range(0, i), *range(i, 0, -1))) for i in range(255)]) + await msg.write_eof(payload) + assert not transport.write.called # type: ignore[attr-defined] + + chunks = [] + for write_lines_call in transport.writelines.mock_calls: # type: ignore[attr-defined] + chunked_payload = list(write_lines_call[1][0])[1:] + chunked_payload.pop() + chunks.extend(chunked_payload) + + assert all(chunks) + content = b"".join(chunks) + assert ZLibBackend.decompress(content) == (b"data" * 4096) + payload + + +@pytest.mark.internal # Used for performance benchmarking async def test_write_payload_deflate_compression_chunked_connection_lost( protocol: BaseProtocol, transport: asyncio.Transport, @@ -380,6 +650,25 @@ async def test_write_payload_deflate_compression_chunked_connection_lost( await msg.write_eof(b"end") +@pytest.mark.usefixtures("parametrize_zlib_backend") +async def test_write_payload_deflate_compression_chunked_connection_lost_all_zlib( + protocol: BaseProtocol, + transport: asyncio.Transport, + loop: asyncio.AbstractEventLoop, +) -> None: + msg = http.StreamWriter(protocol, loop) + msg.enable_compression("deflate") + msg.enable_chunking() + await msg.write(b"data") + with ( + pytest.raises( + ClientConnectionResetError, match="Cannot write to closing transport" + ), + mock.patch.object(transport, "is_closing", return_value=True), + ): + await msg.write_eof(b"end") + + async def test_write_payload_bytes_memoryview( buf: bytearray, protocol: BaseProtocol, diff --git a/tests/test_multipart.py b/tests/test_multipart.py index 8576998962e..b0ca92fde9e 100644 --- a/tests/test_multipart.py +++ b/tests/test_multipart.py @@ -3,13 +3,13 @@ import json import pathlib import sys -import zlib from unittest import mock import pytest import aiohttp from aiohttp import payload +from aiohttp.compression_utils import ZLibBackend from aiohttp.hdrs import ( CONTENT_DISPOSITION, CONTENT_ENCODING, @@ -1190,6 +1190,7 @@ async def test_writer_write_no_parts(buf, stream, writer) -> None: assert b"--:--\r\n" == bytes(buf) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_writer_serialize_with_content_encoding_gzip(buf, stream, writer): writer.append("Time to Relax!", {CONTENT_ENCODING: "gzip"}) await writer.write(stream) @@ -1200,7 +1201,7 @@ async def test_writer_serialize_with_content_encoding_gzip(buf, stream, writer): b"Content-Encoding: gzip" == headers ) - decompressor = zlib.decompressobj(wbits=16 + zlib.MAX_WBITS) + decompressor = ZLibBackend.decompressobj(wbits=16 + ZLibBackend.MAX_WBITS) data = decompressor.decompress(message.split(b"\r\n")[0]) data += decompressor.flush() assert b"Time to Relax!" == data diff --git a/tests/test_web_functional.py b/tests/test_web_functional.py index 47189f7460b..9cc05a08426 100644 --- a/tests/test_web_functional.py +++ b/tests/test_web_functional.py @@ -4,8 +4,7 @@ import pathlib import socket import sys -import zlib -from typing import Any, NoReturn, Optional +from typing import Any, Dict, Generator, NoReturn, Optional, Tuple from unittest import mock import pytest @@ -22,6 +21,7 @@ multipart, web, ) +from aiohttp.compression_utils import ZLibBackend, ZLibCompressObjProtocol from aiohttp.hdrs import CONTENT_LENGTH, CONTENT_TYPE, TRANSFER_ENCODING from aiohttp.pytest_plugin import AiohttpClient from aiohttp.test_utils import make_mocked_coro @@ -1134,19 +1134,30 @@ async def handler(request): await resp.release() -@pytest.mark.parametrize( - "compressor,encoding", - [ - (zlib.compressobj(wbits=16 + zlib.MAX_WBITS), "gzip"), - (zlib.compressobj(wbits=zlib.MAX_WBITS), "deflate"), - # Actually, wrong compression format, but - # should be supported for some legacy cases. - (zlib.compressobj(wbits=-zlib.MAX_WBITS), "deflate"), - ], -) +@pytest.fixture(params=["gzip", "deflate", "deflate-raw"]) +def compressor_case( + request: pytest.FixtureRequest, + parametrize_zlib_backend: None, +) -> Generator[Tuple[ZLibCompressObjProtocol, str], None, None]: + encoding: str = request.param + max_wbits: int = ZLibBackend.MAX_WBITS + + encoding_to_wbits: Dict[str, int] = { + "deflate": max_wbits, + "deflate-raw": -max_wbits, + "gzip": 16 + max_wbits, + } + + compressor = ZLibBackend.compressobj(wbits=encoding_to_wbits[encoding]) + yield (compressor, "deflate" if encoding.startswith("deflate") else encoding) + + async def test_response_with_precompressed_body( - aiohttp_client, compressor, encoding + aiohttp_client: AiohttpClient, + compressor_case: Tuple[ZLibCompressObjProtocol, str], ) -> None: + compressor, encoding = compressor_case + async def handler(request): headers = {"Content-Encoding": encoding} data = compressor.compress(b"mydata") + compressor.flush() @@ -2189,6 +2200,7 @@ async def handler(request): @pytest.mark.parametrize( "auto_decompress,len_of", [(True, "uncompressed"), (False, "compressed")] ) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_auto_decompress( aiohttp_client, auto_decompress, @@ -2203,7 +2215,7 @@ async def handler(request): client = await aiohttp_client(app) uncompressed = b"dataaaaaaaaaaaaaaaaaaaaaaaaa" - compressor = zlib.compressobj(wbits=16 + zlib.MAX_WBITS) + compressor = ZLibBackend.compressobj(wbits=16 + ZLibBackend.MAX_WBITS) compressed = compressor.compress(uncompressed) + compressor.flush() assert len(compressed) != len(uncompressed) headers = {"content-encoding": "gzip"} diff --git a/tests/test_web_response.py b/tests/test_web_response.py index 54176ea661b..b7758f46baa 100644 --- a/tests/test_web_response.py +++ b/tests/test_web_response.py @@ -4,7 +4,6 @@ import io import json import sys -import zlib from concurrent.futures import ThreadPoolExecutor from typing import AsyncIterator, Optional from unittest import mock @@ -417,6 +416,7 @@ async def test_chunked_encoding_forbidden_for_http_10() -> None: assert Matches("Using chunked encoding is forbidden for HTTP/1.0") == str(ctx.value) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_compression_no_accept() -> None: req = make_request("GET", "/") resp = StreamResponse() @@ -458,6 +458,7 @@ async def test_force_compression_false_backwards_compat() -> None: assert not msg.enable_compression.called +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_compression_default_coding() -> None: req = make_request( "GET", "/", headers=CIMultiDict({hdrs.ACCEPT_ENCODING: "gzip, deflate"}) @@ -471,11 +472,12 @@ async def test_compression_default_coding() -> None: msg = await resp.prepare(req) - msg.enable_compression.assert_called_with("deflate", zlib.Z_DEFAULT_STRATEGY) + msg.enable_compression.assert_called_with("deflate", None) assert "deflate" == resp.headers.get(hdrs.CONTENT_ENCODING) assert msg.filter is not None +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_force_compression_deflate() -> None: req = make_request( "GET", "/", headers=CIMultiDict({hdrs.ACCEPT_ENCODING: "gzip, deflate"}) @@ -486,10 +488,12 @@ async def test_force_compression_deflate() -> None: assert resp.compression msg = await resp.prepare(req) - msg.enable_compression.assert_called_with("deflate", zlib.Z_DEFAULT_STRATEGY) + assert msg is not None + msg.enable_compression.assert_called_with("deflate", None) assert "deflate" == resp.headers.get(hdrs.CONTENT_ENCODING) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_force_compression_deflate_large_payload() -> None: """Make sure a warning is thrown for large payloads compressed in the event loop.""" req = make_request( @@ -509,6 +513,7 @@ async def test_force_compression_deflate_large_payload() -> None: assert "deflate" == resp.headers.get(hdrs.CONTENT_ENCODING) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_force_compression_no_accept_deflate() -> None: req = make_request("GET", "/") resp = StreamResponse() @@ -517,10 +522,12 @@ async def test_force_compression_no_accept_deflate() -> None: assert resp.compression msg = await resp.prepare(req) - msg.enable_compression.assert_called_with("deflate", zlib.Z_DEFAULT_STRATEGY) + assert msg is not None + msg.enable_compression.assert_called_with("deflate", None) assert "deflate" == resp.headers.get(hdrs.CONTENT_ENCODING) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_force_compression_gzip() -> None: req = make_request( "GET", "/", headers=CIMultiDict({hdrs.ACCEPT_ENCODING: "gzip, deflate"}) @@ -531,10 +538,12 @@ async def test_force_compression_gzip() -> None: assert resp.compression msg = await resp.prepare(req) - msg.enable_compression.assert_called_with("gzip", zlib.Z_DEFAULT_STRATEGY) + assert msg is not None + msg.enable_compression.assert_called_with("gzip", None) assert "gzip" == resp.headers.get(hdrs.CONTENT_ENCODING) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_force_compression_no_accept_gzip() -> None: req = make_request("GET", "/") resp = StreamResponse() @@ -543,10 +552,12 @@ async def test_force_compression_no_accept_gzip() -> None: assert resp.compression msg = await resp.prepare(req) - msg.enable_compression.assert_called_with("gzip", zlib.Z_DEFAULT_STRATEGY) + assert msg is not None + msg.enable_compression.assert_called_with("gzip", None) assert "gzip" == resp.headers.get(hdrs.CONTENT_ENCODING) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_change_content_threaded_compression_enabled() -> None: req = make_request("GET", "/") body_thread_size = 1024 @@ -558,6 +569,7 @@ async def test_change_content_threaded_compression_enabled() -> None: assert gzip.decompress(resp._compressed_body) == body +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_change_content_threaded_compression_enabled_explicit() -> None: req = make_request("GET", "/") body_thread_size = 1024 @@ -572,6 +584,7 @@ async def test_change_content_threaded_compression_enabled_explicit() -> None: assert gzip.decompress(resp._compressed_body) == body +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_change_content_length_if_compression_enabled() -> None: req = make_request("GET", "/") resp = Response(body=b"answer") @@ -581,6 +594,7 @@ async def test_change_content_length_if_compression_enabled() -> None: assert resp.content_length is not None and resp.content_length != len(b"answer") +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_set_content_length_if_compression_enabled() -> None: writer = mock.Mock() @@ -600,6 +614,7 @@ async def write_headers(status_line, headers): assert resp.content_length == 26 +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_remove_content_length_if_compression_enabled_http11() -> None: writer = mock.Mock() @@ -616,6 +631,7 @@ async def write_headers(status_line, headers): assert resp.content_length is None +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_remove_content_length_if_compression_enabled_http10() -> None: writer = mock.Mock() @@ -632,6 +648,7 @@ async def write_headers(status_line, headers): assert resp.content_length is None +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_force_compression_identity() -> None: writer = mock.Mock() @@ -648,6 +665,7 @@ async def write_headers(status_line, headers): assert resp.content_length == 123 +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_force_compression_identity_response() -> None: writer = mock.Mock() @@ -663,6 +681,7 @@ async def write_headers(status_line, headers): assert resp.content_length == 6 +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_rm_content_length_if_compression_http11() -> None: writer = mock.Mock() @@ -680,6 +699,7 @@ async def write_headers(status_line, headers): assert resp.content_length is None +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_rm_content_length_if_compression_http10() -> None: writer = mock.Mock() diff --git a/tests/test_web_sendfile_functional.py b/tests/test_web_sendfile_functional.py index 256cf4d243a..fc4db06a307 100644 --- a/tests/test_web_sendfile_functional.py +++ b/tests/test_web_sendfile_functional.py @@ -3,7 +3,6 @@ import gzip import pathlib import socket -import zlib from typing import Any, Iterable, Optional from unittest import mock @@ -11,6 +10,7 @@ import aiohttp from aiohttp import web +from aiohttp.compression_utils import ZLibBackend try: import brotlicffi as brotli @@ -300,6 +300,7 @@ async def handler(request): [("gzip, deflate", "gzip"), ("gzip, deflate, br", "br")], ) @pytest.mark.parametrize("forced_compression", [None, web.ContentCoding.gzip]) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_static_file_with_encoding_and_enable_compression( hello_txt: pathlib.Path, aiohttp_client: Any, @@ -1047,6 +1048,7 @@ async def test_static_file_if_range_invalid_date( await client.close() +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_static_file_compression(aiohttp_client, sender) -> None: filepath = pathlib.Path(__file__).parent / "data.unknown_mime_type" @@ -1061,7 +1063,7 @@ async def handler(request): resp = await client.get("/") assert resp.status == 200 - zcomp = zlib.compressobj(wbits=zlib.MAX_WBITS) + zcomp = ZLibBackend.compressobj(wbits=ZLibBackend.MAX_WBITS) expected_body = zcomp.compress(b"file content\n") + zcomp.flush() assert expected_body == await resp.read() assert "application/octet-stream" == resp.headers["Content-Type"] diff --git a/tests/test_websocket_parser.py b/tests/test_websocket_parser.py index 7f8b98d4566..2cac4cf6b87 100644 --- a/tests/test_websocket_parser.py +++ b/tests/test_websocket_parser.py @@ -2,7 +2,6 @@ import pickle import random import struct -import zlib from typing import Union from unittest import mock @@ -20,6 +19,7 @@ from aiohttp._websocket.models import WS_DEFLATE_TRAILING from aiohttp._websocket.reader import WebSocketDataQueue from aiohttp.base_protocol import BaseProtocol +from aiohttp.compression_utils import ZLibBackend from aiohttp.http import WebSocketError, WSCloseCode, WSMessage, WSMsgType from aiohttp.http_websocket import WebSocketReader @@ -29,13 +29,15 @@ class PatchableWebSocketReader(WebSocketReader): def build_frame( - message, opcode, use_mask=False, noheader=False, is_fin=True, compress=False + message, opcode, use_mask=False, noheader=False, is_fin=True, ZLibBackend=None ): # Send a frame over the websocket with message as its payload. - if compress: - compressobj = zlib.compressobj(wbits=-9) + compress = False + if ZLibBackend: + compress = True + compressobj = ZLibBackend.compressobj(wbits=-9) message = compressobj.compress(message) - message = message + compressobj.flush(zlib.Z_SYNC_FLUSH) + message = message + compressobj.flush(ZLibBackend.Z_SYNC_FLUSH) if message.endswith(WS_DEFLATE_TRAILING): message = message[:-4] msg_length = len(message) @@ -545,6 +547,7 @@ def test_parse_compress_error_frame(parser) -> None: assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_parse_no_compress_frame_single( loop: asyncio.AbstractEventLoop, out: WebSocketDataQueue ) -> None: @@ -574,7 +577,7 @@ def test_msg_too_large_not_fin(out) -> None: def test_compressed_msg_too_large(out) -> None: parser = WebSocketReader(out, 256, compress=True) - data = build_frame(b"aaa" * 256, WSMsgType.TEXT, compress=True) + data = build_frame(b"aaa" * 256, WSMsgType.TEXT, ZLibBackend=ZLibBackend) with pytest.raises(WebSocketError) as ctx: parser._feed_data(data) assert ctx.value.code == WSCloseCode.MESSAGE_TOO_BIG diff --git a/tests/test_websocket_writer.py b/tests/test_websocket_writer.py index 77eaa2f74b8..b39e411f90d 100644 --- a/tests/test_websocket_writer.py +++ b/tests/test_websocket_writer.py @@ -118,6 +118,7 @@ async def test_send_compress_text_per_message(protocol, transport) -> None: (32, lambda count: 64 + count if count % 2 else count), ), ) +@pytest.mark.usefixtures("parametrize_zlib_backend") async def test_concurrent_messages( protocol: Any, transport: Any, From 798648a408961a995bb8ee009495e4dccd6b181d Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Mon, 14 Apr 2025 11:09:22 -1000 Subject: [PATCH 14/37] =?UTF-8?q?[PR=20#10706/db6faf75=20backport][3.12]?= =?UTF-8?q?=20docs/client=5Freference.rst,=20attribute=20name=20from=20tra?= =?UTF-8?q?ce=5Fconfig=20to=20trace=5F=E2=80=A6=20(#10708)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: alegtk <115006266+alegtk@users.noreply.github.com> --- docs/client_reference.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/client_reference.rst b/docs/client_reference.rst index 8d01d726e1c..aa664b24ff4 100644 --- a/docs/client_reference.rst +++ b/docs/client_reference.rst @@ -368,7 +368,7 @@ The client session supports the context manager protocol for self closing. .. versionadded:: 3.7 - .. attribute:: trace_config + .. attribute:: trace_configs A list of :class:`TraceConfig` instances used for client tracing. ``None`` (default) is used for request tracing From 98add82d7b9eddd88b8ff60e3783413750db9274 Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Mon, 14 Apr 2025 11:09:34 -1000 Subject: [PATCH 15/37] =?UTF-8?q?[PR=20#10706/db6faf75=20backport][3.11]?= =?UTF-8?q?=20docs/client=5Freference.rst,=20attribute=20name=20from=20tra?= =?UTF-8?q?ce=5Fconfig=20to=20trace=5F=E2=80=A6=20(#10707)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-authored-by: alegtk <115006266+alegtk@users.noreply.github.com> --- docs/client_reference.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/client_reference.rst b/docs/client_reference.rst index 26537161971..130ba6cc336 100644 --- a/docs/client_reference.rst +++ b/docs/client_reference.rst @@ -364,7 +364,7 @@ The client session supports the context manager protocol for self closing. .. versionadded:: 3.7 - .. attribute:: trace_config + .. attribute:: trace_configs A list of :class:`TraceConfig` instances used for client tracing. ``None`` (default) is used for request tracing From ce76d11151fa7453e3a8a456631b7525cdd5cee6 Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Mon, 14 Apr 2025 23:19:16 +0000 Subject: [PATCH 16/37] [PR #10721/d912123c backport][3.12] Update zlib benchmarks for multiple zlib backends (#10724) Co-authored-by: J. Nick Koston --- tests/test_benchmarks_client.py | 2 ++ tests/test_benchmarks_http_websocket.py | 2 ++ 2 files changed, 4 insertions(+) diff --git a/tests/test_benchmarks_client.py b/tests/test_benchmarks_client.py index aa3536be820..ef2a4d88c92 100644 --- a/tests/test_benchmarks_client.py +++ b/tests/test_benchmarks_client.py @@ -2,6 +2,7 @@ import asyncio +import pytest from pytest_codspeed import BenchmarkFixture from aiohttp import hdrs, web @@ -178,6 +179,7 @@ def _run() -> None: loop.run_until_complete(run_client_benchmark()) +@pytest.mark.usefixtures("parametrize_zlib_backend") def test_get_request_with_251308_compressed_chunked_payload( loop: asyncio.AbstractEventLoop, aiohttp_client: AiohttpClient, diff --git a/tests/test_benchmarks_http_websocket.py b/tests/test_benchmarks_http_websocket.py index 7ff04199d24..8e6a8bb7bb9 100644 --- a/tests/test_benchmarks_http_websocket.py +++ b/tests/test_benchmarks_http_websocket.py @@ -3,6 +3,7 @@ import asyncio from typing import Union +import pytest from pytest_codspeed import BenchmarkFixture from aiohttp._websocket.helpers import MSG_SIZE, PACK_LEN3 @@ -117,6 +118,7 @@ def _run() -> None: loop.run_until_complete(_send_one_hundred_websocket_text_messages()) +@pytest.mark.usefixtures("parametrize_zlib_backend") def test_send_one_hundred_websocket_compressed_messages( loop: asyncio.AbstractEventLoop, benchmark: BenchmarkFixture ) -> None: From 538938d0b1c220350c10d435e4e4a20e97a2fed4 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 17 Apr 2025 10:50:30 +0000 Subject: [PATCH 17/37] Bump yarl from 1.19.0 to 1.20.0 (#10727) Bumps [yarl](https://github.com/aio-libs/yarl) from 1.19.0 to 1.20.0.
Release notes

Sourced from yarl's releases.

1.20.0

Features

  • Implemented support for the free-threaded build of CPython 3.13 -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: #1456.

Packaging updates and notes for downstreams

  • Started building wheels for the free-threaded build of CPython 3.13 -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: #1456.


Changelog

Sourced from yarl's changelog.

1.20.0

(2025-04-16)

Features

  • Implemented support for the free-threaded build of CPython 3.13 -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: :issue:1456.

Packaging updates and notes for downstreams

  • Started building wheels for the free-threaded build of CPython 3.13 -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: :issue:1456.


Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=yarl&package-manager=pip&previous-version=1.19.0&new-version=1.20.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 10 +++++++++- requirements/dev.txt | 12 ++++++++---- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 4 ++-- 5 files changed, 21 insertions(+), 9 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index 08beaa66522..7f8e2a2a20a 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -44,5 +44,5 @@ typing-extensions==4.13.1 # via multidict uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython" # via -r requirements/base.in -yarl==1.19.0 +yarl==1.20.0 # via -r requirements/runtime-deps.in diff --git a/requirements/constraints.txt b/requirements/constraints.txt index e8a2d85b2bb..79097789543 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -101,6 +101,10 @@ incremental==24.7.2 # via towncrier iniconfig==2.1.0 # via pytest +isal==1.7.2 + # via + # -r requirements/lint.in + # -r requirements/test.in jinja2==3.1.6 # via # sphinx @@ -285,8 +289,12 @@ wait-for-it==2.3.0 # via -r requirements/test.in wheel==0.45.1 # via pip-tools -yarl==1.19.0 +yarl==1.20.0 # via -r requirements/runtime-deps.in +zlib-ng==0.5.1 + # via + # -r requirements/lint.in + # -r requirements/test.in # The following packages are considered to be unsafe in a requirements file: pip==25.0.1 diff --git a/requirements/dev.txt b/requirements/dev.txt index 90d5c88acb5..2f6ce6afcb6 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -100,7 +100,9 @@ incremental==24.7.2 iniconfig==2.1.0 # via pytest isal==1.7.2 - # via -r requirements/test.in + # via + # -r requirements/lint.in + # -r requirements/test.in jinja2==3.1.6 # via # sphinx @@ -278,10 +280,12 @@ wait-for-it==2.3.0 # via -r requirements/test.in wheel==0.45.1 # via pip-tools -yarl==1.19.0 +yarl==1.20.0 # via -r requirements/runtime-deps.in -zlib_ng==0.5.1 - # via -r requirements/test.in +zlib-ng==0.5.1 + # via + # -r requirements/lint.in + # -r requirements/test.in # The following packages are considered to be unsafe in a requirements file: pip==25.0.1 diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 3fcc493e191..1d68b4cdc19 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -38,5 +38,5 @@ pycparser==2.22 # via cffi typing-extensions==4.13.1 # via multidict -yarl==1.19.0 +yarl==1.20.0 # via -r requirements/runtime-deps.in diff --git a/requirements/test.txt b/requirements/test.txt index 4953cdbd09a..7196f9bb4db 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -140,7 +140,7 @@ uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpytho # via -r requirements/base.in wait-for-it==2.3.0 # via -r requirements/test.in -yarl==1.19.0 +yarl==1.20.0 # via -r requirements/runtime-deps.in -zlib_ng==0.5.1 +zlib-ng==0.5.1 # via -r requirements/test.in From 98a44cccebb2ba841b6dbc4d957fcf5ef33d662f Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 17 Apr 2025 11:08:14 +0000 Subject: [PATCH 18/37] Bump typing-extensions from 4.13.1 to 4.13.2 (#10718) Bumps [typing-extensions](https://github.com/python/typing_extensions) from 4.13.1 to 4.13.2.
Release notes

Sourced from typing-extensions's releases.

4.13.2

  • Fix TypeError when taking the union of typing_extensions.TypeAliasType and a typing.TypeAliasType on Python 3.12 and 3.13. Patch by Joren Hammudoglu.
  • Backport from CPython PR #132160 to avoid having user arguments shadowed in generated __new__ by @typing_extensions.deprecated. Patch by Victorien Plot.
Changelog

Sourced from typing-extensions's changelog.

Release 4.13.2 (April 10, 2025)

  • Fix TypeError when taking the union of typing_extensions.TypeAliasType and a typing.TypeAliasType on Python 3.12 and 3.13. Patch by Joren Hammudoglu.
  • Backport from CPython PR #132160 to avoid having user arguments shadowed in generated __new__ by @typing_extensions.deprecated. Patch by Victorien Plot.
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=typing-extensions&package-manager=pip&previous-version=4.13.1&new-version=4.13.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/cython.txt | 2 +- requirements/dev.txt | 2 +- requirements/lint.txt | 2 +- requirements/multidict.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 8 files changed, 8 insertions(+), 8 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index 7f8e2a2a20a..5c59f913f4e 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -40,7 +40,7 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -typing-extensions==4.13.1 +typing-extensions==4.13.2 # via multidict uvloop==0.21.0 ; platform_system != "Windows" and implementation_name == "cpython" # via -r requirements/base.in diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 79097789543..21021de9034 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -262,7 +262,7 @@ trustme==1.2.1 ; platform_machine != "i686" # via # -r requirements/lint.in # -r requirements/test.in -typing-extensions==4.13.1 +typing-extensions==4.13.2 # via # multidict # mypy diff --git a/requirements/cython.txt b/requirements/cython.txt index d5661f8fff3..8686651881b 100644 --- a/requirements/cython.txt +++ b/requirements/cython.txt @@ -8,5 +8,5 @@ cython==3.0.12 # via -r requirements/cython.in multidict==6.4.3 # via -r requirements/multidict.in -typing-extensions==4.13.1 +typing-extensions==4.13.2 # via multidict diff --git a/requirements/dev.txt b/requirements/dev.txt index 2f6ce6afcb6..a1b87b493ae 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -253,7 +253,7 @@ trustme==1.2.1 ; platform_machine != "i686" # via # -r requirements/lint.in # -r requirements/test.in -typing-extensions==4.13.1 +typing-extensions==4.13.2 # via # multidict # mypy diff --git a/requirements/lint.txt b/requirements/lint.txt index b53cccca9f6..4e9689f1d5e 100644 --- a/requirements/lint.txt +++ b/requirements/lint.txt @@ -97,7 +97,7 @@ tomli==2.2.1 # slotscheck trustme==1.2.1 # via -r requirements/lint.in -typing-extensions==4.13.1 +typing-extensions==4.13.2 # via # mypy # pydantic diff --git a/requirements/multidict.txt b/requirements/multidict.txt index 64a6ea16b87..41435a67142 100644 --- a/requirements/multidict.txt +++ b/requirements/multidict.txt @@ -6,5 +6,5 @@ # multidict==6.4.3 # via -r requirements/multidict.in -typing-extensions==4.13.1 +typing-extensions==4.13.2 # via multidict diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index 1d68b4cdc19..f8fab0f177a 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -36,7 +36,7 @@ pycares==4.5.0 # via aiodns pycparser==2.22 # via cffi -typing-extensions==4.13.1 +typing-extensions==4.13.2 # via multidict yarl==1.20.0 # via -r requirements/runtime-deps.in diff --git a/requirements/test.txt b/requirements/test.txt index 7196f9bb4db..be63bafac53 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -125,7 +125,7 @@ tomli==2.2.1 # pytest trustme==1.2.1 ; platform_machine != "i686" # via -r requirements/test.in -typing-extensions==4.13.1 +typing-extensions==4.13.2 # via # multidict # mypy From d612e83e6a2607522d3f342f852dbc8d80ae58bd Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Thu, 17 Apr 2025 11:26:04 +0000 Subject: [PATCH 19/37] Bump pycares from 4.5.0 to 4.6.0 (#10702) Bumps [pycares](https://github.com/saghul/pycares) from 4.5.0 to 4.6.0.
Commits

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pycares&package-manager=pip&previous-version=4.5.0&new-version=4.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/lint.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 6 files changed, 6 insertions(+), 6 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index 5c59f913f4e..e7dfdd67a62 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -36,7 +36,7 @@ propcache==0.3.1 # via # -r requirements/runtime-deps.in # yarl -pycares==4.5.0 +pycares==4.6.0 # via aiodns pycparser==2.22 # via cffi diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 21021de9034..4332afac2e6 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -148,7 +148,7 @@ propcache==0.3.1 # yarl proxy-py==2.4.10 # via -r requirements/test.in -pycares==4.5.0 +pycares==4.6.0 # via aiodns pycparser==2.22 # via cffi diff --git a/requirements/dev.txt b/requirements/dev.txt index a1b87b493ae..ba62db63f1d 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -145,7 +145,7 @@ propcache==0.3.1 # yarl proxy-py==2.4.10 # via -r requirements/test.in -pycares==4.5.0 +pycares==4.6.0 # via aiodns pycparser==2.22 # via cffi diff --git a/requirements/lint.txt b/requirements/lint.txt index 4e9689f1d5e..ab419411f50 100644 --- a/requirements/lint.txt +++ b/requirements/lint.txt @@ -59,7 +59,7 @@ pluggy==1.5.0 # via pytest pre-commit==4.2.0 # via -r requirements/lint.in -pycares==4.5.0 +pycares==4.6.0 # via aiodns pycparser==2.22 # via cffi diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index f8fab0f177a..da7a66e9a38 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -32,7 +32,7 @@ propcache==0.3.1 # via # -r requirements/runtime-deps.in # yarl -pycares==4.5.0 +pycares==4.6.0 # via aiodns pycparser==2.22 # via cffi diff --git a/requirements/test.txt b/requirements/test.txt index be63bafac53..ea0360d111d 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -79,7 +79,7 @@ propcache==0.3.1 # yarl proxy-py==2.4.10 # via -r requirements/test.in -pycares==4.5.0 +pycares==4.6.0 # via aiodns pycparser==2.22 # via cffi From e55937532f764b85a4e3df94ed62068101bdd177 Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Thu, 17 Apr 2025 23:07:21 +0000 Subject: [PATCH 20/37] [PR #10730/0b9d3571 backport][3.12] Add benchmarks for large binary WebSocket message roundtrips (#10732) Co-authored-by: J. Nick Koston --- tests/test_benchmarks_client_ws.py | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/tests/test_benchmarks_client_ws.py b/tests/test_benchmarks_client_ws.py index 6d4cf309cad..c244d33f6bd 100644 --- a/tests/test_benchmarks_client_ws.py +++ b/tests/test_benchmarks_client_ws.py @@ -2,6 +2,7 @@ import asyncio +import pytest from pytest_codspeed import BenchmarkFixture from aiohttp import web @@ -40,19 +41,22 @@ def _run() -> None: loop.run_until_complete(run_websocket_benchmark()) +@pytest.mark.parametrize("msg_size", [6, MSG_SIZE * 4], ids=["small", "large"]) def test_one_thousand_round_trip_websocket_binary_messages( loop: asyncio.AbstractEventLoop, aiohttp_client: AiohttpClient, benchmark: BenchmarkFixture, + msg_size: int, ) -> None: """Benchmark round trip of 1000 WebSocket binary messages.""" message_count = 1000 + raw_message = b"x" * msg_size async def handler(request: web.Request) -> web.WebSocketResponse: ws = web.WebSocketResponse() await ws.prepare(request) for _ in range(message_count): - await ws.send_bytes(b"answer") + await ws.send_bytes(raw_message) await ws.close() return ws From 377a1f89551b29c85b795a57b9e8891c519f4507 Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Thu, 17 Apr 2025 23:13:46 +0000 Subject: [PATCH 21/37] [PR #10730/0b9d3571 backport][3.11] Add benchmarks for large binary WebSocket message roundtrips (#10731) Co-authored-by: J. Nick Koston --- tests/test_benchmarks_client_ws.py | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/tests/test_benchmarks_client_ws.py b/tests/test_benchmarks_client_ws.py index 6d4cf309cad..c244d33f6bd 100644 --- a/tests/test_benchmarks_client_ws.py +++ b/tests/test_benchmarks_client_ws.py @@ -2,6 +2,7 @@ import asyncio +import pytest from pytest_codspeed import BenchmarkFixture from aiohttp import web @@ -40,19 +41,22 @@ def _run() -> None: loop.run_until_complete(run_websocket_benchmark()) +@pytest.mark.parametrize("msg_size", [6, MSG_SIZE * 4], ids=["small", "large"]) def test_one_thousand_round_trip_websocket_binary_messages( loop: asyncio.AbstractEventLoop, aiohttp_client: AiohttpClient, benchmark: BenchmarkFixture, + msg_size: int, ) -> None: """Benchmark round trip of 1000 WebSocket binary messages.""" message_count = 1000 + raw_message = b"x" * msg_size async def handler(request: web.Request) -> web.WebSocketResponse: ws = web.WebSocketResponse() await ws.prepare(request) for _ in range(message_count): - await ws.send_bytes(b"answer") + await ws.send_bytes(raw_message) await ws.close() return ws From 07c437218022fd5130591c44e4fcc04478948aee Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Fri, 18 Apr 2025 11:03:21 +0000 Subject: [PATCH 22/37] [PR #10714/75bbc03e backport][3.11] Only fetch SSLContext and peername once per connection (#10734) Co-authored-by: J. Nick Koston --- CHANGES/10714.misc.rst | 1 + aiohttp/test_utils.py | 4 ++++ aiohttp/web_protocol.py | 25 +++++++++++++++++++++++++ aiohttp/web_request.py | 6 ++---- 4 files changed, 32 insertions(+), 4 deletions(-) create mode 100644 CHANGES/10714.misc.rst diff --git a/CHANGES/10714.misc.rst b/CHANGES/10714.misc.rst new file mode 100644 index 00000000000..a36a80872f5 --- /dev/null +++ b/CHANGES/10714.misc.rst @@ -0,0 +1 @@ +Improved web server performance when connection can be reused -- by :user:`bdraco`. diff --git a/aiohttp/test_utils.py b/aiohttp/test_utils.py index be6e9b3353e..87c31427867 100644 --- a/aiohttp/test_utils.py +++ b/aiohttp/test_utils.py @@ -730,6 +730,10 @@ def make_mocked_request( if protocol is sentinel: protocol = mock.Mock() protocol.transport = transport + type(protocol).peername = mock.PropertyMock( + return_value=transport.get_extra_info("peername") + ) + type(protocol).ssl_context = mock.PropertyMock(return_value=sslcontext) if writer is sentinel: writer = mock.Mock() diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py index 1dba9606ea0..a7d50195828 100644 --- a/aiohttp/web_protocol.py +++ b/aiohttp/web_protocol.py @@ -24,6 +24,7 @@ import attr import yarl +from propcache import under_cached_property from .abc import AbstractAccessLogger, AbstractStreamWriter from .base_protocol import BaseProtocol @@ -47,6 +48,8 @@ __all__ = ("RequestHandler", "RequestPayloadError", "PayloadAccessError") if TYPE_CHECKING: + import ssl + from .web_server import Server @@ -167,6 +170,7 @@ class RequestHandler(BaseProtocol): "_current_request", "_timeout_ceil_threshold", "_request_in_progress", + "_cache", ) def __init__( @@ -246,6 +250,7 @@ def __init__( self._close = False self._force_close = False self._request_in_progress = False + self._cache: dict[str, Any] = {} def __repr__(self) -> str: return "<{} {}>".format( @@ -253,6 +258,26 @@ def __repr__(self) -> str: "connected" if self.transport is not None else "disconnected", ) + @under_cached_property + def ssl_context(self) -> Optional["ssl.SSLContext"]: + """Return SSLContext if available.""" + return ( + None + if self.transport is None + else self.transport.get_extra_info("sslcontext") + ) + + @under_cached_property + def peername( + self, + ) -> Optional[Union[str, Tuple[str, int, int, int], Tuple[str, int]]]: + """Return peername if available.""" + return ( + None + if self.transport is None + else self.transport.get_extra_info("peername") + ) + @property def keepalive_timeout(self) -> float: return self._keepalive_timeout diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py index f11d49020a0..6bf5a9dea74 100644 --- a/aiohttp/web_request.py +++ b/aiohttp/web_request.py @@ -198,10 +198,8 @@ def __init__( self._client_max_size = client_max_size self._loop = loop - transport = protocol.transport - assert transport is not None - self._transport_sslcontext = transport.get_extra_info("sslcontext") - self._transport_peername = transport.get_extra_info("peername") + self._transport_sslcontext = protocol.ssl_context + self._transport_peername = protocol.peername if remote is not None: self._cache["remote"] = remote From cc4b8c5e671ca0fd07cbcd3313e44cc45d22076c Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Fri, 18 Apr 2025 11:08:56 +0000 Subject: [PATCH 23/37] [PR #10714/75bbc03e backport][3.12] Only fetch SSLContext and peername once per connection (#10735) Co-authored-by: J. Nick Koston --- CHANGES/10714.misc.rst | 1 + aiohttp/test_utils.py | 4 ++++ aiohttp/web_protocol.py | 25 +++++++++++++++++++++++++ aiohttp/web_request.py | 6 ++---- 4 files changed, 32 insertions(+), 4 deletions(-) create mode 100644 CHANGES/10714.misc.rst diff --git a/CHANGES/10714.misc.rst b/CHANGES/10714.misc.rst new file mode 100644 index 00000000000..a36a80872f5 --- /dev/null +++ b/CHANGES/10714.misc.rst @@ -0,0 +1 @@ +Improved web server performance when connection can be reused -- by :user:`bdraco`. diff --git a/aiohttp/test_utils.py b/aiohttp/test_utils.py index be6e9b3353e..87c31427867 100644 --- a/aiohttp/test_utils.py +++ b/aiohttp/test_utils.py @@ -730,6 +730,10 @@ def make_mocked_request( if protocol is sentinel: protocol = mock.Mock() protocol.transport = transport + type(protocol).peername = mock.PropertyMock( + return_value=transport.get_extra_info("peername") + ) + type(protocol).ssl_context = mock.PropertyMock(return_value=sslcontext) if writer is sentinel: writer = mock.Mock() diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py index 1dba9606ea0..a7d50195828 100644 --- a/aiohttp/web_protocol.py +++ b/aiohttp/web_protocol.py @@ -24,6 +24,7 @@ import attr import yarl +from propcache import under_cached_property from .abc import AbstractAccessLogger, AbstractStreamWriter from .base_protocol import BaseProtocol @@ -47,6 +48,8 @@ __all__ = ("RequestHandler", "RequestPayloadError", "PayloadAccessError") if TYPE_CHECKING: + import ssl + from .web_server import Server @@ -167,6 +170,7 @@ class RequestHandler(BaseProtocol): "_current_request", "_timeout_ceil_threshold", "_request_in_progress", + "_cache", ) def __init__( @@ -246,6 +250,7 @@ def __init__( self._close = False self._force_close = False self._request_in_progress = False + self._cache: dict[str, Any] = {} def __repr__(self) -> str: return "<{} {}>".format( @@ -253,6 +258,26 @@ def __repr__(self) -> str: "connected" if self.transport is not None else "disconnected", ) + @under_cached_property + def ssl_context(self) -> Optional["ssl.SSLContext"]: + """Return SSLContext if available.""" + return ( + None + if self.transport is None + else self.transport.get_extra_info("sslcontext") + ) + + @under_cached_property + def peername( + self, + ) -> Optional[Union[str, Tuple[str, int, int, int], Tuple[str, int]]]: + """Return peername if available.""" + return ( + None + if self.transport is None + else self.transport.get_extra_info("peername") + ) + @property def keepalive_timeout(self) -> float: return self._keepalive_timeout diff --git a/aiohttp/web_request.py b/aiohttp/web_request.py index f11d49020a0..6bf5a9dea74 100644 --- a/aiohttp/web_request.py +++ b/aiohttp/web_request.py @@ -198,10 +198,8 @@ def __init__( self._client_max_size = client_max_size self._loop = loop - transport = protocol.transport - assert transport is not None - self._transport_sslcontext = transport.get_extra_info("sslcontext") - self._transport_peername = transport.get_extra_info("peername") + self._transport_sslcontext = protocol.ssl_context + self._transport_peername = protocol.peername if remote is not None: self._cache["remote"] = remote From eadcd28528b3ff3450ab0ea5a11b10d5a7e660bf Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Fri, 18 Apr 2025 01:40:35 -1000 Subject: [PATCH 24/37] [3.11] Bump multidict to 6.4.3 (#10736) --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/cython.txt | 2 +- requirements/dev.txt | 2 +- requirements/multidict.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 7 files changed, 7 insertions(+), 7 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index d79bdab3893..f279c187ebc 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -26,7 +26,7 @@ gunicorn==23.0.0 # via -r requirements/base.in idna==3.4 # via yarl -multidict==6.1.0 +multidict==6.4.3 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 041a3737ab0..16816dcd426 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -116,7 +116,7 @@ markupsafe==2.1.5 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.1.0 +multidict==6.4.3 # via # -r requirements/multidict.in # -r requirements/runtime-deps.in diff --git a/requirements/cython.txt b/requirements/cython.txt index f67cc903a0b..b2ff3e71d39 100644 --- a/requirements/cython.txt +++ b/requirements/cython.txt @@ -6,7 +6,7 @@ # cython==3.0.11 # via -r requirements/cython.in -multidict==6.1.0 +multidict==6.4.3 # via -r requirements/multidict.in typing-extensions==4.12.2 # via multidict diff --git a/requirements/dev.txt b/requirements/dev.txt index a99644dff81..6ab9baf6b59 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -110,7 +110,7 @@ markupsafe==2.1.5 # via jinja2 mdurl==0.1.2 # via markdown-it-py -multidict==6.1.0 +multidict==6.4.3 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/multidict.txt b/requirements/multidict.txt index b8b44428920..a83b5029c3f 100644 --- a/requirements/multidict.txt +++ b/requirements/multidict.txt @@ -4,7 +4,7 @@ # # pip-compile --allow-unsafe --output-file=requirements/multidict.txt --resolver=backtracking --strip-extras requirements/multidict.in # -multidict==6.1.0 +multidict==6.4.3 # via -r requirements/multidict.in typing-extensions==4.12.2 # via multidict diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index cf7f0e396f6..6c9fcc5ccd0 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -24,7 +24,7 @@ frozenlist==1.5.0 # aiosignal idna==3.4 # via yarl -multidict==6.1.0 +multidict==6.4.3 # via # -r requirements/runtime-deps.in # yarl diff --git a/requirements/test.txt b/requirements/test.txt index cf81a7bf257..025940dcf50 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -62,7 +62,7 @@ markdown-it-py==3.0.0 # via rich mdurl==0.1.2 # via markdown-it-py -multidict==6.1.0 +multidict==6.4.3 # via # -r requirements/runtime-deps.in # yarl From eacbe957b5164b253d4f98db9498ef49dfe0aa09 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 18 Apr 2025 11:46:56 +0000 Subject: [PATCH 25/37] Bump frozenlist from 1.5.0 to 1.6.0 (#10737) Bumps [frozenlist](https://github.com/aio-libs/frozenlist) from 1.5.0 to 1.6.0.
Release notes

Sourced from frozenlist's releases.

1.6.0

Bug fixes

  • Stopped implicitly allowing the use of Cython pre-release versions when building the distribution package -- by :user:ajsanchezsanz and :user:markgreene74.

    Related commits on GitHub: :commit:41591f2.

Features

  • Implemented support for the free-threaded build of CPython 3.13 -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: #618.

  • Started building armv7l wheels -- by :user:bdraco.

    Related issues and pull requests on GitHub: #642.

Packaging updates and notes for downstreams

  • Stopped implicitly allowing the use of Cython pre-release versions when building the distribution package -- by :user:ajsanchezsanz and :user:markgreene74.

    Related commits on GitHub: :commit:41591f2.

  • Started building wheels for the free-threaded build of CPython 3.13 -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: #618.

  • The packaging metadata switched to including an SPDX license identifier introduced in :pep:639 -- by :user:cdce8p.

    Related issues and pull requests on GitHub: #639.

Contributor-facing changes

... (truncated)

Changelog

Sourced from frozenlist's changelog.

v1.6.0

(2025-04-17)

Bug fixes

  • Stopped implicitly allowing the use of Cython pre-release versions when building the distribution package -- by :user:ajsanchezsanz and :user:markgreene74.

    Related commits on GitHub: :commit:41591f2.

Features

  • Implemented support for the free-threaded build of CPython 3.13 -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: :issue:618.

  • Started building armv7l wheels -- by :user:bdraco.

    Related issues and pull requests on GitHub: :issue:642.

Packaging updates and notes for downstreams

  • Stopped implicitly allowing the use of Cython pre-release versions when building the distribution package -- by :user:ajsanchezsanz and :user:markgreene74.

    Related commits on GitHub: :commit:41591f2.

  • Started building wheels for the free-threaded build of CPython 3.13 -- by :user:lysnikolaou.

    Related issues and pull requests on GitHub: :issue:618.

  • The packaging metadata switched to including an SPDX license identifier introduced in :pep:639 -- by :user:cdce8p.

    Related issues and pull requests on GitHub: :issue:639.

... (truncated)

Commits
  • 9f4253c Fix towncrier head_line missing the leading v (#645)
  • 4c8207a Release 1.6.0 (#643)
  • 58aef99 Start building wheels on armv7l (#642)
  • d8e4a82 Use SPDX license expression (#639)
  • 57ce238 [pre-commit.ci] pre-commit autoupdate (#641)
  • f545c23 Implement support for the free-threaded build of CPython 3.13 (#618)
  • 4ee4583 Build(deps): Bump pypa/cibuildwheel from 2.23.1 to 2.23.2 (#640)
  • c28f32d Better organize lint and test dependencies (#636)
  • a611cc2 Build(deps): Bump pypa/cibuildwheel from 2.23.0 to 2.23.1 (#638)
  • bfa0cb1 Reduce number of coverage uploads needed (#637)
  • Additional commits viewable in compare view

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=frozenlist&package-manager=pip&previous-version=1.5.0&new-version=1.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> --- requirements/base.txt | 2 +- requirements/constraints.txt | 2 +- requirements/dev.txt | 2 +- requirements/runtime-deps.txt | 2 +- requirements/test.txt | 2 +- 5 files changed, 5 insertions(+), 5 deletions(-) diff --git a/requirements/base.txt b/requirements/base.txt index e7dfdd67a62..b4366c8fa26 100644 --- a/requirements/base.txt +++ b/requirements/base.txt @@ -18,7 +18,7 @@ brotli==1.1.0 ; platform_python_implementation == "CPython" # via -r requirements/runtime-deps.in cffi==1.17.1 # via pycares -frozenlist==1.5.0 +frozenlist==1.6.0 # via # -r requirements/runtime-deps.in # aiosignal diff --git a/requirements/constraints.txt b/requirements/constraints.txt index 4332afac2e6..9cf1615af28 100644 --- a/requirements/constraints.txt +++ b/requirements/constraints.txt @@ -80,7 +80,7 @@ freezegun==1.5.1 # via # -r requirements/lint.in # -r requirements/test.in -frozenlist==1.5.0 +frozenlist==1.6.0 # via # -r requirements/runtime-deps.in # aiosignal diff --git a/requirements/dev.txt b/requirements/dev.txt index ba62db63f1d..fb26879cabc 100644 --- a/requirements/dev.txt +++ b/requirements/dev.txt @@ -78,7 +78,7 @@ freezegun==1.5.1 # via # -r requirements/lint.in # -r requirements/test.in -frozenlist==1.5.0 +frozenlist==1.6.0 # via # -r requirements/runtime-deps.in # aiosignal diff --git a/requirements/runtime-deps.txt b/requirements/runtime-deps.txt index da7a66e9a38..a1d1a47cf00 100644 --- a/requirements/runtime-deps.txt +++ b/requirements/runtime-deps.txt @@ -18,7 +18,7 @@ brotli==1.1.0 ; platform_python_implementation == "CPython" # via -r requirements/runtime-deps.in cffi==1.17.1 # via pycares -frozenlist==1.5.0 +frozenlist==1.6.0 # via # -r requirements/runtime-deps.in # aiosignal diff --git a/requirements/test.txt b/requirements/test.txt index ea0360d111d..ab2185d9ee7 100644 --- a/requirements/test.txt +++ b/requirements/test.txt @@ -41,7 +41,7 @@ forbiddenfruit==0.1.4 # via blockbuster freezegun==1.5.1 # via -r requirements/test.in -frozenlist==1.5.0 +frozenlist==1.6.0 # via # -r requirements/runtime-deps.in # aiosignal From 2c3b885dbd217518d030f2b0cc6343c2da77cf49 Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Fri, 18 Apr 2025 08:39:55 -1000 Subject: [PATCH 26/37] [PR #10713/8d74e26 backport][3.11] Avoid fetching loop time on each request unless logging is enabled (#10738) Co-authored-by: Sam Bull --- CHANGES/10713.misc.rst | 1 + aiohttp/web_protocol.py | 15 +++++++++++---- tests/test_web_app.py | 14 ++++++++++++++ tests/test_web_log.py | 26 ++++++++++++++++++++++++++ 4 files changed, 52 insertions(+), 4 deletions(-) create mode 100644 CHANGES/10713.misc.rst diff --git a/CHANGES/10713.misc.rst b/CHANGES/10713.misc.rst new file mode 100644 index 00000000000..a556d11e1e0 --- /dev/null +++ b/CHANGES/10713.misc.rst @@ -0,0 +1 @@ +Optimized web server performance when access logging is disabled by reducing time syscalls -- by :user:`bdraco`. diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py index a7d50195828..e1923aac24b 100644 --- a/aiohttp/web_protocol.py +++ b/aiohttp/web_protocol.py @@ -170,6 +170,7 @@ class RequestHandler(BaseProtocol): "_current_request", "_timeout_ceil_threshold", "_request_in_progress", + "_logging_enabled", "_cache", ) @@ -244,8 +245,10 @@ def __init__( self.access_logger: Optional[AbstractAccessLogger] = access_log_class( access_log, access_log_format ) + self._logging_enabled = self.access_logger.enabled else: self.access_logger = None + self._logging_enabled = False self._close = False self._force_close = False @@ -463,9 +466,11 @@ def force_close(self) -> None: self.transport = None def log_access( - self, request: BaseRequest, response: StreamResponse, time: float + self, request: BaseRequest, response: StreamResponse, time: Optional[float] ) -> None: if self.access_logger is not None and self.access_logger.enabled: + if TYPE_CHECKING: + assert time is not None self.access_logger.log(request, response, self._loop.time() - time) def log_debug(self, *args: Any, **kw: Any) -> None: @@ -495,7 +500,7 @@ def _process_keepalive(self) -> None: async def _handle_request( self, request: BaseRequest, - start_time: float, + start_time: Optional[float], request_handler: Callable[[BaseRequest], Awaitable[StreamResponse]], ) -> Tuple[StreamResponse, bool]: self._request_in_progress = True @@ -563,7 +568,9 @@ async def start(self) -> None: message, payload = self._messages.popleft() - start = loop.time() + # time is only fetched if logging is enabled as otherwise + # its thrown away and never used. + start = loop.time() if self._logging_enabled else None manager.requests_count += 1 writer = StreamWriter(self, loop) @@ -671,7 +678,7 @@ async def start(self) -> None: self.transport.close() async def finish_response( - self, request: BaseRequest, resp: StreamResponse, start_time: float + self, request: BaseRequest, resp: StreamResponse, start_time: Optional[float] ) -> Tuple[StreamResponse, bool]: """Prepare the response and write_eof, then log access. diff --git a/tests/test_web_app.py b/tests/test_web_app.py index 6a86a3458a3..8c03a6041b2 100644 --- a/tests/test_web_app.py +++ b/tests/test_web_app.py @@ -144,6 +144,20 @@ def log(self, request, response, time): ) +async def test_app_make_handler_no_access_log_class(mocker) -> None: + srv = mocker.patch("aiohttp.web_app.Server") + app = web.Application(handler_args={"access_log": None}) + app._make_handler(access_log=None) + srv.assert_called_with( + app._handle, + request_factory=app._make_request, + loop=asyncio.get_event_loop(), + access_log=None, + debug=mock.ANY, + access_log_class=mock.ANY, + ) + + async def test_app_make_handler_raises_deprecation_warning() -> None: app = web.Application() diff --git a/tests/test_web_log.py b/tests/test_web_log.py index 0896c41c9e1..16c4b976daa 100644 --- a/tests/test_web_log.py +++ b/tests/test_web_log.py @@ -255,3 +255,29 @@ def enabled(self) -> bool: resp = await client.get("/") assert 200 == resp.status assert "This should not be logged" not in caplog.text + + +async def test_logger_set_to_none( + aiohttp_server: AiohttpServer, + aiohttp_client: AiohttpClient, + caplog: pytest.LogCaptureFixture, +) -> None: + """Test logger does nothing when access_log is set to None.""" + + async def handler(request: web.Request) -> web.Response: + return web.Response() + + class Logger(AbstractAccessLogger): + + def log( + self, request: web.BaseRequest, response: web.StreamResponse, time: float + ) -> None: + self.logger.critical("This should not be logged") # pragma: no cover + + app = web.Application() + app.router.add_get("/", handler) + server = await aiohttp_server(app, access_log=None, access_log_class=Logger) + client = await aiohttp_client(server) + resp = await client.get("/") + assert 200 == resp.status + assert "This should not be logged" not in caplog.text From 47511fc620376c8758237abf69486d6edaabeedb Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Fri, 18 Apr 2025 08:43:11 -1000 Subject: [PATCH 27/37] [PR #10713/8d74e26 backport][3.12] Avoid fetching loop time on each request unless logging is enabled (#10739) Co-authored-by: Sam Bull --- CHANGES/10713.misc.rst | 1 + aiohttp/web_protocol.py | 15 +++++++++++---- tests/test_web_app.py | 14 ++++++++++++++ tests/test_web_log.py | 26 ++++++++++++++++++++++++++ 4 files changed, 52 insertions(+), 4 deletions(-) create mode 100644 CHANGES/10713.misc.rst diff --git a/CHANGES/10713.misc.rst b/CHANGES/10713.misc.rst new file mode 100644 index 00000000000..a556d11e1e0 --- /dev/null +++ b/CHANGES/10713.misc.rst @@ -0,0 +1 @@ +Optimized web server performance when access logging is disabled by reducing time syscalls -- by :user:`bdraco`. diff --git a/aiohttp/web_protocol.py b/aiohttp/web_protocol.py index a7d50195828..e1923aac24b 100644 --- a/aiohttp/web_protocol.py +++ b/aiohttp/web_protocol.py @@ -170,6 +170,7 @@ class RequestHandler(BaseProtocol): "_current_request", "_timeout_ceil_threshold", "_request_in_progress", + "_logging_enabled", "_cache", ) @@ -244,8 +245,10 @@ def __init__( self.access_logger: Optional[AbstractAccessLogger] = access_log_class( access_log, access_log_format ) + self._logging_enabled = self.access_logger.enabled else: self.access_logger = None + self._logging_enabled = False self._close = False self._force_close = False @@ -463,9 +466,11 @@ def force_close(self) -> None: self.transport = None def log_access( - self, request: BaseRequest, response: StreamResponse, time: float + self, request: BaseRequest, response: StreamResponse, time: Optional[float] ) -> None: if self.access_logger is not None and self.access_logger.enabled: + if TYPE_CHECKING: + assert time is not None self.access_logger.log(request, response, self._loop.time() - time) def log_debug(self, *args: Any, **kw: Any) -> None: @@ -495,7 +500,7 @@ def _process_keepalive(self) -> None: async def _handle_request( self, request: BaseRequest, - start_time: float, + start_time: Optional[float], request_handler: Callable[[BaseRequest], Awaitable[StreamResponse]], ) -> Tuple[StreamResponse, bool]: self._request_in_progress = True @@ -563,7 +568,9 @@ async def start(self) -> None: message, payload = self._messages.popleft() - start = loop.time() + # time is only fetched if logging is enabled as otherwise + # its thrown away and never used. + start = loop.time() if self._logging_enabled else None manager.requests_count += 1 writer = StreamWriter(self, loop) @@ -671,7 +678,7 @@ async def start(self) -> None: self.transport.close() async def finish_response( - self, request: BaseRequest, resp: StreamResponse, start_time: float + self, request: BaseRequest, resp: StreamResponse, start_time: Optional[float] ) -> Tuple[StreamResponse, bool]: """Prepare the response and write_eof, then log access. diff --git a/tests/test_web_app.py b/tests/test_web_app.py index 6a86a3458a3..8c03a6041b2 100644 --- a/tests/test_web_app.py +++ b/tests/test_web_app.py @@ -144,6 +144,20 @@ def log(self, request, response, time): ) +async def test_app_make_handler_no_access_log_class(mocker) -> None: + srv = mocker.patch("aiohttp.web_app.Server") + app = web.Application(handler_args={"access_log": None}) + app._make_handler(access_log=None) + srv.assert_called_with( + app._handle, + request_factory=app._make_request, + loop=asyncio.get_event_loop(), + access_log=None, + debug=mock.ANY, + access_log_class=mock.ANY, + ) + + async def test_app_make_handler_raises_deprecation_warning() -> None: app = web.Application() diff --git a/tests/test_web_log.py b/tests/test_web_log.py index 0896c41c9e1..16c4b976daa 100644 --- a/tests/test_web_log.py +++ b/tests/test_web_log.py @@ -255,3 +255,29 @@ def enabled(self) -> bool: resp = await client.get("/") assert 200 == resp.status assert "This should not be logged" not in caplog.text + + +async def test_logger_set_to_none( + aiohttp_server: AiohttpServer, + aiohttp_client: AiohttpClient, + caplog: pytest.LogCaptureFixture, +) -> None: + """Test logger does nothing when access_log is set to None.""" + + async def handler(request: web.Request) -> web.Response: + return web.Response() + + class Logger(AbstractAccessLogger): + + def log( + self, request: web.BaseRequest, response: web.StreamResponse, time: float + ) -> None: + self.logger.critical("This should not be logged") # pragma: no cover + + app = web.Application() + app.router.add_get("/", handler) + server = await aiohttp_server(app, access_log=None, access_log_class=Logger) + client = await aiohttp_client(server) + resp = await client.get("/") + assert 200 == resp.status + assert "This should not be logged" not in caplog.text From 099cc0c9f8943d41586055a6825aec31bc70bbd3 Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Fri, 18 Apr 2025 21:16:35 -1000 Subject: [PATCH 28/37] [PR #10740/0d21d8d backport][3.11] Refactor WebSocket reader to avoid creating lists (#10746) --- CHANGES/10740.misc.rst | 1 + aiohttp/_websocket/reader_c.pxd | 36 ++-- aiohttp/_websocket/reader_py.py | 318 ++++++++++++++++---------------- tests/test_websocket_parser.py | 312 ++++++++++++++++--------------- 4 files changed, 330 insertions(+), 337 deletions(-) create mode 100644 CHANGES/10740.misc.rst diff --git a/CHANGES/10740.misc.rst b/CHANGES/10740.misc.rst new file mode 100644 index 00000000000..34ed19aebba --- /dev/null +++ b/CHANGES/10740.misc.rst @@ -0,0 +1 @@ +Improved performance of the WebSocket reader -- by :user:`bdraco`. diff --git a/aiohttp/_websocket/reader_c.pxd b/aiohttp/_websocket/reader_c.pxd index f156a7ff704..3efebeb81dc 100644 --- a/aiohttp/_websocket/reader_c.pxd +++ b/aiohttp/_websocket/reader_c.pxd @@ -8,12 +8,17 @@ cdef unsigned int READ_PAYLOAD_LENGTH cdef unsigned int READ_PAYLOAD_MASK cdef unsigned int READ_PAYLOAD -cdef unsigned int OP_CODE_CONTINUATION -cdef unsigned int OP_CODE_TEXT -cdef unsigned int OP_CODE_BINARY -cdef unsigned int OP_CODE_CLOSE -cdef unsigned int OP_CODE_PING -cdef unsigned int OP_CODE_PONG +cdef int OP_CODE_NOT_SET +cdef int OP_CODE_CONTINUATION +cdef int OP_CODE_TEXT +cdef int OP_CODE_BINARY +cdef int OP_CODE_CLOSE +cdef int OP_CODE_PING +cdef int OP_CODE_PONG + +cdef int COMPRESSED_NOT_SET +cdef int COMPRESSED_FALSE +cdef int COMPRESSED_TRUE cdef object UNPACK_LEN3 cdef object UNPACK_CLOSE_CODE @@ -60,9 +65,9 @@ cdef class WebSocketReader: cdef bytearray _partial cdef unsigned int _state - cdef object _opcode - cdef object _frame_fin - cdef object _frame_opcode + cdef int _opcode + cdef bint _frame_fin + cdef int _frame_opcode cdef object _frame_payload cdef unsigned long long _frame_payload_len @@ -71,7 +76,7 @@ cdef class WebSocketReader: cdef bytes _frame_mask cdef unsigned long long _payload_length cdef unsigned int _payload_length_flag - cdef object _compressed + cdef int _compressed cdef object _decompressobj cdef bint _compress @@ -82,22 +87,21 @@ cdef class WebSocketReader: fin=bint, has_partial=bint, payload_merged=bytes, - opcode="unsigned int", ) - cpdef void _feed_data(self, bytes data) + cpdef void _handle_frame(self, bint fin, int opcode, object payload, int compressed) except * @cython.locals( start_pos="unsigned int", - buf_len="unsigned int", + data_len="unsigned int", length="unsigned int", chunk_size="unsigned int", chunk_len="unsigned int", - buf_length="unsigned int", - buf_cstr="const unsigned char *", + data_length="unsigned int", + data_cstr="const unsigned char *", first_byte="unsigned char", second_byte="unsigned char", end_pos="unsigned int", has_mask=bint, fin=bint, ) - cpdef list parse_frame(self, bytes buf) + cpdef void _feed_data(self, bytes data) except * diff --git a/aiohttp/_websocket/reader_py.py b/aiohttp/_websocket/reader_py.py index 92ad47a52f0..5c5dbc3b0c4 100644 --- a/aiohttp/_websocket/reader_py.py +++ b/aiohttp/_websocket/reader_py.py @@ -3,7 +3,7 @@ import asyncio import builtins from collections import deque -from typing import Deque, Final, List, Optional, Set, Tuple, Union +from typing import Deque, Final, Optional, Set, Tuple, Union from ..base_protocol import BaseProtocol from ..compression_utils import ZLibDecompressor @@ -31,6 +31,7 @@ WS_MSG_TYPE_TEXT = WSMsgType.TEXT # WSMsgType values unpacked so they can by cythonized to ints +OP_CODE_NOT_SET = -1 OP_CODE_CONTINUATION = WSMsgType.CONTINUATION.value OP_CODE_TEXT = WSMsgType.TEXT.value OP_CODE_BINARY = WSMsgType.BINARY.value @@ -41,9 +42,13 @@ EMPTY_FRAME_ERROR = (True, b"") EMPTY_FRAME = (False, b"") +COMPRESSED_NOT_SET = -1 +COMPRESSED_FALSE = 0 +COMPRESSED_TRUE = 1 + TUPLE_NEW = tuple.__new__ -int_ = int # Prevent Cython from converting to PyInt +cython_int = int # Typed to int in Python, but cython with use a signed int in the pxd class WebSocketDataQueue: @@ -95,7 +100,7 @@ def feed_eof(self) -> None: self._release_waiter() self._exception = None # Break cyclic references - def feed_data(self, data: "WSMessage", size: "int_") -> None: + def feed_data(self, data: "WSMessage", size: "cython_int") -> None: self._size += size self._put_buffer((data, size)) self._release_waiter() @@ -136,9 +141,9 @@ def __init__( self._partial = bytearray() self._state = READ_HEADER - self._opcode: Optional[int] = None + self._opcode: int = OP_CODE_NOT_SET self._frame_fin = False - self._frame_opcode: Optional[int] = None + self._frame_opcode: int = OP_CODE_NOT_SET self._frame_payload: Union[bytes, bytearray] = b"" self._frame_payload_len = 0 @@ -147,7 +152,7 @@ def __init__( self._frame_mask: Optional[bytes] = None self._payload_length = 0 self._payload_length_flag = 0 - self._compressed: Optional[bool] = None + self._compressed: int = COMPRESSED_NOT_SET self._decompressobj: Optional[ZLibDecompressor] = None self._compress = compress @@ -175,165 +180,153 @@ def feed_data( return EMPTY_FRAME - def _feed_data(self, data: bytes) -> None: + def _handle_frame( + self, + fin: bool, + opcode: Union[int, cython_int], # Union intended: Cython pxd uses C int + payload: Union[bytes, bytearray], + compressed: Union[int, cython_int], # Union intended: Cython pxd uses C int + ) -> None: msg: WSMessage - for frame in self.parse_frame(data): - fin = frame[0] - opcode = frame[1] - payload = frame[2] - compressed = frame[3] - - is_continuation = opcode == OP_CODE_CONTINUATION - if opcode == OP_CODE_TEXT or opcode == OP_CODE_BINARY or is_continuation: - # load text/binary - if not fin: - # got partial frame payload - if not is_continuation: - self._opcode = opcode - self._partial += payload - if self._max_msg_size and len(self._partial) >= self._max_msg_size: - raise WebSocketError( - WSCloseCode.MESSAGE_TOO_BIG, - f"Message size {len(self._partial)} " - f"exceeds limit {self._max_msg_size}", - ) - continue - - has_partial = bool(self._partial) - if is_continuation: - if self._opcode is None: - raise WebSocketError( - WSCloseCode.PROTOCOL_ERROR, - "Continuation frame for non started message", - ) - opcode = self._opcode - self._opcode = None - # previous frame was non finished - # we should get continuation opcode - elif has_partial: - raise WebSocketError( - WSCloseCode.PROTOCOL_ERROR, - "The opcode in non-fin frame is expected " - f"to be zero, got {opcode!r}", - ) - - assembled_payload: Union[bytes, bytearray] - if has_partial: - assembled_payload = self._partial + payload - self._partial.clear() - else: - assembled_payload = payload - - if self._max_msg_size and len(assembled_payload) >= self._max_msg_size: + if opcode in {OP_CODE_TEXT, OP_CODE_BINARY, OP_CODE_CONTINUATION}: + # load text/binary + if not fin: + # got partial frame payload + if opcode != OP_CODE_CONTINUATION: + self._opcode = opcode + self._partial += payload + if self._max_msg_size and len(self._partial) >= self._max_msg_size: raise WebSocketError( WSCloseCode.MESSAGE_TOO_BIG, - f"Message size {len(assembled_payload)} " + f"Message size {len(self._partial)} " f"exceeds limit {self._max_msg_size}", ) + return - # Decompress process must to be done after all packets - # received. - if compressed: - if not self._decompressobj: - self._decompressobj = ZLibDecompressor( - suppress_deflate_header=True - ) - payload_merged = self._decompressobj.decompress_sync( - assembled_payload + WS_DEFLATE_TRAILING, self._max_msg_size - ) - if self._decompressobj.unconsumed_tail: - left = len(self._decompressobj.unconsumed_tail) - raise WebSocketError( - WSCloseCode.MESSAGE_TOO_BIG, - f"Decompressed message size {self._max_msg_size + left}" - f" exceeds limit {self._max_msg_size}", - ) - elif type(assembled_payload) is bytes: - payload_merged = assembled_payload - else: - payload_merged = bytes(assembled_payload) - - if opcode == OP_CODE_TEXT: - try: - text = payload_merged.decode("utf-8") - except UnicodeDecodeError as exc: - raise WebSocketError( - WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message" - ) from exc - - # XXX: The Text and Binary messages here can be a performance - # bottleneck, so we use tuple.__new__ to improve performance. - # This is not type safe, but many tests should fail in - # test_client_ws_functional.py if this is wrong. - self.queue.feed_data( - TUPLE_NEW(WSMessage, (WS_MSG_TYPE_TEXT, text, "")), - len(payload_merged), - ) - else: - self.queue.feed_data( - TUPLE_NEW(WSMessage, (WS_MSG_TYPE_BINARY, payload_merged, "")), - len(payload_merged), - ) - elif opcode == OP_CODE_CLOSE: - if len(payload) >= 2: - close_code = UNPACK_CLOSE_CODE(payload[:2])[0] - if close_code < 3000 and close_code not in ALLOWED_CLOSE_CODES: - raise WebSocketError( - WSCloseCode.PROTOCOL_ERROR, - f"Invalid close code: {close_code}", - ) - try: - close_message = payload[2:].decode("utf-8") - except UnicodeDecodeError as exc: - raise WebSocketError( - WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message" - ) from exc - msg = TUPLE_NEW( - WSMessage, (WSMsgType.CLOSE, close_code, close_message) - ) - elif payload: + has_partial = bool(self._partial) + if opcode == OP_CODE_CONTINUATION: + if self._opcode == OP_CODE_NOT_SET: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, - f"Invalid close frame: {fin} {opcode} {payload!r}", + "Continuation frame for non started message", ) - else: - msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, 0, "")) + opcode = self._opcode + self._opcode = OP_CODE_NOT_SET + # previous frame was non finished + # we should get continuation opcode + elif has_partial: + raise WebSocketError( + WSCloseCode.PROTOCOL_ERROR, + "The opcode in non-fin frame is expected " + f"to be zero, got {opcode!r}", + ) - self.queue.feed_data(msg, 0) - elif opcode == OP_CODE_PING: - msg = TUPLE_NEW(WSMessage, (WSMsgType.PING, payload, "")) - self.queue.feed_data(msg, len(payload)) + assembled_payload: Union[bytes, bytearray] + if has_partial: + assembled_payload = self._partial + payload + self._partial.clear() + else: + assembled_payload = payload - elif opcode == OP_CODE_PONG: - msg = TUPLE_NEW(WSMessage, (WSMsgType.PONG, payload, "")) - self.queue.feed_data(msg, len(payload)) + if self._max_msg_size and len(assembled_payload) >= self._max_msg_size: + raise WebSocketError( + WSCloseCode.MESSAGE_TOO_BIG, + f"Message size {len(assembled_payload)} " + f"exceeds limit {self._max_msg_size}", + ) + + # Decompress process must to be done after all packets + # received. + if compressed: + if not self._decompressobj: + self._decompressobj = ZLibDecompressor(suppress_deflate_header=True) + payload_merged = self._decompressobj.decompress_sync( + assembled_payload + WS_DEFLATE_TRAILING, self._max_msg_size + ) + if self._decompressobj.unconsumed_tail: + left = len(self._decompressobj.unconsumed_tail) + raise WebSocketError( + WSCloseCode.MESSAGE_TOO_BIG, + f"Decompressed message size {self._max_msg_size + left}" + f" exceeds limit {self._max_msg_size}", + ) + elif type(assembled_payload) is bytes: + payload_merged = assembled_payload + else: + payload_merged = bytes(assembled_payload) + if opcode == OP_CODE_TEXT: + try: + text = payload_merged.decode("utf-8") + except UnicodeDecodeError as exc: + raise WebSocketError( + WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message" + ) from exc + + # XXX: The Text and Binary messages here can be a performance + # bottleneck, so we use tuple.__new__ to improve performance. + # This is not type safe, but many tests should fail in + # test_client_ws_functional.py if this is wrong. + self.queue.feed_data( + TUPLE_NEW(WSMessage, (WS_MSG_TYPE_TEXT, text, "")), + len(payload_merged), + ) else: + self.queue.feed_data( + TUPLE_NEW(WSMessage, (WS_MSG_TYPE_BINARY, payload_merged, "")), + len(payload_merged), + ) + elif opcode == OP_CODE_CLOSE: + if len(payload) >= 2: + close_code = UNPACK_CLOSE_CODE(payload[:2])[0] + if close_code < 3000 and close_code not in ALLOWED_CLOSE_CODES: + raise WebSocketError( + WSCloseCode.PROTOCOL_ERROR, + f"Invalid close code: {close_code}", + ) + try: + close_message = payload[2:].decode("utf-8") + except UnicodeDecodeError as exc: + raise WebSocketError( + WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message" + ) from exc + msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, close_code, close_message)) + elif payload: raise WebSocketError( - WSCloseCode.PROTOCOL_ERROR, f"Unexpected opcode={opcode!r}" + WSCloseCode.PROTOCOL_ERROR, + f"Invalid close frame: {fin} {opcode} {payload!r}", ) + else: + msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, 0, "")) + + self.queue.feed_data(msg, 0) + elif opcode == OP_CODE_PING: + msg = TUPLE_NEW(WSMessage, (WSMsgType.PING, payload, "")) + self.queue.feed_data(msg, len(payload)) + elif opcode == OP_CODE_PONG: + msg = TUPLE_NEW(WSMessage, (WSMsgType.PONG, payload, "")) + self.queue.feed_data(msg, len(payload)) + else: + raise WebSocketError( + WSCloseCode.PROTOCOL_ERROR, f"Unexpected opcode={opcode!r}" + ) - def parse_frame( - self, buf: bytes - ) -> List[Tuple[bool, Optional[int], Union[bytes, bytearray], Optional[bool]]]: + def _feed_data(self, data: bytes) -> None: """Return the next frame from the socket.""" - frames: List[ - Tuple[bool, Optional[int], Union[bytes, bytearray], Optional[bool]] - ] = [] if self._tail: - buf, self._tail = self._tail + buf, b"" + data, self._tail = self._tail + data, b"" start_pos: int = 0 - buf_length = len(buf) - buf_cstr = buf + data_length = len(data) + data_cstr = data while True: # read header if self._state == READ_HEADER: - if buf_length - start_pos < 2: + if data_length - start_pos < 2: break - first_byte = buf_cstr[start_pos] - second_byte = buf_cstr[start_pos + 1] + first_byte = data_cstr[start_pos] + second_byte = data_cstr[start_pos + 1] start_pos += 2 fin = (first_byte >> 7) & 1 @@ -378,8 +371,8 @@ def parse_frame( # Set compress status if last package is FIN # OR set compress status if this is first fragment # Raise error if not first fragment with rsv1 = 0x1 - if self._frame_fin or self._compressed is None: - self._compressed = True if rsv1 else False + if self._frame_fin or self._compressed == COMPRESSED_NOT_SET: + self._compressed = COMPRESSED_TRUE if rsv1 else COMPRESSED_FALSE elif rsv1: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, @@ -396,18 +389,17 @@ def parse_frame( if self._state == READ_PAYLOAD_LENGTH: length_flag = self._payload_length_flag if length_flag == 126: - if buf_length - start_pos < 2: + if data_length - start_pos < 2: break - first_byte = buf_cstr[start_pos] - second_byte = buf_cstr[start_pos + 1] + first_byte = data_cstr[start_pos] + second_byte = data_cstr[start_pos + 1] start_pos += 2 self._payload_length = first_byte << 8 | second_byte elif length_flag > 126: - if buf_length - start_pos < 8: + if data_length - start_pos < 8: break - data = buf_cstr[start_pos : start_pos + 8] + self._payload_length = UNPACK_LEN3(data, start_pos)[0] start_pos += 8 - self._payload_length = UNPACK_LEN3(data)[0] else: self._payload_length = length_flag @@ -415,16 +407,16 @@ def parse_frame( # read payload mask if self._state == READ_PAYLOAD_MASK: - if buf_length - start_pos < 4: + if data_length - start_pos < 4: break - self._frame_mask = buf_cstr[start_pos : start_pos + 4] + self._frame_mask = data_cstr[start_pos : start_pos + 4] start_pos += 4 self._state = READ_PAYLOAD if self._state == READ_PAYLOAD: - chunk_len = buf_length - start_pos + chunk_len = data_length - start_pos if self._payload_length >= chunk_len: - end_pos = buf_length + end_pos = data_length self._payload_length -= chunk_len else: end_pos = start_pos + self._payload_length @@ -433,10 +425,10 @@ def parse_frame( if self._frame_payload_len: if type(self._frame_payload) is not bytearray: self._frame_payload = bytearray(self._frame_payload) - self._frame_payload += buf_cstr[start_pos:end_pos] + self._frame_payload += data_cstr[start_pos:end_pos] else: # Fast path for the first frame - self._frame_payload = buf_cstr[start_pos:end_pos] + self._frame_payload = data_cstr[start_pos:end_pos] self._frame_payload_len += end_pos - start_pos start_pos = end_pos @@ -450,19 +442,17 @@ def parse_frame( self._frame_payload = bytearray(self._frame_payload) websocket_mask(self._frame_mask, self._frame_payload) - frames.append( - ( - self._frame_fin, - self._frame_opcode, - self._frame_payload, - self._compressed, - ) + self._handle_frame( + self._frame_fin, + self._frame_opcode, + self._frame_payload, + self._compressed, ) self._frame_payload = b"" self._frame_payload_len = 0 self._state = READ_HEADER # XXX: Cython needs slices to be bounded, so we can't omit the slice end here. - self._tail = buf_cstr[start_pos:buf_length] if start_pos < buf_length else b"" - - return frames + self._tail = ( + data_cstr[start_pos:data_length] if start_pos < data_length else b"" + ) diff --git a/tests/test_websocket_parser.py b/tests/test_websocket_parser.py index 7f8b98d4566..8a65ac11d50 100644 --- a/tests/test_websocket_parser.py +++ b/tests/test_websocket_parser.py @@ -27,6 +27,25 @@ class PatchableWebSocketReader(WebSocketReader): """WebSocketReader subclass that allows for patching parse_frame.""" + def parse_frame( + self, data: bytes + ) -> list[tuple[bool, int, Union[bytes, bytearray], int]]: + # This method is overridden to allow for patching in tests. + frames: list[tuple[bool, int, Union[bytes, bytearray], int]] = [] + + def _handle_frame( + fin: bool, + opcode: int, + payload: Union[bytes, bytearray], + compressed: int, + ) -> None: + # This method is overridden to allow for patching in tests. + frames.append((fin, opcode, payload, compressed)) + + with mock.patch.object(self, "_handle_frame", _handle_frame): + self._feed_data(data) + return frames + def build_frame( message, opcode, use_mask=False, noheader=False, is_fin=True, compress=False @@ -127,32 +146,32 @@ def test_feed_data_remembers_exception(parser: WebSocketReader) -> None: assert data == b"" -def test_parse_frame(parser) -> None: +def test_parse_frame(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b00000001, 0b00000001)) res = parser.parse_frame(b"1") fin, opcode, payload, compress = res[0] - assert (0, 1, b"1", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"1", 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length0(parser) -> None: +def test_parse_frame_length0(parser: PatchableWebSocketReader) -> None: fin, opcode, payload, compress = parser.parse_frame( struct.pack("!BB", 0b00000001, 0b00000000) )[0] - assert (0, 1, b"", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"", 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length2(parser) -> None: +def test_parse_frame_length2(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b00000001, 126)) parser.parse_frame(struct.pack("!H", 4)) res = parser.parse_frame(b"1234") fin, opcode, payload, compress = res[0] - assert (0, 1, b"1234", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"1234", 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length2_multi_byte(parser: WebSocketReader) -> None: +def test_parse_frame_length2_multi_byte(parser: PatchableWebSocketReader) -> None: """Ensure a multi-byte length is parsed correctly.""" expected_payload = b"1" * 32768 parser.parse_frame(struct.pack("!BB", 0b00000001, 126)) @@ -160,10 +179,12 @@ def test_parse_frame_length2_multi_byte(parser: WebSocketReader) -> None: res = parser.parse_frame(b"1" * 32768) fin, opcode, payload, compress = res[0] - assert (0, 1, expected_payload, False) == (fin, opcode, payload, not not compress) + assert (0, 1, expected_payload, 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length2_multi_byte_multi_packet(parser: WebSocketReader) -> None: +def test_parse_frame_length2_multi_byte_multi_packet( + parser: PatchableWebSocketReader, +) -> None: """Ensure a multi-byte length with multiple packets is parsed correctly.""" expected_payload = b"1" * 32768 assert parser.parse_frame(struct.pack("!BB", 0b00000001, 126)) == [] @@ -174,44 +195,53 @@ def test_parse_frame_length2_multi_byte_multi_packet(parser: WebSocketReader) -> res = parser.parse_frame(b"1" * 8192) fin, opcode, payload, compress = res[0] assert len(payload) == 32768 - assert (0, 1, expected_payload, False) == (fin, opcode, payload, not not compress) + assert (0, 1, expected_payload, 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length4(parser: WebSocketReader) -> None: +def test_parse_frame_length4(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b00000001, 127)) parser.parse_frame(struct.pack("!Q", 4)) fin, opcode, payload, compress = parser.parse_frame(b"1234")[0] - assert (0, 1, b"1234", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"1234", 0) == (fin, opcode, payload, compress) -def test_parse_frame_mask(parser) -> None: +def test_parse_frame_mask(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b00000001, 0b10000001)) parser.parse_frame(b"0001") fin, opcode, payload, compress = parser.parse_frame(b"1")[0] - assert (0, 1, b"\x01", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"\x01", 0) == (fin, opcode, payload, compress) -def test_parse_frame_header_reversed_bits(out, parser) -> None: +def test_parse_frame_header_reversed_bits( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack("!BB", 0b01100000, 0b00000000)) raise out.exception() -def test_parse_frame_header_control_frame(out, parser) -> None: +def test_parse_frame_header_control_frame( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack("!BB", 0b00001000, 0b00000000)) raise out.exception() -def _test_parse_frame_header_new_data_err(out, parser): +@pytest.mark.xfail() +def test_parse_frame_header_new_data_err( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack("!BB", 0b000000000, 0b00000000)) raise out.exception() -def test_parse_frame_header_payload_size(out, parser) -> None: +def test_parse_frame_header_payload_size( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack("!BB", 0b10001000, 0b01111110)) raise out.exception() @@ -226,54 +256,45 @@ def test_parse_frame_header_payload_size(out, parser) -> None: ) def test_ping_frame( out: WebSocketDataQueue, - parser: WebSocketReader, + parser: PatchableWebSocketReader, data: Union[bytes, bytearray, memoryview], ) -> None: - with mock.patch.object(parser, "parse_frame", autospec=True) as m: - m.return_value = [(1, WSMsgType.PING, b"data", False)] - - parser.feed_data(data) - res = out._buffer[0] - assert res == ((WSMsgType.PING, b"data", ""), 4) - + parser._handle_frame(True, WSMsgType.PING, b"data", 0) + res = out._buffer[0] + assert res == ((WSMsgType.PING, b"data", ""), 4) -def test_pong_frame(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.PONG, b"data", False)] - parser.feed_data(b"") +def test_pong_frame(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None: + parser._handle_frame(True, WSMsgType.PONG, b"data", 0) res = out._buffer[0] assert res == ((WSMsgType.PONG, b"data", ""), 4) -def test_close_frame(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"", False)] - - parser.feed_data(b"") +def test_close_frame(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None: + parser._handle_frame(True, WSMsgType.CLOSE, b"", 0) res = out._buffer[0] assert res == ((WSMsgType.CLOSE, 0, ""), 0) -def test_close_frame_info(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"0112345", False)] - - parser.feed_data(b"") +def test_close_frame_info( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(True, WSMsgType.CLOSE, b"0112345", 0) res = out._buffer[0] assert res == (WSMessage(WSMsgType.CLOSE, 12337, "12345"), 0) -def test_close_frame_invalid(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"1", False)] - parser.feed_data(b"") - - assert isinstance(out.exception(), WebSocketError) - assert out.exception().code == WSCloseCode.PROTOCOL_ERROR +def test_close_frame_invalid( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + with pytest.raises(WebSocketError) as ctx: + parser._handle_frame(True, WSMsgType.CLOSE, b"1", 0) + assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR -def test_close_frame_invalid_2(out, parser) -> None: +def test_close_frame_invalid_2( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: data = build_close_frame(code=1) with pytest.raises(WebSocketError) as ctx: @@ -282,7 +303,7 @@ def test_close_frame_invalid_2(out, parser) -> None: assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR -def test_close_frame_unicode_err(parser) -> None: +def test_close_frame_unicode_err(parser: PatchableWebSocketReader) -> None: data = build_close_frame(code=1000, message=b"\xf4\x90\x80\x80") with pytest.raises(WebSocketError) as ctx: @@ -291,23 +312,21 @@ def test_close_frame_unicode_err(parser) -> None: assert ctx.value.code == WSCloseCode.INVALID_TEXT -def test_unknown_frame(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.CONTINUATION, b"", False)] - +def test_unknown_frame( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): - parser.feed_data(b"") - raise out.exception() + parser._handle_frame(True, WSMsgType.CONTINUATION, b"", 0) -def test_simple_text(out, parser) -> None: +def test_simple_text(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None: data = build_frame(b"text", WSMsgType.TEXT) parser._feed_data(data) res = out._buffer[0] assert res == ((WSMsgType.TEXT, "text", ""), 4) -def test_simple_text_unicode_err(parser) -> None: +def test_simple_text_unicode_err(parser: PatchableWebSocketReader) -> None: data = build_frame(b"\xf4\x90\x80\x80", WSMsgType.TEXT) with pytest.raises(WebSocketError) as ctx: @@ -316,16 +335,18 @@ def test_simple_text_unicode_err(parser) -> None: assert ctx.value.code == WSCloseCode.INVALID_TEXT -def test_simple_binary(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.BINARY, b"binary", False)] - - parser.feed_data(b"") +def test_simple_binary( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + data = build_frame(b"binary", WSMsgType.BINARY) + parser._feed_data(data) res = out._buffer[0] assert res == ((WSMsgType.BINARY, b"binary", ""), 6) -def test_fragmentation_header(out, parser) -> None: +def test_fragmentation_header( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: data = build_frame(b"a", WSMsgType.TEXT) parser._feed_data(data[:1]) parser._feed_data(data[1:]) @@ -334,7 +355,9 @@ def test_fragmentation_header(out, parser) -> None: assert res == (WSMessage(WSMsgType.TEXT, "a", ""), 1) -def test_continuation(out, parser) -> None: +def test_continuation( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: data1 = build_frame(b"line1", WSMsgType.TEXT, is_fin=False) parser._feed_data(data1) @@ -345,14 +368,9 @@ def test_continuation(out, parser) -> None: assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10) -def test_continuation_with_ping(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.PING, b"", False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - +def test_continuation_with_ping( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: data1 = build_frame(b"line1", WSMsgType.TEXT, is_fin=False) parser._feed_data(data1) @@ -368,90 +386,78 @@ def test_continuation_with_ping(out, parser) -> None: assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10) -def test_continuation_err(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (1, WSMsgType.TEXT, b"line2", False), - ] - +def test_continuation_err( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) with pytest.raises(WebSocketError): - parser._feed_data(b"") - + parser._handle_frame(True, WSMsgType.TEXT, b"line2", 0) -def test_continuation_with_close(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.CLOSE, build_close_frame(1002, b"test", noheader=True), False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - parser.feed_data(b"") +def test_continuation_with_close( + out: WebSocketDataQueue, parser: WebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) + parser._handle_frame( + False, + WSMsgType.CLOSE, + build_close_frame(1002, b"test", noheader=True), + False, + ) + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) res = out._buffer[0] - assert res, (WSMessage(WSMsgType.CLOSE, 1002, "test"), 0) + assert res == (WSMessage(WSMsgType.CLOSE, 1002, "test"), 0) res = out._buffer[1] assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10) -def test_continuation_with_close_unicode_err(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - ( - 0, +def test_continuation_with_close_unicode_err( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) + with pytest.raises(WebSocketError) as ctx: + parser._handle_frame( + False, WSMsgType.CLOSE, build_close_frame(1000, b"\xf4\x90\x80\x80", noheader=True), - False, - ), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - - with pytest.raises(WebSocketError) as ctx: - parser._feed_data(b"") - + 0, + ) + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) assert ctx.value.code == WSCloseCode.INVALID_TEXT -def test_continuation_with_close_bad_code(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.CLOSE, build_close_frame(1, b"test", noheader=True), False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - +def test_continuation_with_close_bad_code( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) with pytest.raises(WebSocketError) as ctx: - parser._feed_data(b"") + parser._handle_frame( + False, WSMsgType.CLOSE, build_close_frame(1, b"test", noheader=True), 0 + ) assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) -def test_continuation_with_close_bad_payload(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.CLOSE, b"1", False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - +def test_continuation_with_close_bad_payload( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) with pytest.raises(WebSocketError) as ctx: - parser._feed_data(b"") - - assert ctx.value.code, WSCloseCode.PROTOCOL_ERROR + parser._handle_frame(False, WSMsgType.CLOSE, b"1", 0) + assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) -def test_continuation_with_close_empty(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.CLOSE, b"", False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] +def test_continuation_with_close_empty( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) + parser._handle_frame(False, WSMsgType.CLOSE, b"", 0) + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) - parser.feed_data(b"") res = out._buffer[0] - assert res, (WSMessage(WSMsgType.CLOSE, 0, ""), 0) + assert res == (WSMessage(WSMsgType.CLOSE, 0, ""), 0) res = out._buffer[1] assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10) @@ -506,7 +512,7 @@ def test_msgtype_aliases() -> None: assert aiohttp.WSMsgType.ERROR == aiohttp.WSMsgType.error -def test_parse_compress_frame_single(parser) -> None: +def test_parse_compress_frame_single(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b11000001, 0b00000001)) res = parser.parse_frame(b"1") fin, opcode, payload, compress = res[0] @@ -514,7 +520,7 @@ def test_parse_compress_frame_single(parser) -> None: assert (1, 1, b"1", True) == (fin, opcode, payload, not not compress) -def test_parse_compress_frame_multi(parser) -> None: +def test_parse_compress_frame_multi(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b01000001, 126)) parser.parse_frame(struct.pack("!H", 4)) res = parser.parse_frame(b"1234") @@ -534,7 +540,7 @@ def test_parse_compress_frame_multi(parser) -> None: assert (1, 1, b"1234", False) == (fin, opcode, payload, not not compress) -def test_parse_compress_error_frame(parser) -> None: +def test_parse_compress_error_frame(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b01000001, 0b00000001)) parser.parse_frame(b"1") @@ -545,10 +551,8 @@ def test_parse_compress_error_frame(parser) -> None: assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR -async def test_parse_no_compress_frame_single( - loop: asyncio.AbstractEventLoop, out: WebSocketDataQueue -) -> None: - parser_no_compress = WebSocketReader(out, 0, compress=False) +def test_parse_no_compress_frame_single(out: WebSocketDataQueue) -> None: + parser_no_compress = PatchableWebSocketReader(out, 0, compress=False) with pytest.raises(WebSocketError) as ctx: parser_no_compress.parse_frame(struct.pack("!BB", 0b11000001, 0b00000001)) parser_no_compress.parse_frame(b"1") @@ -600,34 +604,28 @@ def test_pickle(self) -> None: def test_flow_control_binary( protocol: BaseProtocol, out_low_limit: WebSocketDataQueue, - parser_low_limit: WebSocketReader, + parser_low_limit: PatchableWebSocketReader, ) -> None: large_payload = b"b" * (1 + 16 * 2) - large_payload_len = len(large_payload) - with mock.patch.object(parser_low_limit, "parse_frame", autospec=True) as m: - m.return_value = [(1, WSMsgType.BINARY, large_payload, False)] - - parser_low_limit.feed_data(b"") - + large_payload_size = len(large_payload) + parser_low_limit._handle_frame(True, WSMsgType.BINARY, large_payload, 0) res = out_low_limit._buffer[0] - assert res == (WSMessage(WSMsgType.BINARY, large_payload, ""), large_payload_len) + assert res == (WSMessage(WSMsgType.BINARY, large_payload, ""), large_payload_size) assert protocol._reading_paused is True def test_flow_control_multi_byte_text( protocol: BaseProtocol, out_low_limit: WebSocketDataQueue, - parser_low_limit: WebSocketReader, + parser_low_limit: PatchableWebSocketReader, ) -> None: large_payload_text = "𒀁" * (1 + 16 * 2) large_payload = large_payload_text.encode("utf-8") - large_payload_len = len(large_payload) - - with mock.patch.object(parser_low_limit, "parse_frame", autospec=True) as m: - m.return_value = [(1, WSMsgType.TEXT, large_payload, False)] - - parser_low_limit.feed_data(b"") - + large_payload_size = len(large_payload) + parser_low_limit._handle_frame(True, WSMsgType.TEXT, large_payload, 0) res = out_low_limit._buffer[0] - assert res == (WSMessage(WSMsgType.TEXT, large_payload_text, ""), large_payload_len) + assert res == ( + WSMessage(WSMsgType.TEXT, large_payload_text, ""), + large_payload_size, + ) assert protocol._reading_paused is True From 2c8575c1b27c78505d670c9e0708e0b8ca544e4e Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Fri, 18 Apr 2025 21:16:44 -1000 Subject: [PATCH 29/37] [PR #10740/0d21d8d backport][3.12] Refactor WebSocket reader to avoid creating lists (#10745) --- CHANGES/10740.misc.rst | 1 + aiohttp/_websocket/reader_c.pxd | 36 ++-- aiohttp/_websocket/reader_py.py | 334 ++++++++++++++++---------------- tests/test_websocket_parser.py | 308 ++++++++++++++--------------- 4 files changed, 337 insertions(+), 342 deletions(-) create mode 100644 CHANGES/10740.misc.rst diff --git a/CHANGES/10740.misc.rst b/CHANGES/10740.misc.rst new file mode 100644 index 00000000000..34ed19aebba --- /dev/null +++ b/CHANGES/10740.misc.rst @@ -0,0 +1 @@ +Improved performance of the WebSocket reader -- by :user:`bdraco`. diff --git a/aiohttp/_websocket/reader_c.pxd b/aiohttp/_websocket/reader_c.pxd index f156a7ff704..3efebeb81dc 100644 --- a/aiohttp/_websocket/reader_c.pxd +++ b/aiohttp/_websocket/reader_c.pxd @@ -8,12 +8,17 @@ cdef unsigned int READ_PAYLOAD_LENGTH cdef unsigned int READ_PAYLOAD_MASK cdef unsigned int READ_PAYLOAD -cdef unsigned int OP_CODE_CONTINUATION -cdef unsigned int OP_CODE_TEXT -cdef unsigned int OP_CODE_BINARY -cdef unsigned int OP_CODE_CLOSE -cdef unsigned int OP_CODE_PING -cdef unsigned int OP_CODE_PONG +cdef int OP_CODE_NOT_SET +cdef int OP_CODE_CONTINUATION +cdef int OP_CODE_TEXT +cdef int OP_CODE_BINARY +cdef int OP_CODE_CLOSE +cdef int OP_CODE_PING +cdef int OP_CODE_PONG + +cdef int COMPRESSED_NOT_SET +cdef int COMPRESSED_FALSE +cdef int COMPRESSED_TRUE cdef object UNPACK_LEN3 cdef object UNPACK_CLOSE_CODE @@ -60,9 +65,9 @@ cdef class WebSocketReader: cdef bytearray _partial cdef unsigned int _state - cdef object _opcode - cdef object _frame_fin - cdef object _frame_opcode + cdef int _opcode + cdef bint _frame_fin + cdef int _frame_opcode cdef object _frame_payload cdef unsigned long long _frame_payload_len @@ -71,7 +76,7 @@ cdef class WebSocketReader: cdef bytes _frame_mask cdef unsigned long long _payload_length cdef unsigned int _payload_length_flag - cdef object _compressed + cdef int _compressed cdef object _decompressobj cdef bint _compress @@ -82,22 +87,21 @@ cdef class WebSocketReader: fin=bint, has_partial=bint, payload_merged=bytes, - opcode="unsigned int", ) - cpdef void _feed_data(self, bytes data) + cpdef void _handle_frame(self, bint fin, int opcode, object payload, int compressed) except * @cython.locals( start_pos="unsigned int", - buf_len="unsigned int", + data_len="unsigned int", length="unsigned int", chunk_size="unsigned int", chunk_len="unsigned int", - buf_length="unsigned int", - buf_cstr="const unsigned char *", + data_length="unsigned int", + data_cstr="const unsigned char *", first_byte="unsigned char", second_byte="unsigned char", end_pos="unsigned int", has_mask=bint, fin=bint, ) - cpdef list parse_frame(self, bytes buf) + cpdef void _feed_data(self, bytes data) except * diff --git a/aiohttp/_websocket/reader_py.py b/aiohttp/_websocket/reader_py.py index 19579bd39a8..a8a8eb7eb01 100644 --- a/aiohttp/_websocket/reader_py.py +++ b/aiohttp/_websocket/reader_py.py @@ -3,7 +3,7 @@ import asyncio import builtins from collections import deque -from typing import Deque, Final, List, Optional, Set, Tuple, Union +from typing import Deque, Final, Optional, Set, Tuple, Union from ..base_protocol import BaseProtocol from ..compression_utils import ZLibDecompressor @@ -31,6 +31,7 @@ WS_MSG_TYPE_TEXT = WSMsgType.TEXT # WSMsgType values unpacked so they can by cythonized to ints +OP_CODE_NOT_SET = -1 OP_CODE_CONTINUATION = WSMsgType.CONTINUATION.value OP_CODE_TEXT = WSMsgType.TEXT.value OP_CODE_BINARY = WSMsgType.BINARY.value @@ -41,9 +42,13 @@ EMPTY_FRAME_ERROR = (True, b"") EMPTY_FRAME = (False, b"") +COMPRESSED_NOT_SET = -1 +COMPRESSED_FALSE = 0 +COMPRESSED_TRUE = 1 + TUPLE_NEW = tuple.__new__ -int_ = int # Prevent Cython from converting to PyInt +cython_int = int # Typed to int in Python, but cython with use a signed int in the pxd class WebSocketDataQueue: @@ -95,7 +100,7 @@ def feed_eof(self) -> None: self._release_waiter() self._exception = None # Break cyclic references - def feed_data(self, data: "WSMessage", size: "int_") -> None: + def feed_data(self, data: "WSMessage", size: "cython_int") -> None: self._size += size self._put_buffer((data, size)) self._release_waiter() @@ -136,9 +141,9 @@ def __init__( self._partial = bytearray() self._state = READ_HEADER - self._opcode: Optional[int] = None + self._opcode: int = OP_CODE_NOT_SET self._frame_fin = False - self._frame_opcode: Optional[int] = None + self._frame_opcode: int = OP_CODE_NOT_SET self._frame_payload: Union[bytes, bytearray] = b"" self._frame_payload_len = 0 @@ -147,7 +152,7 @@ def __init__( self._frame_mask: Optional[bytes] = None self._payload_length = 0 self._payload_length_flag = 0 - self._compressed: Optional[bool] = None + self._compressed: int = COMPRESSED_NOT_SET self._decompressobj: Optional[ZLibDecompressor] = None self._compress = compress @@ -175,173 +180,161 @@ def feed_data( return EMPTY_FRAME - def _feed_data(self, data: bytes) -> None: + def _handle_frame( + self, + fin: bool, + opcode: Union[int, cython_int], # Union intended: Cython pxd uses C int + payload: Union[bytes, bytearray], + compressed: Union[int, cython_int], # Union intended: Cython pxd uses C int + ) -> None: msg: WSMessage - for frame in self.parse_frame(data): - fin = frame[0] - opcode = frame[1] - payload = frame[2] - compressed = frame[3] - - is_continuation = opcode == OP_CODE_CONTINUATION - if opcode == OP_CODE_TEXT or opcode == OP_CODE_BINARY or is_continuation: - # load text/binary - if not fin: - # got partial frame payload - if not is_continuation: - self._opcode = opcode - self._partial += payload - if self._max_msg_size and len(self._partial) >= self._max_msg_size: - raise WebSocketError( - WSCloseCode.MESSAGE_TOO_BIG, - f"Message size {len(self._partial)} " - f"exceeds limit {self._max_msg_size}", - ) - continue - - has_partial = bool(self._partial) - if is_continuation: - if self._opcode is None: - raise WebSocketError( - WSCloseCode.PROTOCOL_ERROR, - "Continuation frame for non started message", - ) - opcode = self._opcode - self._opcode = None - # previous frame was non finished - # we should get continuation opcode - elif has_partial: - raise WebSocketError( - WSCloseCode.PROTOCOL_ERROR, - "The opcode in non-fin frame is expected " - f"to be zero, got {opcode!r}", - ) - - assembled_payload: Union[bytes, bytearray] - if has_partial: - assembled_payload = self._partial + payload - self._partial.clear() - else: - assembled_payload = payload - - if self._max_msg_size and len(assembled_payload) >= self._max_msg_size: + if opcode in {OP_CODE_TEXT, OP_CODE_BINARY, OP_CODE_CONTINUATION}: + # load text/binary + if not fin: + # got partial frame payload + if opcode != OP_CODE_CONTINUATION: + self._opcode = opcode + self._partial += payload + if self._max_msg_size and len(self._partial) >= self._max_msg_size: raise WebSocketError( WSCloseCode.MESSAGE_TOO_BIG, - f"Message size {len(assembled_payload)} " + f"Message size {len(self._partial)} " f"exceeds limit {self._max_msg_size}", ) + return - # Decompress process must to be done after all packets - # received. - if compressed: - if not self._decompressobj: - self._decompressobj = ZLibDecompressor( - suppress_deflate_header=True - ) - # XXX: It's possible that the zlib backend (isal is known to - # do this, maybe others too?) will return max_length bytes, - # but internally buffer more data such that the payload is - # >max_length, so we return one extra byte and if we're able - # to do that, then the message is too big. - payload_merged = self._decompressobj.decompress_sync( - assembled_payload + WS_DEFLATE_TRAILING, - ( - self._max_msg_size + 1 - if self._max_msg_size - else self._max_msg_size - ), - ) - if self._max_msg_size and len(payload_merged) > self._max_msg_size: - raise WebSocketError( - WSCloseCode.MESSAGE_TOO_BIG, - f"Decompressed message exceeds size limit {self._max_msg_size}", - ) - elif type(assembled_payload) is bytes: - payload_merged = assembled_payload - else: - payload_merged = bytes(assembled_payload) - - if opcode == OP_CODE_TEXT: - try: - text = payload_merged.decode("utf-8") - except UnicodeDecodeError as exc: - raise WebSocketError( - WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message" - ) from exc - - # XXX: The Text and Binary messages here can be a performance - # bottleneck, so we use tuple.__new__ to improve performance. - # This is not type safe, but many tests should fail in - # test_client_ws_functional.py if this is wrong. - self.queue.feed_data( - TUPLE_NEW(WSMessage, (WS_MSG_TYPE_TEXT, text, "")), - len(payload_merged), - ) - else: - self.queue.feed_data( - TUPLE_NEW(WSMessage, (WS_MSG_TYPE_BINARY, payload_merged, "")), - len(payload_merged), - ) - elif opcode == OP_CODE_CLOSE: - if len(payload) >= 2: - close_code = UNPACK_CLOSE_CODE(payload[:2])[0] - if close_code < 3000 and close_code not in ALLOWED_CLOSE_CODES: - raise WebSocketError( - WSCloseCode.PROTOCOL_ERROR, - f"Invalid close code: {close_code}", - ) - try: - close_message = payload[2:].decode("utf-8") - except UnicodeDecodeError as exc: - raise WebSocketError( - WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message" - ) from exc - msg = TUPLE_NEW( - WSMessage, (WSMsgType.CLOSE, close_code, close_message) - ) - elif payload: + has_partial = bool(self._partial) + if opcode == OP_CODE_CONTINUATION: + if self._opcode == OP_CODE_NOT_SET: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, - f"Invalid close frame: {fin} {opcode} {payload!r}", + "Continuation frame for non started message", ) - else: - msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, 0, "")) + opcode = self._opcode + self._opcode = OP_CODE_NOT_SET + # previous frame was non finished + # we should get continuation opcode + elif has_partial: + raise WebSocketError( + WSCloseCode.PROTOCOL_ERROR, + "The opcode in non-fin frame is expected " + f"to be zero, got {opcode!r}", + ) - self.queue.feed_data(msg, 0) - elif opcode == OP_CODE_PING: - msg = TUPLE_NEW(WSMessage, (WSMsgType.PING, payload, "")) - self.queue.feed_data(msg, len(payload)) + assembled_payload: Union[bytes, bytearray] + if has_partial: + assembled_payload = self._partial + payload + self._partial.clear() + else: + assembled_payload = payload - elif opcode == OP_CODE_PONG: - msg = TUPLE_NEW(WSMessage, (WSMsgType.PONG, payload, "")) - self.queue.feed_data(msg, len(payload)) + if self._max_msg_size and len(assembled_payload) >= self._max_msg_size: + raise WebSocketError( + WSCloseCode.MESSAGE_TOO_BIG, + f"Message size {len(assembled_payload)} " + f"exceeds limit {self._max_msg_size}", + ) + + # Decompress process must to be done after all packets + # received. + if compressed: + if not self._decompressobj: + self._decompressobj = ZLibDecompressor(suppress_deflate_header=True) + # XXX: It's possible that the zlib backend (isal is known to + # do this, maybe others too?) will return max_length bytes, + # but internally buffer more data such that the payload is + # >max_length, so we return one extra byte and if we're able + # to do that, then the message is too big. + payload_merged = self._decompressobj.decompress_sync( + assembled_payload + WS_DEFLATE_TRAILING, + ( + self._max_msg_size + 1 + if self._max_msg_size + else self._max_msg_size + ), + ) + if self._max_msg_size and len(payload_merged) > self._max_msg_size: + raise WebSocketError( + WSCloseCode.MESSAGE_TOO_BIG, + f"Decompressed message exceeds size limit {self._max_msg_size}", + ) + elif type(assembled_payload) is bytes: + payload_merged = assembled_payload + else: + payload_merged = bytes(assembled_payload) + if opcode == OP_CODE_TEXT: + try: + text = payload_merged.decode("utf-8") + except UnicodeDecodeError as exc: + raise WebSocketError( + WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message" + ) from exc + + # XXX: The Text and Binary messages here can be a performance + # bottleneck, so we use tuple.__new__ to improve performance. + # This is not type safe, but many tests should fail in + # test_client_ws_functional.py if this is wrong. + self.queue.feed_data( + TUPLE_NEW(WSMessage, (WS_MSG_TYPE_TEXT, text, "")), + len(payload_merged), + ) else: + self.queue.feed_data( + TUPLE_NEW(WSMessage, (WS_MSG_TYPE_BINARY, payload_merged, "")), + len(payload_merged), + ) + elif opcode == OP_CODE_CLOSE: + if len(payload) >= 2: + close_code = UNPACK_CLOSE_CODE(payload[:2])[0] + if close_code < 3000 and close_code not in ALLOWED_CLOSE_CODES: + raise WebSocketError( + WSCloseCode.PROTOCOL_ERROR, + f"Invalid close code: {close_code}", + ) + try: + close_message = payload[2:].decode("utf-8") + except UnicodeDecodeError as exc: + raise WebSocketError( + WSCloseCode.INVALID_TEXT, "Invalid UTF-8 text message" + ) from exc + msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, close_code, close_message)) + elif payload: raise WebSocketError( - WSCloseCode.PROTOCOL_ERROR, f"Unexpected opcode={opcode!r}" + WSCloseCode.PROTOCOL_ERROR, + f"Invalid close frame: {fin} {opcode} {payload!r}", ) + else: + msg = TUPLE_NEW(WSMessage, (WSMsgType.CLOSE, 0, "")) + + self.queue.feed_data(msg, 0) + elif opcode == OP_CODE_PING: + msg = TUPLE_NEW(WSMessage, (WSMsgType.PING, payload, "")) + self.queue.feed_data(msg, len(payload)) + elif opcode == OP_CODE_PONG: + msg = TUPLE_NEW(WSMessage, (WSMsgType.PONG, payload, "")) + self.queue.feed_data(msg, len(payload)) + else: + raise WebSocketError( + WSCloseCode.PROTOCOL_ERROR, f"Unexpected opcode={opcode!r}" + ) - def parse_frame( - self, buf: bytes - ) -> List[Tuple[bool, Optional[int], Union[bytes, bytearray], Optional[bool]]]: + def _feed_data(self, data: bytes) -> None: """Return the next frame from the socket.""" - frames: List[ - Tuple[bool, Optional[int], Union[bytes, bytearray], Optional[bool]] - ] = [] if self._tail: - buf, self._tail = self._tail + buf, b"" + data, self._tail = self._tail + data, b"" start_pos: int = 0 - buf_length = len(buf) - buf_cstr = buf + data_length = len(data) + data_cstr = data while True: # read header if self._state == READ_HEADER: - if buf_length - start_pos < 2: + if data_length - start_pos < 2: break - first_byte = buf_cstr[start_pos] - second_byte = buf_cstr[start_pos + 1] + first_byte = data_cstr[start_pos] + second_byte = data_cstr[start_pos + 1] start_pos += 2 fin = (first_byte >> 7) & 1 @@ -386,8 +379,8 @@ def parse_frame( # Set compress status if last package is FIN # OR set compress status if this is first fragment # Raise error if not first fragment with rsv1 = 0x1 - if self._frame_fin or self._compressed is None: - self._compressed = True if rsv1 else False + if self._frame_fin or self._compressed == COMPRESSED_NOT_SET: + self._compressed = COMPRESSED_TRUE if rsv1 else COMPRESSED_FALSE elif rsv1: raise WebSocketError( WSCloseCode.PROTOCOL_ERROR, @@ -404,18 +397,17 @@ def parse_frame( if self._state == READ_PAYLOAD_LENGTH: length_flag = self._payload_length_flag if length_flag == 126: - if buf_length - start_pos < 2: + if data_length - start_pos < 2: break - first_byte = buf_cstr[start_pos] - second_byte = buf_cstr[start_pos + 1] + first_byte = data_cstr[start_pos] + second_byte = data_cstr[start_pos + 1] start_pos += 2 self._payload_length = first_byte << 8 | second_byte elif length_flag > 126: - if buf_length - start_pos < 8: + if data_length - start_pos < 8: break - data = buf_cstr[start_pos : start_pos + 8] + self._payload_length = UNPACK_LEN3(data, start_pos)[0] start_pos += 8 - self._payload_length = UNPACK_LEN3(data)[0] else: self._payload_length = length_flag @@ -423,16 +415,16 @@ def parse_frame( # read payload mask if self._state == READ_PAYLOAD_MASK: - if buf_length - start_pos < 4: + if data_length - start_pos < 4: break - self._frame_mask = buf_cstr[start_pos : start_pos + 4] + self._frame_mask = data_cstr[start_pos : start_pos + 4] start_pos += 4 self._state = READ_PAYLOAD if self._state == READ_PAYLOAD: - chunk_len = buf_length - start_pos + chunk_len = data_length - start_pos if self._payload_length >= chunk_len: - end_pos = buf_length + end_pos = data_length self._payload_length -= chunk_len else: end_pos = start_pos + self._payload_length @@ -441,10 +433,10 @@ def parse_frame( if self._frame_payload_len: if type(self._frame_payload) is not bytearray: self._frame_payload = bytearray(self._frame_payload) - self._frame_payload += buf_cstr[start_pos:end_pos] + self._frame_payload += data_cstr[start_pos:end_pos] else: # Fast path for the first frame - self._frame_payload = buf_cstr[start_pos:end_pos] + self._frame_payload = data_cstr[start_pos:end_pos] self._frame_payload_len += end_pos - start_pos start_pos = end_pos @@ -458,19 +450,17 @@ def parse_frame( self._frame_payload = bytearray(self._frame_payload) websocket_mask(self._frame_mask, self._frame_payload) - frames.append( - ( - self._frame_fin, - self._frame_opcode, - self._frame_payload, - self._compressed, - ) + self._handle_frame( + self._frame_fin, + self._frame_opcode, + self._frame_payload, + self._compressed, ) self._frame_payload = b"" self._frame_payload_len = 0 self._state = READ_HEADER # XXX: Cython needs slices to be bounded, so we can't omit the slice end here. - self._tail = buf_cstr[start_pos:buf_length] if start_pos < buf_length else b"" - - return frames + self._tail = ( + data_cstr[start_pos:data_length] if start_pos < data_length else b"" + ) diff --git a/tests/test_websocket_parser.py b/tests/test_websocket_parser.py index 2cac4cf6b87..04c83f19610 100644 --- a/tests/test_websocket_parser.py +++ b/tests/test_websocket_parser.py @@ -27,6 +27,25 @@ class PatchableWebSocketReader(WebSocketReader): """WebSocketReader subclass that allows for patching parse_frame.""" + def parse_frame( + self, data: bytes + ) -> list[tuple[bool, int, Union[bytes, bytearray], int]]: + # This method is overridden to allow for patching in tests. + frames: list[tuple[bool, int, Union[bytes, bytearray], int]] = [] + + def _handle_frame( + fin: bool, + opcode: int, + payload: Union[bytes, bytearray], + compressed: int, + ) -> None: + # This method is overridden to allow for patching in tests. + frames.append((fin, opcode, payload, compressed)) + + with mock.patch.object(self, "_handle_frame", _handle_frame): + self._feed_data(data) + return frames + def build_frame( message, opcode, use_mask=False, noheader=False, is_fin=True, ZLibBackend=None @@ -129,32 +148,32 @@ def test_feed_data_remembers_exception(parser: WebSocketReader) -> None: assert data == b"" -def test_parse_frame(parser) -> None: +def test_parse_frame(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b00000001, 0b00000001)) res = parser.parse_frame(b"1") fin, opcode, payload, compress = res[0] - assert (0, 1, b"1", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"1", 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length0(parser) -> None: +def test_parse_frame_length0(parser: PatchableWebSocketReader) -> None: fin, opcode, payload, compress = parser.parse_frame( struct.pack("!BB", 0b00000001, 0b00000000) )[0] - assert (0, 1, b"", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"", 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length2(parser) -> None: +def test_parse_frame_length2(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b00000001, 126)) parser.parse_frame(struct.pack("!H", 4)) res = parser.parse_frame(b"1234") fin, opcode, payload, compress = res[0] - assert (0, 1, b"1234", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"1234", 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length2_multi_byte(parser: WebSocketReader) -> None: +def test_parse_frame_length2_multi_byte(parser: PatchableWebSocketReader) -> None: """Ensure a multi-byte length is parsed correctly.""" expected_payload = b"1" * 32768 parser.parse_frame(struct.pack("!BB", 0b00000001, 126)) @@ -162,10 +181,12 @@ def test_parse_frame_length2_multi_byte(parser: WebSocketReader) -> None: res = parser.parse_frame(b"1" * 32768) fin, opcode, payload, compress = res[0] - assert (0, 1, expected_payload, False) == (fin, opcode, payload, not not compress) + assert (0, 1, expected_payload, 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length2_multi_byte_multi_packet(parser: WebSocketReader) -> None: +def test_parse_frame_length2_multi_byte_multi_packet( + parser: PatchableWebSocketReader, +) -> None: """Ensure a multi-byte length with multiple packets is parsed correctly.""" expected_payload = b"1" * 32768 assert parser.parse_frame(struct.pack("!BB", 0b00000001, 126)) == [] @@ -176,44 +197,53 @@ def test_parse_frame_length2_multi_byte_multi_packet(parser: WebSocketReader) -> res = parser.parse_frame(b"1" * 8192) fin, opcode, payload, compress = res[0] assert len(payload) == 32768 - assert (0, 1, expected_payload, False) == (fin, opcode, payload, not not compress) + assert (0, 1, expected_payload, 0) == (fin, opcode, payload, not not compress) -def test_parse_frame_length4(parser: WebSocketReader) -> None: +def test_parse_frame_length4(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b00000001, 127)) parser.parse_frame(struct.pack("!Q", 4)) fin, opcode, payload, compress = parser.parse_frame(b"1234")[0] - assert (0, 1, b"1234", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"1234", 0) == (fin, opcode, payload, compress) -def test_parse_frame_mask(parser) -> None: +def test_parse_frame_mask(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b00000001, 0b10000001)) parser.parse_frame(b"0001") fin, opcode, payload, compress = parser.parse_frame(b"1")[0] - assert (0, 1, b"\x01", False) == (fin, opcode, payload, not not compress) + assert (0, 1, b"\x01", 0) == (fin, opcode, payload, compress) -def test_parse_frame_header_reversed_bits(out, parser) -> None: +def test_parse_frame_header_reversed_bits( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack("!BB", 0b01100000, 0b00000000)) raise out.exception() -def test_parse_frame_header_control_frame(out, parser) -> None: +def test_parse_frame_header_control_frame( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack("!BB", 0b00001000, 0b00000000)) raise out.exception() -def _test_parse_frame_header_new_data_err(out, parser): +@pytest.mark.xfail() +def test_parse_frame_header_new_data_err( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack("!BB", 0b000000000, 0b00000000)) raise out.exception() -def test_parse_frame_header_payload_size(out, parser) -> None: +def test_parse_frame_header_payload_size( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): parser.parse_frame(struct.pack("!BB", 0b10001000, 0b01111110)) raise out.exception() @@ -228,54 +258,45 @@ def test_parse_frame_header_payload_size(out, parser) -> None: ) def test_ping_frame( out: WebSocketDataQueue, - parser: WebSocketReader, + parser: PatchableWebSocketReader, data: Union[bytes, bytearray, memoryview], ) -> None: - with mock.patch.object(parser, "parse_frame", autospec=True) as m: - m.return_value = [(1, WSMsgType.PING, b"data", False)] - - parser.feed_data(data) - res = out._buffer[0] - assert res == ((WSMsgType.PING, b"data", ""), 4) - + parser._handle_frame(True, WSMsgType.PING, b"data", 0) + res = out._buffer[0] + assert res == ((WSMsgType.PING, b"data", ""), 4) -def test_pong_frame(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.PONG, b"data", False)] - parser.feed_data(b"") +def test_pong_frame(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None: + parser._handle_frame(True, WSMsgType.PONG, b"data", 0) res = out._buffer[0] assert res == ((WSMsgType.PONG, b"data", ""), 4) -def test_close_frame(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"", False)] - - parser.feed_data(b"") +def test_close_frame(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None: + parser._handle_frame(True, WSMsgType.CLOSE, b"", 0) res = out._buffer[0] assert res == ((WSMsgType.CLOSE, 0, ""), 0) -def test_close_frame_info(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"0112345", False)] - - parser.feed_data(b"") +def test_close_frame_info( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(True, WSMsgType.CLOSE, b"0112345", 0) res = out._buffer[0] assert res == (WSMessage(WSMsgType.CLOSE, 12337, "12345"), 0) -def test_close_frame_invalid(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.CLOSE, b"1", False)] - parser.feed_data(b"") - - assert isinstance(out.exception(), WebSocketError) - assert out.exception().code == WSCloseCode.PROTOCOL_ERROR +def test_close_frame_invalid( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + with pytest.raises(WebSocketError) as ctx: + parser._handle_frame(True, WSMsgType.CLOSE, b"1", 0) + assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR -def test_close_frame_invalid_2(out, parser) -> None: +def test_close_frame_invalid_2( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: data = build_close_frame(code=1) with pytest.raises(WebSocketError) as ctx: @@ -284,7 +305,7 @@ def test_close_frame_invalid_2(out, parser) -> None: assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR -def test_close_frame_unicode_err(parser) -> None: +def test_close_frame_unicode_err(parser: PatchableWebSocketReader) -> None: data = build_close_frame(code=1000, message=b"\xf4\x90\x80\x80") with pytest.raises(WebSocketError) as ctx: @@ -293,23 +314,21 @@ def test_close_frame_unicode_err(parser) -> None: assert ctx.value.code == WSCloseCode.INVALID_TEXT -def test_unknown_frame(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.CONTINUATION, b"", False)] - +def test_unknown_frame( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: with pytest.raises(WebSocketError): - parser.feed_data(b"") - raise out.exception() + parser._handle_frame(True, WSMsgType.CONTINUATION, b"", 0) -def test_simple_text(out, parser) -> None: +def test_simple_text(out: WebSocketDataQueue, parser: PatchableWebSocketReader) -> None: data = build_frame(b"text", WSMsgType.TEXT) parser._feed_data(data) res = out._buffer[0] assert res == ((WSMsgType.TEXT, "text", ""), 4) -def test_simple_text_unicode_err(parser) -> None: +def test_simple_text_unicode_err(parser: PatchableWebSocketReader) -> None: data = build_frame(b"\xf4\x90\x80\x80", WSMsgType.TEXT) with pytest.raises(WebSocketError) as ctx: @@ -318,16 +337,18 @@ def test_simple_text_unicode_err(parser) -> None: assert ctx.value.code == WSCloseCode.INVALID_TEXT -def test_simple_binary(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [(1, WSMsgType.BINARY, b"binary", False)] - - parser.feed_data(b"") +def test_simple_binary( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + data = build_frame(b"binary", WSMsgType.BINARY) + parser._feed_data(data) res = out._buffer[0] assert res == ((WSMsgType.BINARY, b"binary", ""), 6) -def test_fragmentation_header(out, parser) -> None: +def test_fragmentation_header( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: data = build_frame(b"a", WSMsgType.TEXT) parser._feed_data(data[:1]) parser._feed_data(data[1:]) @@ -336,7 +357,9 @@ def test_fragmentation_header(out, parser) -> None: assert res == (WSMessage(WSMsgType.TEXT, "a", ""), 1) -def test_continuation(out, parser) -> None: +def test_continuation( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: data1 = build_frame(b"line1", WSMsgType.TEXT, is_fin=False) parser._feed_data(data1) @@ -347,14 +370,9 @@ def test_continuation(out, parser) -> None: assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10) -def test_continuation_with_ping(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.PING, b"", False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - +def test_continuation_with_ping( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: data1 = build_frame(b"line1", WSMsgType.TEXT, is_fin=False) parser._feed_data(data1) @@ -370,90 +388,78 @@ def test_continuation_with_ping(out, parser) -> None: assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10) -def test_continuation_err(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (1, WSMsgType.TEXT, b"line2", False), - ] - +def test_continuation_err( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) with pytest.raises(WebSocketError): - parser._feed_data(b"") + parser._handle_frame(True, WSMsgType.TEXT, b"line2", 0) -def test_continuation_with_close(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.CLOSE, build_close_frame(1002, b"test", noheader=True), False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - - parser.feed_data(b"") +def test_continuation_with_close( + out: WebSocketDataQueue, parser: WebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) + parser._handle_frame( + False, + WSMsgType.CLOSE, + build_close_frame(1002, b"test", noheader=True), + False, + ) + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) res = out._buffer[0] - assert res, (WSMessage(WSMsgType.CLOSE, 1002, "test"), 0) + assert res == (WSMessage(WSMsgType.CLOSE, 1002, "test"), 0) res = out._buffer[1] assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10) -def test_continuation_with_close_unicode_err(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - ( - 0, +def test_continuation_with_close_unicode_err( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) + with pytest.raises(WebSocketError) as ctx: + parser._handle_frame( + False, WSMsgType.CLOSE, build_close_frame(1000, b"\xf4\x90\x80\x80", noheader=True), - False, - ), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - - with pytest.raises(WebSocketError) as ctx: - parser._feed_data(b"") - + 0, + ) + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) assert ctx.value.code == WSCloseCode.INVALID_TEXT -def test_continuation_with_close_bad_code(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.CLOSE, build_close_frame(1, b"test", noheader=True), False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - +def test_continuation_with_close_bad_code( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) with pytest.raises(WebSocketError) as ctx: - parser._feed_data(b"") + parser._handle_frame( + False, WSMsgType.CLOSE, build_close_frame(1, b"test", noheader=True), 0 + ) assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) -def test_continuation_with_close_bad_payload(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.CLOSE, b"1", False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] - +def test_continuation_with_close_bad_payload( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) with pytest.raises(WebSocketError) as ctx: - parser._feed_data(b"") - - assert ctx.value.code, WSCloseCode.PROTOCOL_ERROR + parser._handle_frame(False, WSMsgType.CLOSE, b"1", 0) + assert ctx.value.code == WSCloseCode.PROTOCOL_ERROR + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) -def test_continuation_with_close_empty(out, parser) -> None: - parser.parse_frame = mock.Mock() - parser.parse_frame.return_value = [ - (0, WSMsgType.TEXT, b"line1", False), - (0, WSMsgType.CLOSE, b"", False), - (1, WSMsgType.CONTINUATION, b"line2", False), - ] +def test_continuation_with_close_empty( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + parser._handle_frame(False, WSMsgType.TEXT, b"line1", 0) + parser._handle_frame(False, WSMsgType.CLOSE, b"", 0) + parser._handle_frame(True, WSMsgType.CONTINUATION, b"line2", 0) - parser.feed_data(b"") res = out._buffer[0] - assert res, (WSMessage(WSMsgType.CLOSE, 0, ""), 0) + assert res == (WSMessage(WSMsgType.CLOSE, 0, ""), 0) res = out._buffer[1] assert res == (WSMessage(WSMsgType.TEXT, "line1line2", ""), 10) @@ -508,7 +514,7 @@ def test_msgtype_aliases() -> None: assert aiohttp.WSMsgType.ERROR == aiohttp.WSMsgType.error -def test_parse_compress_frame_single(parser) -> None: +def test_parse_compress_frame_single(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b11000001, 0b00000001)) res = parser.parse_frame(b"1") fin, opcode, payload, compress = res[0] @@ -516,7 +522,7 @@ def test_parse_compress_frame_single(parser) -> None: assert (1, 1, b"1", True) == (fin, opcode, payload, not not compress) -def test_parse_compress_frame_multi(parser) -> None: +def test_parse_compress_frame_multi(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b01000001, 126)) parser.parse_frame(struct.pack("!H", 4)) res = parser.parse_frame(b"1234") @@ -536,7 +542,7 @@ def test_parse_compress_frame_multi(parser) -> None: assert (1, 1, b"1234", False) == (fin, opcode, payload, not not compress) -def test_parse_compress_error_frame(parser) -> None: +def test_parse_compress_error_frame(parser: PatchableWebSocketReader) -> None: parser.parse_frame(struct.pack("!BB", 0b01000001, 0b00000001)) parser.parse_frame(b"1") @@ -551,7 +557,7 @@ def test_parse_compress_error_frame(parser) -> None: async def test_parse_no_compress_frame_single( loop: asyncio.AbstractEventLoop, out: WebSocketDataQueue ) -> None: - parser_no_compress = WebSocketReader(out, 0, compress=False) + parser_no_compress = PatchableWebSocketReader(out, 0, compress=False) with pytest.raises(WebSocketError) as ctx: parser_no_compress.parse_frame(struct.pack("!BB", 0b11000001, 0b00000001)) parser_no_compress.parse_frame(b"1") @@ -603,34 +609,28 @@ def test_pickle(self) -> None: def test_flow_control_binary( protocol: BaseProtocol, out_low_limit: WebSocketDataQueue, - parser_low_limit: WebSocketReader, + parser_low_limit: PatchableWebSocketReader, ) -> None: large_payload = b"b" * (1 + 16 * 2) - large_payload_len = len(large_payload) - with mock.patch.object(parser_low_limit, "parse_frame", autospec=True) as m: - m.return_value = [(1, WSMsgType.BINARY, large_payload, False)] - - parser_low_limit.feed_data(b"") - + large_payload_size = len(large_payload) + parser_low_limit._handle_frame(True, WSMsgType.BINARY, large_payload, 0) res = out_low_limit._buffer[0] - assert res == (WSMessage(WSMsgType.BINARY, large_payload, ""), large_payload_len) + assert res == (WSMessage(WSMsgType.BINARY, large_payload, ""), large_payload_size) assert protocol._reading_paused is True def test_flow_control_multi_byte_text( protocol: BaseProtocol, out_low_limit: WebSocketDataQueue, - parser_low_limit: WebSocketReader, + parser_low_limit: PatchableWebSocketReader, ) -> None: large_payload_text = "𒀁" * (1 + 16 * 2) large_payload = large_payload_text.encode("utf-8") - large_payload_len = len(large_payload) - - with mock.patch.object(parser_low_limit, "parse_frame", autospec=True) as m: - m.return_value = [(1, WSMsgType.TEXT, large_payload, False)] - - parser_low_limit.feed_data(b"") - + large_payload_size = len(large_payload) + parser_low_limit._handle_frame(True, WSMsgType.TEXT, large_payload, 0) res = out_low_limit._buffer[0] - assert res == (WSMessage(WSMsgType.TEXT, large_payload_text, ""), large_payload_len) + assert res == ( + WSMessage(WSMsgType.TEXT, large_payload_text, ""), + large_payload_size, + ) assert protocol._reading_paused is True From 82648123d7c2f065cf6ee578212bb12e2a1cc5bb Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Sat, 19 Apr 2025 08:08:37 +0000 Subject: [PATCH 30/37] [PR #10744/23d3ee06 backport][3.12] Refactor WebSocket reader to avoid frequent realloc when frames are fragmented (#10748) Co-authored-by: J. Nick Koston --- CHANGES/10744.misc.rst | 1 + aiohttp/_websocket/reader_c.pxd | 25 ++++---- aiohttp/_websocket/reader_py.py | 103 ++++++++++++++++++-------------- 3 files changed, 72 insertions(+), 57 deletions(-) create mode 100644 CHANGES/10744.misc.rst diff --git a/CHANGES/10744.misc.rst b/CHANGES/10744.misc.rst new file mode 100644 index 00000000000..da0d379475d --- /dev/null +++ b/CHANGES/10744.misc.rst @@ -0,0 +1 @@ +Improved performance of the WebSocket reader with large messages -- by :user:`bdraco`. diff --git a/aiohttp/_websocket/reader_c.pxd b/aiohttp/_websocket/reader_c.pxd index 3efebeb81dc..a7620d8e87f 100644 --- a/aiohttp/_websocket/reader_c.pxd +++ b/aiohttp/_websocket/reader_c.pxd @@ -68,14 +68,14 @@ cdef class WebSocketReader: cdef int _opcode cdef bint _frame_fin cdef int _frame_opcode - cdef object _frame_payload - cdef unsigned long long _frame_payload_len + cdef list _payload_fragments + cdef Py_ssize_t _frame_payload_len cdef bytes _tail cdef bint _has_mask cdef bytes _frame_mask - cdef unsigned long long _payload_length - cdef unsigned int _payload_length_flag + cdef Py_ssize_t _payload_bytes_to_read + cdef unsigned int _payload_len_flag cdef int _compressed cdef object _decompressobj cdef bint _compress @@ -91,17 +91,20 @@ cdef class WebSocketReader: cpdef void _handle_frame(self, bint fin, int opcode, object payload, int compressed) except * @cython.locals( - start_pos="unsigned int", - data_len="unsigned int", - length="unsigned int", - chunk_size="unsigned int", - chunk_len="unsigned int", - data_length="unsigned int", + start_pos=Py_ssize_t, + data_len=Py_ssize_t, + length=Py_ssize_t, + chunk_size=Py_ssize_t, + chunk_len=Py_ssize_t, + data_len=Py_ssize_t, data_cstr="const unsigned char *", first_byte="unsigned char", second_byte="unsigned char", - end_pos="unsigned int", + f_start_pos=Py_ssize_t, + f_end_pos=Py_ssize_t, has_mask=bint, fin=bint, + had_fragments=Py_ssize_t, + payload_bytearray=bytearray, ) cpdef void _feed_data(self, bytes data) except * diff --git a/aiohttp/_websocket/reader_py.py b/aiohttp/_websocket/reader_py.py index a8a8eb7eb01..8a775742df1 100644 --- a/aiohttp/_websocket/reader_py.py +++ b/aiohttp/_websocket/reader_py.py @@ -144,14 +144,14 @@ def __init__( self._opcode: int = OP_CODE_NOT_SET self._frame_fin = False self._frame_opcode: int = OP_CODE_NOT_SET - self._frame_payload: Union[bytes, bytearray] = b"" + self._payload_fragments: list[bytes] = [] self._frame_payload_len = 0 self._tail: bytes = b"" self._has_mask = False self._frame_mask: Optional[bytes] = None - self._payload_length = 0 - self._payload_length_flag = 0 + self._payload_bytes_to_read = 0 + self._payload_len_flag = 0 self._compressed: int = COMPRESSED_NOT_SET self._decompressobj: Optional[ZLibDecompressor] = None self._compress = compress @@ -325,13 +325,13 @@ def _feed_data(self, data: bytes) -> None: data, self._tail = self._tail + data, b"" start_pos: int = 0 - data_length = len(data) + data_len = len(data) data_cstr = data while True: # read header if self._state == READ_HEADER: - if data_length - start_pos < 2: + if data_len - start_pos < 2: break first_byte = data_cstr[start_pos] second_byte = data_cstr[start_pos + 1] @@ -390,77 +390,88 @@ def _feed_data(self, data: bytes) -> None: self._frame_fin = bool(fin) self._frame_opcode = opcode self._has_mask = bool(has_mask) - self._payload_length_flag = length + self._payload_len_flag = length self._state = READ_PAYLOAD_LENGTH # read payload length if self._state == READ_PAYLOAD_LENGTH: - length_flag = self._payload_length_flag - if length_flag == 126: - if data_length - start_pos < 2: + len_flag = self._payload_len_flag + if len_flag == 126: + if data_len - start_pos < 2: break first_byte = data_cstr[start_pos] second_byte = data_cstr[start_pos + 1] start_pos += 2 - self._payload_length = first_byte << 8 | second_byte - elif length_flag > 126: - if data_length - start_pos < 8: + self._payload_bytes_to_read = first_byte << 8 | second_byte + elif len_flag > 126: + if data_len - start_pos < 8: break - self._payload_length = UNPACK_LEN3(data, start_pos)[0] + self._payload_bytes_to_read = UNPACK_LEN3(data, start_pos)[0] start_pos += 8 else: - self._payload_length = length_flag + self._payload_bytes_to_read = len_flag self._state = READ_PAYLOAD_MASK if self._has_mask else READ_PAYLOAD # read payload mask if self._state == READ_PAYLOAD_MASK: - if data_length - start_pos < 4: + if data_len - start_pos < 4: break self._frame_mask = data_cstr[start_pos : start_pos + 4] start_pos += 4 self._state = READ_PAYLOAD if self._state == READ_PAYLOAD: - chunk_len = data_length - start_pos - if self._payload_length >= chunk_len: - end_pos = data_length - self._payload_length -= chunk_len + chunk_len = data_len - start_pos + if self._payload_bytes_to_read >= chunk_len: + f_end_pos = data_len + self._payload_bytes_to_read -= chunk_len else: - end_pos = start_pos + self._payload_length - self._payload_length = 0 - - if self._frame_payload_len: - if type(self._frame_payload) is not bytearray: - self._frame_payload = bytearray(self._frame_payload) - self._frame_payload += data_cstr[start_pos:end_pos] - else: - # Fast path for the first frame - self._frame_payload = data_cstr[start_pos:end_pos] - - self._frame_payload_len += end_pos - start_pos - start_pos = end_pos - - if self._payload_length != 0: + f_end_pos = start_pos + self._payload_bytes_to_read + self._payload_bytes_to_read = 0 + + had_fragments = self._frame_payload_len + self._frame_payload_len += f_end_pos - start_pos + f_start_pos = start_pos + start_pos = f_end_pos + + if self._payload_bytes_to_read != 0: + # If we don't have a complete frame, we need to save the + # data for the next call to feed_data. + self._payload_fragments.append(data_cstr[f_start_pos:f_end_pos]) break - if self._has_mask: + payload: Union[bytes, bytearray] + if had_fragments: + # We have to join the payload fragments get the payload + self._payload_fragments.append(data_cstr[f_start_pos:f_end_pos]) + if self._has_mask: + assert self._frame_mask is not None + payload_bytearray = bytearray() + payload_bytearray.join(self._payload_fragments) + websocket_mask(self._frame_mask, payload_bytearray) + payload = payload_bytearray + else: + payload = b"".join(self._payload_fragments) + self._payload_fragments.clear() + elif self._has_mask: assert self._frame_mask is not None - if type(self._frame_payload) is not bytearray: - self._frame_payload = bytearray(self._frame_payload) - websocket_mask(self._frame_mask, self._frame_payload) + payload_bytearray = data_cstr[f_start_pos:f_end_pos] # type: ignore[assignment] + if type(payload_bytearray) is not bytearray: # pragma: no branch + # Cython will do the conversion for us + # but we need to do it for Python and we + # will always get here in Python + payload_bytearray = bytearray(payload_bytearray) + websocket_mask(self._frame_mask, payload_bytearray) + payload = payload_bytearray + else: + payload = data_cstr[f_start_pos:f_end_pos] self._handle_frame( - self._frame_fin, - self._frame_opcode, - self._frame_payload, - self._compressed, + self._frame_fin, self._frame_opcode, payload, self._compressed ) - self._frame_payload = b"" self._frame_payload_len = 0 self._state = READ_HEADER # XXX: Cython needs slices to be bounded, so we can't omit the slice end here. - self._tail = ( - data_cstr[start_pos:data_length] if start_pos < data_length else b"" - ) + self._tail = data_cstr[start_pos:data_len] if start_pos < data_len else b"" From 1d00bd2c1cdb43298ac6eba0c8bbc316123a154e Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Sat, 19 Apr 2025 08:13:54 +0000 Subject: [PATCH 31/37] [PR #10744/23d3ee06 backport][3.11] Refactor WebSocket reader to avoid frequent realloc when frames are fragmented (#10747) Co-authored-by: J. Nick Koston --- CHANGES/10744.misc.rst | 1 + aiohttp/_websocket/reader_c.pxd | 25 ++++---- aiohttp/_websocket/reader_py.py | 103 ++++++++++++++++++-------------- 3 files changed, 72 insertions(+), 57 deletions(-) create mode 100644 CHANGES/10744.misc.rst diff --git a/CHANGES/10744.misc.rst b/CHANGES/10744.misc.rst new file mode 100644 index 00000000000..da0d379475d --- /dev/null +++ b/CHANGES/10744.misc.rst @@ -0,0 +1 @@ +Improved performance of the WebSocket reader with large messages -- by :user:`bdraco`. diff --git a/aiohttp/_websocket/reader_c.pxd b/aiohttp/_websocket/reader_c.pxd index 3efebeb81dc..a7620d8e87f 100644 --- a/aiohttp/_websocket/reader_c.pxd +++ b/aiohttp/_websocket/reader_c.pxd @@ -68,14 +68,14 @@ cdef class WebSocketReader: cdef int _opcode cdef bint _frame_fin cdef int _frame_opcode - cdef object _frame_payload - cdef unsigned long long _frame_payload_len + cdef list _payload_fragments + cdef Py_ssize_t _frame_payload_len cdef bytes _tail cdef bint _has_mask cdef bytes _frame_mask - cdef unsigned long long _payload_length - cdef unsigned int _payload_length_flag + cdef Py_ssize_t _payload_bytes_to_read + cdef unsigned int _payload_len_flag cdef int _compressed cdef object _decompressobj cdef bint _compress @@ -91,17 +91,20 @@ cdef class WebSocketReader: cpdef void _handle_frame(self, bint fin, int opcode, object payload, int compressed) except * @cython.locals( - start_pos="unsigned int", - data_len="unsigned int", - length="unsigned int", - chunk_size="unsigned int", - chunk_len="unsigned int", - data_length="unsigned int", + start_pos=Py_ssize_t, + data_len=Py_ssize_t, + length=Py_ssize_t, + chunk_size=Py_ssize_t, + chunk_len=Py_ssize_t, + data_len=Py_ssize_t, data_cstr="const unsigned char *", first_byte="unsigned char", second_byte="unsigned char", - end_pos="unsigned int", + f_start_pos=Py_ssize_t, + f_end_pos=Py_ssize_t, has_mask=bint, fin=bint, + had_fragments=Py_ssize_t, + payload_bytearray=bytearray, ) cpdef void _feed_data(self, bytes data) except * diff --git a/aiohttp/_websocket/reader_py.py b/aiohttp/_websocket/reader_py.py index 5c5dbc3b0c4..2c7ae5779e2 100644 --- a/aiohttp/_websocket/reader_py.py +++ b/aiohttp/_websocket/reader_py.py @@ -144,14 +144,14 @@ def __init__( self._opcode: int = OP_CODE_NOT_SET self._frame_fin = False self._frame_opcode: int = OP_CODE_NOT_SET - self._frame_payload: Union[bytes, bytearray] = b"" + self._payload_fragments: list[bytes] = [] self._frame_payload_len = 0 self._tail: bytes = b"" self._has_mask = False self._frame_mask: Optional[bytes] = None - self._payload_length = 0 - self._payload_length_flag = 0 + self._payload_bytes_to_read = 0 + self._payload_len_flag = 0 self._compressed: int = COMPRESSED_NOT_SET self._decompressobj: Optional[ZLibDecompressor] = None self._compress = compress @@ -317,13 +317,13 @@ def _feed_data(self, data: bytes) -> None: data, self._tail = self._tail + data, b"" start_pos: int = 0 - data_length = len(data) + data_len = len(data) data_cstr = data while True: # read header if self._state == READ_HEADER: - if data_length - start_pos < 2: + if data_len - start_pos < 2: break first_byte = data_cstr[start_pos] second_byte = data_cstr[start_pos + 1] @@ -382,77 +382,88 @@ def _feed_data(self, data: bytes) -> None: self._frame_fin = bool(fin) self._frame_opcode = opcode self._has_mask = bool(has_mask) - self._payload_length_flag = length + self._payload_len_flag = length self._state = READ_PAYLOAD_LENGTH # read payload length if self._state == READ_PAYLOAD_LENGTH: - length_flag = self._payload_length_flag - if length_flag == 126: - if data_length - start_pos < 2: + len_flag = self._payload_len_flag + if len_flag == 126: + if data_len - start_pos < 2: break first_byte = data_cstr[start_pos] second_byte = data_cstr[start_pos + 1] start_pos += 2 - self._payload_length = first_byte << 8 | second_byte - elif length_flag > 126: - if data_length - start_pos < 8: + self._payload_bytes_to_read = first_byte << 8 | second_byte + elif len_flag > 126: + if data_len - start_pos < 8: break - self._payload_length = UNPACK_LEN3(data, start_pos)[0] + self._payload_bytes_to_read = UNPACK_LEN3(data, start_pos)[0] start_pos += 8 else: - self._payload_length = length_flag + self._payload_bytes_to_read = len_flag self._state = READ_PAYLOAD_MASK if self._has_mask else READ_PAYLOAD # read payload mask if self._state == READ_PAYLOAD_MASK: - if data_length - start_pos < 4: + if data_len - start_pos < 4: break self._frame_mask = data_cstr[start_pos : start_pos + 4] start_pos += 4 self._state = READ_PAYLOAD if self._state == READ_PAYLOAD: - chunk_len = data_length - start_pos - if self._payload_length >= chunk_len: - end_pos = data_length - self._payload_length -= chunk_len + chunk_len = data_len - start_pos + if self._payload_bytes_to_read >= chunk_len: + f_end_pos = data_len + self._payload_bytes_to_read -= chunk_len else: - end_pos = start_pos + self._payload_length - self._payload_length = 0 - - if self._frame_payload_len: - if type(self._frame_payload) is not bytearray: - self._frame_payload = bytearray(self._frame_payload) - self._frame_payload += data_cstr[start_pos:end_pos] - else: - # Fast path for the first frame - self._frame_payload = data_cstr[start_pos:end_pos] - - self._frame_payload_len += end_pos - start_pos - start_pos = end_pos - - if self._payload_length != 0: + f_end_pos = start_pos + self._payload_bytes_to_read + self._payload_bytes_to_read = 0 + + had_fragments = self._frame_payload_len + self._frame_payload_len += f_end_pos - start_pos + f_start_pos = start_pos + start_pos = f_end_pos + + if self._payload_bytes_to_read != 0: + # If we don't have a complete frame, we need to save the + # data for the next call to feed_data. + self._payload_fragments.append(data_cstr[f_start_pos:f_end_pos]) break - if self._has_mask: + payload: Union[bytes, bytearray] + if had_fragments: + # We have to join the payload fragments get the payload + self._payload_fragments.append(data_cstr[f_start_pos:f_end_pos]) + if self._has_mask: + assert self._frame_mask is not None + payload_bytearray = bytearray() + payload_bytearray.join(self._payload_fragments) + websocket_mask(self._frame_mask, payload_bytearray) + payload = payload_bytearray + else: + payload = b"".join(self._payload_fragments) + self._payload_fragments.clear() + elif self._has_mask: assert self._frame_mask is not None - if type(self._frame_payload) is not bytearray: - self._frame_payload = bytearray(self._frame_payload) - websocket_mask(self._frame_mask, self._frame_payload) + payload_bytearray = data_cstr[f_start_pos:f_end_pos] # type: ignore[assignment] + if type(payload_bytearray) is not bytearray: # pragma: no branch + # Cython will do the conversion for us + # but we need to do it for Python and we + # will always get here in Python + payload_bytearray = bytearray(payload_bytearray) + websocket_mask(self._frame_mask, payload_bytearray) + payload = payload_bytearray + else: + payload = data_cstr[f_start_pos:f_end_pos] self._handle_frame( - self._frame_fin, - self._frame_opcode, - self._frame_payload, - self._compressed, + self._frame_fin, self._frame_opcode, payload, self._compressed ) - self._frame_payload = b"" self._frame_payload_len = 0 self._state = READ_HEADER # XXX: Cython needs slices to be bounded, so we can't omit the slice end here. - self._tail = ( - data_cstr[start_pos:data_length] if start_pos < data_length else b"" - ) + self._tail = data_cstr[start_pos:data_len] if start_pos < data_len else b"" From 8b9888dea1a1306d23e9ad70cd87c851a37f1ed7 Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Sat, 19 Apr 2025 09:33:09 +0000 Subject: [PATCH 32/37] [PR #10749/d702fb30 backport][3.11] Add compressed binary WebSocket roundtrip benchmark (#10750) Co-authored-by: J. Nick Koston --- tests/test_benchmarks_client_ws.py | 64 ++++++++++++++++++++++++++++++ 1 file changed, 64 insertions(+) diff --git a/tests/test_benchmarks_client_ws.py b/tests/test_benchmarks_client_ws.py index c244d33f6bd..044c1c1eb6d 100644 --- a/tests/test_benchmarks_client_ws.py +++ b/tests/test_benchmarks_client_ws.py @@ -105,3 +105,67 @@ async def run_websocket_benchmark() -> None: @benchmark def _run() -> None: loop.run_until_complete(run_websocket_benchmark()) + + +def test_client_send_large_websocket_compressed_messages( + loop: asyncio.AbstractEventLoop, + aiohttp_client: AiohttpClient, + benchmark: BenchmarkFixture, +) -> None: + """Benchmark send of compressed WebSocket binary messages.""" + message_count = 10 + raw_message = b"x" * 2**19 # 512 KiB + + async def handler(request: web.Request) -> web.WebSocketResponse: + ws = web.WebSocketResponse() + await ws.prepare(request) + for _ in range(message_count): + await ws.receive() + await ws.close() + return ws + + app = web.Application() + app.router.add_route("GET", "/", handler) + + async def run_websocket_benchmark() -> None: + client = await aiohttp_client(app) + resp = await client.ws_connect("/", compress=15) + for _ in range(message_count): + await resp.send_bytes(raw_message) + await resp.close() + + @benchmark + def _run() -> None: + loop.run_until_complete(run_websocket_benchmark()) + + +def test_client_receive_large_websocket_compressed_messages( + loop: asyncio.AbstractEventLoop, + aiohttp_client: AiohttpClient, + benchmark: BenchmarkFixture, +) -> None: + """Benchmark receive of compressed WebSocket binary messages.""" + message_count = 10 + raw_message = b"x" * 2**19 # 512 KiB + + async def handler(request: web.Request) -> web.WebSocketResponse: + ws = web.WebSocketResponse() + await ws.prepare(request) + for _ in range(message_count): + await ws.send_bytes(raw_message) + await ws.close() + return ws + + app = web.Application() + app.router.add_route("GET", "/", handler) + + async def run_websocket_benchmark() -> None: + client = await aiohttp_client(app) + resp = await client.ws_connect("/", compress=15) + for _ in range(message_count): + await resp.receive() + await resp.close() + + @benchmark + def _run() -> None: + loop.run_until_complete(run_websocket_benchmark()) From 03d17e5ff8decabaddcf7839ea2873a076e47b88 Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Sat, 19 Apr 2025 09:37:49 +0000 Subject: [PATCH 33/37] [PR #10749/d702fb30 backport][3.12] Add compressed binary WebSocket roundtrip benchmark (#10751) Co-authored-by: J. Nick Koston --- tests/test_benchmarks_client_ws.py | 66 ++++++++++++++++++++++++++++++ 1 file changed, 66 insertions(+) diff --git a/tests/test_benchmarks_client_ws.py b/tests/test_benchmarks_client_ws.py index c244d33f6bd..0338b52fb9d 100644 --- a/tests/test_benchmarks_client_ws.py +++ b/tests/test_benchmarks_client_ws.py @@ -105,3 +105,69 @@ async def run_websocket_benchmark() -> None: @benchmark def _run() -> None: loop.run_until_complete(run_websocket_benchmark()) + + +@pytest.mark.usefixtures("parametrize_zlib_backend") +def test_client_send_large_websocket_compressed_messages( + loop: asyncio.AbstractEventLoop, + aiohttp_client: AiohttpClient, + benchmark: BenchmarkFixture, +) -> None: + """Benchmark send of compressed WebSocket binary messages.""" + message_count = 10 + raw_message = b"x" * 2**19 # 512 KiB + + async def handler(request: web.Request) -> web.WebSocketResponse: + ws = web.WebSocketResponse() + await ws.prepare(request) + for _ in range(message_count): + await ws.receive() + await ws.close() + return ws + + app = web.Application() + app.router.add_route("GET", "/", handler) + + async def run_websocket_benchmark() -> None: + client = await aiohttp_client(app) + resp = await client.ws_connect("/", compress=15) + for _ in range(message_count): + await resp.send_bytes(raw_message) + await resp.close() + + @benchmark + def _run() -> None: + loop.run_until_complete(run_websocket_benchmark()) + + +@pytest.mark.usefixtures("parametrize_zlib_backend") +def test_client_receive_large_websocket_compressed_messages( + loop: asyncio.AbstractEventLoop, + aiohttp_client: AiohttpClient, + benchmark: BenchmarkFixture, +) -> None: + """Benchmark receive of compressed WebSocket binary messages.""" + message_count = 10 + raw_message = b"x" * 2**19 # 512 KiB + + async def handler(request: web.Request) -> web.WebSocketResponse: + ws = web.WebSocketResponse() + await ws.prepare(request) + for _ in range(message_count): + await ws.send_bytes(raw_message) + await ws.close() + return ws + + app = web.Application() + app.router.add_route("GET", "/", handler) + + async def run_websocket_benchmark() -> None: + client = await aiohttp_client(app) + resp = await client.ws_connect("/", compress=15) + for _ in range(message_count): + await resp.receive() + await resp.close() + + @benchmark + def _run() -> None: + loop.run_until_complete(run_websocket_benchmark()) From 07590cd2c8899d9739116d298d89a9c18d2c4d99 Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Sat, 19 Apr 2025 08:34:50 -1000 Subject: [PATCH 34/37] Add a test to the WebSocket parser for sending one byte at a time (#10752) --- tests/test_websocket_parser.py | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/tests/test_websocket_parser.py b/tests/test_websocket_parser.py index 41da6b4e16e..6199abae359 100644 --- a/tests/test_websocket_parser.py +++ b/tests/test_websocket_parser.py @@ -330,6 +330,17 @@ def test_simple_binary( assert res == WSMessageBinary(data=b"binary", size=6, extra="") +def test_one_byte_at_a_time( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + """Send one byte at a time to the parser.""" + data = build_frame(b"binary", WSMsgType.BINARY) + for i in range(len(data)): + parser._feed_data(data[i : i + 1]) + res = out._buffer[0] + assert res == WSMessageBinary(data=b"binary", size=6, extra="") + + def test_fragmentation_header( out: WebSocketDataQueue, parser: PatchableWebSocketReader ) -> None: From d8ad35fd2ebe8e1a7d4740bb830b4baa49a1f96b Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Sat, 19 Apr 2025 09:00:54 -1000 Subject: [PATCH 35/37] [PR #10752/07590cd2 backport][3.12] Add a test to the WebSocket parser for sending one byte at a time (#10755) Co-authored-by: J. Nick Koston --- tests/test_websocket_parser.py | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/tests/test_websocket_parser.py b/tests/test_websocket_parser.py index 04c83f19610..52c34454886 100644 --- a/tests/test_websocket_parser.py +++ b/tests/test_websocket_parser.py @@ -346,6 +346,17 @@ def test_simple_binary( assert res == ((WSMsgType.BINARY, b"binary", ""), 6) +def test_one_byte_at_a_time( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + """Send one byte at a time to the parser.""" + data = build_frame(b"binary", WSMsgType.BINARY) + for i in range(len(data)): + parser._feed_data(data[i : i + 1]) + res = out._buffer[0] + assert res == ((WSMsgType.BINARY, b"binary", ""), 6) + + def test_fragmentation_header( out: WebSocketDataQueue, parser: PatchableWebSocketReader ) -> None: From 0615314569fc1da56a03ebed8b5586acb6e4c4df Mon Sep 17 00:00:00 2001 From: "patchback[bot]" <45432694+patchback[bot]@users.noreply.github.com> Date: Sat, 19 Apr 2025 19:07:09 +0000 Subject: [PATCH 36/37] [PR #10752/07590cd2 backport][3.11] Add a test to the WebSocket parser for sending one byte at a time (#10754) Co-authored-by: J. Nick Koston --- tests/test_websocket_parser.py | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/tests/test_websocket_parser.py b/tests/test_websocket_parser.py index 8a65ac11d50..fc4888df5e5 100644 --- a/tests/test_websocket_parser.py +++ b/tests/test_websocket_parser.py @@ -344,6 +344,17 @@ def test_simple_binary( assert res == ((WSMsgType.BINARY, b"binary", ""), 6) +def test_one_byte_at_a_time( + out: WebSocketDataQueue, parser: PatchableWebSocketReader +) -> None: + """Send one byte at a time to the parser.""" + data = build_frame(b"binary", WSMsgType.BINARY) + for i in range(len(data)): + parser._feed_data(data[i : i + 1]) + res = out._buffer[0] + assert res == ((WSMsgType.BINARY, b"binary", ""), 6) + + def test_fragmentation_header( out: WebSocketDataQueue, parser: PatchableWebSocketReader ) -> None: From 11be7e2ff8775e0ee96c027475474cdef8ec3e37 Mon Sep 17 00:00:00 2001 From: "J. Nick Koston" Date: Sat, 19 Apr 2025 10:53:32 -1000 Subject: [PATCH 37/37] Release 3.11.17 (#10756) --- CHANGES.rst | 42 ++++++++++++++++++++++++++++++++++++++++++ CHANGES/10713.misc.rst | 1 - CHANGES/10714.misc.rst | 1 - CHANGES/10740.misc.rst | 1 - CHANGES/10744.misc.rst | 1 - aiohttp/__init__.py | 2 +- 6 files changed, 43 insertions(+), 5 deletions(-) delete mode 100644 CHANGES/10713.misc.rst delete mode 100644 CHANGES/10714.misc.rst delete mode 100644 CHANGES/10740.misc.rst delete mode 100644 CHANGES/10744.misc.rst diff --git a/CHANGES.rst b/CHANGES.rst index 00d728e775d..3b62b221e4a 100644 --- a/CHANGES.rst +++ b/CHANGES.rst @@ -10,6 +10,48 @@ .. towncrier release notes start +3.11.17 (2025-04-19) +==================== + +Miscellaneous internal changes +------------------------------ + +- Optimized web server performance when access logging is disabled by reducing time syscalls -- by :user:`bdraco`. + + + *Related issues and pull requests on GitHub:* + :issue:`10713`. + + + +- Improved web server performance when connection can be reused -- by :user:`bdraco`. + + + *Related issues and pull requests on GitHub:* + :issue:`10714`. + + + +- Improved performance of the WebSocket reader -- by :user:`bdraco`. + + + *Related issues and pull requests on GitHub:* + :issue:`10740`. + + + +- Improved performance of the WebSocket reader with large messages -- by :user:`bdraco`. + + + *Related issues and pull requests on GitHub:* + :issue:`10744`. + + + + +---- + + 3.11.16 (2025-04-01) ==================== diff --git a/CHANGES/10713.misc.rst b/CHANGES/10713.misc.rst deleted file mode 100644 index a556d11e1e0..00000000000 --- a/CHANGES/10713.misc.rst +++ /dev/null @@ -1 +0,0 @@ -Optimized web server performance when access logging is disabled by reducing time syscalls -- by :user:`bdraco`. diff --git a/CHANGES/10714.misc.rst b/CHANGES/10714.misc.rst deleted file mode 100644 index a36a80872f5..00000000000 --- a/CHANGES/10714.misc.rst +++ /dev/null @@ -1 +0,0 @@ -Improved web server performance when connection can be reused -- by :user:`bdraco`. diff --git a/CHANGES/10740.misc.rst b/CHANGES/10740.misc.rst deleted file mode 100644 index 34ed19aebba..00000000000 --- a/CHANGES/10740.misc.rst +++ /dev/null @@ -1 +0,0 @@ -Improved performance of the WebSocket reader -- by :user:`bdraco`. diff --git a/CHANGES/10744.misc.rst b/CHANGES/10744.misc.rst deleted file mode 100644 index da0d379475d..00000000000 --- a/CHANGES/10744.misc.rst +++ /dev/null @@ -1 +0,0 @@ -Improved performance of the WebSocket reader with large messages -- by :user:`bdraco`. diff --git a/aiohttp/__init__.py b/aiohttp/__init__.py index 8a3d34a4f87..ab1d5bdedcc 100644 --- a/aiohttp/__init__.py +++ b/aiohttp/__init__.py @@ -1,4 +1,4 @@ -__version__ = "3.11.17.dev0" +__version__ = "3.11.17" from typing import TYPE_CHECKING, Tuple