Compare commits

...

29 Commits

Author SHA1 Message Date
Erik Johnston
c836cb988e Better event stream typing 2024-01-15 21:27:03 +00:00
dependabot[bot]
69637f8bac Bump service-identity from 23.1.0 to 24.1.0 (#16816) 2024-01-15 13:54:38 +00:00
dependabot[bot]
2ce91cf26f Bump typing-extensions from 4.8.0 to 4.9.0 (#16815) 2024-01-15 13:54:23 +00:00
dependabot[bot]
d895a64f19 Bump lxml from 4.9.3 to 5.1.0 (#16813) 2024-01-15 13:53:54 +00:00
dependabot[bot]
a83a270069 Bump immutabledict from 4.0.0 to 4.1.0 (#16812) 2024-01-15 13:53:33 +00:00
Erik Johnston
c6e0d845d3 Fix building of deps after bump of pillow version (#16817)
Broke in https://github.com/element-hq/synapse/pull/16802
2024-01-15 13:51:48 +00:00
Erik Johnston
4e4a0f79b9 Update license in Debian metadata (#16807) 2024-01-11 16:25:16 +00:00
Erik Johnston
c43f751013 Optimize query for fetching to-device messages in /sync (#16805)
The current query supports passing in a list of users, which generates a
query using `user_id = ANY(..)`. This is generates a less efficient
query plan that is notably slower than a simple `user_id = ?` condition.

Note: The new function is mostly a copy and paste and then a
simplification of the existing function.
2024-01-11 13:37:57 +00:00
Erik Johnston
b11f7b5122 Improve DB performance of calculating badge counts for push. (#16756)
The crux of the change is to try and make the queries simpler and pull
out fewer rows. Before, there were quite a few joins against subqueries,
which caused postgres to pull out more rows than necessary.

Instead, let's simplify the query and do some of the filtering out in
Python instead, letting Postgres do better optimizations now that it
doesn't have to deal with joins against subqueries.

Review note: this is a complete rewrite of the function, so not sure how
useful the diff is.

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-01-11 11:52:13 +00:00
dependabot[bot]
79a88b5fc9 Bump pillow from 10.1.0 to 10.2.0 (#16802) 2024-01-11 11:25:23 +00:00
dependabot[bot]
791c282349 Bump actions/upload-artifact from 3 to 4 (#16796) 2024-01-11 10:16:49 +00:00
dependabot[bot]
f1e6b9717e Bump actions/download-artifact from 3 to 4 (#16795) 2024-01-11 10:16:01 +00:00
dependabot[bot]
b309a4ecdf Bump dawidd6/action-download-artifact from 2.28.0 to 3.0.0 (#16794) 2024-01-11 10:15:31 +00:00
Erik Johnston
a986f86c82 Correctly handle OIDC config with no client_secret set (#16806)
In previous versions of authlib using `client_secret_basic` without a
`client_secret` would result in an invalid auth header. Since authlib
1.3 it throws an exception.

The configuration may be accepted in by very lax servers, so we don't
want to deny it outright. Instead, let's default the
`client_auth_method` to `none`, which does the right thing. If the
config specifies `client_auth_method` and no `client_secret` then that
is going to be bogus and we should reject it
2024-01-10 17:16:49 +00:00
Erik Johnston
cbe8a80d10 Faster load recents for sync (#16783)
This hopefully reduces the amount of state we need to keep in memory
2024-01-10 15:11:59 +00:00
dependabot[bot]
c9ac102668 Bump types-commonmark from 0.9.2.4 to 0.9.2.20240106 (#16797)
Bumps [types-commonmark](https://github.com/python/typeshed) from
0.9.2.4 to 0.9.2.20240106.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/python/typeshed/commits">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=types-commonmark&package-manager=pip&previous-version=0.9.2.4&new-version=0.9.2.20240106)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-10 15:06:35 +00:00
dependabot[bot]
b715799bb5 Bump pyo3 from 0.20.0 to 0.20.2 (#16791)
Bumps [pyo3](https://github.com/pyo3/pyo3) from 0.20.0 to 0.20.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pyo3/pyo3/releases">pyo3's
releases</a>.</em></p>
<blockquote>
<h2>PyO3 0.20.2</h2>
<p>This release corrects a failure to compile of the <code>either</code>
feature when the <code>experimental-inspect</code> feature is not
enabled.</p>
<p>It also adds backwards-compatibility for <code>pyo3</code> 0.20.0 to
build against <code>pyo3-build-config</code> 0.20.2, as
<code>pyo3</code> 0.20.0 was (unintentionally) not pinned against an
exact patch version <code>pyo3-build-config</code>, and
<code>pyo3</code> 0.20.0 could not build against
<code>pyo3-build-config</code> 0.20.1 due to an internal API
adjustment.</p>
<p>Thank you to the following users for the improvements:</p>
<p><a
href="https://github.com/adamreichold"><code>@​adamreichold</code></a>
<a
href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a></p>
<h2>PyO3 0.20.1</h2>
<p>This release is a maintenance release to resolve <a
href="https://redirect.github.com/rust-lang/rust-clippy/issues/12039">a
clippy warning</a> which triggers on function arguments of
<code>Py&lt;Self&gt;</code>.</p>
<p>This release also contains a few minor API additions, including
optional support for the <code>either</code> and <code>smallvec</code>
crates.</p>
<p>Thank you to the following users for the improvements:</p>
<p><a
href="https://github.com/adamreichold"><code>@​adamreichold</code></a>
<a href="https://github.com/aldanor"><code>@​aldanor</code></a>
<a href="https://github.com/alex"><code>@​alex</code></a>
<a href="https://github.com/daemontus"><code>@​daemontus</code></a>
<a href="https://github.com/davidhewitt"><code>@​davidhewitt</code></a>
<a href="https://github.com/mejrs"><code>@​mejrs</code></a>
<a href="https://github.com/messense"><code>@​messense</code></a>
<a href="https://github.com/neachdainn"><code>@​neachdainn</code></a>
<a href="https://github.com/orhun"><code>@​orhun</code></a>
<a
href="https://github.com/suriya-ganesh"><code>@​suriya-ganesh</code></a>
<a href="https://github.com/wyfo"><code>@​wyfo</code></a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/PyO3/pyo3/blob/main/CHANGELOG.md">pyo3's
changelog</a>.</em></p>
<blockquote>
<h2>[0.20.2] - 2024-01-04</h2>
<h3>Packaging</h3>
<ul>
<li>Pin <code>pyo3</code> and <code>pyo3-ffi</code> dependencies on
<code>pyo3-build-config</code> to require the same patch version, i.e.
<code>pyo3</code> 0.20.2 requires <em>exactly</em>
<code>pyo3-build-config</code> 0.20.2. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3721">#3721</a></li>
</ul>
<h3>Fixed</h3>
<ul>
<li>Fix compile failure when building <code>pyo3</code> 0.20.0 with
latest <code>pyo3-build-config</code> 0.20.X. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3724">#3724</a></li>
<li>Fix docs.rs build. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3722">#3722</a></li>
</ul>
<h2>[0.20.1] - 2023-12-30</h2>
<h3>Added</h3>
<ul>
<li>Add optional <code>either</code> feature to add conversions for
<code>either::Either&lt;L, R&gt;</code> sum type. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3456">#3456</a></li>
<li>Add optional <code>smallvec</code> feature to add conversions for
<code>smallvec::SmallVec</code>. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3507">#3507</a></li>
<li>Add <code>take</code> and <code>into_inner</code> methods to
<code>GILOnceCell</code> <a
href="https://redirect.github.com/PyO3/pyo3/pull/3556">#3556</a></li>
<li><code>#[classmethod]</code> methods can now also receive
<code>Py&lt;PyType&gt;</code> as their first argument. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3587">#3587</a></li>
<li><code>#[pyfunction(pass_module)]</code> can now also receive
<code>Py&lt;PyModule&gt;</code> as their first argument. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3587">#3587</a></li>
<li>Add <code>traverse</code> method to <code>GILProtected</code>. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3616">#3616</a></li>
<li>Added <code>abi3-py312</code> feature <a
href="https://redirect.github.com/PyO3/pyo3/pull/3687">#3687</a></li>
</ul>
<h3>Fixed</h3>
<ul>
<li>Fix minimum version specification for optional <code>chrono</code>
dependency. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3512">#3512</a></li>
<li>Silenced new <code>clippy::unnecessary_fallible_conversions</code>
warning when using a <code>Py&lt;Self&gt;</code> <code>self</code>
receiver. <a
href="https://redirect.github.com/PyO3/pyo3/pull/3564">#3564</a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bcef18b988"><code>bcef18b</code></a>
release: 0.20.2</li>
<li><a
href="fa6d60b77e"><code>fa6d60b</code></a>
Use a definite version specification when depending on
pyo3-build-config.</li>
<li><a
href="f9f0bdde70"><code>f9f0bdd</code></a>
Merge pull request <a
href="https://redirect.github.com/pyo3/pyo3/issues/3724">#3724</a> from
davidhewitt/fix-build-config-issue</li>
<li><a
href="cf213252fa"><code>cf21325</code></a>
re-add emit_pyo3_cfgs for pyo3 0.20.0 compatibility</li>
<li><a
href="f7893858d2"><code>f789385</code></a>
Merge pull request <a
href="https://redirect.github.com/pyo3/pyo3/issues/3722">#3722</a> from
PyO3/fix-doc-build</li>
<li><a
href="9120b35f35"><code>9120b35</code></a>
Include the experimental-inspect feature for the docs.rs build thereby
making...</li>
<li><a
href="2e79c557cc"><code>2e79c55</code></a>
Add CI job to test the equivalent of a docs.rs build.</li>
<li><a
href="2564ca4e75"><code>2564ca4</code></a>
Fix missing feature flags in implementation of Either conversion.</li>
<li><a
href="be4d5627a3"><code>be4d562</code></a>
Merge pull request <a
href="https://redirect.github.com/pyo3/pyo3/issues/3713">#3713</a> from
PyO3/release-0.20.1</li>
<li><a
href="d3f034a80f"><code>d3f034a</code></a>
release: 0.20.1</li>
<li>Additional commits viewable in <a
href="https://github.com/pyo3/pyo3/compare/v0.20.0...v0.20.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pyo3&package-manager=cargo&previous-version=0.20.0&new-version=0.20.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-10 15:06:18 +00:00
Erik Johnston
0a96fa52a2 Pull less state out if we fail to backfill (#16788)
Sometimes we fail to fetch events during backfill due to missing state,
and we often end up querying the same bad events periodically (as people
backpaginate). In such cases its likely we will continue to fail to get
the state, and therefore we should try *before* loading the state that
we have from the DB (as otherwise it's wasted DB and memory).

---------

Co-authored-by: reivilibre <oliverw@matrix.org>
2024-01-10 14:42:13 +00:00
dependabot[bot]
1485cfd0f2 Bump anyhow from 1.0.75 to 1.0.79 (#16789)
Bumps [anyhow](https://github.com/dtolnay/anyhow) from 1.0.75 to 1.0.79.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/dtolnay/anyhow/releases">anyhow's
releases</a>.</em></p>
<blockquote>
<h2>1.0.79</h2>
<ul>
<li>Work around improperly cached build script result by sccache (<a
href="https://redirect.github.com/dtolnay/anyhow/issues/340">#340</a>)</li>
</ul>
<h2>1.0.78</h2>
<ul>
<li>Reduce spurious rebuilds under RustRover IDE when using a nightly
toolchain (<a
href="https://redirect.github.com/dtolnay/anyhow/issues/337">#337</a>)</li>
</ul>
<h2>1.0.77</h2>
<ul>
<li>Make <code>anyhow::Error::backtrace</code> available on stable Rust
compilers 1.65+ (<a
href="https://redirect.github.com/dtolnay/anyhow/issues/293">#293</a>,
thanks <a
href="https://github.com/LukasKalbertodt"><code>@​LukasKalbertodt</code></a>)</li>
</ul>
<h2>1.0.76</h2>
<ul>
<li>Opt in to <code>unsafe_op_in_unsafe_fn</code> lint (<a
href="https://redirect.github.com/dtolnay/anyhow/issues/329">#329</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="71ab53dd2e"><code>71ab53d</code></a>
Release 1.0.79</li>
<li><a
href="60705a53ce"><code>60705a5</code></a>
Merge pull request <a
href="https://redirect.github.com/dtolnay/anyhow/issues/340">#340</a>
from dtolnay/depinfo</li>
<li><a
href="17e252bfdf"><code>17e252b</code></a>
Include env-dep:RUSTC_BOOTSTRAP in dep-info for sccache</li>
<li><a
href="04774c0894"><code>04774c0</code></a>
Merge pull request <a
href="https://redirect.github.com/dtolnay/anyhow/issues/338">#338</a>
from dtolnay/nightlyci</li>
<li><a
href="1fd290c222"><code>1fd290c</code></a>
Make CI verify that error_generic_member_access works in latest
nightly</li>
<li><a
href="ee414707be"><code>ee41470</code></a>
RUSTC must be set by Cargo for build script</li>
<li><a
href="38c79ef242"><code>38c79ef</code></a>
Release 1.0.78</li>
<li><a
href="ded2295ff5"><code>ded2295</code></a>
Merge pull request <a
href="https://redirect.github.com/dtolnay/anyhow/issues/337">#337</a>
from dtolnay/bootstrap</li>
<li><a
href="ae45b672c9"><code>ae45b67</code></a>
Do not rebuild on RUSTC_BOOTSTRAP changes on nightly compiler</li>
<li><a
href="2d32366f58"><code>2d32366</code></a>
Update crate name used for build script probe</li>
<li>Additional commits viewable in <a
href="https://github.com/dtolnay/anyhow/compare/1.0.75...1.0.79">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=anyhow&package-manager=cargo&previous-version=1.0.75&new-version=1.0.79)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-10 14:38:57 +00:00
dependabot[bot]
5d2e606076 Bump sentry-sdk from 1.35.0 to 1.39.1 (#16799)
Bumps [sentry-sdk](https://github.com/getsentry/sentry-python) from
1.35.0 to 1.39.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/releases">sentry-sdk's
releases</a>.</em></p>
<blockquote>
<h2>1.39.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix psycopg2 detection in the Django integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2593">#2593</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Filter out empty string releases (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2591">#2591</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fixed local var not present when there is an error in a user's
<code>error_sampler</code> function (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2511">#2511</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Fixed typing in <code>aiohttp</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2590">#2590</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>1.39.0</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Add support for cluster clients from Redis SDK (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2394">#2394</a>)
by <a href="https://github.com/md384"><code>@​md384</code></a></li>
<li>Improve location reporting for timer metrics (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2552">#2552</a>)
by <a
href="https://github.com/mitsuhiko"><code>@​mitsuhiko</code></a></li>
<li>Fix Celery <code>TypeError</code> with no-argument
<code>apply_async</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2575">#2575</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>Fix Lambda integration with EventBridge source (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2546">#2546</a>)
by <a
href="https://github.com/davidcroda"><code>@​davidcroda</code></a></li>
<li>Add max tries to Spotlight (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2571">#2571</a>)
by <a href="https://github.com/hazAT"><code>@​hazAT</code></a></li>
<li>Handle <code>os.path.devnull</code> access issues (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2579">#2579</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Change <code>code.filepath</code> frame picking logic (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2568">#2568</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Trigger AWS Lambda tests on label (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2538">#2538</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Run permissions step on pull_request_target but not push (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2548">#2548</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Hash AWS Lambda test functions based on current revision (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2557">#2557</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Update Django version in tests (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2562">#2562</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Make metrics tests non-flaky (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2572">#2572</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>1.38.0</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Only add trace context to checkins and do not run
<code>event_processors</code> for checkins (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2536">#2536</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Metric span summaries (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2522">#2522</a>)
by <a
href="https://github.com/mitsuhiko"><code>@​mitsuhiko</code></a></li>
<li>Add source context to code locations (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2539">#2539</a>)
by <a
href="https://github.com/jan-auer"><code>@​jan-auer</code></a></li>
<li>Use in-app filepath instead of absolute path (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2541">#2541</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Switch to <code>jinja2</code> for generating CI yamls (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2534">#2534</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
</ul>
<h2>1.37.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix <code>NameError</code> on <code>parse_version</code> with
eventlet (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2532">#2532</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>build(deps): bump checkouts/data-schemas from <code>68def1e</code>
to <code>e9f7d58</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2501">#2501</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
<h2>1.37.0</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>
<p>Move installed modules code to utils (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2429">#2429</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
<p>Note: We moved the internal function
<code>_get_installed_modules</code> from
<code>sentry_sdk.integrations.modules</code> to
<code>sentry_sdk.utils</code>.
So if you use this function you have to update your imports</p>
</li>
<li>
<p>Add code locations for metrics (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2526">#2526</a>)
by <a href="https://github.com/jan-auer"><code>@​jan-auer</code></a></p>
</li>
<li>
<p>Add query source to DB spans (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2521">#2521</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></p>
</li>
<li>
<p>Send events to Spotlight sidecar (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2524">#2524</a>)
by <a href="https://github.com/HazAT"><code>@​HazAT</code></a></p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md">sentry-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>1.39.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix psycopg2 detection in the Django integration (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2593">#2593</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Filter out empty string releases (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2591">#2591</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fixed local var not present when there is an error in a user's
<code>error_sampler</code> function (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2511">#2511</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Fixed typing in <code>aiohttp</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2590">#2590</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>1.39.0</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Add support for cluster clients from Redis SDK (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2394">#2394</a>)
by <a href="https://github.com/md384"><code>@​md384</code></a></li>
<li>Improve location reporting for timer metrics (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2552">#2552</a>)
by <a
href="https://github.com/mitsuhiko"><code>@​mitsuhiko</code></a></li>
<li>Fix Celery <code>TypeError</code> with no-argument
<code>apply_async</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2575">#2575</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>Fix Lambda integration with EventBridge source (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2546">#2546</a>)
by <a
href="https://github.com/davidcroda"><code>@​davidcroda</code></a></li>
<li>Add max tries to Spotlight (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2571">#2571</a>)
by <a href="https://github.com/hazAT"><code>@​hazAT</code></a></li>
<li>Handle <code>os.path.devnull</code> access issues (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2579">#2579</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Change <code>code.filepath</code> frame picking logic (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2568">#2568</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Trigger AWS Lambda tests on label (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2538">#2538</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Run permissions step on pull_request_target but not push (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2548">#2548</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Hash AWS Lambda test functions based on current revision (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2557">#2557</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Update Django version in tests (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2562">#2562</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Make metrics tests non-flaky (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2572">#2572</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>1.38.0</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Only add trace context to checkins and do not run
<code>event_processors</code> for checkins (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2536">#2536</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Metric span summaries (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2522">#2522</a>)
by <a
href="https://github.com/mitsuhiko"><code>@​mitsuhiko</code></a></li>
<li>Add source context to code locations (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2539">#2539</a>)
by <a
href="https://github.com/jan-auer"><code>@​jan-auer</code></a></li>
<li>Use in-app filepath instead of absolute path (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2541">#2541</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
<li>Switch to <code>jinja2</code> for generating CI yamls (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2534">#2534</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
</ul>
<h2>1.37.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix <code>NameError</code> on <code>parse_version</code> with
eventlet (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2532">#2532</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>build(deps): bump checkouts/data-schemas from <code>68def1e</code>
to <code>e9f7d58</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2501">#2501</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
<h2>1.37.0</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>
<p>Move installed modules code to utils (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2429">#2429</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></p>
<p>Note: We moved the internal function
<code>_get_installed_modules</code> from
<code>sentry_sdk.integrations.modules</code> to
<code>sentry_sdk.utils</code>.</p>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="2b46ec3ba2"><code>2b46ec3</code></a>
Update CHANGELOG.md</li>
<li><a
href="d634c059ea"><code>d634c05</code></a>
release: 1.39.1</li>
<li><a
href="d76fa98332"><code>d76fa98</code></a>
fix(django): Fix psycopg2 detection (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2593">#2593</a>)</li>
<li><a
href="64c42ca975"><code>64c42ca</code></a>
fix(utils): Filter out empty string releases (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2591">#2591</a>)</li>
<li><a
href="47313123d8"><code>4731312</code></a>
Fixed local var not present when error in users error_sampler function
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2511">#2511</a>)</li>
<li><a
href="4deaa38413"><code>4deaa38</code></a>
Fixed typing in aiohttp (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2590">#2590</a>)</li>
<li><a
href="507d4098fb"><code>507d409</code></a>
Merge branch 'release/1.39.0'</li>
<li><a
href="c6cd6360d8"><code>c6cd636</code></a>
Update CHANGELOG.md</li>
<li><a
href="c3a60a60a2"><code>c3a60a6</code></a>
release: 1.39.0</li>
<li><a
href="7df152ba3d"><code>7df152b</code></a>
Change <code>code.filepath</code> frame picking logic (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2568">#2568</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-python/compare/1.35.0...1.39.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sentry-sdk&package-manager=pip&previous-version=1.35.0&new-version=1.39.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-10 14:38:21 +00:00
dependabot[bot]
51096b62d9 Bump serde_json from 1.0.108 to 1.0.111 (#16792)
Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.108 to
1.0.111.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/serde-rs/json/releases">serde_json's
releases</a>.</em></p>
<blockquote>
<h2>v1.0.111</h2>
<ul>
<li>Improve floating point parsing performance on loongarch64 (<a
href="https://redirect.github.com/serde-rs/json/issues/1100">#1100</a>,
thanks <a
href="https://github.com/heiher"><code>@​heiher</code></a>)</li>
</ul>
<h2>v1.0.110</h2>
<ul>
<li>Update proc-macro2 to fix caching issue when using a rustc-wrapper
such as sccache</li>
</ul>
<h2>v1.0.109</h2>
<ul>
<li>Documentation improvements</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0131ac6821"><code>0131ac6</code></a>
Release 1.0.111</li>
<li><a
href="96ecfadd3f"><code>96ecfad</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/json/issues/1100">#1100</a>
from heiher/limb-64-la64</li>
<li><a
href="c80dbaf8ff"><code>c80dbaf</code></a>
Set limb width to 64 for loongarch64</li>
<li><a
href="df5cf215b7"><code>df5cf21</code></a>
Release 1.0.110</li>
<li><a
href="c35856a93c"><code>c35856a</code></a>
Pull in proc-macro2 sccache fix</li>
<li><a
href="f88bf1fccb"><code>f88bf1f</code></a>
Release 1.0.109</li>
<li><a
href="bb62c73ece"><code>bb62c73</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/json/issues/1097">#1097</a>
from serde-rs/doccfg</li>
<li><a
href="df36d109fd"><code>df36d10</code></a>
Restore doc cfg on re-exports</li>
<li><a
href="c367091342"><code>c367091</code></a>
Merge pull request <a
href="https://redirect.github.com/serde-rs/json/issues/1095">#1095</a>
from dtolnay/hashtest</li>
<li><a
href="b328ee7df4"><code>b328ee7</code></a>
Eliminate hash closure in favor of calling hash_one directly</li>
<li>Additional commits viewable in <a
href="https://github.com/serde-rs/json/compare/v1.0.108...v1.0.111">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=serde_json&package-manager=cargo&previous-version=1.0.108&new-version=1.0.111)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-10 14:37:53 +00:00
Erik Johnston
578c5c736e Reduce amount of state pulled out when querying federation hierachy (#16785)
There are two changes here:

1. Only pull out the required state when handling the request.
2. Change the get filtered state return type to check that we're only
querying state that was requested

---------

Co-authored-by: reivilibre <oliverw@matrix.org>
2024-01-10 14:31:35 +00:00
Erik Johnston
4c67f0391b Split up deleting devices into batches (#16766)
Otherwise for users with large numbers of devices this can cause a lot
of woe.
2024-01-10 13:55:16 +00:00
Erik Johnston
72e9b74bbf Fix auto-merge CI to correctly wait for linting. (#16781)
Otherwise if you hit the `Enable auto-merge` button and the linting
fails the PR is still aut-merged.
2024-01-10 13:53:44 +00:00
Erik Johnston
8189942a1f Remove CI check for sign off (#16776)
Since we don't require one anymore.
2024-01-10 13:53:20 +00:00
Andrew Morgan
13e3740f70 Add a link to the Request log format page from Logging Sample Config (#16778) 2024-01-10 13:34:55 +00:00
dependabot[bot]
b96ce9229d Bump types-jsonschema from 4.20.0.0 to 4.20.0.20240105 (#16800)
Bumps [types-jsonschema](https://github.com/python/typeshed) from
4.20.0.0 to 4.20.0.20240105.
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/python/typeshed/commits">compare view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=types-jsonschema&package-manager=pip&previous-version=4.20.0.0&new-version=4.20.0.20240105)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-10 13:27:45 +00:00
Erik Johnston
c3f2f0f063 Faster partial join to room with complex auth graph (#7)
Instead of persisting outliers in a bunch of batches, let's just do them
all at once.

This is fine because all `_auth_and_persist_outliers_inner` is doing is
checking the auth rules for each event, which requires the events to be
topologically sorted by the auth graph.
2024-01-10 12:29:42 +00:00
dependabot[bot]
a0f0fdf4d4 Bump authlib from 1.2.1 to 1.3.0 (#16801)
Bumps [authlib](https://github.com/lepture/authlib) from 1.2.1 to 1.3.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/lepture/authlib/releases">authlib's
releases</a>.</em></p>
<blockquote>
<h2>Version 1.3.0</h2>
<p><strong>Bug fixes</strong></p>
<ul>
<li>Restore AuthorizationServer.create_authorization_response behavior,
via <a
href="https://redirect.github.com/lepture/authlib/issues/558">#558</a>
by <a
href="https://github.com/TurnrDev"><code>@​TurnrDev</code></a></li>
<li>Include leeway in validate_iat() for JWT, via <a
href="https://redirect.github.com/lepture/authlib/issues/565">#565</a>
by <a href="https://github.com/dhallam"><code>@​dhallam</code></a></li>
<li>Fix encode_client_secret_basic, via <a
href="https://redirect.github.com/lepture/authlib/issues/594">#594</a>
by <a href="https://github.com/Prilkop"><code>@​Prilkop</code></a></li>
<li>Use single key in JWK if JWS does not specify kid, via <a
href="https://redirect.github.com/lepture/authlib/issues/596">#596</a>
by <a
href="https://github.com/dklimpel"><code>@​dklimpel</code></a></li>
<li>Fix error when RFC9068 JWS has no scope field, via <a
href="https://redirect.github.com/lepture/authlib/issues/598">#598</a>
by <a
href="https://github.com/tanguilp"><code>@​tanguilp</code></a></li>
<li>Get werkzeug version using importlib, via <a
href="https://redirect.github.com/lepture/authlib/issues/591">#591</a>
by <a
href="https://github.com/Sparrow0hawk"><code>@​Sparrow0hawk</code></a></li>
</ul>
<p><strong>Breaking changes</strong></p>
<ul>
<li>RFC9068 implementation, via <a
href="https://redirect.github.com/lepture/authlib/issues/586">#586</a>
by <a href="https://github.com/azmeuk"><code>@​azmeuk</code></a>.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/lepture/authlib/blob/master/docs/changelog.rst">authlib's
changelog</a>.</em></p>
<blockquote>
<h2>Version 1.3.0</h2>
<p><strong>Released on Dec 17, 2023</strong></p>
<ul>
<li>Restore
<code>AuthorizationServer.create_authorization_response</code> behavior,
via :PR:<code>558</code></li>
<li>Include <code>leeway</code> in <code>validate_iat()</code> for JWT,
via :PR:<code>565</code></li>
<li>Fix <code>encode_client_secret_basic</code>, via
:PR:<code>594</code></li>
<li>Use single key in JWK if JWS does not specify <code>kid</code>, via
:PR:<code>596</code></li>
<li>Fix error when RFC9068 JWS has no scope field, via
:PR:<code>598</code></li>
<li>Get werkzeug version using importlib, via :PR:<code>591</code></li>
</ul>
<p><strong>New features</strong>:</p>
<ul>
<li>RFC9068 implementation, via :PR:<code>586</code>, by <a
href="https://github.com/azmeuk"><code>@​azmeuk</code></a>.</li>
</ul>
<p><strong>Breaking changes</strong>:</p>
<ul>
<li>End support for python 3.7</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="a7d68b4c3b"><code>a7d68b4</code></a>
chore: release 1.3.0</li>
<li><a
href="a26f1d0993"><code>a26f1d0</code></a>
Merge GitHub action for release</li>
<li><a
href="2d66702dec"><code>2d66702</code></a>
Merge pull request <a
href="https://redirect.github.com/lepture/authlib/issues/591">#591</a>
from Sparrow0hawk/patch-2</li>
<li><a
href="0f8e08738b"><code>0f8e087</code></a>
docs: add changelog for 1.3.0</li>
<li><a
href="3ffc950d5b"><code>3ffc950</code></a>
chore: fix pypi release action</li>
<li><a
href="a2543b9ad0"><code>a2543b9</code></a>
chore: add pypi github action</li>
<li><a
href="c7e1b2d41d"><code>c7e1b2d</code></a>
chore: move configuration from setup.cfg to pyproject.toml</li>
<li><a
href="04e83f60ae"><code>04e83f6</code></a>
Merge pull request <a
href="https://redirect.github.com/lepture/authlib/issues/598">#598</a>
from tanguilp/fix-rfc9068-no-scope-in-jws</li>
<li><a
href="092f688b0d"><code>092f688</code></a>
Fix error when RFC9068 JWS has no scope field</li>
<li><a
href="ac58322655"><code>ac58322</code></a>
Get werkzeug version using importlib</li>
<li>Additional commits viewable in <a
href="https://github.com/lepture/authlib/compare/v1.2.1...v1.3.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=authlib&package-manager=pip&previous-version=1.2.1&new-version=1.3.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-10 11:30:21 +00:00
37 changed files with 700 additions and 537 deletions

View File

@@ -9,6 +9,5 @@
- End with either a period (.) or an exclamation mark (!).
- Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry.
* [ ] Pull request includes a [sign off](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#sign-off)
* [ ] [Code style](https://element-hq.github.io/synapse/latest/code_style.html) is correct
(run the [linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

View File

@@ -14,7 +14,7 @@ jobs:
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
- name: 📥 Download artifact
uses: dawidd6/action-download-artifact@268677152d06ba59fcec7a7f0b5d961b6ccd7e1e # v2.28.0
uses: dawidd6/action-download-artifact@e7466d1a7587ed14867642c2ca74b5bcc1e19a2d # v3.0.0
with:
workflow: docs-pr.yaml
run_id: ${{ github.event.workflow_run.id }}

View File

@@ -39,7 +39,7 @@ jobs:
cp book/welcome_and_overview.html book/index.html
- name: Upload Artifact
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: book
path: book

View File

@@ -164,7 +164,7 @@ jobs:
if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})

View File

@@ -92,7 +92,7 @@ jobs:
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
- name: Upload debs as artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: debs
path: debs/*
@@ -156,7 +156,7 @@ jobs:
CARGO_NET_GIT_FETCH_WITH_CLI: true
CIBW_ENVIRONMENT_PASS_LINUX: CARGO_NET_GIT_FETCH_WITH_CLI
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: Wheel
path: ./wheelhouse/*.whl
@@ -177,7 +177,7 @@ jobs:
- name: Build sdist
run: python -m build --sdist
- uses: actions/upload-artifact@v3
- uses: actions/upload-artifact@v4
with:
name: Sdist
path: dist/*.tar.gz
@@ -194,7 +194,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Download all workflow run artifacts
uses: actions/download-artifact@v3
uses: actions/download-artifact@v4
- name: Build a tarball for the debs
run: tar -cvJf debs.tar.xz debs
- name: Attach to release

View File

@@ -12,10 +12,6 @@ concurrency:
cancel-in-progress: true
jobs:
check-signoff:
if: "github.event_name == 'pull_request'"
uses: "matrix-org/backend-meta/.github/workflows/sign-off.yml@v2"
# Job to detect what has changed so we don't run e.g. Rust checks on PRs that
# don't modify Rust code.
changes:
@@ -286,10 +282,26 @@ jobs:
- check-schema-delta
- check-lockfile
- lint-clippy
- lint-clippy-nightly
- lint-rustfmt
runs-on: ubuntu-latest
steps:
- run: "true"
- uses: matrix-org/done-action@v2
with:
needs: ${{ toJSON(needs) }}
# Various bits are skipped if there was no applicable changes.
skippable: |
check-sampleconfig
check-schema-delta
lint
lint-mypy
lint-newsfile
lint-pydantic
lint-clippy
lint-clippy-nightly
lint-rustfmt
calculate-test-jobs:
if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail
@@ -496,7 +508,7 @@ jobs:
if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.job.*, ', ') }})
@@ -594,7 +606,7 @@ jobs:
PGPASSWORD: postgres
PGDATABASE: postgres
- name: "Upload schema differences"
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
if: ${{ failure() && !cancelled() && steps.run_tester_script.outcome == 'failure' }}
with:
name: Schema dumps
@@ -699,6 +711,7 @@ jobs:
- complement
- cargo-test
- cargo-bench
- linting-done
runs-on: ubuntu-latest
steps:
- uses: matrix-org/done-action@v2
@@ -706,7 +719,7 @@ jobs:
needs: ${{ toJSON(needs) }}
# Various bits are skipped if there was no applicable changes.
# The newsfile and signoff lint may be skipped on non PR builds.
# The newsfile lint may be skipped on non PR builds.
skippable: |
trial
trial-olddeps
@@ -714,7 +727,6 @@ jobs:
portdb
export-data
complement
check-signoff
lint-newsfile
cargo-test
cargo-bench

View File

@@ -136,7 +136,7 @@ jobs:
if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})

48
Cargo.lock generated
View File

@@ -13,9 +13,9 @@ dependencies = [
[[package]]
name = "anyhow"
version = "1.0.75"
version = "1.0.79"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a4668cab20f66d8d020e1fbc0ebe47217433c1b6c8f2040faf858554e394ace6"
checksum = "080e9890a082662b09c1ad45f567faeeb47f22b5fb23895fbe1e651e718e25ca"
[[package]]
name = "arc-swap"
@@ -188,18 +188,18 @@ dependencies = [
[[package]]
name = "proc-macro2"
version = "1.0.64"
version = "1.0.76"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "78803b62cbf1f46fde80d7c0e803111524b9877184cfe7c3033659490ac7a7da"
checksum = "95fc56cda0b5c3325f5fbbd7ff9fda9e02bb00bb3dac51252d2f1bfa1cb8cc8c"
dependencies = [
"unicode-ident",
]
[[package]]
name = "pyo3"
version = "0.20.0"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "04e8453b658fe480c3e70c8ed4e3d3ec33eb74988bd186561b0cc66b85c3bc4b"
checksum = "9a89dc7a5850d0e983be1ec2a463a171d20990487c3cfcd68b5363f1ee3d6fe0"
dependencies = [
"anyhow",
"cfg-if",
@@ -215,9 +215,9 @@ dependencies = [
[[package]]
name = "pyo3-build-config"
version = "0.20.0"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a96fe70b176a89cff78f2fa7b3c930081e163d5379b4dcdf993e3ae29ca662e5"
checksum = "07426f0d8fe5a601f26293f300afd1a7b1ed5e78b2a705870c5f30893c5163be"
dependencies = [
"once_cell",
"target-lexicon",
@@ -225,9 +225,9 @@ dependencies = [
[[package]]
name = "pyo3-ffi"
version = "0.20.0"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "214929900fd25e6604661ed9cf349727c8920d47deff196c4e28165a6ef2a96b"
checksum = "dbb7dec17e17766b46bca4f1a4215a85006b4c2ecde122076c562dd058da6cf1"
dependencies = [
"libc",
"pyo3-build-config",
@@ -246,9 +246,9 @@ dependencies = [
[[package]]
name = "pyo3-macros"
version = "0.20.0"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dac53072f717aa1bfa4db832b39de8c875b7c7af4f4a6fe93cdbf9264cf8383b"
checksum = "05f738b4e40d50b5711957f142878cfa0f28e054aa0ebdfc3fd137a843f74ed3"
dependencies = [
"proc-macro2",
"pyo3-macros-backend",
@@ -258,9 +258,9 @@ dependencies = [
[[package]]
name = "pyo3-macros-backend"
version = "0.20.0"
version = "0.20.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7774b5a8282bd4f25f803b1f0d945120be959a36c72e08e7cd031c792fdfd424"
checksum = "0fc910d4851847827daf9d6cdd4a823fbdaab5b8818325c5e97a86da79e8881f"
dependencies = [
"heck",
"proc-macro2",
@@ -280,9 +280,9 @@ dependencies = [
[[package]]
name = "quote"
version = "1.0.29"
version = "1.0.35"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "573015e8ab27661678357f27dc26460738fd2b6c86e46f386fde94cb5d913105"
checksum = "291ec9ab5efd934aaf503a6466c5d5251535d108ee747472c3977cc5acc868ef"
dependencies = [
"proc-macro2",
]
@@ -339,18 +339,18 @@ checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
[[package]]
name = "serde"
version = "1.0.193"
version = "1.0.195"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "25dd9975e68d0cb5aa1120c288333fc98731bd1dd12f561e468ea4728c042b89"
checksum = "63261df402c67811e9ac6def069e4786148c4563f4b50fd4bf30aa370d626b02"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.193"
version = "1.0.195"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "43576ca501357b9b071ac53cdc7da8ef0cbd9493d8df094cd821777ea6e894d3"
checksum = "46fe8f8603d81ba86327b23a2e9cdf49e1255fb94a4c5f297f6ee0547178ea2c"
dependencies = [
"proc-macro2",
"quote",
@@ -359,9 +359,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.108"
version = "1.0.111"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3d1c7e3eac408d115102c4c24ad393e0821bb3a5df4d506a80f85f7a742a526b"
checksum = "176e46fa42316f18edd598015a5166857fc835ec732f5215eac6b7bdbf0a84f4"
dependencies = [
"itoa",
"ryu",
@@ -382,9 +382,9 @@ checksum = "6bdef32e8150c2a081110b42772ffe7d7c9032b606bc226c8260fd97e0976601"
[[package]]
name = "syn"
version = "2.0.28"
version = "2.0.48"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "04361975b3f5e348b2189d8dc55bc942f278b2d482a6a0365de5bdd62d351567"
checksum = "0f3531638e407dfc0814761abb7c00a5b54992b849452a0646b7f65c9f770f3f"
dependencies = [
"proc-macro2",
"quote",

1
changelog.d/16756.misc Normal file
View File

@@ -0,0 +1 @@
Improve DB performance of calculating badge counts for push.

1
changelog.d/16766.misc Normal file
View File

@@ -0,0 +1 @@
Split up deleting devices into batches.

1
changelog.d/16776.misc Normal file
View File

@@ -0,0 +1 @@
Remove CI check for sign off as we require an CLA signature instead.

1
changelog.d/16778.doc Normal file
View File

@@ -0,0 +1 @@
Add a link to the "Request log format" explainer on the "Logging sample config" documentation page.

1
changelog.d/16781.misc Normal file
View File

@@ -0,0 +1 @@
Ensure CI fails when linting fails to make sure auto-merge does the correct thing.

1
changelog.d/16783.misc Normal file
View File

@@ -0,0 +1 @@
Faster load recents for sync by reducing amount of state pulled out.

1
changelog.d/16785.misc Normal file
View File

@@ -0,0 +1 @@
Reduce amount of state pulled out when querying federation hierachy.

1
changelog.d/16788.misc Normal file
View File

@@ -0,0 +1 @@
Pull less state out of the DB when we retry fetching old events during backfill.

1
changelog.d/16805.misc Normal file
View File

@@ -0,0 +1 @@
Optimize query for fetching to-device messages in `/sync`.

1
changelog.d/16806.misc Normal file
View File

@@ -0,0 +1 @@
Reject OIDC config when `client_secret` isn't specified, but the auth method requires one.

1
changelog.d/7.misc Normal file
View File

@@ -0,0 +1 @@
Faster partial join to room with complex auth graph.

6
debian/changelog vendored
View File

@@ -1,3 +1,9 @@
matrix-synapse-py3 (1.99.0~rc1ubuntu1) UNRELEASED; urgency=medium
* Fix copyright file with new licensing
-- Synapse Packaging team <packages@matrix.org> Thu, 11 Jan 2024 13:47:29 +0000
matrix-synapse-py3 (1.99.0~rc1) stable; urgency=medium
* New Synapse release 1.99.0rc1.

4
debian/copyright vendored
View File

@@ -6,6 +6,10 @@ Files: *
Copyright: 2014-2017, OpenMarket Ltd, 2017-2018 New Vector Ltd
License: Apache-2.0
Files: *
Copyright: 2023 New Vector Ltd
License: AGPL-3.0-or-later
Files: synapse/config/saml2.py
Copyright: 2015, Ericsson
License: Apache-2.0

7
debian/rules vendored
View File

@@ -40,9 +40,9 @@ override_dh_shlibdeps:
# to be self-contained, but they have interdependencies and
# dpkg-shlibdeps doesn't know how to resolve them.
#
# As of Pillow 7.1.0, these libraries are in
# site-packages/Pillow.libs. Previously, they were in
# site-packages/PIL/.libs.
# As of Pillow 7.1.0, these libraries are in site-packages/Pillow.libs.
# Previously, they were in site-packages/PIL/.libs. As of Pillow 10.2.0
# the package name is lowercased to site-packages/pillow.libs.
#
# (we also need to exclude psycopg2, of course, since we've already
# dealt with that.)
@@ -50,6 +50,7 @@ override_dh_shlibdeps:
dh_shlibdeps \
-X site-packages/PIL/.libs \
-X site-packages/Pillow.libs \
-X site-packages/pillow.libs \
-X site-packages/psycopg2
override_dh_virtualenv:

View File

@@ -11,6 +11,9 @@ Note that a default logging configuration (shown below) is created automatically
the homeserver config when following the [installation instructions](../../setup/installation.md).
It should be named `<SERVERNAME>.log.config` by default.
Hint: If you're looking for a guide on what each of the fields in the "Processed request" log lines mean,
see [Request log format](../administration/request_log.md).
```yaml
{{#include ../../sample_log_config.yaml}}
```

354
poetry.lock generated
View File

@@ -64,17 +64,17 @@ tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pyte
[[package]]
name = "authlib"
version = "1.2.1"
version = "1.3.0"
description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients."
optional = true
python-versions = "*"
python-versions = ">=3.8"
files = [
{file = "Authlib-1.2.1-py2.py3-none-any.whl", hash = "sha256:c88984ea00149a90e3537c964327da930779afa4564e354edfd98410bea01911"},
{file = "Authlib-1.2.1.tar.gz", hash = "sha256:421f7c6b468d907ca2d9afede256f068f87e34d23dd221c07d13d4c234726afb"},
{file = "Authlib-1.3.0-py2.py3-none-any.whl", hash = "sha256:9637e4de1fb498310a56900b3e2043a206b03cb11c05422014b0302cbc814be3"},
{file = "Authlib-1.3.0.tar.gz", hash = "sha256:959ea62a5b7b5123c5059758296122b57cd2585ae2ed1c0622c21b371ffdae06"},
]
[package.dependencies]
cryptography = ">=3.2"
cryptography = "*"
[[package]]
name = "automat"
@@ -832,13 +832,13 @@ files = [
[[package]]
name = "immutabledict"
version = "4.0.0"
version = "4.1.0"
description = "Immutable wrapper around dictionaries (a fork of frozendict)"
optional = false
python-versions = ">=3.8,<4.0"
files = [
{file = "immutabledict-4.0.0-py3-none-any.whl", hash = "sha256:7b28ffd8a0fbd7c6068ba8ba7a6aa0e50a158e9aae33b22d1dedd03f9aac33b6"},
{file = "immutabledict-4.0.0.tar.gz", hash = "sha256:fabf47437531e8bf65a3b5b47d501e65579323b2d1fe58f8ae01491c1fd29bf7"},
{file = "immutabledict-4.1.0-py3-none-any.whl", hash = "sha256:c176e99aa90aedb81716ad35218bb2055d049b549626db4523dbe011cf2f32ac"},
{file = "immutabledict-4.1.0.tar.gz", hash = "sha256:93d100ccd2cd09a1fd3f136b9328c6e59529ba341de8bb499437f6819159fe8a"},
]
[[package]]
@@ -1096,110 +1096,96 @@ pyasn1 = ">=0.4.6"
[[package]]
name = "lxml"
version = "4.9.3"
version = "5.1.0"
description = "Powerful and Pythonic XML processing library combining libxml2/libxslt with the ElementTree API."
optional = true
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, != 3.4.*"
python-versions = ">=3.6"
files = [
{file = "lxml-4.9.3-cp27-cp27m-macosx_11_0_x86_64.whl", hash = "sha256:b0a545b46b526d418eb91754565ba5b63b1c0b12f9bd2f808c852d9b4b2f9b5c"},
{file = "lxml-4.9.3-cp27-cp27m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:075b731ddd9e7f68ad24c635374211376aa05a281673ede86cbe1d1b3455279d"},
{file = "lxml-4.9.3-cp27-cp27m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:1e224d5755dba2f4a9498e150c43792392ac9b5380aa1b845f98a1618c94eeef"},
{file = "lxml-4.9.3-cp27-cp27m-win32.whl", hash = "sha256:2c74524e179f2ad6d2a4f7caf70e2d96639c0954c943ad601a9e146c76408ed7"},
{file = "lxml-4.9.3-cp27-cp27m-win_amd64.whl", hash = "sha256:4f1026bc732b6a7f96369f7bfe1a4f2290fb34dce00d8644bc3036fb351a4ca1"},
{file = "lxml-4.9.3-cp27-cp27mu-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c0781a98ff5e6586926293e59480b64ddd46282953203c76ae15dbbbf302e8bb"},
{file = "lxml-4.9.3-cp27-cp27mu-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:cef2502e7e8a96fe5ad686d60b49e1ab03e438bd9123987994528febd569868e"},
{file = "lxml-4.9.3-cp310-cp310-macosx_11_0_x86_64.whl", hash = "sha256:b86164d2cff4d3aaa1f04a14685cbc072efd0b4f99ca5708b2ad1b9b5988a991"},
{file = "lxml-4.9.3-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:42871176e7896d5d45138f6d28751053c711ed4d48d8e30b498da155af39aebd"},
{file = "lxml-4.9.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:ae8b9c6deb1e634ba4f1930eb67ef6e6bf6a44b6eb5ad605642b2d6d5ed9ce3c"},
{file = "lxml-4.9.3-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:411007c0d88188d9f621b11d252cce90c4a2d1a49db6c068e3c16422f306eab8"},
{file = "lxml-4.9.3-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:cd47b4a0d41d2afa3e58e5bf1f62069255aa2fd6ff5ee41604418ca925911d76"},
{file = "lxml-4.9.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:0e2cb47860da1f7e9a5256254b74ae331687b9672dfa780eed355c4c9c3dbd23"},
{file = "lxml-4.9.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:1247694b26342a7bf47c02e513d32225ededd18045264d40758abeb3c838a51f"},
{file = "lxml-4.9.3-cp310-cp310-win32.whl", hash = "sha256:cdb650fc86227eba20de1a29d4b2c1bfe139dc75a0669270033cb2ea3d391b85"},
{file = "lxml-4.9.3-cp310-cp310-win_amd64.whl", hash = "sha256:97047f0d25cd4bcae81f9ec9dc290ca3e15927c192df17331b53bebe0e3ff96d"},
{file = "lxml-4.9.3-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:1f447ea5429b54f9582d4b955f5f1985f278ce5cf169f72eea8afd9502973dd5"},
{file = "lxml-4.9.3-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:57d6ba0ca2b0c462f339640d22882acc711de224d769edf29962b09f77129cbf"},
{file = "lxml-4.9.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:9767e79108424fb6c3edf8f81e6730666a50feb01a328f4a016464a5893f835a"},
{file = "lxml-4.9.3-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:71c52db65e4b56b8ddc5bb89fb2e66c558ed9d1a74a45ceb7dcb20c191c3df2f"},
{file = "lxml-4.9.3-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:d73d8ecf8ecf10a3bd007f2192725a34bd62898e8da27eb9d32a58084f93962b"},
{file = "lxml-4.9.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0a3d3487f07c1d7f150894c238299934a2a074ef590b583103a45002035be120"},
{file = "lxml-4.9.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9e28c51fa0ce5674be9f560c6761c1b441631901993f76700b1b30ca6c8378d6"},
{file = "lxml-4.9.3-cp311-cp311-win32.whl", hash = "sha256:0bfd0767c5c1de2551a120673b72e5d4b628737cb05414f03c3277bf9bed3305"},
{file = "lxml-4.9.3-cp311-cp311-win_amd64.whl", hash = "sha256:25f32acefac14ef7bd53e4218fe93b804ef6f6b92ffdb4322bb6d49d94cad2bc"},
{file = "lxml-4.9.3-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:d3ff32724f98fbbbfa9f49d82852b159e9784d6094983d9a8b7f2ddaebb063d4"},
{file = "lxml-4.9.3-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:48d6ed886b343d11493129e019da91d4039826794a3e3027321c56d9e71505be"},
{file = "lxml-4.9.3-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:9a92d3faef50658dd2c5470af249985782bf754c4e18e15afb67d3ab06233f13"},
{file = "lxml-4.9.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:b4e4bc18382088514ebde9328da057775055940a1f2e18f6ad2d78aa0f3ec5b9"},
{file = "lxml-4.9.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:fc9b106a1bf918db68619fdcd6d5ad4f972fdd19c01d19bdb6bf63f3589a9ec5"},
{file = "lxml-4.9.3-cp312-cp312-win_amd64.whl", hash = "sha256:d37017287a7adb6ab77e1c5bee9bcf9660f90ff445042b790402a654d2ad81d8"},
{file = "lxml-4.9.3-cp35-cp35m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:56dc1f1ebccc656d1b3ed288f11e27172a01503fc016bcabdcbc0978b19352b7"},
{file = "lxml-4.9.3-cp35-cp35m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:578695735c5a3f51569810dfebd05dd6f888147a34f0f98d4bb27e92b76e05c2"},
{file = "lxml-4.9.3-cp35-cp35m-win32.whl", hash = "sha256:704f61ba8c1283c71b16135caf697557f5ecf3e74d9e453233e4771d68a1f42d"},
{file = "lxml-4.9.3-cp35-cp35m-win_amd64.whl", hash = "sha256:c41bfca0bd3532d53d16fd34d20806d5c2b1ace22a2f2e4c0008570bf2c58833"},
{file = "lxml-4.9.3-cp36-cp36m-macosx_11_0_x86_64.whl", hash = "sha256:64f479d719dc9f4c813ad9bb6b28f8390360660b73b2e4beb4cb0ae7104f1c12"},
{file = "lxml-4.9.3-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:dd708cf4ee4408cf46a48b108fb9427bfa00b9b85812a9262b5c668af2533ea5"},
{file = "lxml-4.9.3-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5c31c7462abdf8f2ac0577d9f05279727e698f97ecbb02f17939ea99ae8daa98"},
{file = "lxml-4.9.3-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:e3cd95e10c2610c360154afdc2f1480aea394f4a4f1ea0a5eacce49640c9b190"},
{file = "lxml-4.9.3-cp36-cp36m-manylinux_2_28_x86_64.whl", hash = "sha256:4930be26af26ac545c3dffb662521d4e6268352866956672231887d18f0eaab2"},
{file = "lxml-4.9.3-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4aec80cde9197340bc353d2768e2a75f5f60bacda2bab72ab1dc499589b3878c"},
{file = "lxml-4.9.3-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:14e019fd83b831b2e61baed40cab76222139926b1fb5ed0e79225bc0cae14584"},
{file = "lxml-4.9.3-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:0c0850c8b02c298d3c7006b23e98249515ac57430e16a166873fc47a5d549287"},
{file = "lxml-4.9.3-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:aca086dc5f9ef98c512bac8efea4483eb84abbf926eaeedf7b91479feb092458"},
{file = "lxml-4.9.3-cp36-cp36m-win32.whl", hash = "sha256:50baa9c1c47efcaef189f31e3d00d697c6d4afda5c3cde0302d063492ff9b477"},
{file = "lxml-4.9.3-cp36-cp36m-win_amd64.whl", hash = "sha256:bef4e656f7d98aaa3486d2627e7d2df1157d7e88e7efd43a65aa5dd4714916cf"},
{file = "lxml-4.9.3-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:46f409a2d60f634fe550f7133ed30ad5321ae2e6630f13657fb9479506b00601"},
{file = "lxml-4.9.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:4c28a9144688aef80d6ea666c809b4b0e50010a2aca784c97f5e6bf143d9f129"},
{file = "lxml-4.9.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:141f1d1a9b663c679dc524af3ea1773e618907e96075262726c7612c02b149a4"},
{file = "lxml-4.9.3-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:53ace1c1fd5a74ef662f844a0413446c0629d151055340e9893da958a374f70d"},
{file = "lxml-4.9.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:17a753023436a18e27dd7769e798ce302963c236bc4114ceee5b25c18c52c693"},
{file = "lxml-4.9.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:7d298a1bd60c067ea75d9f684f5f3992c9d6766fadbc0bcedd39750bf344c2f4"},
{file = "lxml-4.9.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:081d32421db5df44c41b7f08a334a090a545c54ba977e47fd7cc2deece78809a"},
{file = "lxml-4.9.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:23eed6d7b1a3336ad92d8e39d4bfe09073c31bfe502f20ca5116b2a334f8ec02"},
{file = "lxml-4.9.3-cp37-cp37m-win32.whl", hash = "sha256:1509dd12b773c02acd154582088820893109f6ca27ef7291b003d0e81666109f"},
{file = "lxml-4.9.3-cp37-cp37m-win_amd64.whl", hash = "sha256:120fa9349a24c7043854c53cae8cec227e1f79195a7493e09e0c12e29f918e52"},
{file = "lxml-4.9.3-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:4d2d1edbca80b510443f51afd8496be95529db04a509bc8faee49c7b0fb6d2cc"},
{file = "lxml-4.9.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:8d7e43bd40f65f7d97ad8ef5c9b1778943d02f04febef12def25f7583d19baac"},
{file = "lxml-4.9.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:71d66ee82e7417828af6ecd7db817913cb0cf9d4e61aa0ac1fde0583d84358db"},
{file = "lxml-4.9.3-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:6fc3c450eaa0b56f815c7b62f2b7fba7266c4779adcf1cece9e6deb1de7305ce"},
{file = "lxml-4.9.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:65299ea57d82fb91c7f019300d24050c4ddeb7c5a190e076b5f48a2b43d19c42"},
{file = "lxml-4.9.3-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:eadfbbbfb41b44034a4c757fd5d70baccd43296fb894dba0295606a7cf3124aa"},
{file = "lxml-4.9.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:3e9bdd30efde2b9ccfa9cb5768ba04fe71b018a25ea093379c857c9dad262c40"},
{file = "lxml-4.9.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:fcdd00edfd0a3001e0181eab3e63bd5c74ad3e67152c84f93f13769a40e073a7"},
{file = "lxml-4.9.3-cp38-cp38-win32.whl", hash = "sha256:57aba1bbdf450b726d58b2aea5fe47c7875f5afb2c4a23784ed78f19a0462574"},
{file = "lxml-4.9.3-cp38-cp38-win_amd64.whl", hash = "sha256:92af161ecbdb2883c4593d5ed4815ea71b31fafd7fd05789b23100d081ecac96"},
{file = "lxml-4.9.3-cp39-cp39-macosx_11_0_x86_64.whl", hash = "sha256:9bb6ad405121241e99a86efff22d3ef469024ce22875a7ae045896ad23ba2340"},
{file = "lxml-4.9.3-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:8ed74706b26ad100433da4b9d807eae371efaa266ffc3e9191ea436087a9d6a7"},
{file = "lxml-4.9.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:fbf521479bcac1e25a663df882c46a641a9bff6b56dc8b0fafaebd2f66fb231b"},
{file = "lxml-4.9.3-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:303bf1edce6ced16bf67a18a1cf8339d0db79577eec5d9a6d4a80f0fb10aa2da"},
{file = "lxml-4.9.3-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:5515edd2a6d1a5a70bfcdee23b42ec33425e405c5b351478ab7dc9347228f96e"},
{file = "lxml-4.9.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:690dafd0b187ed38583a648076865d8c229661ed20e48f2335d68e2cf7dc829d"},
{file = "lxml-4.9.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:b6420a005548ad52154c8ceab4a1290ff78d757f9e5cbc68f8c77089acd3c432"},
{file = "lxml-4.9.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:bb3bb49c7a6ad9d981d734ef7c7193bc349ac338776a0360cc671eaee89bcf69"},
{file = "lxml-4.9.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d27be7405547d1f958b60837dc4c1007da90b8b23f54ba1f8b728c78fdb19d50"},
{file = "lxml-4.9.3-cp39-cp39-win32.whl", hash = "sha256:8df133a2ea5e74eef5e8fc6f19b9e085f758768a16e9877a60aec455ed2609b2"},
{file = "lxml-4.9.3-cp39-cp39-win_amd64.whl", hash = "sha256:4dd9a263e845a72eacb60d12401e37c616438ea2e5442885f65082c276dfb2b2"},
{file = "lxml-4.9.3-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:6689a3d7fd13dc687e9102a27e98ef33730ac4fe37795d5036d18b4d527abd35"},
{file = "lxml-4.9.3-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:f6bdac493b949141b733c5345b6ba8f87a226029cbabc7e9e121a413e49441e0"},
{file = "lxml-4.9.3-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:05186a0f1346ae12553d66df1cfce6f251589fea3ad3da4f3ef4e34b2d58c6a3"},
{file = "lxml-4.9.3-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c2006f5c8d28dee289f7020f721354362fa304acbaaf9745751ac4006650254b"},
{file = "lxml-4.9.3-pp38-pypy38_pp73-macosx_11_0_x86_64.whl", hash = "sha256:5c245b783db29c4e4fbbbfc9c5a78be496c9fea25517f90606aa1f6b2b3d5f7b"},
{file = "lxml-4.9.3-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:4fb960a632a49f2f089d522f70496640fdf1218f1243889da3822e0a9f5f3ba7"},
{file = "lxml-4.9.3-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:50670615eaf97227d5dc60de2dc99fb134a7130d310d783314e7724bf163f75d"},
{file = "lxml-4.9.3-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:9719fe17307a9e814580af1f5c6e05ca593b12fb7e44fe62450a5384dbf61b4b"},
{file = "lxml-4.9.3-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:3331bece23c9ee066e0fb3f96c61322b9e0f54d775fccefff4c38ca488de283a"},
{file = "lxml-4.9.3-pp39-pypy39_pp73-macosx_11_0_x86_64.whl", hash = "sha256:ed667f49b11360951e201453fc3967344d0d0263aa415e1619e85ae7fd17b4e0"},
{file = "lxml-4.9.3-pp39-pypy39_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_24_i686.whl", hash = "sha256:8b77946fd508cbf0fccd8e400a7f71d4ac0e1595812e66025bac475a8e811694"},
{file = "lxml-4.9.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:e4da8ca0c0c0aea88fd46be8e44bd49716772358d648cce45fe387f7b92374a7"},
{file = "lxml-4.9.3-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:fe4bda6bd4340caa6e5cf95e73f8fea5c4bfc55763dd42f1b50a94c1b4a2fbd4"},
{file = "lxml-4.9.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:f3df3db1d336b9356dd3112eae5f5c2b8b377f3bc826848567f10bfddfee77e9"},
{file = "lxml-4.9.3.tar.gz", hash = "sha256:48628bd53a426c9eb9bc066a923acaa0878d1e86129fd5359aee99285f4eed9c"},
{file = "lxml-5.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:704f5572ff473a5f897745abebc6df40f22d4133c1e0a1f124e4f2bd3330ff7e"},
{file = "lxml-5.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:9d3c0f8567ffe7502d969c2c1b809892dc793b5d0665f602aad19895f8d508da"},
{file = "lxml-5.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5fcfbebdb0c5d8d18b84118842f31965d59ee3e66996ac842e21f957eb76138c"},
{file = "lxml-5.1.0-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2f37c6d7106a9d6f0708d4e164b707037b7380fcd0b04c5bd9cae1fb46a856fb"},
{file = "lxml-5.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2befa20a13f1a75c751f47e00929fb3433d67eb9923c2c0b364de449121f447c"},
{file = "lxml-5.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22b7ee4c35f374e2c20337a95502057964d7e35b996b1c667b5c65c567d2252a"},
{file = "lxml-5.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:bf8443781533b8d37b295016a4b53c1494fa9a03573c09ca5104550c138d5c05"},
{file = "lxml-5.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:82bddf0e72cb2af3cbba7cec1d2fd11fda0de6be8f4492223d4a268713ef2147"},
{file = "lxml-5.1.0-cp310-cp310-win32.whl", hash = "sha256:b66aa6357b265670bb574f050ffceefb98549c721cf28351b748be1ef9577d93"},
{file = "lxml-5.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:4946e7f59b7b6a9e27bef34422f645e9a368cb2be11bf1ef3cafc39a1f6ba68d"},
{file = "lxml-5.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:14deca1460b4b0f6b01f1ddc9557704e8b365f55c63070463f6c18619ebf964f"},
{file = "lxml-5.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ed8c3d2cd329bf779b7ed38db176738f3f8be637bb395ce9629fc76f78afe3d4"},
{file = "lxml-5.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:436a943c2900bb98123b06437cdd30580a61340fbdb7b28aaf345a459c19046a"},
{file = "lxml-5.1.0-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:acb6b2f96f60f70e7f34efe0c3ea34ca63f19ca63ce90019c6cbca6b676e81fa"},
{file = "lxml-5.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:af8920ce4a55ff41167ddbc20077f5698c2e710ad3353d32a07d3264f3a2021e"},
{file = "lxml-5.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7cfced4a069003d8913408e10ca8ed092c49a7f6cefee9bb74b6b3e860683b45"},
{file = "lxml-5.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:9e5ac3437746189a9b4121db2a7b86056ac8786b12e88838696899328fc44bb2"},
{file = "lxml-5.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:f4c9bda132ad108b387c33fabfea47866af87f4ea6ffb79418004f0521e63204"},
{file = "lxml-5.1.0-cp311-cp311-win32.whl", hash = "sha256:bc64d1b1dab08f679fb89c368f4c05693f58a9faf744c4d390d7ed1d8223869b"},
{file = "lxml-5.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:a5ab722ae5a873d8dcee1f5f45ddd93c34210aed44ff2dc643b5025981908cda"},
{file = "lxml-5.1.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:9aa543980ab1fbf1720969af1d99095a548ea42e00361e727c58a40832439114"},
{file = "lxml-5.1.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:6f11b77ec0979f7e4dc5ae081325a2946f1fe424148d3945f943ceaede98adb8"},
{file = "lxml-5.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a36c506e5f8aeb40680491d39ed94670487ce6614b9d27cabe45d94cd5d63e1e"},
{file = "lxml-5.1.0-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f643ffd2669ffd4b5a3e9b41c909b72b2a1d5e4915da90a77e119b8d48ce867a"},
{file = "lxml-5.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16dd953fb719f0ffc5bc067428fc9e88f599e15723a85618c45847c96f11f431"},
{file = "lxml-5.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:16018f7099245157564d7148165132c70adb272fb5a17c048ba70d9cc542a1a1"},
{file = "lxml-5.1.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:82cd34f1081ae4ea2ede3d52f71b7be313756e99b4b5f829f89b12da552d3aa3"},
{file = "lxml-5.1.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:19a1bc898ae9f06bccb7c3e1dfd73897ecbbd2c96afe9095a6026016e5ca97b8"},
{file = "lxml-5.1.0-cp312-cp312-win32.whl", hash = "sha256:13521a321a25c641b9ea127ef478b580b5ec82aa2e9fc076c86169d161798b01"},
{file = "lxml-5.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:1ad17c20e3666c035db502c78b86e58ff6b5991906e55bdbef94977700c72623"},
{file = "lxml-5.1.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:24ef5a4631c0b6cceaf2dbca21687e29725b7c4e171f33a8f8ce23c12558ded1"},
{file = "lxml-5.1.0-cp36-cp36m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8d2900b7f5318bc7ad8631d3d40190b95ef2aa8cc59473b73b294e4a55e9f30f"},
{file = "lxml-5.1.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:601f4a75797d7a770daed8b42b97cd1bb1ba18bd51a9382077a6a247a12aa38d"},
{file = "lxml-5.1.0-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b4b68c961b5cc402cbd99cca5eb2547e46ce77260eb705f4d117fd9c3f932b95"},
{file = "lxml-5.1.0-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:afd825e30f8d1f521713a5669b63657bcfe5980a916c95855060048b88e1adb7"},
{file = "lxml-5.1.0-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:262bc5f512a66b527d026518507e78c2f9c2bd9eb5c8aeeb9f0eb43fcb69dc67"},
{file = "lxml-5.1.0-cp36-cp36m-win32.whl", hash = "sha256:e856c1c7255c739434489ec9c8aa9cdf5179785d10ff20add308b5d673bed5cd"},
{file = "lxml-5.1.0-cp36-cp36m-win_amd64.whl", hash = "sha256:c7257171bb8d4432fe9d6fdde4d55fdbe663a63636a17f7f9aaba9bcb3153ad7"},
{file = "lxml-5.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:b9e240ae0ba96477682aa87899d94ddec1cc7926f9df29b1dd57b39e797d5ab5"},
{file = "lxml-5.1.0-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a96f02ba1bcd330807fc060ed91d1f7a20853da6dd449e5da4b09bfcc08fdcf5"},
{file = "lxml-5.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e3898ae2b58eeafedfe99e542a17859017d72d7f6a63de0f04f99c2cb125936"},
{file = "lxml-5.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61c5a7edbd7c695e54fca029ceb351fc45cd8860119a0f83e48be44e1c464862"},
{file = "lxml-5.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:3aeca824b38ca78d9ee2ab82bd9883083d0492d9d17df065ba3b94e88e4d7ee6"},
{file = "lxml-5.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8f52fe6859b9db71ee609b0c0a70fea5f1e71c3462ecf144ca800d3f434f0764"},
{file = "lxml-5.1.0-cp37-cp37m-win32.whl", hash = "sha256:d42e3a3fc18acc88b838efded0e6ec3edf3e328a58c68fbd36a7263a874906c8"},
{file = "lxml-5.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:eac68f96539b32fce2c9b47eb7c25bb2582bdaf1bbb360d25f564ee9e04c542b"},
{file = "lxml-5.1.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ae15347a88cf8af0949a9872b57a320d2605ae069bcdf047677318bc0bba45b1"},
{file = "lxml-5.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:c26aab6ea9c54d3bed716b8851c8bfc40cb249b8e9880e250d1eddde9f709bf5"},
{file = "lxml-5.1.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:342e95bddec3a698ac24378d61996b3ee5ba9acfeb253986002ac53c9a5f6f84"},
{file = "lxml-5.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:725e171e0b99a66ec8605ac77fa12239dbe061482ac854d25720e2294652eeaa"},
{file = "lxml-5.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3d184e0d5c918cff04cdde9dbdf9600e960161d773666958c9d7b565ccc60c45"},
{file = "lxml-5.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:98f3f020a2b736566c707c8e034945c02aa94e124c24f77ca097c446f81b01f1"},
{file = "lxml-5.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6d48fc57e7c1e3df57be5ae8614bab6d4e7b60f65c5457915c26892c41afc59e"},
{file = "lxml-5.1.0-cp38-cp38-win32.whl", hash = "sha256:7ec465e6549ed97e9f1e5ed51c657c9ede767bc1c11552f7f4d022c4df4a977a"},
{file = "lxml-5.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:b21b4031b53d25b0858d4e124f2f9131ffc1530431c6d1321805c90da78388d1"},
{file = "lxml-5.1.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:52427a7eadc98f9e62cb1368a5079ae826f94f05755d2d567d93ee1bc3ceb354"},
{file = "lxml-5.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6a2a2c724d97c1eb8cf966b16ca2915566a4904b9aad2ed9a09c748ffe14f969"},
{file = "lxml-5.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:843b9c835580d52828d8f69ea4302537337a21e6b4f1ec711a52241ba4a824f3"},
{file = "lxml-5.1.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9b99f564659cfa704a2dd82d0684207b1aadf7d02d33e54845f9fc78e06b7581"},
{file = "lxml-5.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4f8b0c78e7aac24979ef09b7f50da871c2de2def043d468c4b41f512d831e912"},
{file = "lxml-5.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9bcf86dfc8ff3e992fed847c077bd875d9e0ba2fa25d859c3a0f0f76f07f0c8d"},
{file = "lxml-5.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:49a9b4af45e8b925e1cd6f3b15bbba2c81e7dba6dce170c677c9cda547411e14"},
{file = "lxml-5.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:280f3edf15c2a967d923bcfb1f8f15337ad36f93525828b40a0f9d6c2ad24890"},
{file = "lxml-5.1.0-cp39-cp39-win32.whl", hash = "sha256:ed7326563024b6e91fef6b6c7a1a2ff0a71b97793ac33dbbcf38f6005e51ff6e"},
{file = "lxml-5.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:8d7b4beebb178e9183138f552238f7e6613162a42164233e2bda00cb3afac58f"},
{file = "lxml-5.1.0-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:9bd0ae7cc2b85320abd5e0abad5ccee5564ed5f0cc90245d2f9a8ef330a8deae"},
{file = "lxml-5.1.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d8c1d679df4361408b628f42b26a5d62bd3e9ba7f0c0e7969f925021554755aa"},
{file = "lxml-5.1.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:2ad3a8ce9e8a767131061a22cd28fdffa3cd2dc193f399ff7b81777f3520e372"},
{file = "lxml-5.1.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:304128394c9c22b6569eba2a6d98392b56fbdfbad58f83ea702530be80d0f9df"},
{file = "lxml-5.1.0-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d74fcaf87132ffc0447b3c685a9f862ffb5b43e70ea6beec2fb8057d5d2a1fea"},
{file = "lxml-5.1.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:8cf5877f7ed384dabfdcc37922c3191bf27e55b498fecece9fd5c2c7aaa34c33"},
{file = "lxml-5.1.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:877efb968c3d7eb2dad540b6cabf2f1d3c0fbf4b2d309a3c141f79c7e0061324"},
{file = "lxml-5.1.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3f14a4fb1c1c402a22e6a341a24c1341b4a3def81b41cd354386dcb795f83897"},
{file = "lxml-5.1.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:25663d6e99659544ee8fe1b89b1a8c0aaa5e34b103fab124b17fa958c4a324a6"},
{file = "lxml-5.1.0-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:8b9f19df998761babaa7f09e6bc169294eefafd6149aaa272081cbddc7ba4ca3"},
{file = "lxml-5.1.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5e53d7e6a98b64fe54775d23a7c669763451340c3d44ad5e3a3b48a1efbdc96f"},
{file = "lxml-5.1.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:c3cd1fc1dc7c376c54440aeaaa0dcc803d2126732ff5c6b68ccd619f2e64be4f"},
{file = "lxml-5.1.0.tar.gz", hash = "sha256:3eea6ed6e6c918e468e693c41ef07f3c3acc310b70ddd9cc72d9ef84bc9564ca"},
]
[package.extras]
cssselect = ["cssselect (>=0.7)"]
html5 = ["html5lib"]
htmlsoup = ["BeautifulSoup4"]
source = ["Cython (>=0.29.35)"]
source = ["Cython (>=3.0.7)"]
[[package]]
name = "lxml-stubs"
@@ -1616,70 +1602,88 @@ files = [
[[package]]
name = "pillow"
version = "10.1.0"
version = "10.2.0"
description = "Python Imaging Library (Fork)"
optional = false
python-versions = ">=3.8"
files = [
{file = "Pillow-10.1.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:1ab05f3db77e98f93964697c8efc49c7954b08dd61cff526b7f2531a22410106"},
{file = "Pillow-10.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:6932a7652464746fcb484f7fc3618e6503d2066d853f68a4bd97193a3996e273"},
{file = "Pillow-10.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a5f63b5a68daedc54c7c3464508d8c12075e56dcfbd42f8c1bf40169061ae666"},
{file = "Pillow-10.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0949b55eb607898e28eaccb525ab104b2d86542a85c74baf3a6dc24002edec2"},
{file = "Pillow-10.1.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:ae88931f93214777c7a3aa0a8f92a683f83ecde27f65a45f95f22d289a69e593"},
{file = "Pillow-10.1.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:b0eb01ca85b2361b09480784a7931fc648ed8b7836f01fb9241141b968feb1db"},
{file = "Pillow-10.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d27b5997bdd2eb9fb199982bb7eb6164db0426904020dc38c10203187ae2ff2f"},
{file = "Pillow-10.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7df5608bc38bd37ef585ae9c38c9cd46d7c81498f086915b0f97255ea60c2818"},
{file = "Pillow-10.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:41f67248d92a5e0a2076d3517d8d4b1e41a97e2df10eb8f93106c89107f38b57"},
{file = "Pillow-10.1.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:1fb29c07478e6c06a46b867e43b0bcdb241b44cc52be9bc25ce5944eed4648e7"},
{file = "Pillow-10.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2cdc65a46e74514ce742c2013cd4a2d12e8553e3a2563c64879f7c7e4d28bce7"},
{file = "Pillow-10.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50d08cd0a2ecd2a8657bd3d82c71efd5a58edb04d9308185d66c3a5a5bed9610"},
{file = "Pillow-10.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:062a1610e3bc258bff2328ec43f34244fcec972ee0717200cb1425214fe5b839"},
{file = "Pillow-10.1.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:61f1a9d247317fa08a308daaa8ee7b3f760ab1809ca2da14ecc88ae4257d6172"},
{file = "Pillow-10.1.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:a646e48de237d860c36e0db37ecaecaa3619e6f3e9d5319e527ccbc8151df061"},
{file = "Pillow-10.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:47e5bf85b80abc03be7455c95b6d6e4896a62f6541c1f2ce77a7d2bb832af262"},
{file = "Pillow-10.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:a92386125e9ee90381c3369f57a2a50fa9e6aa8b1cf1d9c4b200d41a7dd8e992"},
{file = "Pillow-10.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:0f7c276c05a9767e877a0b4c5050c8bee6a6d960d7f0c11ebda6b99746068c2a"},
{file = "Pillow-10.1.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:a89b8312d51715b510a4fe9fc13686283f376cfd5abca8cd1c65e4c76e21081b"},
{file = "Pillow-10.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:00f438bb841382b15d7deb9a05cc946ee0f2c352653c7aa659e75e592f6fa17d"},
{file = "Pillow-10.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3d929a19f5469b3f4df33a3df2983db070ebb2088a1e145e18facbc28cae5b27"},
{file = "Pillow-10.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9a92109192b360634a4489c0c756364c0c3a2992906752165ecb50544c251312"},
{file = "Pillow-10.1.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:0248f86b3ea061e67817c47ecbe82c23f9dd5d5226200eb9090b3873d3ca32de"},
{file = "Pillow-10.1.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:9882a7451c680c12f232a422730f986a1fcd808da0fd428f08b671237237d651"},
{file = "Pillow-10.1.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:1c3ac5423c8c1da5928aa12c6e258921956757d976405e9467c5f39d1d577a4b"},
{file = "Pillow-10.1.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:806abdd8249ba3953c33742506fe414880bad78ac25cc9a9b1c6ae97bedd573f"},
{file = "Pillow-10.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:eaed6977fa73408b7b8a24e8b14e59e1668cfc0f4c40193ea7ced8e210adf996"},
{file = "Pillow-10.1.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:fe1e26e1ffc38be097f0ba1d0d07fcade2bcfd1d023cda5b29935ae8052bd793"},
{file = "Pillow-10.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7a7e3daa202beb61821c06d2517428e8e7c1aab08943e92ec9e5755c2fc9ba5e"},
{file = "Pillow-10.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:24fadc71218ad2b8ffe437b54876c9382b4a29e030a05a9879f615091f42ffc2"},
{file = "Pillow-10.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa1d323703cfdac2036af05191b969b910d8f115cf53093125e4058f62012c9a"},
{file = "Pillow-10.1.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:912e3812a1dbbc834da2b32299b124b5ddcb664ed354916fd1ed6f193f0e2d01"},
{file = "Pillow-10.1.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:7dbaa3c7de82ef37e7708521be41db5565004258ca76945ad74a8e998c30af8d"},
{file = "Pillow-10.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9d7bc666bd8c5a4225e7ac71f2f9d12466ec555e89092728ea0f5c0c2422ea80"},
{file = "Pillow-10.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:baada14941c83079bf84c037e2d8b7506ce201e92e3d2fa0d1303507a8538212"},
{file = "Pillow-10.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:2ef6721c97894a7aa77723740a09547197533146fba8355e86d6d9a4a1056b14"},
{file = "Pillow-10.1.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:0a026c188be3b443916179f5d04548092e253beb0c3e2ee0a4e2cdad72f66099"},
{file = "Pillow-10.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:04f6f6149f266a100374ca3cc368b67fb27c4af9f1cc8cb6306d849dcdf12616"},
{file = "Pillow-10.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb40c011447712d2e19cc261c82655f75f32cb724788df315ed992a4d65696bb"},
{file = "Pillow-10.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a8413794b4ad9719346cd9306118450b7b00d9a15846451549314a58ac42219"},
{file = "Pillow-10.1.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:c9aeea7b63edb7884b031a35305629a7593272b54f429a9869a4f63a1bf04c34"},
{file = "Pillow-10.1.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:b4005fee46ed9be0b8fb42be0c20e79411533d1fd58edabebc0dd24626882cfd"},
{file = "Pillow-10.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:4d0152565c6aa6ebbfb1e5d8624140a440f2b99bf7afaafbdbf6430426497f28"},
{file = "Pillow-10.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d921bc90b1defa55c9917ca6b6b71430e4286fc9e44c55ead78ca1a9f9eba5f2"},
{file = "Pillow-10.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:cfe96560c6ce2f4c07d6647af2d0f3c54cc33289894ebd88cfbb3bcd5391e256"},
{file = "Pillow-10.1.0-pp310-pypy310_pp73-macosx_10_10_x86_64.whl", hash = "sha256:937bdc5a7f5343d1c97dc98149a0be7eb9704e937fe3dc7140e229ae4fc572a7"},
{file = "Pillow-10.1.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b1c25762197144e211efb5f4e8ad656f36c8d214d390585d1d21281f46d556ba"},
{file = "Pillow-10.1.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:afc8eef765d948543a4775f00b7b8c079b3321d6b675dde0d02afa2ee23000b4"},
{file = "Pillow-10.1.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:883f216eac8712b83a63f41b76ddfb7b2afab1b74abbb413c5df6680f071a6b9"},
{file = "Pillow-10.1.0-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:b920e4d028f6442bea9a75b7491c063f0b9a3972520731ed26c83e254302eb1e"},
{file = "Pillow-10.1.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1c41d960babf951e01a49c9746f92c5a7e0d939d1652d7ba30f6b3090f27e412"},
{file = "Pillow-10.1.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:1fafabe50a6977ac70dfe829b2d5735fd54e190ab55259ec8aea4aaea412fa0b"},
{file = "Pillow-10.1.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:3b834f4b16173e5b92ab6566f0473bfb09f939ba14b23b8da1f54fa63e4b623f"},
{file = "Pillow-10.1.0.tar.gz", hash = "sha256:e6bf8de6c36ed96c86ea3b6e1d5273c53f46ef518a062464cd7ef5dd2cf92e38"},
{file = "pillow-10.2.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:7823bdd049099efa16e4246bdf15e5a13dbb18a51b68fa06d6c1d4d8b99a796e"},
{file = "pillow-10.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:83b2021f2ade7d1ed556bc50a399127d7fb245e725aa0113ebd05cfe88aaf588"},
{file = "pillow-10.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6fad5ff2f13d69b7e74ce5b4ecd12cc0ec530fcee76356cac6742785ff71c452"},
{file = "pillow-10.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:da2b52b37dad6d9ec64e653637a096905b258d2fc2b984c41ae7d08b938a67e4"},
{file = "pillow-10.2.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:47c0995fc4e7f79b5cfcab1fc437ff2890b770440f7696a3ba065ee0fd496563"},
{file = "pillow-10.2.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:322bdf3c9b556e9ffb18f93462e5f749d3444ce081290352c6070d014c93feb2"},
{file = "pillow-10.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:51f1a1bffc50e2e9492e87d8e09a17c5eea8409cda8d3f277eb6edc82813c17c"},
{file = "pillow-10.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:69ffdd6120a4737710a9eee73e1d2e37db89b620f702754b8f6e62594471dee0"},
{file = "pillow-10.2.0-cp310-cp310-win32.whl", hash = "sha256:c6dafac9e0f2b3c78df97e79af707cdc5ef8e88208d686a4847bab8266870023"},
{file = "pillow-10.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:aebb6044806f2e16ecc07b2a2637ee1ef67a11840a66752751714a0d924adf72"},
{file = "pillow-10.2.0-cp310-cp310-win_arm64.whl", hash = "sha256:7049e301399273a0136ff39b84c3678e314f2158f50f517bc50285fb5ec847ad"},
{file = "pillow-10.2.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:35bb52c37f256f662abdfa49d2dfa6ce5d93281d323a9af377a120e89a9eafb5"},
{file = "pillow-10.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9c23f307202661071d94b5e384e1e1dc7dfb972a28a2310e4ee16103e66ddb67"},
{file = "pillow-10.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:773efe0603db30c281521a7c0214cad7836c03b8ccff897beae9b47c0b657d61"},
{file = "pillow-10.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:11fa2e5984b949b0dd6d7a94d967743d87c577ff0b83392f17cb3990d0d2fd6e"},
{file = "pillow-10.2.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:716d30ed977be8b37d3ef185fecb9e5a1d62d110dfbdcd1e2a122ab46fddb03f"},
{file = "pillow-10.2.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:a086c2af425c5f62a65e12fbf385f7c9fcb8f107d0849dba5839461a129cf311"},
{file = "pillow-10.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:c8de2789052ed501dd829e9cae8d3dcce7acb4777ea4a479c14521c942d395b1"},
{file = "pillow-10.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:609448742444d9290fd687940ac0b57fb35e6fd92bdb65386e08e99af60bf757"},
{file = "pillow-10.2.0-cp311-cp311-win32.whl", hash = "sha256:823ef7a27cf86df6597fa0671066c1b596f69eba53efa3d1e1cb8b30f3533068"},
{file = "pillow-10.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:1da3b2703afd040cf65ec97efea81cfba59cdbed9c11d8efc5ab09df9509fc56"},
{file = "pillow-10.2.0-cp311-cp311-win_arm64.whl", hash = "sha256:edca80cbfb2b68d7b56930b84a0e45ae1694aeba0541f798e908a49d66b837f1"},
{file = "pillow-10.2.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:1b5e1b74d1bd1b78bc3477528919414874748dd363e6272efd5abf7654e68bef"},
{file = "pillow-10.2.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0eae2073305f451d8ecacb5474997c08569fb4eb4ac231ffa4ad7d342fdc25ac"},
{file = "pillow-10.2.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b7c2286c23cd350b80d2fc9d424fc797575fb16f854b831d16fd47ceec078f2c"},
{file = "pillow-10.2.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1e23412b5c41e58cec602f1135c57dfcf15482013ce6e5f093a86db69646a5aa"},
{file = "pillow-10.2.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:52a50aa3fb3acb9cf7213573ef55d31d6eca37f5709c69e6858fe3bc04a5c2a2"},
{file = "pillow-10.2.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:127cee571038f252a552760076407f9cff79761c3d436a12af6000cd182a9d04"},
{file = "pillow-10.2.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:8d12251f02d69d8310b046e82572ed486685c38f02176bd08baf216746eb947f"},
{file = "pillow-10.2.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:54f1852cd531aa981bc0965b7d609f5f6cc8ce8c41b1139f6ed6b3c54ab82bfb"},
{file = "pillow-10.2.0-cp312-cp312-win32.whl", hash = "sha256:257d8788df5ca62c980314053197f4d46eefedf4e6175bc9412f14412ec4ea2f"},
{file = "pillow-10.2.0-cp312-cp312-win_amd64.whl", hash = "sha256:154e939c5f0053a383de4fd3d3da48d9427a7e985f58af8e94d0b3c9fcfcf4f9"},
{file = "pillow-10.2.0-cp312-cp312-win_arm64.whl", hash = "sha256:f379abd2f1e3dddb2b61bc67977a6b5a0a3f7485538bcc6f39ec76163891ee48"},
{file = "pillow-10.2.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:8373c6c251f7ef8bda6675dd6d2b3a0fcc31edf1201266b5cf608b62a37407f9"},
{file = "pillow-10.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:870ea1ada0899fd0b79643990809323b389d4d1d46c192f97342eeb6ee0b8483"},
{file = "pillow-10.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b4b6b1e20608493548b1f32bce8cca185bf0480983890403d3b8753e44077129"},
{file = "pillow-10.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3031709084b6e7852d00479fd1d310b07d0ba82765f973b543c8af5061cf990e"},
{file = "pillow-10.2.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:3ff074fc97dd4e80543a3e91f69d58889baf2002b6be64347ea8cf5533188213"},
{file = "pillow-10.2.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:cb4c38abeef13c61d6916f264d4845fab99d7b711be96c326b84df9e3e0ff62d"},
{file = "pillow-10.2.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:b1b3020d90c2d8e1dae29cf3ce54f8094f7938460fb5ce8bc5c01450b01fbaf6"},
{file = "pillow-10.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:170aeb00224ab3dc54230c797f8404507240dd868cf52066f66a41b33169bdbe"},
{file = "pillow-10.2.0-cp38-cp38-win32.whl", hash = "sha256:c4225f5220f46b2fde568c74fca27ae9771536c2e29d7c04f4fb62c83275ac4e"},
{file = "pillow-10.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:0689b5a8c5288bc0504d9fcee48f61a6a586b9b98514d7d29b840143d6734f39"},
{file = "pillow-10.2.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:b792a349405fbc0163190fde0dc7b3fef3c9268292586cf5645598b48e63dc67"},
{file = "pillow-10.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c570f24be1e468e3f0ce7ef56a89a60f0e05b30a3669a459e419c6eac2c35364"},
{file = "pillow-10.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8ecd059fdaf60c1963c58ceb8997b32e9dc1b911f5da5307aab614f1ce5c2fb"},
{file = "pillow-10.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c365fd1703040de1ec284b176d6af5abe21b427cb3a5ff68e0759e1e313a5e7e"},
{file = "pillow-10.2.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:70c61d4c475835a19b3a5aa42492409878bbca7438554a1f89d20d58a7c75c01"},
{file = "pillow-10.2.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:b6f491cdf80ae540738859d9766783e3b3c8e5bd37f5dfa0b76abdecc5081f13"},
{file = "pillow-10.2.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9d189550615b4948f45252d7f005e53c2040cea1af5b60d6f79491a6e147eef7"},
{file = "pillow-10.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:49d9ba1ed0ef3e061088cd1e7538a0759aab559e2e0a80a36f9fd9d8c0c21591"},
{file = "pillow-10.2.0-cp39-cp39-win32.whl", hash = "sha256:babf5acfede515f176833ed6028754cbcd0d206f7f614ea3447d67c33be12516"},
{file = "pillow-10.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:0304004f8067386b477d20a518b50f3fa658a28d44e4116970abfcd94fac34a8"},
{file = "pillow-10.2.0-cp39-cp39-win_arm64.whl", hash = "sha256:0fb3e7fc88a14eacd303e90481ad983fd5b69c761e9e6ef94c983f91025da869"},
{file = "pillow-10.2.0-pp310-pypy310_pp73-macosx_10_10_x86_64.whl", hash = "sha256:322209c642aabdd6207517e9739c704dc9f9db943015535783239022002f054a"},
{file = "pillow-10.2.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3eedd52442c0a5ff4f887fab0c1c0bb164d8635b32c894bc1faf4c618dd89df2"},
{file = "pillow-10.2.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cb28c753fd5eb3dd859b4ee95de66cc62af91bcff5db5f2571d32a520baf1f04"},
{file = "pillow-10.2.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:33870dc4653c5017bf4c8873e5488d8f8d5f8935e2f1fb9a2208c47cdd66efd2"},
{file = "pillow-10.2.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:3c31822339516fb3c82d03f30e22b1d038da87ef27b6a78c9549888f8ceda39a"},
{file = "pillow-10.2.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a2b56ba36e05f973d450582fb015594aaa78834fefe8dfb8fcd79b93e64ba4c6"},
{file = "pillow-10.2.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:d8e6aeb9201e655354b3ad049cb77d19813ad4ece0df1249d3c793de3774f8c7"},
{file = "pillow-10.2.0-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:2247178effb34a77c11c0e8ac355c7a741ceca0a732b27bf11e747bbc950722f"},
{file = "pillow-10.2.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:15587643b9e5eb26c48e49a7b33659790d28f190fc514a322d55da2fb5c2950e"},
{file = "pillow-10.2.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753cd8f2086b2b80180d9b3010dd4ed147efc167c90d3bf593fe2af21265e5a5"},
{file = "pillow-10.2.0-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:7c8f97e8e7a9009bcacbe3766a36175056c12f9a44e6e6f2d5caad06dcfbf03b"},
{file = "pillow-10.2.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d1b35bcd6c5543b9cb547dee3150c93008f8dd0f1fef78fc0cd2b141c5baf58a"},
{file = "pillow-10.2.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:fe4c15f6c9285dc54ce6553a3ce908ed37c8f3825b5a51a15c91442bb955b868"},
{file = "pillow-10.2.0.tar.gz", hash = "sha256:e87f0b2c78157e12d7686b27d63c070fd65d994e8ddae6f328e0dcf4a0cd007e"},
]
[package.extras]
docs = ["furo", "olefile", "sphinx (>=2.4)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinx-removed-in", "sphinxext-opengraph"]
fpx = ["olefile"]
mic = ["olefile"]
tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout"]
typing = ["typing-extensions"]
xmp = ["defusedxml"]
[[package]]
name = "pkginfo"
@@ -2475,13 +2479,13 @@ doc = ["Sphinx", "sphinx-rtd-theme"]
[[package]]
name = "sentry-sdk"
version = "1.35.0"
version = "1.39.1"
description = "Python client for Sentry (https://sentry.io)"
optional = true
python-versions = "*"
files = [
{file = "sentry-sdk-1.35.0.tar.gz", hash = "sha256:04e392db9a0d59bd49a51b9e3a92410ac5867556820465057c2ef89a38e953e9"},
{file = "sentry_sdk-1.35.0-py2.py3-none-any.whl", hash = "sha256:a7865952701e46d38b41315c16c075367675c48d049b90a4cc2e41991ebc7efa"},
{file = "sentry-sdk-1.39.1.tar.gz", hash = "sha256:320a55cdf9da9097a0bead239c35b7e61f53660ef9878861824fd6d9b2eaf3b5"},
{file = "sentry_sdk-1.39.1-py2.py3-none-any.whl", hash = "sha256:81b5b9ffdd1a374e9eb0c053b5d2012155db9cbe76393a8585677b753bd5fdc1"},
]
[package.dependencies]
@@ -2520,13 +2524,13 @@ tornado = ["tornado (>=5)"]
[[package]]
name = "service-identity"
version = "23.1.0"
version = "24.1.0"
description = "Service identity verification for pyOpenSSL & cryptography."
optional = false
python-versions = ">=3.8"
files = [
{file = "service_identity-23.1.0-py3-none-any.whl", hash = "sha256:87415a691d52fcad954a500cb81f424d0273f8e7e3ee7d766128f4575080f383"},
{file = "service_identity-23.1.0.tar.gz", hash = "sha256:ecb33cd96307755041e978ab14f8b14e13b40f1fbd525a4dc78f46d2b986431d"},
{file = "service_identity-24.1.0-py3-none-any.whl", hash = "sha256:a28caf8130c8a5c1c7a6f5293faaf239bbfb7751e4862436920ee6f2616f568a"},
{file = "service_identity-24.1.0.tar.gz", hash = "sha256:6829c9d62fb832c2e1c435629b0a8c476e1929881f28bee4d20bc24161009221"},
]
[package.dependencies]
@@ -2536,7 +2540,7 @@ pyasn1 = "*"
pyasn1-modules = "*"
[package.extras]
dev = ["pyopenssl", "service-identity[docs,idna,mypy,tests]"]
dev = ["pyopenssl", "service-identity[idna,mypy,tests]"]
docs = ["furo", "myst-parser", "pyopenssl", "sphinx", "sphinx-notfound-page"]
idna = ["idna"]
mypy = ["idna", "mypy", "types-pyopenssl"]
@@ -3037,24 +3041,24 @@ files = [
[[package]]
name = "types-commonmark"
version = "0.9.2.4"
version = "0.9.2.20240106"
description = "Typing stubs for commonmark"
optional = false
python-versions = "*"
python-versions = ">=3.8"
files = [
{file = "types-commonmark-0.9.2.4.tar.gz", hash = "sha256:2c6486f65735cf18215cca3e962b17787fa545be279306f79b801f64a5319959"},
{file = "types_commonmark-0.9.2.4-py3-none-any.whl", hash = "sha256:d5090fa685c3e3c0ec3a5973ff842000baef6d86f762d52209b3c5e9fbd0b555"},
{file = "types-commonmark-0.9.2.20240106.tar.gz", hash = "sha256:52a062b71766d6ab258fca2d8e19fb0853796e25ca9afa9d0f67a1e42c93479f"},
{file = "types_commonmark-0.9.2.20240106-py3-none-any.whl", hash = "sha256:606d9de1e3a96cab0b1c0b6cccf4df099116148d1d864d115fde2e27ad6877c3"},
]
[[package]]
name = "types-jsonschema"
version = "4.20.0.0"
version = "4.20.0.20240105"
description = "Typing stubs for jsonschema"
optional = false
python-versions = ">=3.8"
files = [
{file = "types-jsonschema-4.20.0.0.tar.gz", hash = "sha256:0de1032d243f1d3dba8b745ad84efe8c1af71665a9deb1827636ac535dcb79c1"},
{file = "types_jsonschema-4.20.0.0-py3-none-any.whl", hash = "sha256:e6d5df18aaca4412f0aae246a294761a92040e93d7bc840f002b7329a8b72d26"},
{file = "types-jsonschema-4.20.0.20240105.tar.gz", hash = "sha256:4a71af7e904498e7ad055149f6dc1eee04153b59a99ad7dd17aa3769c9bc5982"},
{file = "types_jsonschema-4.20.0.20240105-py3-none-any.whl", hash = "sha256:26706cd70a273e59e718074c4e756608a25ba61327a7f9a4493ebd11941e5ad4"},
]
[package.dependencies]
@@ -3156,13 +3160,13 @@ files = [
[[package]]
name = "typing-extensions"
version = "4.8.0"
version = "4.9.0"
description = "Backported and Experimental Type Hints for Python 3.8+"
optional = false
python-versions = ">=3.8"
files = [
{file = "typing_extensions-4.8.0-py3-none-any.whl", hash = "sha256:8f92fc8806f9a6b641eaa5318da32b44d401efaac0f6678c9bc448ba3605faa0"},
{file = "typing_extensions-4.8.0.tar.gz", hash = "sha256:df8e4339e9cb77357558cbdbceca33c303714cf861d1eef15e1070055ae8b7ef"},
{file = "typing_extensions-4.9.0-py3-none-any.whl", hash = "sha256:af72aea155e91adfc61c3ae9e0e342dbc0cba726d6cba4b6c72c1f34e47291cd"},
{file = "typing_extensions-4.9.0.tar.gz", hash = "sha256:23478f88c37f27d76ac8aee6c905017a143b0b1b886c3c9f66bc2fd94f9f5783"},
]
[[package]]

View File

@@ -299,6 +299,19 @@ def _parse_oidc_config_dict(
config_path + ("client_secret",),
)
# If no client secret is specified then the auth method must be None
client_auth_method = oidc_config.get("client_auth_method")
if client_secret is None and client_secret_jwt_key is None:
if client_auth_method is None:
client_auth_method = "none"
elif client_auth_method != "none":
raise ConfigError(
"No 'client_secret' is set in OIDC config, and 'client_auth_method' is not set to 'none'"
)
if client_auth_method is None:
client_auth_method = "client_secret_basic"
return OidcProviderConfig(
idp_id=idp_id,
idp_name=oidc_config.get("idp_name", "OIDC"),
@@ -309,7 +322,7 @@ def _parse_oidc_config_dict(
client_id=oidc_config["client_id"],
client_secret=client_secret,
client_secret_jwt_key=client_secret_jwt_key,
client_auth_method=oidc_config.get("client_auth_method", "client_secret_basic"),
client_auth_method=client_auth_method,
pkce_method=oidc_config.get("pkce_method", "auto"),
scopes=oidc_config.get("scopes", ["openid"]),
authorization_endpoint=oidc_config.get("authorization_endpoint"),

View File

@@ -404,7 +404,7 @@ _DEFAULT_SERIALIZE_EVENT_CONFIG = SerializeEventConfig()
def serialize_event(
e: Union[JsonDict, EventBase],
e: EventBase,
time_now_ms: int,
*,
config: SerializeEventConfig = _DEFAULT_SERIALIZE_EVENT_CONFIG,
@@ -420,10 +420,6 @@ def serialize_event(
The serialized event dictionary.
"""
# FIXME(erikj): To handle the case of presence events and the like
if not isinstance(e, EventBase):
return e
time_now_ms = int(time_now_ms)
# Should this strip out None's?
@@ -531,7 +527,7 @@ class EventClientSerializer:
async def serialize_event(
self,
event: Union[JsonDict, EventBase],
event: EventBase,
time_now: int,
*,
config: SerializeEventConfig = _DEFAULT_SERIALIZE_EVENT_CONFIG,
@@ -549,10 +545,6 @@ class EventClientSerializer:
Returns:
The serialized event
"""
# To handle the case of presence events and the like
if not isinstance(event, EventBase):
return event
serialized_event = serialize_event(event, time_now, config=config)
new_unsigned = {}
@@ -656,7 +648,7 @@ class EventClientSerializer:
async def serialize_events(
self,
events: Iterable[Union[JsonDict, EventBase]],
events: Iterable[EventBase],
time_now: int,
*,
config: SerializeEventConfig = _DEFAULT_SERIALIZE_EVENT_CONFIG,

View File

@@ -20,7 +20,7 @@
import logging
import random
from typing import TYPE_CHECKING, Iterable, List, Optional
from typing import TYPE_CHECKING, Iterable, List, Optional, cast
from synapse.api.constants import EduTypes, EventTypes, Membership, PresenceState
from synapse.api.errors import AuthError, SynapseError
@@ -29,7 +29,7 @@ from synapse.events.utils import SerializeEventConfig
from synapse.handlers.presence import format_user_presence_state
from synapse.storage.databases.main.events_worker import EventRedactBehaviour
from synapse.streams.config import PaginationConfig
from synapse.types import JsonDict, Requester, UserID
from synapse.types import JsonDict, Requester, StreamKeyType, UserID
from synapse.visibility import filter_events_for_client
if TYPE_CHECKING:
@@ -93,49 +93,54 @@ class EventStreamHandler:
is_guest=requester.is_guest,
explicit_room_id=room_id,
)
events = stream_result.events
events_by_source = stream_result.events_by_source
time_now = self.clock.time_msec()
# When the user joins a new room, or another user joins a currently
# joined room, we need to send down presence for those users.
to_add: List[JsonDict] = []
for event in events:
if not isinstance(event, EventBase):
to_return: List[JsonDict] = []
for keyname, source_events in events_by_source.items():
if keyname != StreamKeyType.ROOM:
e = cast(List[JsonDict], source_events)
to_return.extend(e)
continue
if event.type == EventTypes.Member:
if event.membership != Membership.JOIN:
continue
# Send down presence.
if event.state_key == requester.user.to_string():
# Send down presence for everyone in the room.
users: Iterable[str] = await self.store.get_users_in_room(
event.room_id
events = cast(List[EventBase], source_events)
serialized_events = await self._event_serializer.serialize_events(
events,
time_now,
config=SerializeEventConfig(
as_client_event=as_client_event, requester=requester
),
)
to_return.extend(serialized_events)
for event in events:
if event.type == EventTypes.Member:
if event.membership != Membership.JOIN:
continue
# Send down presence.
if event.state_key == requester.user.to_string():
# Send down presence for everyone in the room.
users: Iterable[str] = await self.store.get_users_in_room(
event.room_id
)
else:
users = [event.state_key]
states = await presence_handler.get_states(users)
to_return.extend(
{
"type": EduTypes.PRESENCE,
"content": format_user_presence_state(state, time_now),
}
for state in states
)
else:
users = [event.state_key]
states = await presence_handler.get_states(users)
to_add.extend(
{
"type": EduTypes.PRESENCE,
"content": format_user_presence_state(state, time_now),
}
for state in states
)
events.extend(to_add)
chunks = await self._event_serializer.serialize_events(
events,
time_now,
config=SerializeEventConfig(
as_client_event=as_client_event, requester=requester
),
)
chunk = {
"chunk": chunks,
"chunk": to_return,
"start": await stream_result.start_token.to_string(self.store),
"end": await stream_result.end_token.to_string(self.store),
}

View File

@@ -94,7 +94,7 @@ from synapse.types import (
)
from synapse.types.state import StateFilter
from synapse.util.async_helpers import Linearizer, concurrently_execute
from synapse.util.iterutils import batch_iter, partition, sorted_topologically_batched
from synapse.util.iterutils import batch_iter, partition, sorted_topologically
from synapse.util.retryutils import NotRetryingDestination
from synapse.util.stringutils import shortstr
@@ -1141,16 +1141,8 @@ class FederationEventHandler:
partial_state_flags = await self._store.get_partial_state_events(seen)
partial_state = any(partial_state_flags.values())
# Get the state of the events we know about
ours = await self._state_storage_controller.get_state_groups_ids(
room_id, seen, await_full_state=False
)
# state_maps is a list of mappings from (type, state_key) to event_id
state_maps: List[StateMap[str]] = list(ours.values())
# we don't need this any more, let's delete it.
del ours
state_maps: List[StateMap[str]] = []
# Ask the remote server for the states we don't
# know about
@@ -1169,6 +1161,17 @@ class FederationEventHandler:
state_maps.append(remote_state_map)
# Get the state of the events we know about. We do this *after*
# trying to fetch missing state over federation as that might fail
# and then we can skip loading the local state.
ours = await self._state_storage_controller.get_state_groups_ids(
room_id, seen, await_full_state=False
)
state_maps.extend(ours.values())
# we don't need this any more, let's delete it.
del ours
room_version = await self._store.get_room_version_id(room_id)
state_map = await self._state_resolution_handler.resolve_events_with_store(
room_id,
@@ -1678,57 +1681,36 @@ class FederationEventHandler:
# We need to persist an event's auth events before the event.
auth_graph = {
ev: [event_map[e_id] for e_id in ev.auth_event_ids() if e_id in event_map]
ev.event_id: [e_id for e_id in ev.auth_event_ids() if e_id in event_map]
for ev in event_map.values()
}
for roots in sorted_topologically_batched(event_map.values(), auth_graph):
if not roots:
# if *none* of the remaining events are ready, that means
# we have a loop. This either means a bug in our logic, or that
# somebody has managed to create a loop (which requires finding a
# hash collision in room v2 and later).
logger.warning(
"Loop found in auth events while fetching missing state/auth "
"events: %s",
shortstr(event_map.keys()),
)
return
sorted_auth_event_ids = sorted_topologically(event_map.keys(), auth_graph)
sorted_auth_events = [event_map[e_id] for e_id in sorted_auth_event_ids]
logger.info(
"Persisting %i remaining outliers: %s",
len(sorted_auth_events),
shortstr(e.event_id for e in sorted_auth_events),
)
logger.info(
"Persisting %i of %i remaining outliers: %s",
len(roots),
len(event_map),
shortstr(e.event_id for e in roots),
)
await self._auth_and_persist_outliers_inner(room_id, roots)
async def _auth_and_persist_outliers_inner(
self, room_id: str, fetched_events: Collection[EventBase]
) -> None:
"""Helper for _auth_and_persist_outliers
Persists a batch of events where we have (theoretically) already persisted all
of their auth events.
Marks the events as outliers, auths them, persists them to the database, and,
where appropriate (eg, an invite), awakes the notifier.
Params:
origin: where the events came from
room_id: the room that the events are meant to be in (though this has
not yet been checked)
fetched_events: the events to persist
"""
# get all the auth events for all the events in this batch. By now, they should
# have been persisted.
auth_events = {
aid for event in fetched_events for aid in event.auth_event_ids()
auth_event_ids = {
aid for event in sorted_auth_events for aid in event.auth_event_ids()
}
persisted_events = await self._store.get_events(
auth_events,
allow_rejected=True,
)
auth_map = {
ev.event_id: ev
for ev in sorted_auth_events
if ev.event_id in auth_event_ids
}
missing_events = auth_event_ids.difference(auth_map)
if missing_events:
persisted_events = await self._store.get_events(
missing_events,
allow_rejected=True,
redact_behaviour=EventRedactBehaviour.as_is,
)
auth_map.update(persisted_events)
events_and_contexts_to_persist: List[Tuple[EventBase, EventContext]] = []
@@ -1736,7 +1718,7 @@ class FederationEventHandler:
with nested_logging_context(suffix=event.event_id):
auth = []
for auth_event_id in event.auth_event_ids():
ae = persisted_events.get(auth_event_id)
ae = auth_map.get(auth_event_id)
if not ae:
# the fact we can't find the auth event doesn't mean it doesn't
# exist, which means it is premature to reject `event`. Instead we
@@ -1755,7 +1737,9 @@ class FederationEventHandler:
context = EventContext.for_outlier(self._storage_controllers)
try:
validate_event_for_room_version(event)
await check_state_independent_auth_rules(self._store, event)
await check_state_independent_auth_rules(
self._store, event, batched_auth_events=auth_map
)
check_state_dependent_auth_rules(event, auth)
except AuthError as e:
logger.warning("Rejecting %r because %s", event, e)
@@ -1772,7 +1756,7 @@ class FederationEventHandler:
events_and_contexts_to_persist.append((event, context))
for event in fetched_events:
for event in sorted_auth_events:
await prep(event)
await self.persist_events_and_notify(

View File

@@ -44,6 +44,7 @@ from synapse.api.ratelimiting import Ratelimiter
from synapse.config.ratelimiting import RatelimitSettings
from synapse.events import EventBase
from synapse.types import JsonDict, Requester, StrCollection
from synapse.types.state import StateFilter
from synapse.util.caches.response_cache import ResponseCache
if TYPE_CHECKING:
@@ -546,7 +547,16 @@ class RoomSummaryHandler:
Returns:
True if the room is accessible to the requesting user or server.
"""
state_ids = await self._storage_controllers.state.get_current_state_ids(room_id)
event_types = [
(EventTypes.JoinRules, ""),
(EventTypes.RoomHistoryVisibility, ""),
]
if requester:
event_types.append((EventTypes.Member, requester))
state_ids = await self._storage_controllers.state.get_current_state_ids(
room_id, state_filter=StateFilter.from_types(event_types)
)
# If there's no state for the room, it isn't known.
if not state_ids:

View File

@@ -583,10 +583,11 @@ class SyncHandler:
# `recents`, so partial state is only a problem when a membership
# event turns up in `recents` but has not made it into the current
# state.
current_state_ids_map = (
await self.store.get_partial_current_state_ids(room_id)
current_state_ids = (
await self.store.check_if_events_in_current_state(
{e.event_id for e in recents if e.is_state()}
)
)
current_state_ids = frozenset(current_state_ids_map.values())
recents = await filter_events_for_client(
self._storage_controllers,
@@ -667,10 +668,11 @@ class SyncHandler:
# `loaded_recents`, so partial state is only a problem when a
# membership event turns up in `loaded_recents` but has not made it
# into the current state.
current_state_ids_map = (
await self.store.get_partial_current_state_ids(room_id)
current_state_ids = (
await self.store.check_if_events_in_current_state(
{e.event_id for e in loaded_recents if e.is_state()}
)
)
current_state_ids = frozenset(current_state_ids_map.values())
loaded_recents = await filter_events_for_client(
self._storage_controllers,

View File

@@ -198,12 +198,12 @@ class _NotifierUserStream:
@attr.s(slots=True, frozen=True, auto_attribs=True)
class EventStreamResult:
events: List[Union[JsonDict, EventBase]]
events_by_source: Dict[StreamKeyType, List[Union[JsonDict, EventBase]]]
start_token: StreamToken
end_token: StreamToken
def __bool__(self) -> bool:
return bool(self.events)
return any(bool(e) for e in self.events_by_source.values())
@attr.s(slots=True, frozen=True, auto_attribs=True)
@@ -694,12 +694,12 @@ class Notifier:
before_token: StreamToken, after_token: StreamToken
) -> EventStreamResult:
if after_token == before_token:
return EventStreamResult([], from_token, from_token)
return EventStreamResult({}, from_token, from_token)
# The events fetched from each source are a JsonDict, EventBase, or
# UserPresenceState, but see below for UserPresenceState being
# converted to JsonDict.
events: List[Union[JsonDict, EventBase]] = []
events_by_source: Dict[StreamKeyType, List[Union[JsonDict, EventBase]]] = {}
end_token = from_token
for keyname, source in self.event_sources.sources.get_sources():
@@ -734,10 +734,12 @@ class Notifier:
for event in new_events
]
events.extend(new_events)
if new_events:
events_by_source.setdefault(keyname, []).extend(new_events)
end_token = end_token.copy_and_replace(keyname, new_key)
return EventStreamResult(events, from_token, end_token)
return EventStreamResult(events_by_source, from_token, end_token)
user_id_for_stream = user.to_string()
if is_peeking:

View File

@@ -28,17 +28,11 @@ from synapse.storage.databases.main import DataStore
async def get_badge_count(store: DataStore, user_id: str, group_by_room: bool) -> int:
invites = await store.get_invited_rooms_for_local_user(user_id)
joins = await store.get_rooms_for_user(user_id)
badge = len(invites)
room_to_count = await store.get_unread_counts_by_room_for_user(user_id)
for room_id, notify_count in room_to_count.items():
# room_to_count may include rooms which the user has left,
# ignore those.
if room_id not in joins:
continue
for _room_id, notify_count in room_to_count.items():
if notify_count == 0:
continue

View File

@@ -245,33 +245,74 @@ class DeviceInboxWorkerStore(SQLBaseStore):
* The last-processed stream ID. Subsequent calls of this function with the
same device should pass this value as 'from_stream_id'.
"""
(
user_id_device_id_to_messages,
last_processed_stream_id,
) = await self._get_device_messages(
user_ids=[user_id],
device_id=device_id,
from_stream_id=from_stream_id,
to_stream_id=to_stream_id,
limit=limit,
)
if not user_id_device_id_to_messages:
if not self._device_inbox_stream_cache.has_entity_changed(
user_id, from_stream_id
):
# There were no messages!
return [], to_stream_id
# Extract the messages, no need to return the user and device ID again
to_device_messages = user_id_device_id_to_messages.get((user_id, device_id), [])
def get_device_messages_txn(
txn: LoggingTransaction,
) -> Tuple[List[JsonDict], int]:
sql = """
SELECT stream_id, message_json FROM device_inbox
WHERE user_id = ? AND device_id = ?
AND ? < stream_id AND stream_id <= ?
ORDER BY stream_id ASC
LIMIT ?
"""
txn.execute(sql, (user_id, device_id, from_stream_id, to_stream_id, limit))
return to_device_messages, last_processed_stream_id
# Create and fill a dictionary of (user ID, device ID) -> list of messages
# intended for each device.
last_processed_stream_pos = to_stream_id
to_device_messages: List[JsonDict] = []
rowcount = 0
for row in txn:
rowcount += 1
last_processed_stream_pos = row[0]
message_dict = db_to_json(row[1])
# Store the device details
to_device_messages.append(message_dict)
# start a new span for each message, so that we can tag each separately
with start_active_span("get_to_device_message"):
set_tag(SynapseTags.TO_DEVICE_TYPE, message_dict["type"])
set_tag(SynapseTags.TO_DEVICE_SENDER, message_dict["sender"])
set_tag(SynapseTags.TO_DEVICE_RECIPIENT, user_id)
set_tag(SynapseTags.TO_DEVICE_RECIPIENT_DEVICE, device_id)
set_tag(
SynapseTags.TO_DEVICE_MSGID,
message_dict["content"].get(EventContentFields.TO_DEVICE_MSGID),
)
if rowcount == limit:
# We ended up bumping up against the message limit. There may be more messages
# to retrieve. Return what we have, as well as the last stream position that
# was processed.
#
# The caller is expected to set this as the lower (exclusive) bound
# for the next query of this device.
return to_device_messages, last_processed_stream_pos
# The limit was not reached, thus we know that recipient_device_to_messages
# contains all to-device messages for the given device and stream id range.
#
# We return to_stream_id, which the caller should then provide as the lower
# (exclusive) bound on the next query of this device.
return to_device_messages, to_stream_id
return await self.db_pool.runInteraction(
"get_messages_for_device", get_device_messages_txn
)
async def _get_device_messages(
self,
user_ids: Collection[str],
from_stream_id: int,
to_stream_id: int,
device_id: Optional[str] = None,
limit: Optional[int] = None,
) -> Tuple[Dict[Tuple[str, str], List[JsonDict]], int]:
"""
Retrieve pending to-device messages for a collection of user devices.
@@ -291,11 +332,7 @@ class DeviceInboxWorkerStore(SQLBaseStore):
user_ids: The user IDs to filter device messages by.
from_stream_id: The lower boundary of stream id to filter with (exclusive).
to_stream_id: The upper boundary of stream id to filter with (inclusive).
device_id: A device ID to query to-device messages for. If not provided, to-device
messages from all device IDs for the given user IDs will be queried. May not be
provided if `user_ids` contains more than one entry.
limit: The maximum number of to-device messages to return. Can only be used when
passing a single user ID / device ID tuple.
Returns:
A tuple containing:
@@ -308,30 +345,7 @@ class DeviceInboxWorkerStore(SQLBaseStore):
logger.warning("No users provided upon querying for device IDs")
return {}, to_stream_id
# Prevent a query for one user's device also retrieving another user's device with
# the same device ID (device IDs are not unique across users).
if len(user_ids) > 1 and device_id is not None:
raise AssertionError(
"Programming error: 'device_id' cannot be supplied to "
"_get_device_messages when >1 user_id has been provided"
)
# A limit can only be applied when querying for a single user ID / device ID tuple.
# See the docstring of this function for more details.
if limit is not None and device_id is None:
raise AssertionError(
"Programming error: _get_device_messages was passed 'limit' "
"without a specific user_id/device_id"
)
user_ids_to_query: Set[str] = set()
device_ids_to_query: Set[str] = set()
# Note that a device ID could be an empty str
if device_id is not None:
# If a device ID was passed, use it to filter results.
# Otherwise, device IDs will be derived from the given collection of user IDs.
device_ids_to_query.add(device_id)
# Determine which users have devices with pending messages
for user_id in user_ids:
@@ -355,20 +369,20 @@ class DeviceInboxWorkerStore(SQLBaseStore):
# hidden devices should not receive to-device messages.
# Note that this is more efficient than just dropping `device_id` from the query,
# since device_inbox has an index on `(user_id, device_id, stream_id)`
if not device_ids_to_query:
user_device_dicts = cast(
List[Tuple[str]],
self.db_pool.simple_select_many_txn(
txn,
table="devices",
column="user_id",
iterable=user_ids_to_query,
keyvalues={"hidden": False},
retcols=("device_id",),
),
)
device_ids_to_query.update({row[0] for row in user_device_dicts})
user_device_dicts = cast(
List[Tuple[str]],
self.db_pool.simple_select_many_txn(
txn,
table="devices",
column="user_id",
iterable=user_ids_to_query,
keyvalues={"hidden": False},
retcols=("device_id",),
),
)
device_ids_to_query = {row[0] for row in user_device_dicts}
if not device_ids_to_query:
# We've ended up with no devices to query.
@@ -400,22 +414,15 @@ class DeviceInboxWorkerStore(SQLBaseStore):
to_stream_id,
)
# If a limit was provided, limit the data retrieved from the database
if limit is not None:
sql += "LIMIT ?"
sql_args += (limit,)
txn.execute(sql, sql_args)
# Create and fill a dictionary of (user ID, device ID) -> list of messages
# intended for each device.
last_processed_stream_pos = to_stream_id
recipient_device_to_messages: Dict[Tuple[str, str], List[JsonDict]] = {}
rowcount = 0
for row in txn:
rowcount += 1
last_processed_stream_pos = row[0]
recipient_user_id = row[1]
recipient_device_id = row[2]
message_dict = db_to_json(row[3])
@@ -436,18 +443,6 @@ class DeviceInboxWorkerStore(SQLBaseStore):
message_dict["content"].get(EventContentFields.TO_DEVICE_MSGID),
)
if limit is not None and rowcount == limit:
# We ended up bumping up against the message limit. There may be more messages
# to retrieve. Return what we have, as well as the last stream position that
# was processed.
#
# The caller is expected to set this as the lower (exclusive) bound
# for the next query of this device.
return recipient_device_to_messages, last_processed_stream_pos
# The limit was not reached, thus we know that recipient_device_to_messages
# contains all to-device messages for the given device and stream id range.
#
# We return to_stream_id, which the caller should then provide as the lower
# (exclusive) bound on the next query of this device.
return recipient_device_to_messages, to_stream_id

View File

@@ -1794,7 +1794,7 @@ class DeviceStore(DeviceWorkerStore, DeviceBackgroundUpdateStore):
device_ids: The IDs of the devices to delete
"""
def _delete_devices_txn(txn: LoggingTransaction) -> None:
def _delete_devices_txn(txn: LoggingTransaction, device_ids: List[str]) -> None:
self.db_pool.simple_delete_many_txn(
txn,
table="devices",
@@ -1811,7 +1811,11 @@ class DeviceStore(DeviceWorkerStore, DeviceBackgroundUpdateStore):
keyvalues={"user_id": user_id},
)
await self.db_pool.runInteraction("delete_devices", _delete_devices_txn)
for batch in batch_iter(device_ids, 100):
await self.db_pool.runInteraction(
"delete_devices", _delete_devices_txn, batch
)
for device_id in device_ids:
self.device_id_exists_cache.invalidate((user_id, device_id))

View File

@@ -357,10 +357,6 @@ class EventPushActionsWorkerStore(ReceiptsWorkerStore, StreamWorkerStore, SQLBas
This function is intentionally not cached because it is called to calculate the
unread badge for push notifications and thus the result is expected to change.
Note that this function assumes the user is a member of the room. Because
summary rows are not removed when a user leaves a room, the caller must
filter out those results from the result.
Returns:
A map of room ID to notification counts for the given user.
"""
@@ -373,127 +369,170 @@ class EventPushActionsWorkerStore(ReceiptsWorkerStore, StreamWorkerStore, SQLBas
def _get_unread_counts_by_room_for_user_txn(
self, txn: LoggingTransaction, user_id: str
) -> Dict[str, int]:
receipt_types_clause, args = make_in_list_sql_clause(
# To get the badge count of all rooms we need to make three queries:
# 1. Fetch all counts from `event_push_summary`, discarding any stale
# rooms.
# 2. Fetch all notifications from `event_push_actions` that haven't
# been rotated yet.
# 3. Fetch all notifications from `event_push_actions` for the stale
# rooms.
#
# The "stale room" scenario generally happens when there is a new read
# receipt that hasn't yet been processed to update the
# `event_push_summary` table. When that happens we ignore the
# `event_push_summary` table for that room and calculate the count
# manually from `event_push_actions`.
# We need to only take into account read receipts of these types.
receipt_types_clause, receipt_types_args = make_in_list_sql_clause(
self.database_engine,
"receipt_type",
(ReceiptTypes.READ, ReceiptTypes.READ_PRIVATE),
)
args.extend([user_id, user_id])
receipts_cte = f"""
WITH all_receipts AS (
SELECT room_id, thread_id, MAX(event_stream_ordering) AS max_receipt_stream_ordering
FROM receipts_linearized
LEFT JOIN events USING (room_id, event_id)
WHERE
{receipt_types_clause}
AND user_id = ?
GROUP BY room_id, thread_id
)
"""
receipts_joins = """
LEFT JOIN (
SELECT room_id, thread_id,
max_receipt_stream_ordering AS threaded_receipt_stream_ordering
FROM all_receipts
WHERE thread_id IS NOT NULL
) AS threaded_receipts USING (room_id, thread_id)
LEFT JOIN (
SELECT room_id, thread_id,
max_receipt_stream_ordering AS unthreaded_receipt_stream_ordering
FROM all_receipts
WHERE thread_id IS NULL
) AS unthreaded_receipts USING (room_id)
"""
# First get summary counts by room / thread for the user. We use the max receipt
# stream ordering of both threaded & unthreaded receipts to compare against the
# summary table.
#
# PostgreSQL and SQLite differ in comparing scalar numerics.
if isinstance(self.database_engine, PostgresEngine):
# GREATEST ignores NULLs.
max_clause = """GREATEST(
threaded_receipt_stream_ordering,
unthreaded_receipt_stream_ordering
)"""
else:
# MAX returns NULL if any are NULL, so COALESCE to 0 first.
max_clause = """MAX(
COALESCE(threaded_receipt_stream_ordering, 0),
COALESCE(unthreaded_receipt_stream_ordering, 0)
)"""
# Step 1, fetch all counts from `event_push_summary` for the user. This
# is slightly convoluted as we also need to pull out the stream ordering
# of the most recent receipt of the user in the room (either a thread
# aware receipt or thread unaware receipt) in order to determine
# whether the row in `event_push_summary` is stale. Hence the outer
# GROUP BY and odd join condition against `receipts_linearized`.
sql = f"""
{receipts_cte}
SELECT eps.room_id, eps.thread_id, notif_count
FROM event_push_summary AS eps
{receipts_joins}
WHERE user_id = ?
AND notif_count != 0
AND (
(last_receipt_stream_ordering IS NULL AND stream_ordering > {max_clause})
OR last_receipt_stream_ordering = {max_clause}
SELECT room_id, notif_count, stream_ordering, thread_id, last_receipt_stream_ordering,
MAX(receipt_stream_ordering)
FROM (
SELECT e.room_id, notif_count, e.stream_ordering, e.thread_id, last_receipt_stream_ordering,
ev.stream_ordering AS receipt_stream_ordering
FROM event_push_summary AS e
INNER JOIN local_current_membership USING (user_id, room_id)
LEFT JOIN receipts_linearized AS r ON (
e.user_id = r.user_id
AND e.room_id = r.room_id
AND (e.thread_id = r.thread_id OR r.thread_id IS NULL)
AND {receipt_types_clause}
)
LEFT JOIN events AS ev ON (r.event_id = ev.event_id)
WHERE e.user_id = ? and notif_count > 0
) AS es
GROUP BY room_id, notif_count, stream_ordering, thread_id, last_receipt_stream_ordering
"""
txn.execute(sql, args)
seen_thread_ids = set()
room_to_count: Dict[str, int] = defaultdict(int)
for room_id, thread_id, notif_count in txn:
room_to_count[room_id] += notif_count
seen_thread_ids.add(thread_id)
# Now get any event push actions that haven't been rotated using the same OR
# join and filter by receipt and event push summary rotated up to stream ordering.
sql = f"""
{receipts_cte}
SELECT epa.room_id, epa.thread_id, COUNT(CASE WHEN epa.notif = 1 THEN 1 END) AS notif_count
FROM event_push_actions AS epa
{receipts_joins}
WHERE user_id = ?
AND epa.notif = 1
AND stream_ordering > (SELECT stream_ordering FROM event_push_summary_stream_ordering)
AND (threaded_receipt_stream_ordering IS NULL OR stream_ordering > threaded_receipt_stream_ordering)
AND (unthreaded_receipt_stream_ordering IS NULL OR stream_ordering > unthreaded_receipt_stream_ordering)
GROUP BY epa.room_id, epa.thread_id
"""
txn.execute(sql, args)
for room_id, thread_id, notif_count in txn:
# Note: only count push actions we have valid summaries for with up to date receipt.
if thread_id not in seen_thread_ids:
continue
room_to_count[room_id] += notif_count
thread_id_clause, thread_ids_args = make_in_list_sql_clause(
self.database_engine, "epa.thread_id", seen_thread_ids
txn.execute(
sql,
receipt_types_args
+ [
user_id,
],
)
room_to_count: Dict[str, int] = defaultdict(int)
stale_room_ids = set()
for row in txn:
room_id = row[0]
notif_count = row[1]
stream_ordering = row[2]
_thread_id = row[3]
last_receipt_stream_ordering = row[4]
receipt_stream_ordering = row[5]
if last_receipt_stream_ordering is None:
if receipt_stream_ordering is None:
room_to_count[room_id] += notif_count
elif stream_ordering > receipt_stream_ordering:
room_to_count[room_id] += notif_count
else:
# The latest read receipt from the user is after all the rows for
# this room in `event_push_summary`. We ignore them, and
# calculate the count from `event_push_actions` in step 3.
pass
elif last_receipt_stream_ordering == receipt_stream_ordering:
room_to_count[room_id] += notif_count
else:
# The row is stale if `last_receipt_stream_ordering` is set and
# *doesn't* match the latest receipt from the user.
stale_room_ids.add(room_id)
# Discard any stale rooms from `room_to_count`, as we will recalculate
# them in step 3.
for room_id in stale_room_ids:
room_to_count.pop(room_id, None)
# Step 2, basically the same query, except against `event_push_actions`
# and only fetching rows inserted since the last rotation.
rotated_upto_stream_ordering = self.db_pool.simple_select_one_onecol_txn(
txn,
table="event_push_summary_stream_ordering",
keyvalues={},
retcol="stream_ordering",
)
# Finally re-check event_push_actions for any rooms not in the summary, ignoring
# the rotated up-to position. This handles the case where a read receipt has arrived
# but not been rotated meaning the summary table is out of date, so we go back to
# the push actions table.
sql = f"""
{receipts_cte}
SELECT epa.room_id, COUNT(CASE WHEN epa.notif = 1 THEN 1 END) AS notif_count
FROM event_push_actions AS epa
{receipts_joins}
WHERE user_id = ?
AND NOT {thread_id_clause}
AND epa.notif = 1
AND (threaded_receipt_stream_ordering IS NULL OR stream_ordering > threaded_receipt_stream_ordering)
AND (unthreaded_receipt_stream_ordering IS NULL OR stream_ordering > unthreaded_receipt_stream_ordering)
GROUP BY epa.room_id
SELECT room_id, thread_id
FROM (
SELECT e.room_id, e.stream_ordering, e.thread_id,
ev.stream_ordering AS receipt_stream_ordering
FROM event_push_actions AS e
INNER JOIN local_current_membership USING (user_id, room_id)
LEFT JOIN receipts_linearized AS r ON (
e.user_id = r.user_id
AND e.room_id = r.room_id
AND (e.thread_id = r.thread_id OR r.thread_id IS NULL)
AND {receipt_types_clause}
)
LEFT JOIN events AS ev ON (r.event_id = ev.event_id)
WHERE e.user_id = ? and notif > 0
AND e.stream_ordering > ?
) AS es
GROUP BY room_id, stream_ordering, thread_id
HAVING stream_ordering > COALESCE(MAX(receipt_stream_ordering), 0)
"""
args.extend(thread_ids_args)
txn.execute(sql, args)
txn.execute(
sql,
receipt_types_args + [user_id, rotated_upto_stream_ordering],
)
for room_id, _thread_id in txn:
# Again, we ignore any stale rooms.
if room_id not in stale_room_ids:
# For event push actions it is one notification per row.
room_to_count[room_id] += 1
for room_id, notif_count in txn:
room_to_count[room_id] += notif_count
# Step 3, if we have stale rooms then we need to recalculate the counts
# from `event_push_actions`. Again, this is basically the same query as
# above except without a lower bound on stream ordering and only against
# a specific set of rooms.
if stale_room_ids:
room_id_clause, room_id_args = make_in_list_sql_clause(
self.database_engine,
"e.room_id",
stale_room_ids,
)
sql = f"""
SELECT room_id, thread_id
FROM (
SELECT e.room_id, e.stream_ordering, e.thread_id,
ev.stream_ordering AS receipt_stream_ordering
FROM event_push_actions AS e
INNER JOIN local_current_membership USING (user_id, room_id)
LEFT JOIN receipts_linearized AS r ON (
e.user_id = r.user_id
AND e.room_id = r.room_id
AND (e.thread_id = r.thread_id OR r.thread_id IS NULL)
AND {receipt_types_clause}
)
LEFT JOIN events AS ev ON (r.event_id = ev.event_id)
WHERE e.user_id = ? and notif > 0
AND {room_id_clause}
) AS es
GROUP BY room_id, stream_ordering, thread_id
HAVING stream_ordering > COALESCE(MAX(receipt_stream_ordering), 0)
"""
txn.execute(
sql,
receipt_types_args + [user_id] + room_id_args,
)
for room_id, _ in txn:
room_to_count[room_id] += 1
return room_to_count

View File

@@ -24,13 +24,17 @@ from typing import (
Any,
Collection,
Dict,
FrozenSet,
Iterable,
List,
Mapping,
Optional,
Set,
Tuple,
TypeVar,
Union,
cast,
overload,
)
import attr
@@ -52,7 +56,7 @@ from synapse.storage.database import (
)
from synapse.storage.databases.main.events_worker import EventsWorkerStore
from synapse.storage.databases.main.roommember import RoomMemberWorkerStore
from synapse.types import JsonDict, JsonMapping, StateMap
from synapse.types import JsonDict, JsonMapping, StateKey, StateMap, StrCollection
from synapse.types.state import StateFilter
from synapse.util.caches import intern_string
from synapse.util.caches.descriptors import cached, cachedList
@@ -64,6 +68,8 @@ if TYPE_CHECKING:
logger = logging.getLogger(__name__)
_T = TypeVar("_T")
MAX_STATE_DELTA_HOPS = 100
@@ -318,6 +324,20 @@ class StateGroupWorkerStore(EventsWorkerStore, SQLBaseStore):
"get_partial_current_state_ids", _get_current_state_ids_txn
)
async def check_if_events_in_current_state(
self, event_ids: StrCollection
) -> FrozenSet[str]:
"""Checks and returns which of the given events is part of the current state."""
rows = await self.db_pool.simple_select_many_batch(
table="current_state_events",
column="event_id",
iterable=event_ids,
retcols=("event_id",),
desc="check_if_events_in_current_state",
)
return frozenset(event_id for event_id, in rows)
# FIXME: how should this be cached?
@cancellable
async def get_partial_filtered_current_state_ids(
@@ -349,7 +369,8 @@ class StateGroupWorkerStore(EventsWorkerStore, SQLBaseStore):
def _get_filtered_current_state_ids_txn(
txn: LoggingTransaction,
) -> StateMap[str]:
results = {}
results = StateMapWrapper(state_filter=state_filter or StateFilter.all())
sql = """
SELECT type, state_key, event_id FROM current_state_events
WHERE room_id = ?
@@ -726,3 +747,41 @@ class StateStore(StateGroupWorkerStore, MainStateBackgroundUpdateStore):
hs: "HomeServer",
):
super().__init__(database, db_conn, hs)
@attr.s(auto_attribs=True, slots=True)
class StateMapWrapper(Dict[StateKey, str]):
"""A wrapper around a StateMap[str] to ensure that we only query for items
that were not filtered out.
This is to help prevent bugs where we filter out state but other bits of the
code expect the state to be there.
"""
state_filter: StateFilter
def __getitem__(self, key: StateKey) -> str:
if key not in self.state_filter:
raise Exception("State map was filtered and doesn't include: %s", key)
return super().__getitem__(key)
@overload
def get(self, key: Tuple[str, str]) -> Optional[str]:
...
@overload
def get(self, key: Tuple[str, str], default: Union[str, _T]) -> Union[str, _T]:
...
def get(
self, key: StateKey, default: Union[str, _T, None] = None
) -> Union[str, _T, None]:
if key not in self.state_filter:
raise Exception("State map was filtered and doesn't include: %s", key)
return super().get(key, default)
def __contains__(self, key: Any) -> bool:
if key not in self.state_filter:
raise Exception("State map was filtered and doesn't include: %s", key)
return super().__contains__(key)

View File

@@ -20,6 +20,7 @@
import logging
from typing import (
TYPE_CHECKING,
Any,
Callable,
Collection,
Dict,
@@ -584,6 +585,29 @@ class StateFilter:
# local users only
return False
def __contains__(self, key: Any) -> bool:
if not isinstance(key, tuple) or len(key) != 2:
raise TypeError(
f"'in StateFilter' requires (str, str) as left operand, not {type(key).__name__}"
)
typ, state_key = key
if not isinstance(typ, str) or not isinstance(state_key, str):
raise TypeError(
f"'in StateFilter' requires (str, str) as left operand, not ({type(typ).__name__}, {type(state_key).__name__})"
)
if typ in self.types:
state_keys = self.types[typ]
if state_keys is None or state_key in state_keys:
return True
elif self.include_others:
return True
return False
_ALL_STATE_FILTER = StateFilter(types=immutabledict(), include_others=True)
_ALL_NON_MEMBER_STATE_FILTER = StateFilter(