Limit size of user directory search queries (#18172)
If a user search has many words we can end up creating really large queries that take a long time for the database to process. Generally, such searches don't return any results anyway (due to limits on user ID and display name length). We "fix" this by cheating and only searching for the first ten words.
This commit is contained in:
1
changelog.d/18172.misc
Normal file
1
changelog.d/18172.misc
Normal file
@@ -0,0 +1 @@
|
|||||||
|
Reduce database load of user search when using large search terms.
|
||||||
@@ -1237,7 +1237,13 @@ def _parse_query_postgres(search_term: str) -> Tuple[str, str, str]:
|
|||||||
search_term = _filter_text_for_index(search_term)
|
search_term = _filter_text_for_index(search_term)
|
||||||
|
|
||||||
escaped_words = []
|
escaped_words = []
|
||||||
for word in _parse_words(search_term):
|
for index, word in enumerate(_parse_words(search_term)):
|
||||||
|
if index >= 10:
|
||||||
|
# We limit how many terms we include, as otherwise it can use
|
||||||
|
# excessive database time if people accidentally search for large
|
||||||
|
# strings.
|
||||||
|
break
|
||||||
|
|
||||||
# Postgres tsvector and tsquery quoting rules:
|
# Postgres tsvector and tsquery quoting rules:
|
||||||
# words potentially containing punctuation should be quoted
|
# words potentially containing punctuation should be quoted
|
||||||
# and then existing quotes and backslashes should be doubled
|
# and then existing quotes and backslashes should be doubled
|
||||||
|
|||||||
Reference in New Issue
Block a user