Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: [Nightly] Query failed reporting delegator closed during wait tsafe #39379

Open
1 task done
NicoYuan1986 opened this issue Jan 17, 2025 · 2 comments
Open
1 task done
Assignees
Labels
kind/bug Issues or changes related a bug triage/accepted Indicates an issue or PR is ready to be actively worked on.
Milestone

Comments

@NicoYuan1986
Copy link
Contributor

Is there an existing issue for this?

  • I have searched the existing issues

Environment

- Milvus version: e752059
- Deployment mode(standalone or cluster):standalone
- MQ type(rocksmq, pulsar or kafka):    rocksmq
- SDK version(e.g. pymilvus v2.0.0rc2):
- OS(Ubuntu or CentOS): 
- CPU/Memory: 
- GPU: 
- Others:

Current Behavior

Query failed reporting delegator closed during wait tsafe.

2025-01-16T22:35:07Z {container="step-test"} [2025-01-16 21:00:55 - DEBUG - ci_test]: (api_request)  : [Collection.query] args: ["json_contains_all(json_field['listStr'], ['1846', '1847', '1848'])", None, None, 180], kwargs: {} (api_request.py:52)
2025-01-16T22:35:07Z {container="step-test"} [2025-01-16 21:01:03 - ERROR - ci_test]: Traceback (most recent call last):
2025-01-16T22:35:07Z {container="step-test"} File "/milvus/tests/python_client/utils/api_request.py", line 22, in inner_wrapper
2025-01-16T22:35:07Z {container="step-test"} res = func(*args, **_kwargs)
2025-01-16T22:35:07Z {container="step-test"} File "/milvus/tests/python_client/utils/api_request.py", line 53, in api_request
2025-01-16T22:35:07Z {container="step-test"} return func(*arg, **kwargs)
2025-01-16T22:35:07Z {container="step-test"} File "/usr/local/lib/python3.10/site-packages/pymilvus/orm/collection.py", line 1074, in query
2025-01-16T22:35:07Z {container="step-test"} return conn.query(
2025-01-16T22:35:07Z {container="step-test"} File "/usr/local/lib/python3.10/site-packages/pymilvus/decorators.py", line 141, in handler
2025-01-16T22:35:07Z {container="step-test"} raise e from e
2025-01-16T22:35:07Z {container="step-test"} File "/usr/local/lib/python3.10/site-packages/pymilvus/decorators.py", line 137, in handler
2025-01-16T22:35:07Z {container="step-test"} return func(*args, **kwargs)
2025-01-16T22:35:07Z {container="step-test"} File "/usr/local/lib/python3.10/site-packages/pymilvus/decorators.py", line 176, in handler
2025-01-16T22:35:07Z {container="step-test"} return func(self, *args, **kwargs)
2025-01-16T22:35:07Z {container="step-test"} File "/usr/local/lib/python3.10/site-packages/pymilvus/decorators.py", line 116, in handler
2025-01-16T22:35:07Z {container="step-test"} raise e from e
2025-01-16T22:35:07Z {container="step-test"} File "/usr/local/lib/python3.10/site-packages/pymilvus/decorators.py", line 86, in handler
2025-01-16T22:35:07Z {container="step-test"} return func(*args, **kwargs)
2025-01-16T22:35:07Z {container="step-test"} File "/usr/local/lib/python3.10/site-packages/pymilvus/client/grpc_handler.py", line 1604, in query
2025-01-16T22:35:07Z {container="step-test"} check_status(response.status)
2025-01-16T22:35:07Z {container="step-test"} File "/usr/local/lib/python3.10/site-packages/pymilvus/client/utils.py", line 63, in check_status
2025-01-16T22:35:07Z {container="step-test"} raise MilvusException(status.code, status.reason, status.error_code)
2025-01-16T22:35:07Z {container="step-test"} pymilvus.exceptions.MilvusException: <MilvusException: (code=503, message=fail to Query on QueryNode 4: delegator closed during wait tsafe: channel not available[channel=by-dev-rootcoord-dml_4_455358390307386397v0])>
2025-01-16T22:35:07Z {container="step-test"} (api_request.py:35)
2025-01-16T22:35:07Z {container="step-test"} [2025-01-16 21:01:03 - ERROR - ci_test]: (api_response) : <MilvusException: (code=503, message=fail to Query on QueryNode 4: delegator closed during wait tsafe: channel not available[channel=by-dev-rootcoord-dml_4_455358390307386397v0])> (api_request.py:36)

Expected Behavior

query successfully

Steps To Reproduce

Milvus Log

  1. link: https://jenkins.milvus.io:18080/blue/organizations/jenkins/Milvus%20Nightly%20CI(new)/detail/2.5/50/pipeline/122/

Anything else?

No response

@NicoYuan1986 NicoYuan1986 added kind/bug Issues or changes related a bug needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Jan 17, 2025
@NicoYuan1986 NicoYuan1986 added this to the 2.5.5 milestone Jan 17, 2025
@NicoYuan1986 NicoYuan1986 added triage/accepted Indicates an issue or PR is ready to be actively worked on. and removed needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. labels Jan 17, 2025
@congqixia
Copy link
Contributor

[2025/01/16 21:01:01.981 +00:00] [INFO] [balance/utils.go:138] ["Balance-Plans: new plans:{collectionID:455358390307386397, replicaID:455358391099065341, ChannelPlan:[collectionID: 455358390307386397, channel: by-dev-rootcoord-dml_4_455358390307386397v0, replicaID: 455358391099065341, from: 4, to: 6]\n}"]

[2025/01/16 21:01:02.616 +00:00] [WARN] [proxy/task_query.go:621] ["QueryNode query result error"] [traceID=a02daba66305793483728a1aec47824c] [collection=455358390307386397] [partitionIDs="[]"] [nodeID=4] [channel=by-dev-rootcoord-dml_4_455358390307386397v0] [errorCode=NoReplicaAvailable] [reason="delegator closed during wait tsafe: channel not available[channel=by-dev-rootcoord-dml_4_455358390307386397v0]"]
channel balance then query failed, could you please help investigate?
/assign @weiliu1031

@congqixia congqixia removed their assignment Jan 17, 2025
@weiliu1031
Copy link
Contributor

This issue is a corner case that may occur with a certain probability in high-frequency load/release collection scenarios. The specific triggering logic is as follows:

The delegator processes streaming data slowly, causing tSafe delays. As a result, queries may be blocked in the delegator while waiting for tSafe to advance.
During the channel balancing process, if a query is still waiting for tSafe on the old delegator, the release channel operation does not wait for these queries to complete. Instead, it causes the queries to fail, prompting the proxy to retry on the new delegator. The current retry mechanism allows up to two retries by default.
In scenarios with frequent load/release collection operations, a channel may be balanced repeatedly. This can indirectly lead to multiple query failures due to the release channel. When the number of failures exceeds the retry limit, the query ultimately fails.
Since this issue requires specific conditions to occur and is not common, it is not currently treated as a blocking issue. However, we plan to optimize the proxy's retry strategy to better handle similar corner cases in the future.
/cc @congqixia @NicoYuan1986

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Issues or changes related a bug triage/accepted Indicates an issue or PR is ready to be actively worked on.
Projects
None yet
Development

No branches or pull requests

4 participants