Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate worker (Celery?) issues #2697

Open
lbarcziova opened this issue Jan 15, 2025 · 17 comments
Open

Investigate worker (Celery?) issues #2697

lbarcziova opened this issue Jan 15, 2025 · 17 comments
Assignees
Labels
area/general Related to whole service, not a specific part/integration. complexity/single-task Regular task, should be done within days. gain/high This brings a lot of value to (not strictly a lot of) users. impact/high This issue impacts multiple/lot of users. kind/bug Something isn't working.

Comments

@lbarcziova
Copy link
Member

lbarcziova commented Jan 15, 2025

After the previous week redeployment (6th January), we have started hitting issues with jobs processing, causing tasks not being processed for some time and delays:

  • we could see restarts of the worker pods caused by hitting CPU limits - tried to mitigate this by increasing the CPU limits (Increase cpu limit to handle spikes better deployment#631), the limit was also increased for postgres (Increase cpu limits for postgres deployment#636) where metrics showed also going above limit
  • sometimes, the tasks are not being processed at all in workers, without any task blocking them
  • we could see in logs messages like:
    • Substantial drift from celery@packit-worker-long-running-0 may mean clocks are out of sync. Current drift is 1799 seconds. [orig: 2025-01-14 14:51:59.656603 recv: 2025-01-14 14:22:00.484181]
    • consumer: Connection to broker lost. Trying to re-establish the connection... followed by a restart
@lbarcziova lbarcziova added kind/bug Something isn't working. complexity/single-task Regular task, should be done within days. impact/high This issue impacts multiple/lot of users. area/general Related to whole service, not a specific part/integration. gain/high This brings a lot of value to (not strictly a lot of) users. labels Jan 15, 2025
@nforro
Copy link
Member

nforro commented Jan 15, 2025

AFAICT it only affects short-running workers.

  • Substantial drift from celery@packit-worker-long-running-0 may mean clocks are out of sync. Current drift is 1799 seconds. [orig: 2025-01-14 14:51:59.656603 recv: 2025-01-14 14:22:00.484181]

This matches the period of time when no tasks were being processed on that particular worker. There was no such message on the other worker yesterday, but the "idle time" was very similar, and 1799 is almost exactly 30 minutes, it could be just a coincidence, but it seems a bit suspicious to me (but I suppose it could be an interval in which Celery checks for non-responsive workers).

@lbarcziova lbarcziova moved this from new to refined in Packit Kanban Board Jan 15, 2025
@nforro
Copy link
Member

nforro commented Jan 15, 2025

After the previous week redeployment (6th January)

I checked the images and there were no updates of python3-celery, python3-gevent nor eventlet (is there anything else that could affect this?).

@majamassarini
Copy link
Member

majamassarini commented Jan 20, 2025

I was releasing Packit and I had to tag in sidetags both ogr and specfile.
I tagged them using staging instance.

Ogr was quick:

1/20/25
11:01:03.432 AM	
{ [-]
   kubernetes: { [-]
     container_id: cri-o://3e9d68e5cabf75270f05afb43d9f7bff523c45690fbb083b9646ab0119b56132
     container_image: quay.io/packit/packit-worker@sha256:efdd08d40650696a907bbfa0ed2350c238d3a3b5e7d935406b32c24dc720698e
     container_image_id: quay.io/packit/packit-worker@sha256:efdd08d40650696a907bbfa0ed2350c238d3a3b5e7d935406b32c24dc720698e
     container_name: packit-worker
     labels: { [-]
       component: packit-worker-short-running
       controller-revision-hash: packit-worker-short-running-6d4b46669
       paas_redhat_com_appcode: PCKT-002
       statefulset_kubernetes_io_pod-name: packit-worker-short-running-0
     }
     namespace_name: packit--stg
     pod_id: 8e80213d-6467-4004-932b-195c23c7bd29
     pod_ip: 172.20.212.235
     pod_name: packit-worker-short-running-0
     pod_owner: StatefulSet/packit-worker-short-running
   }
   level: info
   log_type: application
   message: [2025-01-20 11:01:03,432: INFO/MainProcess] Task task.steve_jobs.process_message[3f191fbc-b618-40e1-99b1-b93955f38d1e] received
   openshift: { [-]
     cluster_id: d760ecc9-5098-480a-a88f-8913d8ab0ee7
     labels: { [+]
     }
     sequence: 1737370863921404400
   }
}
Show as raw text
1/20/25
11:01:03.430 AM	
{ [-]
   kubernetes: { [-]
     container_id: cri-o://ef7f4ffe934db49e3176a1771af0b836b3904bd38f7406e23068db17f9609ed6
     container_image: quay.io/packit/packit-service-fedmsg@sha256:f349e711067c3d24e6dcfbbce783a4ea91af1971ef4e3d076ba36506459f31f7
     container_image_id: quay.io/packit/packit-service-fedmsg@sha256:f349e711067c3d24e6dcfbbce783a4ea91af1971ef4e3d076ba36506459f31f7
     container_name: packit-service-fedmsg
     labels: { [-]
       component: packit-service-fedmsg
       paas_redhat_com_appcode: PCKT-002
     }
     namespace_name: packit--stg
     pod_id: fe77a366-0323-4a96-a125-b1c6c3d1be66
     pod_ip: 172.20.20.37
     pod_name: packit-service-fedmsg-748f798f9c-j6svq
     pod_owner: ReplicaSet/packit-service-fedmsg-748f798f9c
   }
   level: debug
   log_type: application
   message: [2025-01-20 11:01:03,430 DEBUG packit_service_fedmsg.consumer] Task UUID=3f191fbc-b618-40e1-99b1-b93955f38d1e sent to Celery
   openshift: { [-]
     cluster_id: d760ecc9-5098-480a-a88f-8913d8ab0ee7
     labels: { [+]
     }
     sequence: 1737370863432772400
   }
}
Show as raw text

Specfile took almost 30 minutes to react:

1/20/25
11:31:23.439 AM	
{ [-]
   kubernetes: { [-]
     container_id: cri-o://3e9d68e5cabf75270f05afb43d9f7bff523c45690fbb083b9646ab0119b56132
     container_image: quay.io/packit/packit-worker@sha256:efdd08d40650696a907bbfa0ed2350c238d3a3b5e7d935406b32c24dc720698e
     container_image_id: quay.io/packit/packit-worker@sha256:efdd08d40650696a907bbfa0ed2350c238d3a3b5e7d935406b32c24dc720698e
     container_name: packit-worker
     labels: { [-]
       component: packit-worker-short-running
       controller-revision-hash: packit-worker-short-running-6d4b46669
       paas_redhat_com_appcode: PCKT-002
       statefulset_kubernetes_io_pod-name: packit-worker-short-running-0
     }
     namespace_name: packit--stg
     pod_id: 8e80213d-6467-4004-932b-195c23c7bd29
     pod_ip: 172.20.212.235
     pod_name: packit-worker-short-running-0
     pod_owner: StatefulSet/packit-worker-short-running
   }
   level: info
   log_type: application
   message: [2025-01-20 11:31:23,439: INFO/MainProcess] Task task.steve_jobs.process_message[faf4f880-1437-470f-9121-9516c1162e43] received
   openshift: { [-]
     cluster_id: d760ecc9-5098-480a-a88f-8913d8ab0ee7
     labels: { [-]
       cluster_id: preprod-spoke-aws-us-east-1
       paas_cluster_appcode: itos-008
       paas_cluster_cloud_provider: aws
       paas_cluster_flavor: mpp
       paas_cluster_id: preprod-spoke-aws-us-east-1
       paas_cluster_name: spoke
       paas_cluster_region: us-east-1
       paas_cluster_service_phase: preprod
     }
     sequence: 1737372683842993200
   }
}
Show as raw text
1/20/25
11:05:13.704 AM	
{ [-]
   kubernetes: { [-]
     container_id: cri-o://ef7f4ffe934db49e3176a1771af0b836b3904bd38f7406e23068db17f9609ed6
     container_image: quay.io/packit/packit-service-fedmsg@sha256:f349e711067c3d24e6dcfbbce783a4ea91af1971ef4e3d076ba36506459f31f7
     container_image_id: quay.io/packit/packit-service-fedmsg@sha256:f349e711067c3d24e6dcfbbce783a4ea91af1971ef4e3d076ba36506459f31f7
     container_name: packit-service-fedmsg
     labels: { [-]
       component: packit-service-fedmsg
       paas_redhat_com_appcode: PCKT-002
     }
     namespace_name: packit--stg
     pod_id: fe77a366-0323-4a96-a125-b1c6c3d1be66
     pod_ip: 172.20.20.37
     pod_name: packit-service-fedmsg-748f798f9c-j6svq
     pod_owner: ReplicaSet/packit-service-fedmsg-748f798f9c
   }
   level: debug
   log_type: application
   message: [2025-01-20 11:05:13,704 DEBUG packit_service_fedmsg.consumer] Task UUID=faf4f880-1437-470f-9121-9516c1162e43 sent to Celery
   openshift: { [-]
     cluster_id: d760ecc9-5098-480a-a88f-8913d8ab0ee7
     labels: { [-]
       cluster_id: preprod-spoke-aws-us-east-1
       paas_cluster_appcode: itos-008
       paas_cluster_cloud_provider: aws
       paas_cluster_flavor: mpp
       paas_cluster_id: preprod-spoke-aws-us-east-1
       paas_cluster_name: spoke
       paas_cluster_region: us-east-1
       paas_cluster_service_phase: preprod
     }
     sequence: 1737371114150410800
   }
}
Show as raw text

@nforro
Copy link
Member

nforro commented Jan 20, 2025

So we are hitting this on stage as well?

@majamassarini
Copy link
Member

So we are hitting this on stage as well?

yes, exactly!

@majamassarini
Copy link
Member

majamassarini commented Jan 20, 2025

This are the most suspicious messages I found in the logs for the short-running-worker that didn't process any task in 30 minutes.

{ [-]
   kubernetes: { [+]
   }
   level: info
   log_type: application
   message: [2025-01-20 11:01:10,750: INFO/MainProcess] missed heartbeat from celery@packit-worker-long-running-0
   openshift: { [+]
   }
}
"[2025-01-20 11:31:08,202: CRITICAL/MainProcess] Couldn't ack '47024a1a-4c14-41b4-912c-1fd82ea41028', reason:ConnectionError('Connection closed by server.')"
"  File ""/usr/lib/python3.13/site-packages/kombu/message.py"", line 126, in ack"
"  File ""/usr/lib/python3.13/site-packages/kombu/transport/virtual/base.py"", line 670, in basic_ack"
"  File ""/usr/lib/python3.13/site-packages/kombu/transport/redis.py"", line 380, in ack"
"  File ""/usr/lib/python3.13/site-packages/sentry_sdk/integrations/redis/_sync_common.py"", line 54, in sentry_patched_execute"
"  File ""/usr/lib/python3.13/site-packages/redis/client.py"", line 1530, in execute"
"  File ""/usr/lib/python3.13/site-packages/redis/retry.py"", line 65, in call_with_retry"
"  File ""/usr/lib/python3.13/site-packages/redis/client.py"", line 1532, in <lambda>"
"  File ""/usr/lib/python3.13/site-packages/redis/client.py"", line 1508, in _disconnect_raise_reset"
"  File ""/usr/lib/python3.13/site-packages/redis/retry.py"", line 62, in call_with_retry"
"  File ""/usr/lib/python3.13/site-packages/redis/client.py"", line 1531, in <lambda>"
"  File ""/usr/lib/python3.13/site-packages/redis/client.py"", line 1375, in _execute_transaction"
"  File ""/usr/lib/python3.13/site-packages/redis/client.py"", line 1462, in parse_response"
"  File ""/usr/lib/python3.13/site-packages/redis/client.py"", line 584, in parse_response"
"  File ""/usr/lib/python3.13/site-packages/redis/connection.py"", line 592, in read_response"
"  File ""/usr/lib/python3.13/site-packages/redis/_parsers/resp2.py"", line 15, in read_response"
"  File ""/usr/lib/python3.13/site-packages/redis/_parsers/resp2.py"", line 25, in _read_response"
"  File ""/usr/lib/python3.13/site-packages/redis/_parsers/socket.py"", line 115, in readline"
"  File ""/usr/lib/python3.13/site-packages/redis/_parsers/socket.py"", line 68, in _read_from_socket"
"redis.exceptions.ConnectionError: Connection closed by server."
"[2025-01-20 11:31:08,258: WARNING/MainProcess] consumer: Connection to broker lost. Trying to re-establish the connection..."
"  File ""/usr/lib/python3.13/site-packages/celery/bootsteps.py"", line 116, in start"
"  File ""/usr/lib/python3.13/site-packages/celery/worker/consumer/consumer.py"", line 742, in start"
"  File ""/usr/lib/python3.13/site-packages/celery/worker/loops.py"", line 130, in synloop"
"  File ""/usr/lib/python3.13/site-packages/kombu/connection.py"", line 341, in drain_events"
"  File ""/usr/lib/python3.13/site-packages/kombu/transport/virtual/base.py"", line 997, in drain_events"
"  File ""/usr/lib/python3.13/site-packages/kombu/transport/redis.py"", line 584, in get"
"  File ""/usr/lib/python3.13/site-packages/kombu/transport/redis.py"", line 525, in _register_BRPOP"
"  File ""/usr/lib/python3.13/site-packages/kombu/transport/redis.py"", line 957, in _brpop_start"
"  File ""/usr/lib/python3.13/site-packages/redis/connection.py"", line 556, in send_command"
"  File ""/usr/lib/python3.13/site-packages/redis/connection.py"", line 529, in send_packed_command"
"  File ""/usr/lib/python3.13/site-packages/redis/connection.py"", line 521, in check_health"
"  File ""/usr/lib/python3.13/site-packages/redis/retry.py"", line 67, in call_with_retry"
"  File ""/usr/lib/python3.13/site-packages/redis/retry.py"", line 62, in call_with_retry"
"  File ""/usr/lib/python3.13/site-packages/redis/connection.py"", line 512, in _send_ping"
"redis.exceptions.ConnectionError: Bad response from PING health check"
"[2025-01-20 11:31:08,264: WARNING/MainProcess] Restoring 1 unacknowledged message(s)"
"[2025-01-20 11:31:08,280: INFO/MainProcess] Temporarily reducing the prefetch count to 1 to avoid over-fetching since 0 tasks are currently being processed."
"The prefetch count will be gradually restored to 0 as the tasks complete processing."
"[2025-01-20 11:31:08,280: DEBUG/MainProcess] | Consumer: Restarting event loop..."
"[2025-01-20 11:31:08,280: DEBUG/MainProcess] | Consumer: Restarting Heart..."
"[2025-01-20 11:31:08,282: DEBUG/MainProcess] | Consumer: Restarting Control..."
"[2025-01-20 11:31:08,282: DEBUG/MainProcess] Waiting for broadcast thread to shutdown..."
"[2025-01-20 11:31:08,293: DEBUG/MainProcess] | Consumer: Restarting Tasks..."
"[2025-01-20 11:31:08,293: DEBUG/MainProcess] Canceling task consumer..."
"[2025-01-20 11:31:08,293: DEBUG/MainProcess] | Consumer: Restarting Gossip..."
"[2025-01-20 11:31:08,293: DEBUG/MainProcess] | Consumer: Restarting Mingle..."
"[2025-01-20 11:31:08,294: DEBUG/MainProcess] | Consumer: Restarting Events..."
"[2025-01-20 11:31:08,294: DEBUG/MainProcess] | Consumer: Restarting Connection..."
"[2025-01-20 11:31:08,294: DEBUG/MainProcess] | Consumer: Starting Connection"
"[2025-01-20 11:31:08,307: INFO/MainProcess] Connected to redis://valkey:6379/0"
"[2025-01-20 11:31:08,307: DEBUG/MainProcess] ^-- substep ok"
"[2025-01-20 11:31:08,307: DEBUG/MainProcess] | Consumer: Starting Events"
"[2025-01-20 11:31:08,310: DEBUG/MainProcess] ^-- substep ok"
"[2025-01-20 11:31:08,310: DEBUG/MainProcess] | Consumer: Starting Mingle"
"[2025-01-20 11:31:08,310: INFO/MainProcess] mingle: searching for neighbors"
"[2025-01-20 11:32:02,715: INFO/MainProcess] task.steve_jobs.process_message[8c91f702-86d2-4496-bad4-964d18c4897a] Pagure PR comment event, topic: org.fedoraproject.prod.pagure.pull-request.comment.added"

And this is the strange log for the long-running-worker not responding to heartbeat check.

"[2025-01-20 10:59:02,478: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Removing contents of the PV."
"[2025-01-20 10:59:02,478: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Running handler <packit_service.worker.handlers.distgit.SyncFromDownstream object at 0x7ff0bb29b7a0> for sync_from_downstream"
"[2025-01-20 10:59:02,479: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Instantiation of the repository cache at /repository-cache. New projects will not be added."
"[2025-01-20 10:59:02,479: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Attributes requested: repo_name, namespace, git_repo, full_name"
"[2025-01-20 10:59:02,479: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Attributes requested not to be calculated: "
"[2025-01-20 10:59:02,479: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Transitive dependencies: repo_name => set(), namespace => set(), git_repo => set(), full_name => set(), git_project => set(), git_url => set(), working_dir => set(), git_service => set()"
"[2025-01-20 10:59:02,480: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] To-calculate set: {'full_name', 'git_url', 'git_repo', 'git_service', 'repo_name', 'namespace'}"
"[2025-01-20 10:59:02,698: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Parsed remote url 'https://src.fedoraproject.org/rpms/packit.git' from the project PagureProject(namespace=""rpms"", repo=""packit"")."
"[2025-01-20 10:59:02,699: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] `working_dir` is set and `git_repo` is not: let's discover..."
"[2025-01-20 10:59:02,699: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Cloning repo https://src.fedoraproject.org/rpms/packit.git -> /tmp/sandcastle/local-project using repository cache at /repository-cache"
"[2025-01-20 10:59:02,699: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Repositories in the cache (0 project(s)):"
""
"[2025-01-20 10:59:02,700: INFO/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] git clone -v --tags -- https://src.fedoraproject.org/rpms/packit.git /tmp/sandcastle/local-project"
"[2025-01-20 10:59:02,700: DEBUG/ForkPoolWorker-1] TaskName.sync_from_downstream[39a489f0-701e-4fcf-92b3-9af344b7be5a] Popen(['git', 'clone', '-v', '--tags', '--', 'https://src.fedoraproject.org/rpms/packit.git', '/tmp/sandcastle/local-project'], cwd=/src, stdin=None, shell=False, universal_newlines=True)"
"[2025-01-20 11:32:42,453: INFO/ForkPoolWorker-1] [10814] 1737372762.453006: ccselect module realm chose cache DIR::/home/packit/kerberos/tkt with client principal [email protected] for server principal HTTP/[email protected]"
"[2025-01-20 11:32:42,453: INFO/ForkPoolWorker-1] [10814] 1737372762.453007: Getting credentials [email protected] -> HTTP/koji.fedoraproject.org@ using ccache DIR::/home/packit/kerberos/tkt"
"[2025-01-20 11:32:42,453: INFO/ForkPoolWorker-1] [10814] 1737372762.453008: Retrieving [email protected] -> krb5_ccache_conf_data/start_realm@X-CACHECONF: from DIR::/home/packit/kerberos/tkt with result: -1765328243/Matching credential not found (filename: /home/packit/kerberos/tkt)"
"[2025-01-20 11:32:42,453: INFO/ForkPoolWorker-1] [10814] 1737372762.453009: Retrieving [email protected] -> HTTP/koji.fedoraproject.org@ from DIR::/home/packit/kerberos/tkt with result: -1765328243/Matching credential not found (filename: /home/packit/kerberos/tkt)"
"[2025-01-20 11:32:42,453: INFO/ForkPoolWorker-1] [10814] 1737372762.453010: Retrying [email protected] -> HTTP/[email protected] with result: -1765328243/Matching credential not found (filename: /home/packit/kerberos/tkt)"
"[2025-01-20 11:32:42,453: INFO/ForkPoolWorker-1] [10814] 1737372762.453011: Retrieving [email protected] -> krbtgt/[email protected] from DIR::/home/packit/kerberos/tkt with result: 0/Success"
"[2025-01-20 11:32:42,453: INFO/ForkPoolWorker-1] [10814] 1737372762.453012: Starting with TGT for client realm: [email protected] -> krbtgt/[email protected]"
"[2025-01-20 11:32:42,453: INFO/ForkPoolWorker-1] [10814] 1737372762.453013: Requesting tickets for HTTP/[email protected], referrals on"
"[2025-01-20 11:32:42,454: INFO/ForkPoolWorker-1] [10814] 1737372762.453014: Generated subkey for TGS request: aes256-cts/E7EA"
"[2025-01-20 11:32:42,454: INFO/ForkPoolWorker-1] [10814] 1737372762.453015: etypes requested in TGS request: aes256-sha2, aes128-sha2, aes256-cts, aes128-cts, camellia256-cts, camellia128-cts"
"[2025-01-20 11:32:42,454: INFO/ForkPoolWorker-1] [10814] 1737372762.453017: Encoding request body and padata into FAST request"
"[2025-01-20 11:32:42,454: INFO/ForkPoolWorker-1] [10814] 1737372762.453018: Sending request (1848 bytes) to FEDORAPROJECT.ORG"
"[2025-01-20 11:32:42,454: INFO/ForkPoolWorker-1] [10814] 1737372762.453019: Resolving hostname id.fedoraproject.org"
"[2025-01-20 11:32:42,476: INFO/ForkPoolWorker-1] [10814] 1737372762.453020: TLS certificate name matched ""id.fedoraproject.org"""
"[2025-01-20 11:32:42,476: INFO/ForkPoolWorker-1] [10814] 1737372762.453021: Sending HTTPS request to https 38.145.60.21:443"
"[2025-01-20 11:32:42,653: INFO/ForkPoolWorker-1] [10814] 1737372762.453022: Received answer (1891 bytes) from https 38.145.60.21:443"
"[2025-01-20 11:32:42,653: INFO/ForkPoolWorker-1] [10814] 1737372762.453023: Terminating TCP connection to https 38.145.60.21:443"
"[2025-01-20 11:32:42,653: INFO/ForkPoolWorker-1] [10814] 1737372762.453024: Sending DNS URI query for _kerberos.FEDORAPROJECT.ORG."
"[2025-01-20 11:32:42,654: INFO/ForkPoolWorker-1] [10814] 1737372762.453025: URI answer: 10 1 ""krb5srv:m:kkdcp:https://id.fedoraproject.org/KdcProxy/"""
"[2025-01-20 11:32:42,654: INFO/ForkPoolWorker-1] [10814] 1737372762.453026: Response was from primary KDC"
"[2025-01-20 11:32:42,654: INFO/ForkPoolWorker-1] [10814] 1737372762.453027: Decoding FAST response"
"[2025-01-20 11:32:42,654: INFO/ForkPoolWorker-1] [10814] 1737372762.453028: FAST reply key: aes256-cts/61AD"
"[2025-01-20 11:32:42,654: INFO/ForkPoolWorker-1] [10814] 1737372762.453029: TGS reply is for [email protected] -> HTTP/[email protected] with session key aes256-cts/5E5B"
"[2025-01-20 11:32:42,654: INFO/ForkPoolWorker-1] [10814] 1737372762.453030: TGS request result: 0/Success"
"[2025-01-20 11:32:42,654: INFO/ForkPoolWorker-1] [10814] 1737372762.453031: Received creds for desired service HTTP/[email protected]"
"[2025-01-20 11:32:42,655: INFO/ForkPoolWorker-1] [10814] 1737372762.453032: Storing [email protected] -> HTTP/koji.fedoraproject.org@ in DIR::/home/packit/kerberos/tkt"
"[2025-01-20 11:32:42,655: INFO/ForkPoolWorker-1] [10814] 1737372762.453033: Creating authenticator for [email protected] -> HTTP/koji.fedoraproject.org@, seqnum 61862160, subkey aes256-cts/4F03, session key aes256-cts/5E5B"
"[2025-01-20 11:32:43,265: INFO/ForkPoolWorker-1] Using packit.spec"
"[2025-01-20 11:32:43,295: INFO/ForkPoolWorker-1] Building packit-1.0.0-1.fc42 for f42-build-side-103953"
"[2025-01-20 11:32:43,307: INFO/ForkPoolWorker-1] Created task: 128206838"
"[2025-01-20 11:32:43,307: INFO/ForkPoolWorker-1] Task info: https://koji.fedoraproject.org/koji/taskinfo?taskID=128206838"
"[2025-01-20 11:32:43,450: INFO/ForkPoolWorker-1] TaskName.downstream_koji_build[03695bb7-019d-4a18-a65d-cc23f6eb6b39] Cleaning up the mess."

Shouldn't sync_from_downstream be disabled?

@nforro
Copy link
Member

nforro commented Jan 20, 2025

Shouldn't sync_from_downstream be disabled?

I thought we got rid of it in the code, but apparently not. It's still enabled in packit and ogr config though.

@lbarcziova lbarcziova self-assigned this Jan 20, 2025
@lbarcziova lbarcziova moved this from refined to in-progress in Packit Kanban Board Jan 20, 2025
@lbarcziova
Copy link
Member Author

lbarcziova commented Jan 22, 2025

I checked the images and there were no updates of python3-celery, python3-gevent nor eventlet (is there anything else that could affect this?).

I double-checked also other packages that could potentially affect this (resulting in celery, gevent, eventlet, greenlet, billiard, kombu, redis, amqp) and haven't found any difference in version.

@lbarcziova
Copy link
Member Author

After discussion with @majamassarini I added an alert that could help detect this: https://gitlab.cee.redhat.com/osci/monitoring/-/merge_requests/557

@majamassarini
Copy link
Member

majamassarini commented Feb 6, 2025

While working on #2693 I spotted this for short running pod 2 that I think is related with this card.

And something similar for short running 0 (see below).

In both cases the alert rule was not raised (I think because we are detecting it just when this happen for all the workers at the same time, in these case it happened just for a single worker).
In both cases the memory consumption was not bad for both of them (half their limit).
In both cases everything seem to happen after more missed heartbeat:

  • worker 2
[2025-02-04 17:11:14,977: INFO/MainProcess] missed heartbeat from celery@packit-worker-long-running-1
[2025-02-04 17:11:14,977: INFO/MainProcess] missed heartbeat from celery@packit-worker-long-running-0
  • worker 0
[2025-02-05 12:13:38,558: INFO/MainProcess] missed heartbeat from celery@packit-worker-long-running-0
[2025-02-05 12:13:38,558: INFO/MainProcess] missed heartbeat from celery@packit-worker-long-running-1
[2025-02-05 12:13:38,559: INFO/MainProcess] missed heartbeat from celery@packit-worker-short-running-2
[2025-02-05 12:13:38,559: INFO/MainProcess] missed heartbeat from celery@packit-worker-short-running-1

The worker 2 was restarted by OpenShift and in worker 0 just Celery restarted.


short running worker 0 log:

[2025-02-05 12:13:31,115: DEBUG/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Running handler <packit_service.worker.handlers.copr.CoprBuildStartHandler object at 0x7f3c1c562e60> for copr_build
[2025-02-05 12:13:31,278: DEBUG/MainProcess] [233] 1738757611.277979: Received answer (1615 bytes) from https 38.145.60.21:443
[2025-02-05 12:13:31,278: DEBUG/MainProcess] [233] 1738757611.277980: Terminating TCP connection to https 38.145.60.21:443
[2025-02-05 12:13:31,279: DEBUG/MainProcess] [233] 1738757611.277981: Sending DNS URI query for _kerberos.FEDORAPROJECT.ORG.
[2025-02-05 12:13:31,279: DEBUG/MainProcess] [233] 1738757611.277982: URI answer: 10 1 "krb5srv:m:kkdcp:https://id.fedoraproject.org/KdcProxy/"
[2025-02-05 12:13:31,279: DEBUG/MainProcess] [233] 1738757611.277983: Response was from primary KDC
[2025-02-05 12:13:31,279: DEBUG/MainProcess] [233] 1738757611.277984: Processing preauth types: PA-ETYPE-INFO2 (19)
[2025-02-05 12:13:31,279: DEBUG/MainProcess] [233] 1738757611.277985: Selected etype info: etype aes256-cts, salt "Cn90 Y,B :A0D)5/", params ""
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277986: Produced preauth for next request: (empty)
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277987: AS key determined by preauth: aes256-cts/FD25
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277988: Decrypted AS reply; session key is: aes256-cts/F5D3
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277989: FAST negotiation: available
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277990: Resolving unique ccache of type MEMORY
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277991: Initializing MEMORY:ff62KZR with default princ [email protected]
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277992: Storing config in MEMORY:ff62KZR for krbtgt/[email protected]: fast_avail: yes
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277993: Storing [email protected] -> krb5_ccache_conf_data/fast_avail/krbtgt\/FEDORAPROJECT.ORG\@FEDORAPROJECT.ORG@X-CACHECONF: in MEMORY:ff62KZR
[2025-02-05 12:13:31,280: DEBUG/MainProcess] [233] 1738757611.277994: Storing config in MEMORY:ff62KZR for krbtgt/[email protected]: pa_type: 2
[2025-02-05 12:13:31,281: DEBUG/MainProcess] [233] 1738757611.277995: Storing [email protected] -> krb5_ccache_conf_data/pa_type/krbtgt\/FEDORAPROJECT.ORG\@FEDORAPROJECT.ORG@X-CACHECONF: in MEMORY:ff62KZR
[2025-02-05 12:13:31,281: DEBUG/MainProcess] [233] 1738757611.277996: Storing [email protected] -> krbtgt/[email protected] in MEMORY:ff62KZR
[2025-02-05 12:13:31,281: DEBUG/MainProcess] [233] 1738757611.277997: Moving ccache MEMORY:ff62KZR to DIR::/home/packit/kerberos/tkt
[2025-02-05 12:13:31,281: DEBUG/MainProcess] [233] 1738757611.277998: Destroying ccache MEMORY:ff62KZR
[2025-02-05 12:13:31,282: DEBUG/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Command: osh-cli version-diff-build --srpm=/tmp/tmp4ed6egnz/automotive-image-builder-0.4.0-1.dev121+0447732.src.rpm --base-srpm=/tmp/tmp4ed6egnz/automotive-image-builder-0.4.0-1.dev118+8ac39d8.src.rpm --config=fedora-rawhide-x86_64 --nowait --json --comment=Submitted via Packit Service for https://dashboard.packit.dev/jobs/copr/2191106
[2025-02-05 12:13:31,294: ERROR/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Failed to run the handler: ('Connection broken: IncompleteRead(219 bytes read, 502 more expected)', IncompleteRead(219 bytes read, 502 more expected))
[2025-02-05 12:13:31,294: ERROR/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Failed to run the handler: Could not perform the request due to a network error
[2025-02-05 12:13:31,298: INFO/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Cleaning up the mess.
[2025-02-05 12:13:31,298: DEBUG/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Removing contents of the PV.
[2025-02-05 12:13:31,298: INFO/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Cleaning up the mess.
[2025-02-05 12:13:31,298: DEBUG/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Removing contents of the PV.
[2025-02-05 12:13:31,311: WARNING/MainProcess] Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLError(1, '[SSL: SSLV3_ALERT_BAD_RECORD_MAC] ssl/tls alert bad record mac (_ssl.c:2638)'))': /api/1767823/envelope/
[2025-02-05 12:13:31,313: WARNING/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLError(1, '[SSL: SSLV3_ALERT_BAD_RECORD_MAC] ssl/tls alert bad record mac (_ssl.c:2638)'))': /repos/rpm-software-management/dnf5/check-runs
[2025-02-05 12:13:31,317: INFO/MainProcess] Task task.run_copr_build_start_handler[cae78083-afb5-4aab-a1b9-2815c5610481] received
[2025-02-05 12:13:31,364: INFO/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Task TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] retry: Retry in 3s: ChunkedEncodingError(ProtocolError('Connection broken: IncompleteRead(219 bytes read, 502 more expected)', IncompleteRead(219 bytes read, 502 more expected)))
[2025-02-05 12:13:31,367: INFO/MainProcess] TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] Task TaskName.copr_build_start[cae78083-afb5-4aab-a1b9-2815c5610481] retry: Retry in 0s: OgrNetworkError('Could not perform the request due to a network error')
[2025-02-05 12:13:31,369: WARNING/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProtocolError('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))': /repos/rpm-software-management/dnf5/check-runs
[2025-02-05 12:13:31,578: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:31,578: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:31,578: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:31,579: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:31,579: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:31,579: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:31,579: DEBUG/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Setting Github status check 'in_progress' for check 'testing-farm:fedora-41-x86_64:createrepo_c-tests': RPM build is in progress...
[2025-02-05 12:13:31,691: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:31,691: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:31,692: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:31,692: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:31,692: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:31,692: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:31,692: DEBUG/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Setting Github status check 'in_progress' for check 'testing-farm:fedora-41-x86_64:createrepo_c-tests': RPM build is in progress...
[2025-02-05 12:13:31,892: DEBUG/MainProcess] [234] 1738757611.892905: ccselect module realm chose cache DIR::/home/packit/kerberos/tkt with client principal [email protected] for server principal HTTP/[email protected]
[2025-02-05 12:13:31,893: DEBUG/MainProcess] [234] 1738757611.892906: Getting credentials [email protected] -> HTTP/openscanhub.fedoraproject.org@ using ccache DIR::/home/packit/kerberos/tkt
[2025-02-05 12:13:31,893: DEBUG/MainProcess] [234] 1738757611.892907: Retrieving [email protected] -> krb5_ccache_conf_data/start_realm@X-CACHECONF: from DIR::/home/packit/kerberos/tkt with result: -1765328243/Matching credential not found (filename: /home/packit/kerberos/tkt)
[2025-02-05 12:13:31,893: DEBUG/MainProcess] [234] 1738757611.892908: Retrieving [email protected] -> HTTP/openscanhub.fedoraproject.org@ from DIR::/home/packit/kerberos/tkt with result: -1765328243/Matching credential not found (filename: /home/packit/kerberos/tkt)
[2025-02-05 12:13:31,893: DEBUG/MainProcess] [234] 1738757611.892909: Retrying [email protected] -> HTTP/[email protected] with result: -1765328243/Matching credential not found (filename: /home/packit/kerberos/tkt)
[2025-02-05 12:13:31,893: DEBUG/MainProcess] [234] 1738757611.892910: Retrieving [email protected] -> krbtgt/[email protected] from DIR::/home/packit/kerberos/tkt with result: 0/Success
[2025-02-05 12:13:31,893: DEBUG/MainProcess] [234] 1738757611.892911: Starting with TGT for client realm: [email protected] -> krbtgt/[email protected]
[2025-02-05 12:13:31,894: DEBUG/MainProcess] [234] 1738757611.892912: Requesting tickets for HTTP/[email protected], referrals on
[2025-02-05 12:13:31,894: DEBUG/MainProcess] [234] 1738757611.892913: Generated subkey for TGS request: aes256-cts/666E
[2025-02-05 12:13:31,894: DEBUG/MainProcess] [234] 1738757611.892914: etypes requested in TGS request: aes256-sha2, aes128-sha2, aes256-cts, aes128-cts, camellia256-cts, camellia128-cts
[2025-02-05 12:13:31,894: DEBUG/MainProcess] [234] 1738757611.892916: Encoding request body and padata into FAST request
[2025-02-05 12:13:31,894: DEBUG/MainProcess] [234] 1738757611.892917: Sending request (1834 bytes) to FEDORAPROJECT.ORG
[2025-02-05 12:13:31,894: DEBUG/MainProcess] [234] 1738757611.892918: Resolving hostname id.fedoraproject.org
[2025-02-05 12:13:31,917: DEBUG/MainProcess] [234] 1738757611.892919: TLS certificate name matched "id.fedoraproject.org"
[2025-02-05 12:13:31,918: DEBUG/MainProcess] [234] 1738757611.892920: Sending HTTPS request to https 38.145.60.21:443
[2025-02-05 12:13:31,993: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:31,994: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:31,994: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:31,994: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:31,994: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:31,994: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:31,994: DEBUG/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Setting Github status check 'in_progress' for check 'testing-farm:fedora-42-x86_64:createrepo_c-tests': RPM build is in progress...
[2025-02-05 12:13:32,083: DEBUG/MainProcess] [234] 1738757612.082937: Received answer (1869 bytes) from https 38.145.60.21:443
[2025-02-05 12:13:32,083: DEBUG/MainProcess] [234] 1738757612.082938: Terminating TCP connection to https 38.145.60.21:443
[2025-02-05 12:13:32,083: DEBUG/MainProcess] [234] 1738757612.082939: Sending DNS URI query for _kerberos.FEDORAPROJECT.ORG.
[2025-02-05 12:13:32,084: DEBUG/MainProcess] [234] 1738757612.082940: URI answer: 10 1 "krb5srv:m:kkdcp:https://id.fedoraproject.org/KdcProxy/"
[2025-02-05 12:13:32,084: DEBUG/MainProcess] [234] 1738757612.082941: Response was from primary KDC
[2025-02-05 12:13:32,084: DEBUG/MainProcess] [234] 1738757612.082942: Decoding FAST response
[2025-02-05 12:13:32,084: DEBUG/MainProcess] [234] 1738757612.082943: FAST reply key: aes256-cts/D30E
[2025-02-05 12:13:32,084: DEBUG/MainProcess] [234] 1738757612.082944: TGS reply is for [email protected] -> HTTP/[email protected] with session key aes256-cts/04D7
[2025-02-05 12:13:32,085: DEBUG/MainProcess] [234] 1738757612.082945: TGS request result: 0/Success
[2025-02-05 12:13:32,085: DEBUG/MainProcess] [234] 1738757612.082946: Received creds for desired service HTTP/[email protected]
[2025-02-05 12:13:32,085: DEBUG/MainProcess] [234] 1738757612.082947: Storing [email protected] -> HTTP/openscanhub.fedoraproject.org@ in DIR::/home/packit/kerberos/tkt
[2025-02-05 12:13:32,085: DEBUG/MainProcess] [234] 1738757612.082948: Creating authenticator for [email protected] -> HTTP/openscanhub.fedoraproject.org@, seqnum 110231511, subkey aes256-cts/F17F, session key aes256-cts/04D7
[2025-02-05 12:13:32,236: DEBUG/MainProcess] [234] 1738757612.082950: Read AP-REP, time 1738757612.82949, subkey aes256-sha2/65D5, seqnum 49588984
[2025-02-05 12:13:32,755: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:32,755: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:32,756: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:32,756: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:32,756: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:32,756: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:32,756: DEBUG/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Setting Github status check 'in_progress' for check 'testing-farm:fedora-41-x86_64:dnf5daemon-tests': RPM build is in progress...
[2025-02-05 12:13:32,918: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:32,918: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:32,918: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:32,918: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:32,919: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:32,919: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:32,919: DEBUG/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Setting Github status check 'in_progress' for check 'testing-farm:fedora-41-x86_64:dnf5daemon-tests': RPM build is in progress...
[2025-02-05 12:13:33,231: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:33,231: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:33,231: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:33,231: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Build targets: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'} 
[2025-02-05 12:13:33,232: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:33,232: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Result set: {'fedora-40-x86_64', 'fedora-41-x86_64', 'fedora-rawhide-x86_64', 'fedora-42-x86_64'}
[2025-02-05 12:13:33,232: DEBUG/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Setting Github status check 'in_progress' for check 'testing-farm:fedora-42-x86_64:dnf5daemon-tests': RPM build is in progress...
[2025-02-05 12:13:33,359: DEBUG/MainProcess] {"id": 35761, "url": "http://openscanhub.fedoraproject.org/task/35761/"}
[2025-02-05 12:13:33,393: INFO/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Scan submitted successfully.
[2025-02-05 12:13:33,394: DEBUG/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Parsed dict from output: {'id': 35761, 'url': 'http://openscanhub.fedoraproject.org/task/35761/'} 
[2025-02-05 12:13:33,401: INFO/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Cleaning up the mess.
[2025-02-05 12:13:33,401: DEBUG/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Removing contents of the PV.
[2025-02-05 12:13:33,402: DEBUG/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Job finished!
[2025-02-05 12:13:33,402: INFO/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Pushing the metrics to pushgateway.
[2025-02-05 12:13:33,429: INFO/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Task TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] succeeded in 6.4112632339965785s: {'job': {'copr_build-20250205121327025939': {'success': True, 'details': {...}}}, 'event': {'commit_sha': '58dec537fff2264ec44eea0857932382d0cae583', 'base_repo_name': 'automotive-image-builder', 'base_repo_namespace': 'CentOS/automotive/src', 'topic': 'org.fedoraproject.prod.copr.build.end', 'identifier': '198', 'created_at': 1738757606, 'project_url': 'https://gitlab.com/CentOS/automotive/src/automotive-image-builder', '_pr_id': 198, 'actor': None, 'git_ref': '58dec537fff2264ec44eea0857932382d0cae583', 'build_id': 8609012, 'chroot': 'fedora-rawhide-x86_64', 'status': 1, 'owner': 'packit', 'project_name': 'gitlab.com-CentOS-automotive-src-automotive-image-builder-198', 'pkg': 'automotive-image-builder', 'timestamp': 1738757606.896176, 'event_type': 'copr.build.End', 'event_id': 126281, 'task_accepted_time': None}}
[2025-02-05 12:13:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:13:34,006: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Build targets: {'fedora-rawhide-x86_64'} 
[2025-02-05 12:13:34,006: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:34,006: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Result set: {'fedora-rawhide-x86_64'}
[2025-02-05 12:13:34,012: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Cleaning up the mess.
[2025-02-05 12:13:34,012: DEBUG/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Removing contents of the PV.
[2025-02-05 12:13:34,013: DEBUG/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Job finished!
[2025-02-05 12:13:34,013: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Pushing the metrics to pushgateway.
[2025-02-05 12:13:34,039: INFO/MainProcess] TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] Task TaskName.copr_build_start[e40c4103-3f5b-4c66-9b41-1a1f0acc82cf] succeeded in 5.587283611006569s: {'job': {'copr_build-20250205121328475410': {'success': True, 'details': {...}}}, 'event': {'commit_sha': '859f5d3b7a8d8787fbf9d91a922197297085d983', 'base_repo_name': 'dnf5', 'base_repo_namespace': 'rpm-software-management', 'topic': 'org.fedoraproject.prod.copr.build.start', 'identifier': '1592', 'created_at': 1738757604, 'project_url': 'https://github.com/rpm-software-management/dnf5', '_pr_id': 1592, 'actor': None, 'git_ref': '859f5d3b7a8d8787fbf9d91a922197297085d983', 'build_id': 8609007, 'chroot': 'fedora-41-x86_64', 'status': 3, 'owner': 'packit', 'project_name': 'rpm-software-management-dnf5-1592', 'pkg': 'dnf5', 'timestamp': 1738757604.510865, 'event_type': 'copr.build.Start', 'event_id': 105590, 'task_accepted_time': None}}
[2025-02-05 12:13:34,242: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Build targets: {'fedora-rawhide-x86_64'} 
[2025-02-05 12:13:34,242: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:34,242: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Result set: {'fedora-rawhide-x86_64'}
[2025-02-05 12:13:34,247: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Cleaning up the mess.
[2025-02-05 12:13:34,248: DEBUG/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Removing contents of the PV.
[2025-02-05 12:13:34,248: DEBUG/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Job finished!
[2025-02-05 12:13:34,248: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Pushing the metrics to pushgateway.
[2025-02-05 12:13:34,277: INFO/MainProcess] TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] Task TaskName.copr_build_start[d41d3073-48b4-4141-a613-ce3326c95e17] succeeded in 5.6973279579979135s: {'job': {'copr_build-20250205121328605943': {'success': True, 'details': {...}}}, 'event': {'commit_sha': '859f5d3b7a8d8787fbf9d91a922197297085d983', 'base_repo_name': 'dnf5', 'base_repo_namespace': 'rpm-software-management', 'topic': 'org.fedoraproject.prod.copr.build.start', 'identifier': '1592', 'created_at': 1738757604, 'project_url': 'https://github.com/rpm-software-management/dnf5', '_pr_id': 1592, 'actor': None, 'git_ref': '859f5d3b7a8d8787fbf9d91a922197297085d983', 'build_id': 8609007, 'chroot': 'fedora-41-x86_64', 'status': 3, 'owner': 'packit', 'project_name': 'rpm-software-management-dnf5-1592', 'pkg': 'dnf5', 'timestamp': 1738757604.510865, 'event_type': 'copr.build.Start', 'event_id': 105590, 'task_accepted_time': None}}
[2025-02-05 12:13:34,453: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Build targets: {'fedora-rawhide-x86_64'} 
[2025-02-05 12:13:34,453: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Copr chroots: ['epel-10-ppc64le', 'fedora-41-aarch64', 'openeuler-24.03-x86_64', 'openeuler-24.03-aarch64', 'amazonlinux-2023-x86_64', 'rhel-8-s390x', 'rhel-8-x86_64', 'rhel-9-x86_64', 'openeuler-22.03-aarch64', 'openeuler-22.03-x86_64', 'fedora-40-aarch64', 'fedora-40-i386', 'openeuler-20.03-aarch64', 'openeuler-20.03-x86_64', 'rhel-9-aarch64', 'opensuse-leap-15.5-x86_64', 'mageia-9-x86_64', 'mageia-9-i586', 'fedora-40-ppc64le', 'fedora-40-s390x', 'mageia-9-aarch64', 'fedora-40-x86_64', 'centos-stream-10-x86_64', 'centos-stream-10-aarch64', 'centos-stream-10-s390x', 'mageia-cauldron-x86_64', 'epel-10-x86_64', 'epel-10-aarch64', 'fedora-rawhide-x86_64', 'opensuse-tumbleweed-x86_64', 'opensuse-tumbleweed-i586', 'mageia-cauldron-aarch64', 'centos-stream-8-ppc64le', 'centos-stream+epel-next-8-x86_64', 'rhel-7-x86_64', 'centos-stream-10-ppc64le', 'epel-7-x86_64', 'custom-1-i386', 'custom-1-x86_64', 'mageia-cauldron-i686', 'epel-10-s390x', 'fedora-41-i386', 'fedora-41-ppc64le', 'fedora-41-s390x', 'fedora-41-x86_64', 'fedora-42-aarch64', 'fedora-42-x86_64', 'fedora-42-i386', 'fedora-42-ppc64le', 'fedora-42-s390x', 'fedora-rawhide-i386', 'epel-8-x86_64', 'epel-8-aarch64', 'centos-stream+epel-next-9-aarch64', 'centos-stream+epel-next-9-s390x', 'centos-stream+epel-next-9-x86_64', 'centos-stream+epel-next-9-ppc64le', 'amazonlinux-2023-aarch64', 'rhel-9-s390x', 'opensuse-leap-15.5-aarch64', 'centos-stream-8-aarch64', 'centos-stream-8-x86_64', 'centos-stream+epel-next-8-aarch64', 'centos-stream+epel-next-8-ppc64le', 'fedora-rawhide-aarch64', 'rhel-8-aarch64', 'fedora-eln-ppc64le', 'centos-stream-9-ppc64le', 'fedora-rawhide-ppc64le', 'epel-8-ppc64le', 'epel-9-x86_64', 'epel-9-aarch64', 'epel-8-s390x', 'epel-9-s390x', 'epel-9-ppc64le', 'opensuse-tumbleweed-aarch64', 'fedora-rawhide-s390x', 'fedora-eln-s390x', 'fedora-eln-aarch64', 'fedora-eln-x86_64', 'mageia-8-aarch64', 'centos-stream-9-x86_64', 'centos-stream-9-aarch64', 'centos-stream-9-s390x', 'opensuse-tumbleweed-ppc64le', 'mageia-8-x86_64', 'mageia-8-i586', 'epel-7-ppc64le'] 
[2025-02-05 12:13:34,454: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Result set: {'fedora-rawhide-x86_64'}
[2025-02-05 12:13:34,459: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Cleaning up the mess.
[2025-02-05 12:13:34,459: DEBUG/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Removing contents of the PV.
[2025-02-05 12:13:34,460: DEBUG/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Job finished!
[2025-02-05 12:13:34,460: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Pushing the metrics to pushgateway.
[2025-02-05 12:13:34,487: INFO/MainProcess] TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] Task TaskName.copr_build_start[e4a885e1-857f-4892-9af4-08c379eb2d51] succeeded in 6.253109352008323s: {'job': {'copr_build-20250205121328251221': {'success': True, 'details': {...}}}, 'event': {'commit_sha': '859f5d3b7a8d8787fbf9d91a922197297085d983', 'base_repo_name': 'dnf5', 'base_repo_namespace': 'rpm-software-management', 'topic': 'org.fedoraproject.prod.copr.build.start', 'identifier': '1592', 'created_at': 1738757604, 'project_url': 'https://github.com/rpm-software-management/dnf5', '_pr_id': 1592, 'actor': None, 'git_ref': '859f5d3b7a8d8787fbf9d91a922197297085d983', 'build_id': 8609007, 'chroot': 'fedora-42-x86_64', 'status': 3, 'owner': 'packit', 'project_name': 'rpm-software-management-dnf5-1592', 'pkg': 'dnf5', 'timestamp': 1738757604.361237, 'event_type': 'copr.build.Start', 'event_id': 105590, 'task_accepted_time': None}}
[2025-02-05 12:13:38,558: INFO/MainProcess] missed heartbeat from celery@packit-worker-long-running-0
[2025-02-05 12:13:38,558: INFO/MainProcess] missed heartbeat from celery@packit-worker-long-running-1
[2025-02-05 12:13:38,559: INFO/MainProcess] missed heartbeat from celery@packit-worker-short-running-2
[2025-02-05 12:13:38,559: INFO/MainProcess] missed heartbeat from celery@packit-worker-short-running-1
[2025-02-05 12:13:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:13:43,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:13:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:13:53,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:13:58,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:23,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:14:58,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:18,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:28,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:43,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:48,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:53,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:15:58,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:08,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:38,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:43,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:48,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:16:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:17:58,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:04,874: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '4427043c-af63-31ea-b54a-8d3ab7fd4fd5'} ticket:3dbcc4b3-4dc2-403b-b0ff-6126aed65b70]
[2025-02-05 12:18:07,541: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '4439d6a2-1652-3b76-af93-3f3e45dc1ac6'} ticket:a07ed9a2-86d6-4b95-96f5-210d68989d7c]
[2025-02-05 12:18:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:11,150: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': 'c8894002-ddd6-3e26-8aff-930725a1af1d'} ticket:47d8ccbf-0a06-41fd-ae07-2a187c181f34]
[2025-02-05 12:18:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:18,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:38,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:18:58,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:14,434: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': 'ed9edce8-830d-3e16-8c8d-cd971bbafb57'} ticket:00ed98fb-f802-4cd8-ac15-8b32ac36e8f8]
[2025-02-05 12:19:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:28,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:48,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:19:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:08,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:23,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:28,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:33,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:20:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:33,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:21:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:08,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:28,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:53,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:22:59,502: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '641e13c8-ff8a-3105-97fd-22e5537ef425'} ticket:a7cc20d7-160a-4e3b-ad79-59326e744d03]
[2025-02-05 12:23:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:08,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:23,703: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:28,701: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:38,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:43,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:48,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:23:58,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:23,701: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:28,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:24:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:28,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:33,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:38,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:25:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:38,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:26:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:08,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:33,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:53,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:27:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:04,871: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': 'c479a24d-b2ea-3811-bb13-8c4e014e50bf'} ticket:27967944-75cc-48b2-991d-e82ad5158920]
[2025-02-05 12:28:07,797: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': 'd8931460-9f04-33ed-bb9a-ed0d17c69622'} ticket:5165302c-14c3-4ed7-8de8-8f63a2457643]
[2025-02-05 12:28:08,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:11,178: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '14e444ed-8fd6-327b-8206-96ac6152b2c2'} ticket:6c51a059-b145-48f3-b2dd-27827623ab79]
[2025-02-05 12:28:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:28,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:28:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:14,280: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '81b0b174-58ea-3b29-a1e3-d3054446deea'} ticket:724466ea-0340-4ea3-ab98-d568ef2c60a5]
[2025-02-05 12:29:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:43,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:48,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:29:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:08,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:13,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:43,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:48,701: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:30:58,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:31:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:18,701: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:38,701: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:32:59,463: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '5174a5dd-0c61-31df-b07f-876887e48c35'} ticket:92915f53-2903-4eac-b6e2-b8fb81586b92]
[2025-02-05 12:33:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:38,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:48,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:33:58,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:18,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:34:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:18,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:33,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:38,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:48,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:35:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:38,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:43,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:53,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:36:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:28,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:33,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:43,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:53,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:37:58,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:03,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:05,299: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '8c27cce9-96ae-3356-b711-e342159b8e01'} ticket:f94d57cb-08dd-44d3-a3e2-a6bc800c482c]
[2025-02-05 12:38:07,755: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': 'fff64c42-12c8-3a69-a536-87a7e7d23b93'} ticket:2f908576-011a-4dea-9386-f2cb82f5528e]
[2025-02-05 12:38:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:11,385: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '1e420a0e-bcf5-3c62-8f8c-a70ff5c53a80'} ticket:610bd62a-2ba8-4828-9b2a-89872a77d3f9]
[2025-02-05 12:38:13,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:53,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:38:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:03,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:14,060: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '1150457b-d961-3ec9-9b4b-ea7baaec9a17'} ticket:7339c434-7d01-46eb-bd2c-5c4693158600]
[2025-02-05 12:39:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:23,701: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:38,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:39:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:23,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:28,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:40:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:03,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:18,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:28,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:33,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:38,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:48,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:53,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:41:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:08,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:28,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:33,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:39,500: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:43,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:48,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:53,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:58,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:42:59,564: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': 'd5bf9141-edaf-3736-a8cf-0071f9f0921c'} ticket:2906937d-0753-416e-b331-7292b4f799d7]
[2025-02-05 12:43:03,700: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:43:08,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:43:13,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:43:18,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:43:23,699: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:43:28,698: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2025-02-05 12:43:32,454: WARNING/MainProcess] consumer: Connection to broker lost. Trying to re-establish the connection...
Traceback (most recent call last):
  File "/usr/lib/python3.13/site-packages/celery/worker/consumer/consumer.py", line 340, in start
    blueprint.start(self)
    ~~~~~~~~~~~~~~~^^^^^^
  File "/usr/lib/python3.13/site-packages/celery/bootsteps.py", line 116, in start
    step.start(parent)
    ~~~~~~~~~~^^^^^^^^
  File "/usr/lib/python3.13/site-packages/celery/worker/consumer/consumer.py", line 742, in start
    c.loop(*c.loop_args())
    ~~~~~~^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.13/site-packages/celery/worker/loops.py", line 130, in synloop
    connection.drain_events(timeout=2.0)
    ~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^
  File "/usr/lib/python3.13/site-packages/kombu/connection.py", line 341, in drain_events
    return self.transport.drain_events(self.connection, **kwargs)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.13/site-packages/kombu/transport/virtual/base.py", line 997, in drain_events
    get(self._deliver, timeout=timeout)
    ~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.13/site-packages/kombu/transport/redis.py", line 591, in get
    ret = self.handle_event(fileno, event)
  File "/usr/lib/python3.13/site-packages/kombu/transport/redis.py", line 573, in handle_event
    return self.on_readable(fileno), self
           ~~~~~~~~~~~~~~~~^^^^^^^^
  File "/usr/lib/python3.13/site-packages/kombu/transport/redis.py", line 569, in on_readable
    chan.handlers[type]()
    ~~~~~~~~~~~~~~~~~~~^^
  File "/usr/lib/python3.13/site-packages/kombu/transport/redis.py", line 962, in _brpop_read
    dest__item = self.client.parse_response(self.client.connection,
                                            'BRPOP',
                                            **options)
  File "/usr/lib/python3.13/site-packages/redis/client.py", line 584, in parse_response
    response = connection.read_response()
  File "/usr/lib/python3.13/site-packages/redis/connection.py", line 592, in read_response
    response = self._parser.read_response(disable_decoding=disable_decoding)
  File "/usr/lib/python3.13/site-packages/redis/_parsers/resp2.py", line 15, in read_response
    result = self._read_response(disable_decoding=disable_decoding)
  File "/usr/lib/python3.13/site-packages/redis/_parsers/resp2.py", line 25, in _read_response
    raw = self._buffer.readline()
  File "/usr/lib/python3.13/site-packages/redis/_parsers/socket.py", line 115, in readline
    self._read_from_socket()
    ~~~~~~~~~~~~~~~~~~~~~~^^
  File "/usr/lib/python3.13/site-packages/redis/_parsers/socket.py", line 68, in _read_from_socket
    raise ConnectionError(SERVER_CLOSED_CONNECTION_ERROR)
redis.exceptions.ConnectionError: Connection closed by server.
[2025-02-05 12:43:32,467: WARNING/MainProcess] Restoring 5 unacknowledged message(s)
[2025-02-05 12:43:32,484: WARNING/MainProcess] /usr/lib/python3.13/site-packages/celery/worker/consumer/consumer.py:391: CPendingDeprecationWarning: 
In Celery 5.1 we introduced an optional breaking change which
on connection loss cancels all currently executed tasks with late acknowledgement enabled.
These tasks cannot be acknowledged as the connection is gone, and the tasks are automatically redelivered
back to the queue. You can enable this behavior using the worker_cancel_long_running_tasks_on_connection_loss
setting. In Celery 5.1 it is set to False by default. The setting will be set to True by default in Celery 6.0.

  warnings.warn(CANCEL_TASKS_BY_DEFAULT, CPendingDeprecationWarning)

[2025-02-05 12:43:32,485: INFO/MainProcess] Temporarily reducing the prefetch count to 1 to avoid over-fetching since 0 tasks are currently being processed.
The prefetch count will be gradually restored to 0 as the tasks complete processing.
[2025-02-05 12:43:32,485: DEBUG/MainProcess] | Consumer: Restarting event loop...
[2025-02-05 12:43:32,485: DEBUG/MainProcess] | Consumer: Restarting Heart...
[2025-02-05 12:43:32,486: DEBUG/MainProcess] | Consumer: Restarting Gossip...
[2025-02-05 12:43:32,487: DEBUG/MainProcess] | Consumer: Restarting Control...
[2025-02-05 12:43:32,487: DEBUG/MainProcess] Waiting for broadcast thread to shutdown...
[2025-02-05 12:43:32,708: DEBUG/MainProcess] | Consumer: Restarting Tasks...
[2025-02-05 12:43:32,708: DEBUG/MainProcess] Canceling task consumer...
[2025-02-05 12:43:32,709: DEBUG/MainProcess] | Consumer: Restarting Mingle...
[2025-02-05 12:43:32,709: DEBUG/MainProcess] | Consumer: Restarting Events...
[2025-02-05 12:43:32,709: DEBUG/MainProcess] | Consumer: Restarting Connection...
[2025-02-05 12:43:32,709: DEBUG/MainProcess] | Consumer: Starting Connection
[2025-02-05 12:43:32,710: WARNING/MainProcess] /usr/lib/python3.13/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2025-02-05 12:43:32,713: INFO/MainProcess] Connected to redis://valkey:6379/0
[2025-02-05 12:43:32,713: DEBUG/MainProcess] ^-- substep ok
[2025-02-05 12:43:32,714: DEBUG/MainProcess] | Consumer: Starting Events
[2025-02-05 12:43:32,714: WARNING/MainProcess] /usr/lib/python3.13/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2025-02-05 12:43:32,718: DEBUG/MainProcess] ^-- substep ok
[2025-02-05 12:43:32,718: DEBUG/MainProcess] | Consumer: Starting Mingle
[2025-02-05 12:43:32,718: INFO/MainProcess] mingle: searching for neighbors
[2025-02-05 12:43:33,732: INFO/MainProcess] mingle: sync with 4 nodes
[2025-02-05 12:43:33,733: DEBUG/MainProcess] mingle: processing reply from celery@packit-worker-long-running-1
[2025-02-05 12:43:33,733: DEBUG/MainProcess] mingle: processing reply from celery@packit-worker-short-running-2
[2025-02-05 12:43:33,733: DEBUG/MainProcess] mingle: processing reply from celery@packit-worker-short-running-1
[2025-02-05 12:43:33,733: DEBUG/MainProcess] mingle: processing reply from celery@packit-worker-long-running-0
[2025-02-05 12:43:33,733: INFO/MainProcess] mingle: sync complete
[2025-02-05 12:43:33,733: DEBUG/MainProcess] ^-- substep ok
[2025-02-05 12:43:33,734: DEBUG/MainProcess] | Consumer: Starting Tasks
[2025-02-05 12:43:33,750: DEBUG/MainProcess] ^-- substep ok
[2025-02-05 12:43:33,750: DEBUG/MainProcess] | Consumer: Starting Control
[2025-02-05 12:43:33,750: DEBUG/MainProcess] ^-- substep ok
[2025-02-05 12:43:33,750: DEBUG/MainProcess] | Consumer: Starting Gossip
[2025-02-05 12:43:33,751: WARNING/MainProcess] /usr/lib/python3.13/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
whether broker connection retries are made during startup in Celery 6.0 and above.
If you wish to retain the existing behavior for retrying connections on startup,
you should set broker_connection_retry_on_startup to True.
  warnings.warn(

[2025-02-05 12:43:33,755: INFO/MainProcess] pidbox: Connected to redis://valkey:6379/0.
[2025-02-05 12:43:33,759: DEBUG/MainProcess] ^-- substep ok
[2025-02-05 12:43:33,759: DEBUG/MainProcess] | Consumer: Starting Heart
[2025-02-05 12:43:33,763: DEBUG/MainProcess] ^-- substep ok
[2025-02-05 12:43:33,763: DEBUG/MainProcess] | Consumer: Starting event loop
[2025-02-05 12:43:33,764: DEBUG/MainProcess] basic.qos: prefetch_count->1
[2025-02-05 12:43:33,773: INFO/MainProcess] Task task.run_copr_build_end_handler[72577c95-8218-4ad7-84a7-994b75e30911] received
[2025-02-05 12:43:33,776: DEBUG/MainProcess] TaskPool: Apply <function fast_trace_task at 0x7f3c23ac0400> (args:('task.run_copr_build_end_handler', '72577c95-8218-4ad7-84a7-994b75e30911', {'lang': 'py', 'task': 'task.run_copr_build_end_handler', 'id': '72577c95-8218-4ad7-84a7-994b75e30911', 'shadow': None, 'eta': None, 'expires': None, 'group': '0c5dc4b7-abfd-4d35-94ac-40bb8e53bcc9', 'group_index': 0, 'retries': 0, 'timelimit': [None, None], 'root_id': '814e3502-0d77-44de-9290-3c62ef49b67b', 'parent_id': '814e3502-0d77-44de-9290-3c62ef49b67b', 'argsrepr': '()', 'kwargsrepr': "{'package_config': {'jobs': [{...}, {...}, {...}], 'packages': {'automotive-image-builder': {...}}}, 'job_config': {'job': 'copr_build', 'trigger': 'pull_request', 'skip_build': False, 'manual_trigger': False, 'labels': None, 'packages': {'automotive-image-builder': {...}}, 'package': None, 'sidetag_group': None, 'dependents': None, 'dependencies': None}, 'event': {'commit_sha': '58dec537fff2264ec44eea0857932382d0cae583', 'base_repo_name': 'automotive-image-builder', 'base_repo_namespace': 'CentOS/automotive/src', 'topic':... kwargs:{})
[2025-02-05 12:43:33,782: DEBUG/MainProcess] TaskName.copr_build_end[72577c95-8218-4ad7-84a7-994b75e30911] Repo config after pre-loading:
{

@lbarcziova
Copy link
Member Author

thanks @majamassarini for the logs! I will have a better look into the missed heartbeat issue.

I think because we are detecting it just when this happen for all the workers at the same time, in these case it happened just for a single worker

makes sense, will try to adjust the alerting to account for this

@lbarcziova
Copy link
Member Author

lbarcziova commented Feb 14, 2025

I added the alert for single short-running workers: https://gitlab.cee.redhat.com/osci/monitoring/-/merge_requests/561, this seems to work and detected one occurrence of worker not receiving events a while ago (the affected worker was short-running-1).


During that, I tried to run few diagnostic commands from which it appears that the worker was healthy:

sh-5.2$ celery -A packit_service.worker.tasks inspect ping -d celery@packit-worker-short-running-1
Ignored keyword arguments: {'type': 'pagure'}
->  celery@packit-worker-short-running-1: OK
        pong

1 node online.

sh-5.2$ celery -A packit_service.worker.tasks inspect active -d celery@packit-worker-short-running-1
Ignored keyword arguments: {'type': 'pagure'}
->  celery@packit-worker-short-running-1: OK
    - empty -

sh-5.2$ celery -A packit_service.worker.tasks inspect reserved -d celery@packit-worker-short-running-1
Ignored keyword arguments: {'type': 'pagure'}
->  celery@packit-worker-short-running-1: OK
    - empty -

1 node online.
sh-5.2$ celery -A packit_service.worker.tasks inspect scheduled -d celery@packit-worker-short-running-1
Ignored keyword arguments: {'type': 'pagure'}
->  celery@packit-worker-short-running-1: OK
    - empty -

1 node online.
sh-5.2$ celery -A packit_service.worker.tasks inspect stats -d celery@packit-worker-short-running-1
Ignored keyword arguments: {'type': 'pagure'}
->  celery@packit-worker-short-running-1: OK
    {
        "broker": {
            "alternates": [],
            "connect_timeout": null,
            "failover_strategy": "round-robin",
            "heartbeat": 120.0,
            "hostname": "valkey",
            "insist": false,
            "login_method": null,
            "port": 6379,
            "ssl": false,
            "transport": "redis",
            "transport_options": {},
            "uri_prefix": null,
            "userid": null,
            "virtual_host": "0"
        },
        "clock": "5298253",
        "pid": 1,
        "pool": {
            "implementation": "celery.concurrency.gevent:TaskPool",
            "max-concurrency": 16
        },
        "prefetch_count": 16,
        "rusage": {
            "idrss": 0,
            "inblock": 5736,
            "isrss": 0,
            "ixrss": 0,
            "majflt": 4,
            "maxrss": 405820,
            "minflt": 2931799,
            "msgrcv": 0,
            "msgsnd": 0,
            "nivcsw": 313662,
            "nsignals": 0,
            "nswap": 0,
            "nvcsw": 3938541,
            "oublock": 1639720,
            "stime": 215.717188,
            "utime": 10510.710481
        },
        "total": {
            "task.koji_build_tag": 2,
            "task.openscanhub_task_finished": 34,
            "task.openscanhub_task_started": 40,
            "task.run_copr_build_end_handler": 1363,
            "task.run_copr_build_start_handler": 1335,
            "task.run_downstream_koji_build_report_handler": 48,
            "task.run_koji_build_report_handler": 10,
            "task.run_testing_farm_handler": 775,
            "task.run_testing_farm_results_handler": 999,
            "task.steve_jobs.process_message": 5116
        },
        "uptime": 84709
    }


sh-5.2$ celery -A packit_service.worker.tasks inspect active_queues -d celery@packit-worker-short-running-1
Ignored keyword arguments: {'type': 'pagure'}
->  celery@packit-worker-short-running-1: OK
    * {'name': 'short-running', 'exchange': {'name': 'short-running', 'type': 'direct', 'arguments': None, 'durable': True, 'passive': False, 'auto_delete': False, 'delivery_mode': None, 'no_declare': False}, 'routing_key': 'short-running', 'queue_arguments': None, 'binding_arguments': None, 'consumer_arguments': None, 'durable': True, 'exclusive': False, 'auto_delete': False, 'no_ack': False, 'alias': None, 'bindings': [], 'no_declare': None, 'expires': None, 'message_ttl': None, 'max_length': None, 'max_length_bytes': None, 'max_priority': None}

For comparison, here are outputs of some of these commands ran on worker that was processing events normally:

sh-5.2$ celery -A packit_service.worker.tasks inspect stats -d celery@packit-worker-short-running-2
Ignored keyword arguments: {'type': 'pagure'}
->  celery@packit-worker-short-running-2: OK
    {
        "broker": {
            "alternates": [],
            "connect_timeout": null,
            "failover_strategy": "round-robin",
            "heartbeat": 120.0,
            "hostname": "valkey",
            "insist": false,
            "login_method": null,
            "port": 6379,
            "ssl": false,
            "transport": "redis",
            "transport_options": {},
            "uri_prefix": null,
            "userid": null,
            "virtual_host": "0"
        },
        "clock": "5304669",
        "pid": 1,
        "pool": {
            "implementation": "celery.concurrency.gevent:TaskPool",
            "max-concurrency": 16
        },
        "prefetch_count": 16,
        "rusage": {
            "idrss": 0,
            "inblock": 9768,
            "isrss": 0,
            "ixrss": 0,
            "majflt": 62,
            "maxrss": 878396,
            "minflt": 2343935,
            "msgrcv": 0,
            "msgsnd": 0,
            "nivcsw": 375961,
            "nsignals": 0,
            "nswap": 0,
            "nvcsw": 4117959,
            "oublock": 1395488,
            "stime": 189.064118,
            "utime": 10527.809781
        },
        "total": {
            "task.openscanhub_task_finished": 37,
            "task.openscanhub_task_started": 32,
            "task.run_copr_build_end_handler": 1391,
            "task.run_copr_build_start_handler": 1371,
            "task.run_downstream_koji_build_report_handler": 54,
            "task.run_koji_build_report_handler": 12,
            "task.run_testing_farm_handler": 777,
            "task.run_testing_farm_results_handler": 1047,
            "task.run_vm_image_build_handler": 1,
            "task.steve_jobs.process_message": 5103
        },
        "uptime": 84969
    }

sh-5.2$ celery -A packit_service.worker.tasks inspect active_queues -d celery@packit-worker-short-running-2
Ignored keyword arguments: {'type': 'pagure'}
->  celery@packit-worker-short-running-2: OK
    * {'name': 'short-running', 'exchange': {'name': 'short-running', 'type': 'direct', 'arguments': None, 'durable': True, 'passive': False, 'auto_delete': False, 'delivery_mode': None, 'no_declare': False}, 'routing_key': 'short-running', 'queue_arguments': None, 'binding_arguments': None, 'consumer_arguments': None, 'durable': True, 'exclusive': False, 'auto_delete': False, 'no_ack': False, 'alias': None, 'bindings': [], 'no_declare': None, 'expires': None, 'message_ttl': None, 'max_length': None, 'max_length_bytes': None, 'max_priority': None}

1 node online.



A few minutes later, the same idle state occurred for short-running-2, and what's strange is that the graph indicates tasks were being constantly executed during this "idle" time:

Image



As for the 30-minute restart time, I think that comes from valkey timeout setting here. Until we have something more, we could try lowering this timeout to help with earlier recovery. Additionally, I found the --max-tasks-per-child setting in the Celery documentation, which we could experiment with also as a workaround for potential memory leaks, related to #2522.

@majamassarini
Copy link
Member



A few minutes later, the same idle state occurred for short-running-2, and what's strange is that the graph indicates tasks were being constantly executed during this "idle" time:

I see that the graph indicates that there was a gap in tasks execution, and it was long 30 minutes, for the short running worker 1. I can't see anything for the worker 2 so I am wondering if in the alert rule we should increase the time range?

Image

@mfocko
Copy link
Member

mfocko commented Feb 14, 2025

Judging from the posted graphs, we don't get over 8 concurrently ran tasks per worker, but we have

https://github.com/packit/deployment/blob/2cd9e846a6ab1c320f6a6c589e4bcab6b3cfb762/openshift/packit-worker.yml.j2#L132-L135

16 set… should we try lowering to the 8 and see if it helps in any way?

@majamassarini
Copy link
Member

majamassarini commented Feb 14, 2025



As for the 30-minute restart time, I think that comes from valkey timeout setting here. Until we have something more, we could try lowering this timeout to help with earlier recovery. Additionally, I found the --max-tasks-per-child setting in the Celery documentation, which we could experiment with also as a workaround for potential memory leaks, related to #2522.

About the --max-tasks-per-child I have read that it can somehow decrease performances so I would probably investigate more with the actual configuration.
I am not so sure that we have a memory leak (since the latest increase in the memory limit we have never hit the new limit, not even when our workers started misbehaving). I think we have just an high memory need in the short running workers because they are executing much more tasks, on avarage, than the long running ones.

I do think we have a Celery issue, but now it is no more related with memory or cpu limits.
I think we are somehow stressing Celery but not with tasks.


There is a strange correlation between CPU usage / COPR metrics and Celery misbehaving. Seems like that when we en-queue many copr builds than the CPU usage increases and Celery starts loosing heartbeats.

Image

Image


The other strange thing is that when we restart the pods, enqueued COPR builds drop and pod start behaving well again.
In the following graphs there are two big slopes, both caused by me restarting the pods (one the 12th around 12pm and one the 13th around 12pm) .

Image

Image


One fun thing is that there is no correlation between queued copr builds and started/ended copr builds.
Can be that we are putting the same task multiple times in the Celery queue? Or is pushing the metrics multiple times costing us a lot of CPU cycles?

Image

Image

Image


There seem to exist one more correlation between processed events / CPU usage / and pods misbehaving but honestly I can not explain it and I can't find any evidence in the logs.
This also seem somehow related with the COPR logs, but I don't know how.

Image


I would probably:

  • try to fix this and see if this could be related.
  • improve logs to better understand what kind of events are dropping after a restart and sometimes increasing so much (external from github/gitlab?, internal from fedmsg? something generated by us, I don't know how?)

@majamassarini
Copy link
Member

Judging from the posted graphs, we don't get over 8 concurrently ran tasks per worker, but we have

https://github.com/packit/deployment/blob/2cd9e846a6ab1c320f6a6c589e4bcab6b3cfb762/openshift/packit-worker.yml.j2#L132-L135

16 set… should we try lowering to the 8 and see if it helps in any way?

not this day, but I have seen 16 used sometimes. I think that having 4 workers somehow helps in not needing so much concurrency.

lbarcziova added a commit to lbarcziova/packit-service that referenced this issue Feb 15, 2025
Previously, metrics could be pushed multiple times within a single process_message task.
This could potentially lead to increased CPU usage, redundant network requests, and  metric inconsistencies.
Related to packit#2430 packit#2697
lbarcziova added a commit to lbarcziova/packit-service that referenced this issue Feb 15, 2025
Previously, metrics could be pushed multiple times within a single process_message task.
This could potentially lead to increased CPU usage, redundant network requests, and  metric inconsistencies.
Related to packit#2430 packit#2697
softwarefactory-project-zuul bot added a commit that referenced this issue Feb 17, 2025
Push metrics once per process_message task

Previously, metrics could be pushed multiple times within a single process_message task. This could potentially lead to increased CPU usage, redundant network requests, and  metric inconsistencies. Related to #2430 #2697

Reviewed-by: Matej Focko
Reviewed-by: Maja Massarini
lbarcziova added a commit to lbarcziova/packit-service that referenced this issue Feb 17, 2025
Previously, `cls.pushgateway` was used in `process_message`, possibly causing race conditions
in multithreaded Celery tasks, leading to inaccurate metric reporting.
Related to packit#2430 packit#2697
softwarefactory-project-zuul bot added a commit that referenced this issue Feb 17, 2025
Initialise Pushgateway per instance of SteveJobs

Previously, cls.pushgateway was used in process_message, possibly causing race conditions in multithreaded Celery tasks, leading to inaccurate metric reporting. Related to #2430 #2697

Reviewed-by: Matej Focko
@lbarcziova
Copy link
Member Author

lbarcziova commented Feb 17, 2025

I implemented #2722 and #2723, which seem to be fixing the metrics (#2430). As described in #2723, the very high values of queued Copr builds were caused by multiple tasks pushing metrics concurrently using the same SteveJobs instance in a multithreaded Celery environment. With concurrency set to 16, each thread could have pushed metrics multiple times, and when the load was higher (higher CPU), it could have potentially caused e.g. network congestion and additional load (interacting with the pushgateway from multiple threads at the same time). I’m not yet sure if this is the underlying issue with the "idle workers", so I will continue to monitor the behavior now that the pushgateway code is fixed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/general Related to whole service, not a specific part/integration. complexity/single-task Regular task, should be done within days. gain/high This brings a lot of value to (not strictly a lot of) users. impact/high This issue impacts multiple/lot of users. kind/bug Something isn't working.
Projects
Status: in-progress
Development

No branches or pull requests

4 participants