Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Transport endpoint is not connected" after pod eviction #156

Open
idavydoff opened this issue Jan 27, 2025 · 0 comments
Open

"Transport endpoint is not connected" after pod eviction #156

idavydoff opened this issue Jan 27, 2025 · 0 comments

Comments

@idavydoff
Copy link

idavydoff commented Jan 27, 2025

RUS: После эвикта пода, с привязанным PV csi-s3, по причине превышения лимита на эфемерное пространство, драйвер на этой ноде не может примаунтить PVC к новому поду. При этом появляется зомби процесс geesefs.

Помогает только перезапуск пода csi-s3 на ноде.

ENG: After the eviction of the pod (with the attached PV csi-s3) due to exceeding the limit on ephemeral storage, the driver on this node cannot remount the PVC to the new pod. Also, a zombie geesefs process appears.

The only solution is to restart the csi-s3 pod on the node.

  1. Eviction log
    I0127 16:14:50.379658 4710 event.go:307] "Event occurred" object="confluence/confluence-1-0" fieldPath="" kind="Pod" apiVersion="v1" type="Warning" reason="Evicted" message="Pod ephemeral local storage usage exceeds the total limit of containers 4608Mi. "

  2. csi-s3 pod log
    I0127 13:05:41.667912 1 utils.go:97] GRPC call: /csi.v1.Node/NodePublishVolume
    I0127 13:05:41.668024 1 nodeserver.go:121] target /var/lib/kubelet/pods/c18106dd-4875-4116-b166-ab911e4323a3/volumes/kubernetes.iocsi/pvc-d4dee633-0cab-45c5-a86f-2d6c207c3214/mount
    readonly false
    volumeId devops-pvc/pvc-d4dee633-0cab-45c5-a86f-2d6c207c3214
    attributes map[bucket:devops-pvc capacity:1073741824000 mounter:geesefs options:--dir-mode 0777 --file-mode 0666 --uid 2001 --gid 2001 --memory-limit 500 --entry-limit 200000 --subdomain --debug --debug_s3 --debug_fuse storage.kubernetes.io/csiProvisionerIdentity:1737980062853-8081-ru.yandex.s3.csi]
    mountflags []
    I0127 13:05:41.668072 1 nodeserver.go:126] Binding volume devops-pvc/pvc-d4dee633-0cab-45c5-a86f-2d6c207c3214 from /var/lib/kubelet/plugins/kubernetes.io/csi/ru.yandex.s3.csi/b396c97efee5ceef2d8da2bfd526729ac5aeb3b9e02bc20e3a3f51f8f9d91bb1/globalmount to /var/lib/kubelet/pods/c18106dd-4875-4116-b166-ab911e4323a3/volumes/kubernetes.iocsi/pvc-d4dee633-0cab-45c5-a86f-2d6c207c3214/mount
    I0127 13:05:41.668838 1 nodeserver.go:132] s3: volume devops-pvc/pvc-d4dee633-0cab-45c5-a86f-2d6c207c3214 successfully mounted to /var/lib/kubelet/pods/c18106dd-4875-4116-b166-ab911e4323a3/volumes/kubernetes.io~csi/pvc-d4dee633-0cab-45c5-a86f-2d6c207c3214/mount
    I0127 13:06:01.630882 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:06:51.897093 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:07:03.095046 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:08:28.261525 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:08:37.169111 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:10:26.096836 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:10:35.058964 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:11:47.679974 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:11:47.993527 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:13:20.190600 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:13:23.281797 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:14:41.622626 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:12.761138 1 utils.go:97] GRPC call: /csi.v1.Node/NodeUnpublishVolume
    I0127 13:15:12.762224 1 nodeserver.go:152] s3: volume devops-pvc/pvc-454a177e-c8c6-4130-8dfb-08ecc207e280 has been unmounted.
    I0127 13:15:12.864900 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:12.874482 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:12.875944 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:12.876483 1 utils.go:97] GRPC call: /csi.v1.Node/NodePublishVolume
    E0127 13:15:12.876538 1 utils.go:101] GRPC error: rpc error: code = Internal desc = stat /var/lib/kubelet/plugins/kubernetes.io/csi/ru.yandex.s3.csi/57753c4990a325f341d03af83dc39454e977dd796d2a097d2f462a935aa4736e/globalmount: transport endpoint is not connected
    I0127 13:15:13.496718 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:13.506717 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:13.507321 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:13.507800 1 utils.go:97] GRPC call: /csi.v1.Node/NodePublishVolume
    E0127 13:15:13.507854 1 utils.go:101] GRPC error: rpc error: code = Internal desc = stat /var/lib/kubelet/plugins/kubernetes.io/csi/ru.yandex.s3.csi/57753c4990a325f341d03af83dc39454e977dd796d2a097d2f462a935aa4736e/globalmount: transport endpoint is not connected
    I0127 13:15:14.577846 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:14.585565 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:14.586259 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:14.586798 1 utils.go:97] GRPC call: /csi.v1.Node/NodePublishVolume
    E0127 13:15:14.586863 1 utils.go:101] GRPC error: rpc error: code = Internal desc = stat /var/lib/kubelet/plugins/kubernetes.io/csi/ru.yandex.s3.csi/57753c4990a325f341d03af83dc39454e977dd796d2a097d2f462a935aa4736e/globalmount: transport endpoint is not connected
    I0127 13:15:16.689554 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:16.699563 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilities
    I0127 13:15:16.700848 1 utils.go:97] GRPC call: /csi.v1.Node/NodeGetCapabilitiesimage

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant