Apr 24 14:23:51.323913 ip-10-0-128-169 systemd[1]: Starting Kubernetes Kubelet... Apr 24 14:23:51.794505 ip-10-0-128-169 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:51.794505 ip-10-0-128-169 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 14:23:51.794505 ip-10-0-128-169 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:51.794505 ip-10-0-128-169 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 14:23:51.795021 ip-10-0-128-169 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 14:23:51.797119 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.797012 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 14:23:51.805279 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805251 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:51.805279 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805271 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:51.805279 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805276 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:51.805279 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805280 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:51.805279 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805284 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805289 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805293 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805297 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805301 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805305 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805309 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805312 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805316 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805320 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805324 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805328 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805332 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805336 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805340 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805344 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805348 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805351 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805364 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:51.805557 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805371 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805377 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805382 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805386 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805390 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805395 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805399 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805404 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805408 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805412 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805417 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805421 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805425 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805429 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805433 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805438 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805443 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805448 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805452 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:51.806341 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805456 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805460 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805464 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805477 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805482 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805485 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805489 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805494 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805498 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805502 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805508 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805512 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805516 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805520 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805524 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805529 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805533 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805537 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805541 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805545 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805550 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:51.807079 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805553 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805557 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805561 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805565 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805569 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805573 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805577 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805582 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805587 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805591 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805596 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805619 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805625 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805629 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805633 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805649 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805655 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805658 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805665 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:51.807814 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805672 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805677 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805681 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.805686 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806332 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806341 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806346 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806351 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806355 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806359 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806364 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806368 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806372 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806376 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806380 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806385 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806390 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806394 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806398 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806402 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:51.808618 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806408 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806413 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806417 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806421 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806425 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806429 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806433 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806437 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806452 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806457 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806460 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806465 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806469 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806473 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806478 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806482 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806486 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806490 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806494 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806499 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:51.809217 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806505 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806509 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806513 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806517 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806521 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806525 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806529 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806533 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806537 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806541 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806545 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806549 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806554 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806558 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806562 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806569 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806575 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806579 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806585 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:51.809779 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806591 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806595 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806627 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806633 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806637 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806643 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806646 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806651 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806655 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806659 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806663 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806667 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806672 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806676 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806680 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806684 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806688 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806692 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806696 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806701 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:51.810333 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806705 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806710 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806714 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806718 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806722 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806727 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806731 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806735 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806739 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806743 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.806747 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807353 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807367 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807381 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807388 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807404 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807409 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807416 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807423 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807428 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807433 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 14:23:51.810905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807439 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807444 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807450 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807455 2572 flags.go:64] FLAG: --cgroup-root="" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807460 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807464 2572 flags.go:64] FLAG: --client-ca-file="" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807469 2572 flags.go:64] FLAG: --cloud-config="" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807474 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807479 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807490 2572 flags.go:64] FLAG: --cluster-domain="" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807494 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807500 2572 flags.go:64] FLAG: --config-dir="" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807504 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807510 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807517 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807522 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807527 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807532 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807537 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807542 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807546 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807551 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807556 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807563 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807568 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 14:23:51.811524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807573 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807577 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807589 2572 flags.go:64] FLAG: --enable-server="true" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807595 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807623 2572 flags.go:64] FLAG: --event-burst="100" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807629 2572 flags.go:64] FLAG: --event-qps="50" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807634 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807638 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807643 2572 flags.go:64] FLAG: --eviction-hard="" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807650 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807655 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807660 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807665 2572 flags.go:64] FLAG: --eviction-soft="" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807670 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807675 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807679 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807684 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807691 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807696 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807701 2572 flags.go:64] FLAG: --feature-gates="" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807707 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807712 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807717 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807722 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807727 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 24 14:23:51.812347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807732 2572 flags.go:64] FLAG: --help="false" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807737 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-128-169.ec2.internal" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807743 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807748 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807752 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807758 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807764 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807769 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807773 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807778 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807791 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807797 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807802 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807807 2572 flags.go:64] FLAG: --kube-reserved="" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807811 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807816 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807822 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807826 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807831 2572 flags.go:64] FLAG: --lock-file="" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807836 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807841 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807847 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807855 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 14:23:51.812972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807860 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807866 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807871 2572 flags.go:64] FLAG: --logging-format="text" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807875 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807881 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807886 2572 flags.go:64] FLAG: --manifest-url="" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807891 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807898 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807903 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807909 2572 flags.go:64] FLAG: --max-pods="110" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807914 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807919 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807924 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807929 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807934 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807939 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807944 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807955 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807962 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807967 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807980 2572 flags.go:64] FLAG: --pod-cidr="" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807985 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807993 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.807998 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 14:23:51.813669 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808003 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808008 2572 flags.go:64] FLAG: --port="10250" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808013 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808017 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04393041b90a9f296" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808023 2572 flags.go:64] FLAG: --qos-reserved="" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808027 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808032 2572 flags.go:64] FLAG: --register-node="true" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808037 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808041 2572 flags.go:64] FLAG: --register-with-taints="" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808047 2572 flags.go:64] FLAG: --registry-burst="10" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808055 2572 flags.go:64] FLAG: --registry-qps="5" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808060 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808065 2572 flags.go:64] FLAG: --reserved-memory="" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808071 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808076 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808081 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808085 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808090 2572 flags.go:64] FLAG: --runonce="false" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808095 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808100 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808105 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808109 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808114 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808119 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808123 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808128 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 14:23:51.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808137 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808142 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808147 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808159 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808164 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808169 2572 flags.go:64] FLAG: --system-cgroups="" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808174 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808183 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808187 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808192 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808204 2572 flags.go:64] FLAG: --tls-min-version="" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808210 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808214 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808219 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808224 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808229 2572 flags.go:64] FLAG: --v="2" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808238 2572 flags.go:64] FLAG: --version="false" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808244 2572 flags.go:64] FLAG: --vmodule="" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808250 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.808255 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808418 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808424 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808428 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808433 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:51.814956 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808438 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808442 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808446 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808450 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808454 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808458 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808463 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808467 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808473 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808477 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808481 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808485 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808497 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808502 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808506 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808510 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808514 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808519 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808525 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:51.815522 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808532 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808537 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808541 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808546 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808550 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808556 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808560 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808564 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808568 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808572 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808577 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808581 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808586 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808590 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808594 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808598 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808619 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808623 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808627 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808631 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:51.816031 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808635 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808641 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808645 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808649 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808653 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808657 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808669 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808674 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808678 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808682 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808686 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808691 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808696 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808700 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808704 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808708 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808713 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808717 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808721 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808725 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:51.816526 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808729 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808733 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808737 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808741 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808745 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808749 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808753 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808757 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808762 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808766 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808770 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808775 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808779 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808785 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808789 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808793 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808797 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808801 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808805 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:51.817071 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808819 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:51.817542 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808825 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:51.817542 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808830 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:51.817542 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.808834 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:51.817542 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.809697 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:51.817665 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.817586 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 14:23:51.817665 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.817617 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817674 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817679 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817683 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817686 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817689 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817692 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817695 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817698 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817701 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817704 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817706 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817709 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817712 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817714 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817717 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817720 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817723 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817726 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817729 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:51.817722 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817732 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817734 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817737 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817740 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817742 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817745 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817748 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817750 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817753 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817755 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817758 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817760 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817763 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817765 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817768 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817771 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817773 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817776 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817778 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817781 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:51.818208 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817783 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817786 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817788 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817791 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817793 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817796 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817798 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817802 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817806 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817809 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817812 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817814 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817817 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817819 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817822 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817825 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817827 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817830 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817832 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:51.818745 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817835 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817838 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817840 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817843 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817846 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817848 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817851 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817853 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817855 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817858 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817861 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817864 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817868 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817871 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817874 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817877 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817879 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817882 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817884 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:51.819236 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817887 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817889 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817892 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817894 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817897 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817899 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817902 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817904 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.817907 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.817912 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818008 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818012 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818029 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818033 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818039 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 14:23:51.819717 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818042 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818045 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818048 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818051 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818053 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818056 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818059 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818061 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818064 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818067 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818070 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818072 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818075 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818077 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818079 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818082 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818084 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818087 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818089 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818092 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 14:23:51.820098 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818094 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818097 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818099 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818102 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818104 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818107 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818110 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818112 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818115 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818118 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818122 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818125 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818128 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818130 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818133 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818135 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818137 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818141 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818145 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 14:23:51.820572 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818147 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818151 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818153 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818156 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818159 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818162 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818164 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818167 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818169 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818172 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818174 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818177 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818179 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818182 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818184 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818187 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818189 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818192 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818194 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818197 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 14:23:51.821045 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818199 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818202 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818204 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818207 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818209 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818212 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818216 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818218 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818221 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818224 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818226 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818228 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818231 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818234 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818236 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818238 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818241 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818243 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818246 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818248 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 14:23:51.821535 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818251 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 14:23:51.822057 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:51.818253 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 14:23:51.822057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.818258 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 14:23:51.822057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.819100 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 14:23:51.822057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.821801 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 14:23:51.822773 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.822762 2572 server.go:1019] "Starting client certificate rotation" Apr 24 14:23:51.822872 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.822856 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:23:51.822904 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.822896 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 14:23:51.848095 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.848075 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:23:51.852031 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.852015 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 14:23:51.869009 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.868991 2572 log.go:25] "Validated CRI v1 runtime API" Apr 24 14:23:51.875375 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.875357 2572 log.go:25] "Validated CRI v1 image API" Apr 24 14:23:51.876723 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.876709 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 14:23:51.880249 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.880233 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:23:51.885761 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.885742 2572 fs.go:135] Filesystem UUIDs: map[358c1b57-3548-4df2-a6bf-7df08e00f117:/dev/nvme0n1p4 484b9a4a-ffc5-466d-9c4a-deb79fb7d425:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 24 14:23:51.885818 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.885760 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 14:23:51.891463 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.891348 2572 manager.go:217] Machine: {Timestamp:2026-04-24 14:23:51.889359436 +0000 UTC m=+0.440201600 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099735 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20eb2b5c131716e2b9a30f1cdcde54 SystemUUID:ec20eb2b-5c13-1716-e2b9-a30f1cdcde54 BootID:7a702e4e-48aa-402c-a28a-fede66e414b8 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:78:6d:96:b2:2d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:78:6d:96:b2:2d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:2d:47:9f:06:f9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 14:23:51.891463 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.891458 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 14:23:51.891576 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.891539 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 14:23:51.892539 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.892515 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 14:23:51.892700 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.892541 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-169.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 14:23:51.892749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.892709 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 14:23:51.892749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.892717 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 14:23:51.892749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.892730 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:23:51.892749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.892748 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 14:23:51.893875 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.893865 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:23:51.894014 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.894005 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 14:23:51.896562 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.896552 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 24 14:23:51.896597 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.896565 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 14:23:51.896597 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.896583 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 14:23:51.896688 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.896616 2572 kubelet.go:397] "Adding apiserver pod source" Apr 24 14:23:51.896688 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.896633 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 14:23:51.897691 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.897680 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:23:51.897735 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.897698 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 14:23:51.900556 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.900538 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 14:23:51.902767 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.902750 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 14:23:51.904227 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904211 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 14:23:51.904271 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904229 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 14:23:51.904271 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904238 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 14:23:51.904271 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904248 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 14:23:51.904271 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904256 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 14:23:51.904378 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904275 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 14:23:51.904378 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904281 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 14:23:51.904378 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904287 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 14:23:51.904378 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904294 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 14:23:51.904378 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904301 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 14:23:51.904378 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904316 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 14:23:51.904378 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.904325 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 14:23:51.905110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.905100 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 14:23:51.905147 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.905110 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 14:23:51.908727 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.908713 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 14:23:51.908783 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.908751 2572 server.go:1295] "Started kubelet" Apr 24 14:23:51.908938 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.908855 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 14:23:51.908994 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.908974 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 14:23:51.909161 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.908907 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 14:23:51.909492 ip-10-0-128-169 systemd[1]: Started Kubernetes Kubelet. Apr 24 14:23:51.910407 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.910348 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 24 14:23:51.913879 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.913855 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 14:23:51.914252 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.914128 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-169.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 14:23:51.914859 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.914825 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-169.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 14:23:51.915438 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.914822 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 14:23:51.919372 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.918116 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-169.ec2.internal.18a9510a205af358 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-169.ec2.internal,UID:ip-10-0-128-169.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-169.ec2.internal,},FirstTimestamp:2026-04-24 14:23:51.908725592 +0000 UTC m=+0.459567755,LastTimestamp:2026-04-24 14:23:51.908725592 +0000 UTC m=+0.459567755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-169.ec2.internal,}" Apr 24 14:23:51.919949 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.919929 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 14:23:51.920543 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.920523 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 14:23:51.921279 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921261 2572 factory.go:55] Registering systemd factory Apr 24 14:23:51.921279 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921282 2572 factory.go:223] Registration of the systemd container factory successfully Apr 24 14:23:51.921423 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921377 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 14:23:51.921469 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921429 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 14:23:51.921469 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921448 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 14:23:51.921549 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.921523 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 14:23:51.921549 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921544 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 24 14:23:51.921694 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921555 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 24 14:23:51.921694 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.921544 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:51.921694 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921658 2572 factory.go:153] Registering CRI-O factory Apr 24 14:23:51.921694 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921668 2572 factory.go:223] Registration of the crio container factory successfully Apr 24 14:23:51.921875 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921714 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 14:23:51.921875 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921739 2572 factory.go:103] Registering Raw factory Apr 24 14:23:51.921875 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.921753 2572 manager.go:1196] Started watching for new ooms in manager Apr 24 14:23:51.922274 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.922262 2572 manager.go:319] Starting recovery of all containers Apr 24 14:23:51.925750 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.925723 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 14:23:51.925923 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.925885 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-169.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 14:23:51.926201 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.926178 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8dbqh" Apr 24 14:23:51.932953 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.932838 2572 manager.go:324] Recovery completed Apr 24 14:23:51.933583 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.933567 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8dbqh" Apr 24 14:23:51.934301 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.934275 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 14:23:51.937354 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.937338 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:51.940663 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.940648 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:51.940723 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.940679 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:51.940723 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.940690 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:51.941072 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.941057 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 14:23:51.941072 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.941070 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 14:23:51.941179 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.941092 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 24 14:23:51.943092 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.943033 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-169.ec2.internal.18a9510a22424366 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-169.ec2.internal,UID:ip-10-0-128-169.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-169.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-169.ec2.internal,},FirstTimestamp:2026-04-24 14:23:51.940662118 +0000 UTC m=+0.491504282,LastTimestamp:2026-04-24 14:23:51.940662118 +0000 UTC m=+0.491504282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-169.ec2.internal,}" Apr 24 14:23:51.943595 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.943584 2572 policy_none.go:49] "None policy: Start" Apr 24 14:23:51.943643 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.943628 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 14:23:51.943643 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.943639 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 24 14:23:51.984152 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.984137 2572 manager.go:341] "Starting Device Plugin manager" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.984164 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.984178 2572 server.go:85] "Starting device plugin registration server" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.984394 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.984403 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.984477 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.984546 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:51.984555 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.985116 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 14:23:51.990325 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:51.985148 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.051587 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.051529 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 14:23:52.052709 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.052694 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 14:23:52.052763 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.052725 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 14:23:52.052763 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.052748 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 14:23:52.052763 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.052758 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 14:23:52.052886 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.052796 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 14:23:52.056391 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.056369 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:52.084713 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.084698 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.085803 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.085783 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.085881 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.085813 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.085881 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.085827 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.085881 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.085848 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.094091 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.094078 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.094132 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.094097 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-169.ec2.internal\": node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.113063 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.113046 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.152863 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.152839 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal"] Apr 24 14:23:52.152928 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.152897 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.153898 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.153882 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.153965 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.153910 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.153965 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.153920 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.155299 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.155288 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.155625 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.155595 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.155665 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.155640 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.156346 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.156331 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.156401 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.156347 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.156401 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.156360 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.156401 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.156369 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.156401 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.156374 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.156401 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.156379 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.157542 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.157527 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.157615 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.157562 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 14:23:52.158505 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.158479 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientMemory" Apr 24 14:23:52.158564 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.158508 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 14:23:52.158564 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.158517 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeHasSufficientPID" Apr 24 14:23:52.184143 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.184128 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-169.ec2.internal\" not found" node="ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.188408 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.188393 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-169.ec2.internal\" not found" node="ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.213111 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.213087 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.223467 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.223448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8a21c2b27ced13adbceceb16f3c2439-config\") pod \"kube-apiserver-proxy-ip-10-0-128-169.ec2.internal\" (UID: \"e8a21c2b27ced13adbceceb16f3c2439\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.223536 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.223470 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00ce4c1e01abb0612bbf3542d5106471-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal\" (UID: \"00ce4c1e01abb0612bbf3542d5106471\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.223536 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.223489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00ce4c1e01abb0612bbf3542d5106471-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal\" (UID: \"00ce4c1e01abb0612bbf3542d5106471\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.313837 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.313749 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.324167 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.324138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8a21c2b27ced13adbceceb16f3c2439-config\") pod \"kube-apiserver-proxy-ip-10-0-128-169.ec2.internal\" (UID: \"e8a21c2b27ced13adbceceb16f3c2439\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.324277 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.324176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00ce4c1e01abb0612bbf3542d5106471-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal\" (UID: \"00ce4c1e01abb0612bbf3542d5106471\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.324277 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.324200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00ce4c1e01abb0612bbf3542d5106471-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal\" (UID: \"00ce4c1e01abb0612bbf3542d5106471\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.324277 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.324245 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8a21c2b27ced13adbceceb16f3c2439-config\") pod \"kube-apiserver-proxy-ip-10-0-128-169.ec2.internal\" (UID: \"e8a21c2b27ced13adbceceb16f3c2439\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.324391 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.324273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/00ce4c1e01abb0612bbf3542d5106471-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal\" (UID: \"00ce4c1e01abb0612bbf3542d5106471\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.324391 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.324256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/00ce4c1e01abb0612bbf3542d5106471-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal\" (UID: \"00ce4c1e01abb0612bbf3542d5106471\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.414539 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.414500 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.486002 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.485980 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.490504 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.490487 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:52.515365 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.515340 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.615878 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.615817 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.716360 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.716337 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.816910 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.816882 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.822099 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.822088 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 14:23:52.822221 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.822207 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 14:23:52.917619 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:52.917544 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-169.ec2.internal\" not found" Apr 24 14:23:52.920142 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.920128 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 14:23:52.925108 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.925091 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:52.933329 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.933311 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 14:23:52.935423 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.935397 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 14:18:51 +0000 UTC" deadline="2028-01-31 19:47:41.010913152 +0000 UTC" Apr 24 14:23:52.935486 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.935423 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15533h23m48.075493035s" Apr 24 14:23:52.961873 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.961855 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-th9sw" Apr 24 14:23:52.971006 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:52.970991 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-th9sw" Apr 24 14:23:53.018655 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.018632 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:53.020745 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.020728 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" Apr 24 14:23:53.031430 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.031406 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:23:53.032352 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.032335 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" Apr 24 14:23:53.045719 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.045701 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 14:23:53.074754 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:53.074726 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a21c2b27ced13adbceceb16f3c2439.slice/crio-5e3ce2715d629002de69024675df7bf998ab8feac6e363a8582e1665aa00c8ae WatchSource:0}: Error finding container 5e3ce2715d629002de69024675df7bf998ab8feac6e363a8582e1665aa00c8ae: Status 404 returned error can't find the container with id 5e3ce2715d629002de69024675df7bf998ab8feac6e363a8582e1665aa00c8ae Apr 24 14:23:53.075239 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:53.075213 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ce4c1e01abb0612bbf3542d5106471.slice/crio-7f6b1264a476606e8137c528f5d07529f60645973e6d50495a04295b204e817a WatchSource:0}: Error finding container 7f6b1264a476606e8137c528f5d07529f60645973e6d50495a04295b204e817a: Status 404 returned error can't find the container with id 7f6b1264a476606e8137c528f5d07529f60645973e6d50495a04295b204e817a Apr 24 14:23:53.079189 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.079172 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:23:53.504432 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.504406 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:53.699279 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.699109 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 14:23:53.898023 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.897947 2572 apiserver.go:52] "Watching apiserver" Apr 24 14:23:53.908936 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.908904 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 14:23:53.909726 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.909696 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-j9n5z","openshift-network-operator/iptables-alerter-7fk7l","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj","openshift-multus/network-metrics-daemon-7vzlj","openshift-network-diagnostics/network-check-target-md9sj","openshift-ovn-kubernetes/ovnkube-node-7w4g5","kube-system/konnectivity-agent-7wv4w","kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal","openshift-cluster-node-tuning-operator/tuned-dx4gw","openshift-image-registry/node-ca-gxbvj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal","openshift-multus/multus-additional-cni-plugins-gsthx"] Apr 24 14:23:53.912461 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.912439 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.913022 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.912554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:53.914084 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.914064 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:53.915625 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.915353 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:53.915625 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.915396 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 14:23:53.915625 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.915405 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 14:23:53.915625 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:53.915454 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:23:53.915839 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.915743 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 14:23:53.915904 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.915882 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 14:23:53.915953 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.915926 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-88gmb\"" Apr 24 14:23:53.916095 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.916069 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:23:53.916320 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.916305 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 14:23:53.916503 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.916484 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vc9gg\"" Apr 24 14:23:53.917186 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.917007 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 14:23:53.917186 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.917137 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zx5w5\"" Apr 24 14:23:53.917475 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.917458 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 14:23:53.918376 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.917766 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 14:23:53.918376 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.917824 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 14:23:53.919019 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.918908 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:23:53.919019 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:53.918972 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:23:53.919158 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.919053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.920294 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.920275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:23:53.921687 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.921667 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.922500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.922476 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dq7l2\"" Apr 24 14:23:53.922733 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.922717 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 14:23:53.922927 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.922897 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 14:23:53.923056 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.923035 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:53.923115 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.923094 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 14:23:53.923193 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.923171 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 14:23:53.923481 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.923462 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 14:23:53.923585 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.923493 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 14:23:53.923585 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.923549 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ml6fg\"" Apr 24 14:23:53.923727 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.923672 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 14:23:53.923727 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.923690 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 14:23:53.924110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.924062 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 14:23:53.924349 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.924331 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dqmt8\"" Apr 24 14:23:53.924457 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.924412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:53.924655 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.924624 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 14:23:53.925467 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.925447 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 14:23:53.925550 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.925452 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 14:23:53.926206 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.926189 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 14:23:53.926459 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.926442 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-m8lld\"" Apr 24 14:23:53.927038 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.927020 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 14:23:53.927920 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.927785 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 14:23:53.928011 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.927925 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qrfns\"" Apr 24 14:23:53.934403 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:53.934501 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-serviceca\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:53.934501 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934440 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:53.934501 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c69fa493-7572-43af-8e01-2691e09381c7-agent-certs\") pod \"konnectivity-agent-7wv4w\" (UID: \"c69fa493-7572-43af-8e01-2691e09381c7\") " pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:23:53.934501 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-var-lib-kubelet\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.934728 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da7813e0-cc46-4210-982d-86a1bd6de417-host-slash\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:53.934728 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-systemd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.934728 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-var-lib-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.934728 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxf4\" (UniqueName: \"kubernetes.io/projected/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-kube-api-access-9cxf4\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.934728 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-cni-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.934728 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-daemon-config\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.934728 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-kubelet\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934732 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-systemd-units\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-run-ovn-kubernetes\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934804 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-cnibin\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-run-netns\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-log-socket\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-systemd\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934925 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-run\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.934966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.935048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-lib-modules\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-host\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935106 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-device-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935134 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-modprobe-d\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935155 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysctl-conf\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-system-cni-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935258 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-cni-bin\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-multus-certs\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-netns\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-kubelet\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-sys-fs\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysconfig\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935440 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da7813e0-cc46-4210-982d-86a1bd6de417-iptables-alerter-script\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovn-node-metrics-cert\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935491 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-host\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:53.935516 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935513 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-registration-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-socket-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cf9f051-7347-447e-934c-30eb0f79fd31-cni-binary-copy\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29l84\" (UniqueName: \"kubernetes.io/projected/1a1e1cea-5bb5-4d2e-83e5-817f18307569-kube-api-access-29l84\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovnkube-config\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935649 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-tuned\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-conf-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935704 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-cni-bin\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935738 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbcl\" (UniqueName: \"kubernetes.io/projected/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-kube-api-access-4wbcl\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935800 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c69fa493-7572-43af-8e01-2691e09381c7-konnectivity-ca\") pod \"konnectivity-agent-7wv4w\" (UID: \"c69fa493-7572-43af-8e01-2691e09381c7\") " pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-os-release\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfb6w\" (UniqueName: \"kubernetes.io/projected/da7813e0-cc46-4210-982d-86a1bd6de417-kube-api-access-sfb6w\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-ovn\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-node-log\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-env-overrides\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovnkube-script-lib\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-kubernetes\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.936217 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.935984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-tmp\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-k8s-cni-cncf-io\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936057 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-etc-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp4zs\" (UniqueName: \"kubernetes.io/projected/ed88905d-70e9-490e-aacb-2e6c20b37f9a-kube-api-access-lp4zs\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-slash\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-cni-netd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-hostroot\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-etc-kubernetes\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t547s\" (UniqueName: \"kubernetes.io/projected/0cf9f051-7347-447e-934c-30eb0f79fd31-kube-api-access-t547s\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-sys\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936347 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-socket-dir-parent\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2drfd\" (UniqueName: \"kubernetes.io/projected/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-kube-api-access-2drfd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysctl-d\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:53.936790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.936435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-cni-multus\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:53.971542 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.971493 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:18:52 +0000 UTC" deadline="2027-10-11 18:05:43.036427378 +0000 UTC" Apr 24 14:23:53.971542 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:53.971513 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12843h41m49.064915964s" Apr 24 14:23:54.022127 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.022102 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 14:23:54.037620 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-system-cni-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.037733 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-cni-bin\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.037733 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-multus-certs\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.037733 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.037733 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-system-cni-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.037733 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037688 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-netns\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.037733 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-kubelet\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-sys-fs\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-kubelet\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-netns\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysconfig\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-multus-certs\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da7813e0-cc46-4210-982d-86a1bd6de417-iptables-alerter-script\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysconfig\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-sys-fs\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-cni-bin\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037849 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovn-node-metrics-cert\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-host\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-registration-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59g56\" (UniqueName: \"kubernetes.io/projected/4372c551-45c1-401f-b043-c8048ef49e81-kube-api-access-59g56\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.038025 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.037996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-socket-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038023 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-host\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-registration-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cf9f051-7347-447e-934c-30eb0f79fd31-cni-binary-copy\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29l84\" (UniqueName: \"kubernetes.io/projected/1a1e1cea-5bb5-4d2e-83e5-817f18307569-kube-api-access-29l84\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038121 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovnkube-config\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038137 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-socket-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-tuned\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-conf-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-cni-bin\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbcl\" (UniqueName: \"kubernetes.io/projected/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-kube-api-access-4wbcl\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038201 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c69fa493-7572-43af-8e01-2691e09381c7-konnectivity-ca\") pod \"konnectivity-agent-7wv4w\" (UID: \"c69fa493-7572-43af-8e01-2691e09381c7\") " pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038280 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-os-release\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfb6w\" (UniqueName: \"kubernetes.io/projected/da7813e0-cc46-4210-982d-86a1bd6de417-kube-api-access-sfb6w\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-ovn\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-node-log\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.038749 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-env-overrides\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038398 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da7813e0-cc46-4210-982d-86a1bd6de417-iptables-alerter-script\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovnkube-script-lib\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-kubernetes\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-cni-bin\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-tmp\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-k8s-cni-cncf-io\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-node-log\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-etc-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038598 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp4zs\" (UniqueName: \"kubernetes.io/projected/ed88905d-70e9-490e-aacb-2e6c20b37f9a-kube-api-access-lp4zs\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038637 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cf9f051-7347-447e-934c-30eb0f79fd31-cni-binary-copy\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-slash\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-cni-netd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-hostroot\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-etc-kubernetes\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t547s\" (UniqueName: \"kubernetes.io/projected/0cf9f051-7347-447e-934c-30eb0f79fd31-kube-api-access-t547s\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-sys\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-socket-dir-parent\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.039596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-conf-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-ovn\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2drfd\" (UniqueName: \"kubernetes.io/projected/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-kube-api-access-2drfd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038876 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-etc-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-run-k8s-cni-cncf-io\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-cnibin\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-os-release\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038967 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-os-release\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-cni-netd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039071 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-hostroot\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-kubernetes\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039112 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-etc-kubernetes\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-slash\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.038783 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovnkube-config\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039165 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-socket-dir-parent\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-sys\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039216 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovnkube-script-lib\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysctl-d\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.040462 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-cni-multus\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-serviceca\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-host-var-lib-cni-multus\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysctl-d\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c69fa493-7572-43af-8e01-2691e09381c7-agent-certs\") pod \"konnectivity-agent-7wv4w\" (UID: \"c69fa493-7572-43af-8e01-2691e09381c7\") " pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c69fa493-7572-43af-8e01-2691e09381c7-konnectivity-ca\") pod \"konnectivity-agent-7wv4w\" (UID: \"c69fa493-7572-43af-8e01-2691e09381c7\") " pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-env-overrides\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-var-lib-kubelet\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-var-lib-kubelet\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.039664 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da7813e0-cc46-4210-982d-86a1bd6de417-host-slash\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-systemd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-var-lib-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.039752 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:54.539720361 +0000 UTC m=+3.090562528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:54.041107 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-var-lib-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da7813e0-cc46-4210-982d-86a1bd6de417-host-slash\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxf4\" (UniqueName: \"kubernetes.io/projected/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-kube-api-access-9cxf4\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-systemd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-cni-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039831 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-serviceca\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-daemon-config\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-kubelet\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-systemd-units\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-cni-dir\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-kubelet\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039942 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-run-ovn-kubernetes\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-systemd-units\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-cni-binary-copy\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.039996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-cnibin\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-run-openvswitch\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-run-netns\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.041748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-log-socket\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-run-netns\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040056 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cf9f051-7347-447e-934c-30eb0f79fd31-cnibin\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040020 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-run-ovn-kubernetes\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-log-socket\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-systemd\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040127 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-run\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-systemd\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-run\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-system-cni-dir\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-lib-modules\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-host\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040327 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:23:54.042617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0cf9f051-7347-447e-934c-30eb0f79fd31-multus-daemon-config\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040396 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-host\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-device-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-lib-modules\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040634 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ed88905d-70e9-490e-aacb-2e6c20b37f9a-device-dir\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-modprobe-d\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040672 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysctl-conf\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-modprobe-d\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.040846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-sysctl-conf\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.042176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-ovn-node-metrics-cert\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.042234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-etc-tuned\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.042348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c69fa493-7572-43af-8e01-2691e09381c7-agent-certs\") pod \"konnectivity-agent-7wv4w\" (UID: \"c69fa493-7572-43af-8e01-2691e09381c7\") " pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:23:54.043225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.042508 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-tmp\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.049456 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.048769 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:54.049456 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.048792 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:54.049456 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.048806 2572 projected.go:194] Error preparing data for projected volume kube-api-access-gtfsh for pod openshift-network-diagnostics/network-check-target-md9sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:54.049456 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.048881 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh podName:7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa nodeName:}" failed. No retries permitted until 2026-04-24 14:23:54.548862638 +0000 UTC m=+3.099704804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gtfsh" (UniqueName: "kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh") pod "network-check-target-md9sj" (UID: "7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:54.049456 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.049050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxf4\" (UniqueName: \"kubernetes.io/projected/fc559a32-7f1b-4b50-a3a8-4b6b184a2586-kube-api-access-9cxf4\") pod \"tuned-dx4gw\" (UID: \"fc559a32-7f1b-4b50-a3a8-4b6b184a2586\") " pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.049456 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.049397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t547s\" (UniqueName: \"kubernetes.io/projected/0cf9f051-7347-447e-934c-30eb0f79fd31-kube-api-access-t547s\") pod \"multus-j9n5z\" (UID: \"0cf9f051-7347-447e-934c-30eb0f79fd31\") " pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.050731 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.050708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfb6w\" (UniqueName: \"kubernetes.io/projected/da7813e0-cc46-4210-982d-86a1bd6de417-kube-api-access-sfb6w\") pod \"iptables-alerter-7fk7l\" (UID: \"da7813e0-cc46-4210-982d-86a1bd6de417\") " pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:54.051155 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.051109 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbcl\" (UniqueName: \"kubernetes.io/projected/f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4-kube-api-access-4wbcl\") pod \"node-ca-gxbvj\" (UID: \"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4\") " pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:54.051779 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.051695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp4zs\" (UniqueName: \"kubernetes.io/projected/ed88905d-70e9-490e-aacb-2e6c20b37f9a-kube-api-access-lp4zs\") pod \"aws-ebs-csi-driver-node-c2kwj\" (UID: \"ed88905d-70e9-490e-aacb-2e6c20b37f9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.051977 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.051957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2drfd\" (UniqueName: \"kubernetes.io/projected/2b4a98b9-41f7-4893-a186-4cc7fb68fb05-kube-api-access-2drfd\") pod \"ovnkube-node-7w4g5\" (UID: \"2b4a98b9-41f7-4893-a186-4cc7fb68fb05\") " pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.052753 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.052731 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29l84\" (UniqueName: \"kubernetes.io/projected/1a1e1cea-5bb5-4d2e-83e5-817f18307569-kube-api-access-29l84\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:54.059532 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.057276 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" event={"ID":"00ce4c1e01abb0612bbf3542d5106471","Type":"ContainerStarted","Data":"7f6b1264a476606e8137c528f5d07529f60645973e6d50495a04295b204e817a"} Apr 24 14:23:54.060594 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.060568 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" event={"ID":"e8a21c2b27ced13adbceceb16f3c2439","Type":"ContainerStarted","Data":"5e3ce2715d629002de69024675df7bf998ab8feac6e363a8582e1665aa00c8ae"} Apr 24 14:23:54.141928 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.141897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142106 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.141948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-cnibin\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142106 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142002 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-cnibin\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142106 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-os-release\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142106 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142083 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-cni-binary-copy\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142313 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-system-cni-dir\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142313 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142313 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-os-release\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142313 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-system-cni-dir\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142313 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142313 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59g56\" (UniqueName: \"kubernetes.io/projected/4372c551-45c1-401f-b043-c8048ef49e81-kube-api-access-59g56\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142643 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4372c551-45c1-401f-b043-c8048ef49e81-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142776 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.142851 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.142830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-cni-binary-copy\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.143180 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.143154 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4372c551-45c1-401f-b043-c8048ef49e81-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.151903 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.151852 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59g56\" (UniqueName: \"kubernetes.io/projected/4372c551-45c1-401f-b043-c8048ef49e81-kube-api-access-59g56\") pod \"multus-additional-cni-plugins-gsthx\" (UID: \"4372c551-45c1-401f-b043-c8048ef49e81\") " pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.226794 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.226766 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j9n5z" Apr 24 14:23:54.235931 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.235903 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7fk7l" Apr 24 14:23:54.244529 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.244508 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" Apr 24 14:23:54.251425 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.251404 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:23:54.263031 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.263004 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:23:54.270682 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.270664 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" Apr 24 14:23:54.278221 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.278204 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gxbvj" Apr 24 14:23:54.284791 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.284772 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gsthx" Apr 24 14:23:54.545857 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.545781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:54.546009 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.545943 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:54.546075 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.546013 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:55.54599323 +0000 UTC m=+4.096835396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:54.646569 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.646529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:23:54.646756 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.646718 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:54.646756 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.646745 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:54.646856 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.646759 2572 projected.go:194] Error preparing data for projected volume kube-api-access-gtfsh for pod openshift-network-diagnostics/network-check-target-md9sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:54.646856 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:54.646822 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh podName:7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa nodeName:}" failed. No retries permitted until 2026-04-24 14:23:55.646807368 +0000 UTC m=+4.197649521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gtfsh" (UniqueName: "kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh") pod "network-check-target-md9sj" (UID: "7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:54.771685 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:54.771555 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda7813e0_cc46_4210_982d_86a1bd6de417.slice/crio-191ffef73306d88ad0ea0b821449e62f70a7fc90337f3a1bc119dab6cee4e501 WatchSource:0}: Error finding container 191ffef73306d88ad0ea0b821449e62f70a7fc90337f3a1bc119dab6cee4e501: Status 404 returned error can't find the container with id 191ffef73306d88ad0ea0b821449e62f70a7fc90337f3a1bc119dab6cee4e501 Apr 24 14:23:54.773327 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:54.773298 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4372c551_45c1_401f_b043_c8048ef49e81.slice/crio-f0d0a64dc89c96f279fa5f429f880d3d31a411f336c78a9244c80e4f1926cf36 WatchSource:0}: Error finding container f0d0a64dc89c96f279fa5f429f880d3d31a411f336c78a9244c80e4f1926cf36: Status 404 returned error can't find the container with id f0d0a64dc89c96f279fa5f429f880d3d31a411f336c78a9244c80e4f1926cf36 Apr 24 14:23:54.774747 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:54.774587 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc69fa493_7572_43af_8e01_2691e09381c7.slice/crio-fd40866cf10cfa7c9135b23360cdf23810b220d8f44653b9d8901bf76ac91482 WatchSource:0}: Error finding container fd40866cf10cfa7c9135b23360cdf23810b220d8f44653b9d8901bf76ac91482: Status 404 returned error can't find the container with id fd40866cf10cfa7c9135b23360cdf23810b220d8f44653b9d8901bf76ac91482 Apr 24 14:23:54.776845 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:54.776824 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded88905d_70e9_490e_aacb_2e6c20b37f9a.slice/crio-1714c8ded05a71a513a19e5b81d59f01e8f1549a4fc6ad6f5e649e47eb1c86db WatchSource:0}: Error finding container 1714c8ded05a71a513a19e5b81d59f01e8f1549a4fc6ad6f5e649e47eb1c86db: Status 404 returned error can't find the container with id 1714c8ded05a71a513a19e5b81d59f01e8f1549a4fc6ad6f5e649e47eb1c86db Apr 24 14:23:54.777947 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:54.777912 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc559a32_7f1b_4b50_a3a8_4b6b184a2586.slice/crio-590715e6389113c27105497736116e3e9a7f643dee9cfd34481415cd1f947432 WatchSource:0}: Error finding container 590715e6389113c27105497736116e3e9a7f643dee9cfd34481415cd1f947432: Status 404 returned error can't find the container with id 590715e6389113c27105497736116e3e9a7f643dee9cfd34481415cd1f947432 Apr 24 14:23:54.778783 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:54.778756 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b9ac64_b92d_41f4_8a6d_92cb3608d7a4.slice/crio-9a9c7a0ba3a588db7293352558309b5e2fdd16cc4493bc4bcb64a032ec9652b7 WatchSource:0}: Error finding container 9a9c7a0ba3a588db7293352558309b5e2fdd16cc4493bc4bcb64a032ec9652b7: Status 404 returned error can't find the container with id 9a9c7a0ba3a588db7293352558309b5e2fdd16cc4493bc4bcb64a032ec9652b7 Apr 24 14:23:54.779235 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:54.779215 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b4a98b9_41f7_4893_a186_4cc7fb68fb05.slice/crio-262e71bcfa1ba1497d675b3a80ae0ac7fb3e480a0a9d62dfa3a579e571f63a50 WatchSource:0}: Error finding container 262e71bcfa1ba1497d675b3a80ae0ac7fb3e480a0a9d62dfa3a579e571f63a50: Status 404 returned error can't find the container with id 262e71bcfa1ba1497d675b3a80ae0ac7fb3e480a0a9d62dfa3a579e571f63a50 Apr 24 14:23:54.780550 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:54.780501 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf9f051_7347_447e_934c_30eb0f79fd31.slice/crio-14dc825d0ec6af729497259dccd3584bc67d81e2a73e25a7ec876262e58d7bf7 WatchSource:0}: Error finding container 14dc825d0ec6af729497259dccd3584bc67d81e2a73e25a7ec876262e58d7bf7: Status 404 returned error can't find the container with id 14dc825d0ec6af729497259dccd3584bc67d81e2a73e25a7ec876262e58d7bf7 Apr 24 14:23:54.971793 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.971659 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 14:18:52 +0000 UTC" deadline="2027-12-01 03:34:23.924629111 +0000 UTC" Apr 24 14:23:54.971793 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:54.971791 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14053h10m28.952840557s" Apr 24 14:23:55.063034 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.062927 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gxbvj" event={"ID":"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4","Type":"ContainerStarted","Data":"9a9c7a0ba3a588db7293352558309b5e2fdd16cc4493bc4bcb64a032ec9652b7"} Apr 24 14:23:55.063869 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.063843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7wv4w" event={"ID":"c69fa493-7572-43af-8e01-2691e09381c7","Type":"ContainerStarted","Data":"fd40866cf10cfa7c9135b23360cdf23810b220d8f44653b9d8901bf76ac91482"} Apr 24 14:23:55.065270 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.065244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" event={"ID":"e8a21c2b27ced13adbceceb16f3c2439","Type":"ContainerStarted","Data":"2ba99a23e456cd0eca0ba7fc3920ca5468a59fa5792ded618ee78ceacd650e5b"} Apr 24 14:23:55.066214 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.066193 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"262e71bcfa1ba1497d675b3a80ae0ac7fb3e480a0a9d62dfa3a579e571f63a50"} Apr 24 14:23:55.067054 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.067037 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" event={"ID":"ed88905d-70e9-490e-aacb-2e6c20b37f9a","Type":"ContainerStarted","Data":"1714c8ded05a71a513a19e5b81d59f01e8f1549a4fc6ad6f5e649e47eb1c86db"} Apr 24 14:23:55.068346 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.068313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gsthx" event={"ID":"4372c551-45c1-401f-b043-c8048ef49e81","Type":"ContainerStarted","Data":"f0d0a64dc89c96f279fa5f429f880d3d31a411f336c78a9244c80e4f1926cf36"} Apr 24 14:23:55.069381 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.069358 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7fk7l" event={"ID":"da7813e0-cc46-4210-982d-86a1bd6de417","Type":"ContainerStarted","Data":"191ffef73306d88ad0ea0b821449e62f70a7fc90337f3a1bc119dab6cee4e501"} Apr 24 14:23:55.070364 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.070328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9n5z" event={"ID":"0cf9f051-7347-447e-934c-30eb0f79fd31","Type":"ContainerStarted","Data":"14dc825d0ec6af729497259dccd3584bc67d81e2a73e25a7ec876262e58d7bf7"} Apr 24 14:23:55.071148 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.071129 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" event={"ID":"fc559a32-7f1b-4b50-a3a8-4b6b184a2586","Type":"ContainerStarted","Data":"590715e6389113c27105497736116e3e9a7f643dee9cfd34481415cd1f947432"} Apr 24 14:23:55.514480 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.513251 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-169.ec2.internal" podStartSLOduration=2.513231211 podStartE2EDuration="2.513231211s" podCreationTimestamp="2026-04-24 14:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:23:55.078112572 +0000 UTC m=+3.628954744" watchObservedRunningTime="2026-04-24 14:23:55.513231211 +0000 UTC m=+4.064073383" Apr 24 14:23:55.514480 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.514053 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qdkg2"] Apr 24 14:23:55.516637 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.516151 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.519810 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.519774 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 14:23:55.520013 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.519996 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 14:23:55.521713 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.521693 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xvfpf\"" Apr 24 14:23:55.554488 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.554244 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-hosts-file\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.554488 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.554298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:55.554488 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.554328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qsl\" (UniqueName: \"kubernetes.io/projected/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-kube-api-access-26qsl\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.554488 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.554392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-tmp-dir\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.554488 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:55.554395 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:55.554488 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:55.554462 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:57.554448601 +0000 UTC m=+6.105290756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:55.654856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.654821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26qsl\" (UniqueName: \"kubernetes.io/projected/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-kube-api-access-26qsl\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.654983 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.654899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:23:55.654983 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.654939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-tmp-dir\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.654983 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.654974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-hosts-file\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.655129 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.655060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-hosts-file\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.655491 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:55.655472 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:55.655579 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:55.655500 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:55.655579 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:55.655513 2572 projected.go:194] Error preparing data for projected volume kube-api-access-gtfsh for pod openshift-network-diagnostics/network-check-target-md9sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:55.655579 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:55.655568 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh podName:7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa nodeName:}" failed. No retries permitted until 2026-04-24 14:23:57.655548739 +0000 UTC m=+6.206390895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gtfsh" (UniqueName: "kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh") pod "network-check-target-md9sj" (UID: "7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:55.655903 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.655872 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-tmp-dir\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.671429 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.671403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qsl\" (UniqueName: \"kubernetes.io/projected/7ef2b28b-afd5-4cb7-a98e-824028a4bb08-kube-api-access-26qsl\") pod \"node-resolver-qdkg2\" (UID: \"7ef2b28b-afd5-4cb7-a98e-824028a4bb08\") " pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.831552 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:55.831143 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qdkg2" Apr 24 14:23:55.857998 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:23:55.857831 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ef2b28b_afd5_4cb7_a98e_824028a4bb08.slice/crio-f47b4d9d77534845d19786e960c6e59ab592e58acaa2ebaadbfef8c735e3c99f WatchSource:0}: Error finding container f47b4d9d77534845d19786e960c6e59ab592e58acaa2ebaadbfef8c735e3c99f: Status 404 returned error can't find the container with id f47b4d9d77534845d19786e960c6e59ab592e58acaa2ebaadbfef8c735e3c99f Apr 24 14:23:56.056152 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:56.056120 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:56.056564 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:56.056260 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:23:56.056880 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:56.056859 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:23:56.056973 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:56.056953 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:23:56.081086 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:56.080312 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qdkg2" event={"ID":"7ef2b28b-afd5-4cb7-a98e-824028a4bb08","Type":"ContainerStarted","Data":"f47b4d9d77534845d19786e960c6e59ab592e58acaa2ebaadbfef8c735e3c99f"} Apr 24 14:23:56.087946 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:56.087921 2572 generic.go:358] "Generic (PLEG): container finished" podID="00ce4c1e01abb0612bbf3542d5106471" containerID="d42b4e26367318f4478fed8e4994860963671214a07f508155ba2bd96c4c78ac" exitCode=0 Apr 24 14:23:56.088853 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:56.088831 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" event={"ID":"00ce4c1e01abb0612bbf3542d5106471","Type":"ContainerDied","Data":"d42b4e26367318f4478fed8e4994860963671214a07f508155ba2bd96c4c78ac"} Apr 24 14:23:57.097563 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:57.097522 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" event={"ID":"00ce4c1e01abb0612bbf3542d5106471","Type":"ContainerStarted","Data":"2ae97032f29af12de3d75ec23b0e22f22ba3fe254b0818a156a55fea97a57b13"} Apr 24 14:23:57.569383 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:57.569350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:57.569575 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:57.569490 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:57.569575 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:57.569551 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:01.569530593 +0000 UTC m=+10.120372748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:23:57.670754 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:57.670717 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:23:57.670905 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:57.670872 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:23:57.670905 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:57.670890 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:23:57.670905 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:57.670902 2572 projected.go:194] Error preparing data for projected volume kube-api-access-gtfsh for pod openshift-network-diagnostics/network-check-target-md9sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:57.671065 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:57.670954 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh podName:7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa nodeName:}" failed. No retries permitted until 2026-04-24 14:24:01.670936715 +0000 UTC m=+10.221778872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gtfsh" (UniqueName: "kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh") pod "network-check-target-md9sj" (UID: "7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:23:57.992470 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:57.990597 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-169.ec2.internal" podStartSLOduration=4.990576853 podStartE2EDuration="4.990576853s" podCreationTimestamp="2026-04-24 14:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:23:57.112116952 +0000 UTC m=+5.662959124" watchObservedRunningTime="2026-04-24 14:23:57.990576853 +0000 UTC m=+6.541419063" Apr 24 14:23:57.992470 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:57.991620 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rdrc9"] Apr 24 14:23:57.994147 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:57.993737 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:57.994147 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:57.993812 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:23:58.054458 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.053744 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:23:58.054458 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:58.053895 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:23:58.054458 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.054273 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:23:58.054458 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:58.054367 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:23:58.072982 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.072953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9c7c423b-7cf4-4b11-b125-c5bcef103313-dbus\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.073134 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.073013 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9c7c423b-7cf4-4b11-b125-c5bcef103313-kubelet-config\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.073134 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.073048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.173785 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.173752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9c7c423b-7cf4-4b11-b125-c5bcef103313-dbus\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.174231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.173817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9c7c423b-7cf4-4b11-b125-c5bcef103313-kubelet-config\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.174231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.173853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.174231 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:58.174017 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:58.174231 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:58.174079 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret podName:9c7c423b-7cf4-4b11-b125-c5bcef103313 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:58.674060634 +0000 UTC m=+7.224902799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret") pod "global-pull-secret-syncer-rdrc9" (UID: "9c7c423b-7cf4-4b11-b125-c5bcef103313") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:58.174446 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.174421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9c7c423b-7cf4-4b11-b125-c5bcef103313-dbus\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.174502 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.174487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9c7c423b-7cf4-4b11-b125-c5bcef103313-kubelet-config\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.678319 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:58.678276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:58.678502 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:58.678434 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:58.678570 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:58.678515 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret podName:9c7c423b-7cf4-4b11-b125-c5bcef103313 nodeName:}" failed. No retries permitted until 2026-04-24 14:23:59.678495518 +0000 UTC m=+8.229337672 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret") pod "global-pull-secret-syncer-rdrc9" (UID: "9c7c423b-7cf4-4b11-b125-c5bcef103313") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:59.685557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:23:59.685515 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:23:59.686022 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:59.685701 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:23:59.686022 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:23:59.685776 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret podName:9c7c423b-7cf4-4b11-b125-c5bcef103313 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:01.685755237 +0000 UTC m=+10.236597403 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret") pod "global-pull-secret-syncer-rdrc9" (UID: "9c7c423b-7cf4-4b11-b125-c5bcef103313") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:00.054496 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:00.054465 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:00.054665 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:00.054614 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:00.054757 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:00.054735 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:00.054841 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:00.054740 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:00.054927 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:00.054825 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:00.055048 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:00.055011 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:01.602439 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:01.602382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:01.602916 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:01.602573 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:01.602916 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:01.602724 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:09.602678883 +0000 UTC m=+18.153521049 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:01.703099 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:01.703061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:01.703348 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:01.703126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:01.703348 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:01.703239 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:01.703348 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:01.703261 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:01.703348 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:01.703262 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:01.703348 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:01.703271 2572 projected.go:194] Error preparing data for projected volume kube-api-access-gtfsh for pod openshift-network-diagnostics/network-check-target-md9sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:01.703348 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:01.703333 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret podName:9c7c423b-7cf4-4b11-b125-c5bcef103313 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:05.703315691 +0000 UTC m=+14.254157861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret") pod "global-pull-secret-syncer-rdrc9" (UID: "9c7c423b-7cf4-4b11-b125-c5bcef103313") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:01.703348 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:01.703353 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh podName:7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa nodeName:}" failed. No retries permitted until 2026-04-24 14:24:09.703345014 +0000 UTC m=+18.254187165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gtfsh" (UniqueName: "kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh") pod "network-check-target-md9sj" (UID: "7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:02.054734 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:02.054250 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:02.054734 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:02.054357 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:02.054734 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:02.054641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:02.055016 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:02.054758 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:02.055016 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:02.054817 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:02.055016 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:02.054890 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:04.053414 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:04.053379 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:04.053859 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:04.053416 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:04.053859 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:04.053533 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:04.053859 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:04.053597 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:04.053859 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:04.053632 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:04.053859 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:04.053723 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:05.732077 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:05.732044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:05.732535 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:05.732213 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:05.732535 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:05.732280 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret podName:9c7c423b-7cf4-4b11-b125-c5bcef103313 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:13.732263501 +0000 UTC m=+22.283105656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret") pod "global-pull-secret-syncer-rdrc9" (UID: "9c7c423b-7cf4-4b11-b125-c5bcef103313") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:06.053238 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:06.053205 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:06.053238 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:06.053231 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:06.053455 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:06.053207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:06.053455 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:06.053316 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:06.053455 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:06.053394 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:06.053455 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:06.053447 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:08.053437 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:08.053406 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:08.053861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:08.053411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:08.053861 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:08.053525 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:08.053861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:08.053418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:08.053861 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:08.053596 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:08.053861 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:08.053719 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:09.658846 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:09.658808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:09.659299 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:09.658969 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:09.659299 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:09.659039 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:25.659022787 +0000 UTC m=+34.209864942 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:09.759476 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:09.759438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:09.759624 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:09.759565 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:09.759624 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:09.759579 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:09.759624 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:09.759589 2572 projected.go:194] Error preparing data for projected volume kube-api-access-gtfsh for pod openshift-network-diagnostics/network-check-target-md9sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:09.759729 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:09.759655 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh podName:7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa nodeName:}" failed. No retries permitted until 2026-04-24 14:24:25.759638587 +0000 UTC m=+34.310480737 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gtfsh" (UniqueName: "kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh") pod "network-check-target-md9sj" (UID: "7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:10.053437 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:10.053407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:10.053570 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:10.053407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:10.053570 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:10.053506 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:10.053670 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:10.053407 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:10.053670 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:10.053588 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:10.053746 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:10.053723 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:12.054299 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.053940 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:12.054838 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.054011 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:12.054838 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:12.054399 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:12.054838 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:12.054487 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:12.054838 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.054042 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:12.054838 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:12.054629 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:12.121833 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.121790 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9n5z" event={"ID":"0cf9f051-7347-447e-934c-30eb0f79fd31","Type":"ContainerStarted","Data":"3c2befe83f34f5476763148d502515af51a645b14637ef24d76c094dc6535f88"} Apr 24 14:24:12.123397 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.123370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" event={"ID":"fc559a32-7f1b-4b50-a3a8-4b6b184a2586","Type":"ContainerStarted","Data":"5ad2329217b25d9fb459f8a04ecfcb3cef0d5bf7f023fa73841014f7d801700a"} Apr 24 14:24:12.125226 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.125195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gxbvj" event={"ID":"f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4","Type":"ContainerStarted","Data":"c1a7dac5114038952fff7324b147bbc927fda671cb4f0e1eface7c24e8654c2e"} Apr 24 14:24:12.126641 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.126597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7wv4w" event={"ID":"c69fa493-7572-43af-8e01-2691e09381c7","Type":"ContainerStarted","Data":"15b279ca489372ad2b8382abe2d7035292513a115e61b026b3ec06c71e29c19f"} Apr 24 14:24:12.127906 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.127884 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qdkg2" event={"ID":"7ef2b28b-afd5-4cb7-a98e-824028a4bb08","Type":"ContainerStarted","Data":"722924e40c0257e59fa7674ec47a4c03c8d03d565efa8817069be96b2e8130bf"} Apr 24 14:24:12.131248 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.131230 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:24:12.131698 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.131626 2572 generic.go:358] "Generic (PLEG): container finished" podID="2b4a98b9-41f7-4893-a186-4cc7fb68fb05" containerID="5d77d3e5840e93a30f9d9c5e0e86fcea1af991841a2c522171a212c80aa2e1a0" exitCode=1 Apr 24 14:24:12.131784 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.131701 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"376ac5deb6239bc3f9adda7d5570dbcf921dbad2003c0f8dcc4587c800695388"} Apr 24 14:24:12.131784 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.131729 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"3689e898357c58f46ff51d4a40edca8e5ea9f59523e22a53a6f66b474da30275"} Apr 24 14:24:12.131784 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.131744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerDied","Data":"5d77d3e5840e93a30f9d9c5e0e86fcea1af991841a2c522171a212c80aa2e1a0"} Apr 24 14:24:12.131784 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.131758 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"4144e1c9711543fce17db4869cb5ae39fb03d038638f30d5eb60cd1e6ec00942"} Apr 24 14:24:12.133204 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.133173 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" event={"ID":"ed88905d-70e9-490e-aacb-2e6c20b37f9a","Type":"ContainerStarted","Data":"e8f0b254e4eaffff01fb7c1ea7e5a1cf55b7edf898cf425153e0827487889825"} Apr 24 14:24:12.134645 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.134622 2572 generic.go:358] "Generic (PLEG): container finished" podID="4372c551-45c1-401f-b043-c8048ef49e81" containerID="d01d2970196a1b7ecc8e05d6c6039b04bbe35b227041373d31f8c77ae6d381ef" exitCode=0 Apr 24 14:24:12.134756 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.134656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gsthx" event={"ID":"4372c551-45c1-401f-b043-c8048ef49e81","Type":"ContainerDied","Data":"d01d2970196a1b7ecc8e05d6c6039b04bbe35b227041373d31f8c77ae6d381ef"} Apr 24 14:24:12.141736 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.141662 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qdkg2" podStartSLOduration=1.556479622 podStartE2EDuration="17.141652067s" podCreationTimestamp="2026-04-24 14:23:55 +0000 UTC" firstStartedPulling="2026-04-24 14:23:55.861169779 +0000 UTC m=+4.412011934" lastFinishedPulling="2026-04-24 14:24:11.446342213 +0000 UTC m=+19.997184379" observedRunningTime="2026-04-24 14:24:12.141448439 +0000 UTC m=+20.692290612" watchObservedRunningTime="2026-04-24 14:24:12.141652067 +0000 UTC m=+20.692494237" Apr 24 14:24:12.153684 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.153642 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7wv4w" podStartSLOduration=3.471904082 podStartE2EDuration="20.153627492s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:54.776371843 +0000 UTC m=+3.327213993" lastFinishedPulling="2026-04-24 14:24:11.458095239 +0000 UTC m=+20.008937403" observedRunningTime="2026-04-24 14:24:12.152990366 +0000 UTC m=+20.703832537" watchObservedRunningTime="2026-04-24 14:24:12.153627492 +0000 UTC m=+20.704469664" Apr 24 14:24:12.166543 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.166276 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j9n5z" podStartSLOduration=3.466350409 podStartE2EDuration="20.166263365s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:54.782426487 +0000 UTC m=+3.333268636" lastFinishedPulling="2026-04-24 14:24:11.482339438 +0000 UTC m=+20.033181592" observedRunningTime="2026-04-24 14:24:12.165929066 +0000 UTC m=+20.716771253" watchObservedRunningTime="2026-04-24 14:24:12.166263365 +0000 UTC m=+20.717105537" Apr 24 14:24:12.178942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.178855 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dx4gw" podStartSLOduration=3.498616491 podStartE2EDuration="20.17884523s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:54.780124079 +0000 UTC m=+3.330966236" lastFinishedPulling="2026-04-24 14:24:11.460352817 +0000 UTC m=+20.011194975" observedRunningTime="2026-04-24 14:24:12.178485008 +0000 UTC m=+20.729327180" watchObservedRunningTime="2026-04-24 14:24:12.17884523 +0000 UTC m=+20.729687401" Apr 24 14:24:12.189983 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.189947 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gxbvj" podStartSLOduration=11.311768408 podStartE2EDuration="20.189937846s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:54.781241084 +0000 UTC m=+3.332083241" lastFinishedPulling="2026-04-24 14:24:03.659410516 +0000 UTC m=+12.210252679" observedRunningTime="2026-04-24 14:24:12.189867312 +0000 UTC m=+20.740709482" watchObservedRunningTime="2026-04-24 14:24:12.189937846 +0000 UTC m=+20.740780017" Apr 24 14:24:12.842255 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.842235 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 14:24:12.995344 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.995148 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T14:24:12.842251821Z","UUID":"d6a65cf5-89ea-4ba7-a93b-92c16a6fe41c","Handler":null,"Name":"","Endpoint":""} Apr 24 14:24:12.998617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.998577 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 14:24:12.998761 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:12.998634 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 14:24:13.139131 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:13.139105 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:24:13.139526 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:13.139503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"f7c5b65ad131fc4dbc5ce66316748af1cc9d27b62b4de7d683d31a2b0c154fac"} Apr 24 14:24:13.139568 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:13.139539 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"8a4df347278babb82efa1f8f29f78ed5259ae2993443e6e01665b73b0f4f5fdf"} Apr 24 14:24:13.141274 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:13.141246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" event={"ID":"ed88905d-70e9-490e-aacb-2e6c20b37f9a","Type":"ContainerStarted","Data":"a9614a9149dafee2bf9c846ed59c02b9ded54d7fbca720406a136b1fa37b8c32"} Apr 24 14:24:13.142689 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:13.142660 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7fk7l" event={"ID":"da7813e0-cc46-4210-982d-86a1bd6de417","Type":"ContainerStarted","Data":"a2575d1e8ee5028bc250685a4dc9f5b3104e683591356c6286532d16aa4e7769"} Apr 24 14:24:13.157982 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:13.157940 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7fk7l" podStartSLOduration=4.472552327 podStartE2EDuration="21.157928436s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:54.773077335 +0000 UTC m=+3.323919489" lastFinishedPulling="2026-04-24 14:24:11.458453448 +0000 UTC m=+20.009295598" observedRunningTime="2026-04-24 14:24:13.157402799 +0000 UTC m=+21.708244968" watchObservedRunningTime="2026-04-24 14:24:13.157928436 +0000 UTC m=+21.708770606" Apr 24 14:24:13.792573 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:13.792537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:13.792869 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:13.792683 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:13.792869 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:13.792748 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret podName:9c7c423b-7cf4-4b11-b125-c5bcef103313 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:29.792732519 +0000 UTC m=+38.343574688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret") pod "global-pull-secret-syncer-rdrc9" (UID: "9c7c423b-7cf4-4b11-b125-c5bcef103313") : object "kube-system"/"original-pull-secret" not registered Apr 24 14:24:14.053410 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:14.053340 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:14.053541 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:14.053490 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:14.053541 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:14.053530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:14.053656 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:14.053590 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:14.053656 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:14.053459 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:14.053765 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:14.053714 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:14.146559 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:14.146530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" event={"ID":"ed88905d-70e9-490e-aacb-2e6c20b37f9a","Type":"ContainerStarted","Data":"3cda3fab0a943323fca037721044bd6d1f8cbf79a028cb7f6db005c6917c2995"} Apr 24 14:24:14.176234 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:14.175209 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2kwj" podStartSLOduration=2.984852487 podStartE2EDuration="22.17518994s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:54.779029089 +0000 UTC m=+3.329871240" lastFinishedPulling="2026-04-24 14:24:13.96936654 +0000 UTC m=+22.520208693" observedRunningTime="2026-04-24 14:24:14.174555356 +0000 UTC m=+22.725397512" watchObservedRunningTime="2026-04-24 14:24:14.17518994 +0000 UTC m=+22.726032111" Apr 24 14:24:15.151398 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:15.151180 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:24:15.151878 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:15.151721 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"f49eaa9d111bfb8c72d588ff5ea25f8d08fedbbc4e3eed0d23c647ef577b226e"} Apr 24 14:24:15.900919 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:15.900888 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:24:15.901578 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:15.901553 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:24:16.053395 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:16.053316 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:16.053577 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:16.053325 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:16.053577 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:16.053433 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:16.053577 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:16.053518 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:16.053577 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:16.053564 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:16.053794 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:16.053646 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:16.153659 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:16.153585 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:24:16.154020 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:16.154012 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7wv4w" Apr 24 14:24:17.158153 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:17.157974 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:24:17.158861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:17.158443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"1d967e10fdede94f0257543880333f339b443a92117f34df5244ba97dd7e3d50"} Apr 24 14:24:17.158861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:17.158834 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:24:17.159009 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:17.158993 2572 scope.go:117] "RemoveContainer" containerID="5d77d3e5840e93a30f9d9c5e0e86fcea1af991841a2c522171a212c80aa2e1a0" Apr 24 14:24:17.160241 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:17.160221 2572 generic.go:358] "Generic (PLEG): container finished" podID="4372c551-45c1-401f-b043-c8048ef49e81" containerID="4ffaaea9aeda890e5f883082238dfa800013eb38180d807d73b82797f890f24b" exitCode=0 Apr 24 14:24:17.160335 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:17.160280 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gsthx" event={"ID":"4372c551-45c1-401f-b043-c8048ef49e81","Type":"ContainerDied","Data":"4ffaaea9aeda890e5f883082238dfa800013eb38180d807d73b82797f890f24b"} Apr 24 14:24:17.173657 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:17.173640 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:24:18.053542 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.053518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:18.053542 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.053531 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:18.053684 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.053532 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:18.053684 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:18.053639 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:18.053798 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:18.053767 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:18.053847 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:18.053831 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:18.165085 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.165062 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:24:18.165490 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.165461 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" event={"ID":"2b4a98b9-41f7-4893-a186-4cc7fb68fb05","Type":"ContainerStarted","Data":"7d34213bda5e1708f58769d7eae964271732ef8f415c764297c6ffb0f4d961ea"} Apr 24 14:24:18.165614 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.165585 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 14:24:18.165850 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.165833 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:24:18.167655 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.167577 2572 generic.go:358] "Generic (PLEG): container finished" podID="4372c551-45c1-401f-b043-c8048ef49e81" containerID="a414ad7d96aff85e3a05fff0f85b874060ead9718c1409fbf7865059fe1a9cb5" exitCode=0 Apr 24 14:24:18.167738 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.167641 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gsthx" event={"ID":"4372c551-45c1-401f-b043-c8048ef49e81","Type":"ContainerDied","Data":"a414ad7d96aff85e3a05fff0f85b874060ead9718c1409fbf7865059fe1a9cb5"} Apr 24 14:24:18.179940 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.179921 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:24:18.193295 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.193259 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" podStartSLOduration=9.466531523 podStartE2EDuration="26.193247383s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:54.782474091 +0000 UTC m=+3.333316254" lastFinishedPulling="2026-04-24 14:24:11.509189962 +0000 UTC m=+20.060032114" observedRunningTime="2026-04-24 14:24:18.193074803 +0000 UTC m=+26.743917028" watchObservedRunningTime="2026-04-24 14:24:18.193247383 +0000 UTC m=+26.744089554" Apr 24 14:24:18.452529 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.452458 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rdrc9"] Apr 24 14:24:18.452657 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.452596 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:18.452753 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:18.452724 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:18.455967 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.455942 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7vzlj"] Apr 24 14:24:18.456061 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.456049 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:18.456154 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:18.456138 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:18.456685 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.456665 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-md9sj"] Apr 24 14:24:18.456773 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:18.456758 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:18.456867 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:18.456848 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:19.135766 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:19.135738 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:24:19.171104 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:19.171073 2572 generic.go:358] "Generic (PLEG): container finished" podID="4372c551-45c1-401f-b043-c8048ef49e81" containerID="6e6deeda2e8667f6850b033090a346e3718c731d113df28fcfb3e54ffae0ac00" exitCode=0 Apr 24 14:24:19.171459 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:19.171143 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gsthx" event={"ID":"4372c551-45c1-401f-b043-c8048ef49e81","Type":"ContainerDied","Data":"6e6deeda2e8667f6850b033090a346e3718c731d113df28fcfb3e54ffae0ac00"} Apr 24 14:24:20.053410 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:20.053365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:20.053619 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:20.053415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:20.053619 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:20.053386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:20.053619 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:20.053492 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:20.053619 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:20.053573 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:20.053850 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:20.053671 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:22.054432 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:22.054214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:22.054810 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:22.054266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:22.054810 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:22.054283 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:22.054810 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:22.054561 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:22.054810 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:22.054660 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:22.054810 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:22.054720 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:24.053968 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.053931 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:24.054661 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.053949 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:24.054661 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:24.054053 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rdrc9" podUID="9c7c423b-7cf4-4b11-b125-c5bcef103313" Apr 24 14:24:24.054661 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.054083 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:24.054661 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:24.054170 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:24:24.054661 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:24.054253 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-md9sj" podUID="7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa" Apr 24 14:24:24.705218 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.705189 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-169.ec2.internal" event="NodeReady" Apr 24 14:24:24.705399 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.705325 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 14:24:24.752388 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.752359 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69698d7d56-b8d2z"] Apr 24 14:24:24.769810 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.769786 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp"] Apr 24 14:24:24.769972 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.769938 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.773029 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.773004 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 14:24:24.773156 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.773090 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 14:24:24.773156 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.773097 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2wpbm\"" Apr 24 14:24:24.773279 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.773266 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 14:24:24.782526 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.782507 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz"] Apr 24 14:24:24.782690 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.782673 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.785398 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.785368 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 14:24:24.787504 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.787485 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 14:24:24.787617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.787567 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 14:24:24.787683 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.787649 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 14:24:24.787683 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.787678 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 14:24:24.787815 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.787797 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 14:24:24.787912 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.787898 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 14:24:24.788117 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.788103 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 14:24:24.797783 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.797765 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c"] Apr 24 14:24:24.797904 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.797890 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" Apr 24 14:24:24.800858 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.800843 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-d6z2q\"" Apr 24 14:24:24.801301 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.801286 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 14:24:24.813325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.813302 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w8bbw"] Apr 24 14:24:24.813453 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.813438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.819769 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.819716 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 14:24:24.830203 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.830187 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp"] Apr 24 14:24:24.830284 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.830208 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c"] Apr 24 14:24:24.830284 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.830218 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dp5qc"] Apr 24 14:24:24.830349 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.830320 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:24.833356 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.833341 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 14:24:24.833428 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.833412 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 14:24:24.834080 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.834066 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6p8rb\"" Apr 24 14:24:24.848132 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.848110 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69698d7d56-b8d2z"] Apr 24 14:24:24.848132 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.848129 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w8bbw"] Apr 24 14:24:24.848252 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.848137 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz"] Apr 24 14:24:24.848252 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.848145 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dp5qc"] Apr 24 14:24:24.848252 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.848221 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:24.852205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.852187 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 14:24:24.852295 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.852204 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 14:24:24.852434 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.852421 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 14:24:24.852644 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.852628 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v25p\"" Apr 24 14:24:24.877154 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877137 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/64dce565-d49d-478c-869d-07eb80e65c86-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.877256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn4kw\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-kube-api-access-qn4kw\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.877256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-tmp\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.877256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjl9t\" (UniqueName: \"kubernetes.io/projected/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-kube-api-access-fjl9t\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.877353 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dfff644e-5ef3-46bb-b72a-c092391dc054-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5477577d75-jp7qz\" (UID: \"dfff644e-5ef3-46bb-b72a-c092391dc054\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" Apr 24 14:24:24.877353 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-trusted-ca\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.877353 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-ca\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.877471 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-image-registry-private-configuration\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.877471 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-bound-sa-token\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.877471 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8gt\" (UniqueName: \"kubernetes.io/projected/dfff644e-5ef3-46bb-b72a-c092391dc054-kube-api-access-gh8gt\") pod \"managed-serviceaccount-addon-agent-5477577d75-jp7qz\" (UID: \"dfff644e-5ef3-46bb-b72a-c092391dc054\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" Apr 24 14:24:24.877471 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.877586 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-hub\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.877586 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877520 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/96ebb4e4-b101-4abb-a7b7-6babb1746165-ca-trust-extracted\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.877586 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.877586 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-installation-pull-secrets\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.877586 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-klusterlet-config\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.877738 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877625 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jg9m\" (UniqueName: \"kubernetes.io/projected/64dce565-d49d-478c-869d-07eb80e65c86-kube-api-access-5jg9m\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.877738 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877656 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.877738 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.877686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-certificates\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978242 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dfff644e-5ef3-46bb-b72a-c092391dc054-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5477577d75-jp7qz\" (UID: \"dfff644e-5ef3-46bb-b72a-c092391dc054\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" Apr 24 14:24:24.978242 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-trusted-ca\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978242 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978226 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hccx5\" (UniqueName: \"kubernetes.io/projected/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-kube-api-access-hccx5\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-ca\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-image-registry-private-configuration\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-bound-sa-token\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc8n5\" (UniqueName: \"kubernetes.io/projected/933358e5-3e5d-42e6-8b21-2e75a3e94d59-kube-api-access-bc8n5\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8gt\" (UniqueName: \"kubernetes.io/projected/dfff644e-5ef3-46bb-b72a-c092391dc054-kube-api-access-gh8gt\") pod \"managed-serviceaccount-addon-agent-5477577d75-jp7qz\" (UID: \"dfff644e-5ef3-46bb-b72a-c092391dc054\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-tmp-dir\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-hub\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.978494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/96ebb4e4-b101-4abb-a7b7-6babb1746165-ca-trust-extracted\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-installation-pull-secrets\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-klusterlet-config\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978589 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jg9m\" (UniqueName: \"kubernetes.io/projected/64dce565-d49d-478c-869d-07eb80e65c86-kube-api-access-5jg9m\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-certificates\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978728 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/64dce565-d49d-478c-869d-07eb80e65c86-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn4kw\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-kube-api-access-qn4kw\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-tmp\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl9t\" (UniqueName: \"kubernetes.io/projected/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-kube-api-access-fjl9t\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978818 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-config-volume\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:24.978942 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:24.979526 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.978983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/96ebb4e4-b101-4abb-a7b7-6babb1746165-ca-trust-extracted\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.980226 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:24.980202 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:24.980226 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:24.980228 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69698d7d56-b8d2z: secret "image-registry-tls" not found Apr 24 14:24:24.980385 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:24.980300 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls podName:96ebb4e4-b101-4abb-a7b7-6babb1746165 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:25.4802716 +0000 UTC m=+34.031113765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls") pod "image-registry-69698d7d56-b8d2z" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165") : secret "image-registry-tls" not found Apr 24 14:24:24.980593 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.980569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-tmp\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.980908 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.980887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-certificates\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.980982 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.980937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-trusted-ca\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.981380 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.981357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/64dce565-d49d-478c-869d-07eb80e65c86-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.983313 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.983289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-installation-pull-secrets\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.983526 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.983511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-image-registry-private-configuration\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.983712 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.983693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.983757 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.983705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dfff644e-5ef3-46bb-b72a-c092391dc054-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5477577d75-jp7qz\" (UID: \"dfff644e-5ef3-46bb-b72a-c092391dc054\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" Apr 24 14:24:24.983826 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.983811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-klusterlet-config\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.983948 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.983932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-ca\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.984146 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.984126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-hub\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.984199 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.984182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/64dce565-d49d-478c-869d-07eb80e65c86-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.988100 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.988080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8gt\" (UniqueName: \"kubernetes.io/projected/dfff644e-5ef3-46bb-b72a-c092391dc054-kube-api-access-gh8gt\") pod \"managed-serviceaccount-addon-agent-5477577d75-jp7qz\" (UID: \"dfff644e-5ef3-46bb-b72a-c092391dc054\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" Apr 24 14:24:24.988187 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.988089 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-bound-sa-token\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:24.994064 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.994039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jg9m\" (UniqueName: \"kubernetes.io/projected/64dce565-d49d-478c-869d-07eb80e65c86-kube-api-access-5jg9m\") pod \"cluster-proxy-proxy-agent-649fdf996-8jfdp\" (UID: \"64dce565-d49d-478c-869d-07eb80e65c86\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:24.994176 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.994132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjl9t\" (UniqueName: \"kubernetes.io/projected/9a6957ac-fa8e-4d6f-b2b5-d90c674863fe-kube-api-access-fjl9t\") pod \"klusterlet-addon-workmgr-797579889f-rkk4c\" (UID: \"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:24.994248 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:24.994227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn4kw\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-kube-api-access-qn4kw\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:25.079594 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.079575 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-config-volume\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.079625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.079645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hccx5\" (UniqueName: \"kubernetes.io/projected/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-kube-api-access-hccx5\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.079673 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc8n5\" (UniqueName: \"kubernetes.io/projected/933358e5-3e5d-42e6-8b21-2e75a3e94d59-kube-api-access-bc8n5\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.079693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-tmp-dir\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.079722 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.079739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.079781 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls podName:7a535dae-e92e-4ab6-b2ab-5eb561bc8793 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:25.579761485 +0000 UTC m=+34.130603652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls") pod "dns-default-w8bbw" (UID: "7a535dae-e92e-4ab6-b2ab-5eb561bc8793") : secret "dns-default-metrics-tls" not found Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.079809 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.079851 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert podName:933358e5-3e5d-42e6-8b21-2e75a3e94d59 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:25.579836896 +0000 UTC m=+34.130679076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert") pod "ingress-canary-dp5qc" (UID: "933358e5-3e5d-42e6-8b21-2e75a3e94d59") : secret "canary-serving-cert" not found Apr 24 14:24:25.080081 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.079977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-tmp-dir\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:25.080367 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.080169 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-config-volume\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:25.090176 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.090158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hccx5\" (UniqueName: \"kubernetes.io/projected/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-kube-api-access-hccx5\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:25.097330 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.097313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc8n5\" (UniqueName: \"kubernetes.io/projected/933358e5-3e5d-42e6-8b21-2e75a3e94d59-kube-api-access-bc8n5\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:25.099880 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.099859 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:24:25.107189 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.107168 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" Apr 24 14:24:25.127999 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.127971 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:25.263041 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.262734 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz"] Apr 24 14:24:25.269140 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:24:25.269108 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfff644e_5ef3_46bb_b72a_c092391dc054.slice/crio-eceeaf06c9a067a5b84e19a4db95cb7b37b869d92d798156cb2bba376546560a WatchSource:0}: Error finding container eceeaf06c9a067a5b84e19a4db95cb7b37b869d92d798156cb2bba376546560a: Status 404 returned error can't find the container with id eceeaf06c9a067a5b84e19a4db95cb7b37b869d92d798156cb2bba376546560a Apr 24 14:24:25.273058 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.273036 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp"] Apr 24 14:24:25.276006 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.275980 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c"] Apr 24 14:24:25.278576 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:24:25.278552 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64dce565_d49d_478c_869d_07eb80e65c86.slice/crio-35497e98cbb6aea5790bc9ca22a6878eeafbd2f1fe5194a1b023469be7de8522 WatchSource:0}: Error finding container 35497e98cbb6aea5790bc9ca22a6878eeafbd2f1fe5194a1b023469be7de8522: Status 404 returned error can't find the container with id 35497e98cbb6aea5790bc9ca22a6878eeafbd2f1fe5194a1b023469be7de8522 Apr 24 14:24:25.279103 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:24:25.279081 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a6957ac_fa8e_4d6f_b2b5_d90c674863fe.slice/crio-f781db28647bcb10828203e1d718c1958347038d7979beb9d2c17166674a59f0 WatchSource:0}: Error finding container f781db28647bcb10828203e1d718c1958347038d7979beb9d2c17166674a59f0: Status 404 returned error can't find the container with id f781db28647bcb10828203e1d718c1958347038d7979beb9d2c17166674a59f0 Apr 24 14:24:25.484052 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.483979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:25.484168 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.484141 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:25.484168 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.484159 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69698d7d56-b8d2z: secret "image-registry-tls" not found Apr 24 14:24:25.484229 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.484213 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls podName:96ebb4e4-b101-4abb-a7b7-6babb1746165 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.48419849 +0000 UTC m=+35.035040645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls") pod "image-registry-69698d7d56-b8d2z" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165") : secret "image-registry-tls" not found Apr 24 14:24:25.584509 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.584483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:25.584661 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.584544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:25.584661 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.584643 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:25.584661 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.584650 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:25.584762 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.584697 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert podName:933358e5-3e5d-42e6-8b21-2e75a3e94d59 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.584684611 +0000 UTC m=+35.135526761 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert") pod "ingress-canary-dp5qc" (UID: "933358e5-3e5d-42e6-8b21-2e75a3e94d59") : secret "canary-serving-cert" not found Apr 24 14:24:25.584762 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.584710 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls podName:7a535dae-e92e-4ab6-b2ab-5eb561bc8793 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:26.584704385 +0000 UTC m=+35.135546534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls") pod "dns-default-w8bbw" (UID: "7a535dae-e92e-4ab6-b2ab-5eb561bc8793") : secret "dns-default-metrics-tls" not found Apr 24 14:24:25.685641 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.685596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:25.685779 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.685727 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:25.685822 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.685783 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:57.685770232 +0000 UTC m=+66.236612395 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 14:24:25.786724 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:25.786690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:25.786896 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.786819 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 14:24:25.786896 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.786833 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 14:24:25.786896 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.786843 2572 projected.go:194] Error preparing data for projected volume kube-api-access-gtfsh for pod openshift-network-diagnostics/network-check-target-md9sj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:25.786896 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:25.786895 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh podName:7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa nodeName:}" failed. No retries permitted until 2026-04-24 14:24:57.786882617 +0000 UTC m=+66.337724766 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gtfsh" (UniqueName: "kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh") pod "network-check-target-md9sj" (UID: "7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 14:24:26.054222 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.053924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:26.054222 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.054135 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:26.054534 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.054019 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:26.059180 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.058802 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 14:24:26.059180 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.058880 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:26.059180 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.059046 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nctch\"" Apr 24 14:24:26.059180 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.059084 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4q5p5\"" Apr 24 14:24:26.059467 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.059235 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:26.059528 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.059500 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:26.192633 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.192497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" event={"ID":"dfff644e-5ef3-46bb-b72a-c092391dc054","Type":"ContainerStarted","Data":"eceeaf06c9a067a5b84e19a4db95cb7b37b869d92d798156cb2bba376546560a"} Apr 24 14:24:26.199315 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.199280 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" event={"ID":"64dce565-d49d-478c-869d-07eb80e65c86","Type":"ContainerStarted","Data":"35497e98cbb6aea5790bc9ca22a6878eeafbd2f1fe5194a1b023469be7de8522"} Apr 24 14:24:26.205401 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.205230 2572 generic.go:358] "Generic (PLEG): container finished" podID="4372c551-45c1-401f-b043-c8048ef49e81" containerID="0401d15ce83da5fa240e60175349f90a393aa883cbc3c105ae0a65b9939223eb" exitCode=0 Apr 24 14:24:26.205401 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.205292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gsthx" event={"ID":"4372c551-45c1-401f-b043-c8048ef49e81","Type":"ContainerDied","Data":"0401d15ce83da5fa240e60175349f90a393aa883cbc3c105ae0a65b9939223eb"} Apr 24 14:24:26.207680 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.207642 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" event={"ID":"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe","Type":"ContainerStarted","Data":"f781db28647bcb10828203e1d718c1958347038d7979beb9d2c17166674a59f0"} Apr 24 14:24:26.499621 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.499300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:26.499621 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:26.499514 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:26.499621 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:26.499532 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69698d7d56-b8d2z: secret "image-registry-tls" not found Apr 24 14:24:26.499621 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:26.499585 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls podName:96ebb4e4-b101-4abb-a7b7-6babb1746165 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:28.499566295 +0000 UTC m=+37.050408450 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls") pod "image-registry-69698d7d56-b8d2z" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165") : secret "image-registry-tls" not found Apr 24 14:24:26.602508 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.601724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:26.602508 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:26.601823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:26.602508 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:26.601973 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:26.602508 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:26.602037 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls podName:7a535dae-e92e-4ab6-b2ab-5eb561bc8793 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:28.602018886 +0000 UTC m=+37.152861047 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls") pod "dns-default-w8bbw" (UID: "7a535dae-e92e-4ab6-b2ab-5eb561bc8793") : secret "dns-default-metrics-tls" not found Apr 24 14:24:26.602508 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:26.602425 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:26.602508 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:26.602475 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert podName:933358e5-3e5d-42e6-8b21-2e75a3e94d59 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:28.602458879 +0000 UTC m=+37.153301032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert") pod "ingress-canary-dp5qc" (UID: "933358e5-3e5d-42e6-8b21-2e75a3e94d59") : secret "canary-serving-cert" not found Apr 24 14:24:27.217737 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:27.217371 2572 generic.go:358] "Generic (PLEG): container finished" podID="4372c551-45c1-401f-b043-c8048ef49e81" containerID="d467b7e890c960598bd433618ea3df1324330bf47ef3fc17bccd1a9b94f5414d" exitCode=0 Apr 24 14:24:27.217737 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:27.217437 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gsthx" event={"ID":"4372c551-45c1-401f-b043-c8048ef49e81","Type":"ContainerDied","Data":"d467b7e890c960598bd433618ea3df1324330bf47ef3fc17bccd1a9b94f5414d"} Apr 24 14:24:28.519297 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:28.519250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:28.519665 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:28.519406 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:28.519665 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:28.519427 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69698d7d56-b8d2z: secret "image-registry-tls" not found Apr 24 14:24:28.519665 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:28.519482 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls podName:96ebb4e4-b101-4abb-a7b7-6babb1746165 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:32.519467325 +0000 UTC m=+41.070309478 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls") pod "image-registry-69698d7d56-b8d2z" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165") : secret "image-registry-tls" not found Apr 24 14:24:28.620168 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:28.620137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:28.620321 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:28.620214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:28.620321 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:28.620296 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:28.620436 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:28.620376 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert podName:933358e5-3e5d-42e6-8b21-2e75a3e94d59 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:32.620357794 +0000 UTC m=+41.171199966 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert") pod "ingress-canary-dp5qc" (UID: "933358e5-3e5d-42e6-8b21-2e75a3e94d59") : secret "canary-serving-cert" not found Apr 24 14:24:28.620436 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:28.620305 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:28.620529 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:28.620440 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls podName:7a535dae-e92e-4ab6-b2ab-5eb561bc8793 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:32.620426133 +0000 UTC m=+41.171268284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls") pod "dns-default-w8bbw" (UID: "7a535dae-e92e-4ab6-b2ab-5eb561bc8793") : secret "dns-default-metrics-tls" not found Apr 24 14:24:29.831787 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:29.831566 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:29.835997 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:29.835971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9c7c423b-7cf4-4b11-b125-c5bcef103313-original-pull-secret\") pod \"global-pull-secret-syncer-rdrc9\" (UID: \"9c7c423b-7cf4-4b11-b125-c5bcef103313\") " pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:29.981179 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:29.981137 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rdrc9" Apr 24 14:24:31.039205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.039173 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rdrc9"] Apr 24 14:24:31.042532 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:24:31.042507 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c7c423b_7cf4_4b11_b125_c5bcef103313.slice/crio-5bd6e1fabfd7cf4ba412fc03cb0ce903420a447bb19ca99a41d4d1a27eba1cff WatchSource:0}: Error finding container 5bd6e1fabfd7cf4ba412fc03cb0ce903420a447bb19ca99a41d4d1a27eba1cff: Status 404 returned error can't find the container with id 5bd6e1fabfd7cf4ba412fc03cb0ce903420a447bb19ca99a41d4d1a27eba1cff Apr 24 14:24:31.226780 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.226702 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" event={"ID":"dfff644e-5ef3-46bb-b72a-c092391dc054","Type":"ContainerStarted","Data":"880bed2ddb1b0bb74a4381d4b71e580edb06096c89a95de40342b52ad148068f"} Apr 24 14:24:31.227998 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.227972 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" event={"ID":"64dce565-d49d-478c-869d-07eb80e65c86","Type":"ContainerStarted","Data":"fe5e717565e129d479563267caf3c7c84d442204b013699d48f7e7aca64c9812"} Apr 24 14:24:31.230617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.230587 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gsthx" event={"ID":"4372c551-45c1-401f-b043-c8048ef49e81","Type":"ContainerStarted","Data":"70ebc7622be04ee3e00e4dd99f3c3cc4153920aa1eb616983883a88dd01cfd3e"} Apr 24 14:24:31.231769 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.231748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" event={"ID":"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe","Type":"ContainerStarted","Data":"7805608992f738be0cf600cd9e521474b8b0c832636d9c9b090b95801775d3e1"} Apr 24 14:24:31.231977 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.231936 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:31.232790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.232771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rdrc9" event={"ID":"9c7c423b-7cf4-4b11-b125-c5bcef103313","Type":"ContainerStarted","Data":"5bd6e1fabfd7cf4ba412fc03cb0ce903420a447bb19ca99a41d4d1a27eba1cff"} Apr 24 14:24:31.233488 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.233474 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:24:31.242780 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.242733 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" podStartSLOduration=28.609460749 podStartE2EDuration="34.242718423s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:24:25.271996592 +0000 UTC m=+33.822838755" lastFinishedPulling="2026-04-24 14:24:30.905254265 +0000 UTC m=+39.456096429" observedRunningTime="2026-04-24 14:24:31.242073145 +0000 UTC m=+39.792915327" watchObservedRunningTime="2026-04-24 14:24:31.242718423 +0000 UTC m=+39.793560600" Apr 24 14:24:31.263737 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.263700 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gsthx" podStartSLOduration=8.96446974 podStartE2EDuration="39.263691554s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:23:54.775957727 +0000 UTC m=+3.326799877" lastFinishedPulling="2026-04-24 14:24:25.075179538 +0000 UTC m=+33.626021691" observedRunningTime="2026-04-24 14:24:31.262067206 +0000 UTC m=+39.812909376" watchObservedRunningTime="2026-04-24 14:24:31.263691554 +0000 UTC m=+39.814533724" Apr 24 14:24:31.277537 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:31.277497 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" podStartSLOduration=28.647586461 podStartE2EDuration="34.277486558s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:24:25.280833331 +0000 UTC m=+33.831675483" lastFinishedPulling="2026-04-24 14:24:30.910733416 +0000 UTC m=+39.461575580" observedRunningTime="2026-04-24 14:24:31.2772478 +0000 UTC m=+39.828089968" watchObservedRunningTime="2026-04-24 14:24:31.277486558 +0000 UTC m=+39.828328727" Apr 24 14:24:32.554810 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:32.554770 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:32.555153 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:32.554945 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:32.555153 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:32.554964 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69698d7d56-b8d2z: secret "image-registry-tls" not found Apr 24 14:24:32.555153 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:32.555034 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls podName:96ebb4e4-b101-4abb-a7b7-6babb1746165 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:40.555013431 +0000 UTC m=+49.105855597 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls") pod "image-registry-69698d7d56-b8d2z" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165") : secret "image-registry-tls" not found Apr 24 14:24:32.656021 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:32.655987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:32.656164 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:32.656076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:32.656164 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:32.656133 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:32.656289 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:32.656168 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:32.656289 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:32.656200 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert podName:933358e5-3e5d-42e6-8b21-2e75a3e94d59 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:40.656181889 +0000 UTC m=+49.207024060 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert") pod "ingress-canary-dp5qc" (UID: "933358e5-3e5d-42e6-8b21-2e75a3e94d59") : secret "canary-serving-cert" not found Apr 24 14:24:32.656289 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:32.656218 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls podName:7a535dae-e92e-4ab6-b2ab-5eb561bc8793 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:40.65620951 +0000 UTC m=+49.207051659 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls") pod "dns-default-w8bbw" (UID: "7a535dae-e92e-4ab6-b2ab-5eb561bc8793") : secret "dns-default-metrics-tls" not found Apr 24 14:24:35.242966 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:35.242895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" event={"ID":"64dce565-d49d-478c-869d-07eb80e65c86","Type":"ContainerStarted","Data":"22bfe8f043556d19c18b0705441ce6519fd2c4861a8404ba641e4ad9f33d901f"} Apr 24 14:24:35.242966 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:35.242930 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" event={"ID":"64dce565-d49d-478c-869d-07eb80e65c86","Type":"ContainerStarted","Data":"7abd1239927e4ae4451070bfabab74a5916b540588aec34e6b710b42ad850ca6"} Apr 24 14:24:36.247164 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:36.247126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rdrc9" event={"ID":"9c7c423b-7cf4-4b11-b125-c5bcef103313","Type":"ContainerStarted","Data":"cb15aff6ae5a3ca1d7eaa86f3218d27df0207de6a646f5f1017fcd11d6d8147b"} Apr 24 14:24:36.262224 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:36.262179 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" podStartSLOduration=29.567327894 podStartE2EDuration="39.262167105s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:24:25.280263828 +0000 UTC m=+33.831105978" lastFinishedPulling="2026-04-24 14:24:34.97510304 +0000 UTC m=+43.525945189" observedRunningTime="2026-04-24 14:24:35.261248977 +0000 UTC m=+43.812091147" watchObservedRunningTime="2026-04-24 14:24:36.262167105 +0000 UTC m=+44.813009274" Apr 24 14:24:36.262355 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:36.262310 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rdrc9" podStartSLOduration=34.925095933 podStartE2EDuration="39.262305906s" podCreationTimestamp="2026-04-24 14:23:57 +0000 UTC" firstStartedPulling="2026-04-24 14:24:31.044075169 +0000 UTC m=+39.594917322" lastFinishedPulling="2026-04-24 14:24:35.381285144 +0000 UTC m=+43.932127295" observedRunningTime="2026-04-24 14:24:36.261418301 +0000 UTC m=+44.812260472" watchObservedRunningTime="2026-04-24 14:24:36.262305906 +0000 UTC m=+44.813148055" Apr 24 14:24:40.615758 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:40.615722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:40.616204 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:40.615901 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:40.616204 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:40.615925 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69698d7d56-b8d2z: secret "image-registry-tls" not found Apr 24 14:24:40.616204 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:40.615995 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls podName:96ebb4e4-b101-4abb-a7b7-6babb1746165 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:56.615973684 +0000 UTC m=+65.166815846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls") pod "image-registry-69698d7d56-b8d2z" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165") : secret "image-registry-tls" not found Apr 24 14:24:40.716456 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:40.716419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:40.716651 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:40.716493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:40.716651 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:40.716575 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:40.716651 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:40.716650 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls podName:7a535dae-e92e-4ab6-b2ab-5eb561bc8793 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:56.716630748 +0000 UTC m=+65.267472911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls") pod "dns-default-w8bbw" (UID: "7a535dae-e92e-4ab6-b2ab-5eb561bc8793") : secret "dns-default-metrics-tls" not found Apr 24 14:24:40.716763 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:40.716574 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:40.716763 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:40.716692 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert podName:933358e5-3e5d-42e6-8b21-2e75a3e94d59 nodeName:}" failed. No retries permitted until 2026-04-24 14:24:56.716681334 +0000 UTC m=+65.267523488 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert") pod "ingress-canary-dp5qc" (UID: "933358e5-3e5d-42e6-8b21-2e75a3e94d59") : secret "canary-serving-cert" not found Apr 24 14:24:50.183875 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:50.183840 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7w4g5" Apr 24 14:24:56.638185 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:56.638140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:24:56.638565 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:56.638276 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:24:56.638565 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:56.638295 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69698d7d56-b8d2z: secret "image-registry-tls" not found Apr 24 14:24:56.638565 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:56.638377 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls podName:96ebb4e4-b101-4abb-a7b7-6babb1746165 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:28.638360069 +0000 UTC m=+97.189202220 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls") pod "image-registry-69698d7d56-b8d2z" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165") : secret "image-registry-tls" not found Apr 24 14:24:56.739515 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:56.739472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:24:56.739720 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:56.739538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:24:56.739720 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:56.739670 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:24:56.739813 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:56.739745 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert podName:933358e5-3e5d-42e6-8b21-2e75a3e94d59 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:28.739729018 +0000 UTC m=+97.290571168 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert") pod "ingress-canary-dp5qc" (UID: "933358e5-3e5d-42e6-8b21-2e75a3e94d59") : secret "canary-serving-cert" not found Apr 24 14:24:56.739813 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:56.739680 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:24:56.739813 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:56.739794 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls podName:7a535dae-e92e-4ab6-b2ab-5eb561bc8793 nodeName:}" failed. No retries permitted until 2026-04-24 14:25:28.739783577 +0000 UTC m=+97.290625741 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls") pod "dns-default-w8bbw" (UID: "7a535dae-e92e-4ab6-b2ab-5eb561bc8793") : secret "dns-default-metrics-tls" not found Apr 24 14:24:57.746620 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:57.746568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:24:57.749514 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:57.749494 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 14:24:57.757805 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:57.757790 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:24:57.757856 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:24:57.757846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:01.757830715 +0000 UTC m=+130.308672864 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : secret "metrics-daemon-secret" not found Apr 24 14:24:57.847327 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:57.847282 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:57.850170 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:57.850154 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 14:24:57.860887 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:57.860867 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 14:24:57.871404 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:57.871382 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtfsh\" (UniqueName: \"kubernetes.io/projected/7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa-kube-api-access-gtfsh\") pod \"network-check-target-md9sj\" (UID: \"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa\") " pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:57.898974 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:57.898946 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4q5p5\"" Apr 24 14:24:57.907252 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:57.907232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:24:58.022806 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:58.022731 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-md9sj"] Apr 24 14:24:58.025636 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:24:58.025573 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7433ec7f_d2d9_4c4e_a533_4c4c15fc9bfa.slice/crio-407e943fe023b12c1bd470e7da35f4ba2fc64a2f64a1e2b303b42414ce806164 WatchSource:0}: Error finding container 407e943fe023b12c1bd470e7da35f4ba2fc64a2f64a1e2b303b42414ce806164: Status 404 returned error can't find the container with id 407e943fe023b12c1bd470e7da35f4ba2fc64a2f64a1e2b303b42414ce806164 Apr 24 14:24:58.308714 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:24:58.308682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-md9sj" event={"ID":"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa","Type":"ContainerStarted","Data":"407e943fe023b12c1bd470e7da35f4ba2fc64a2f64a1e2b303b42414ce806164"} Apr 24 14:25:01.317116 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:25:01.317082 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-md9sj" event={"ID":"7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa","Type":"ContainerStarted","Data":"b63cf2bbf9745fb929037929090115204caa07fe7f34d86506dd17ec73abb137"} Apr 24 14:25:01.317521 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:25:01.317203 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:25:01.333873 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:25:01.333831 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-md9sj" podStartSLOduration=66.423449766 podStartE2EDuration="1m9.333818249s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:24:58.027673637 +0000 UTC m=+66.578515801" lastFinishedPulling="2026-04-24 14:25:00.938042124 +0000 UTC m=+69.488884284" observedRunningTime="2026-04-24 14:25:01.333069676 +0000 UTC m=+69.883911847" watchObservedRunningTime="2026-04-24 14:25:01.333818249 +0000 UTC m=+69.884660421" Apr 24 14:25:28.674941 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:25:28.674901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:25:28.675311 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:25:28.675050 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 14:25:28.675311 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:25:28.675068 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69698d7d56-b8d2z: secret "image-registry-tls" not found Apr 24 14:25:28.675311 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:25:28.675122 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls podName:96ebb4e4-b101-4abb-a7b7-6babb1746165 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:32.675106021 +0000 UTC m=+161.225948170 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls") pod "image-registry-69698d7d56-b8d2z" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165") : secret "image-registry-tls" not found Apr 24 14:25:28.775967 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:25:28.775936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:25:28.776099 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:25:28.775989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:25:28.776099 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:25:28.776085 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 14:25:28.776211 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:25:28.776146 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert podName:933358e5-3e5d-42e6-8b21-2e75a3e94d59 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:32.776131025 +0000 UTC m=+161.326973187 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert") pod "ingress-canary-dp5qc" (UID: "933358e5-3e5d-42e6-8b21-2e75a3e94d59") : secret "canary-serving-cert" not found Apr 24 14:25:28.776211 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:25:28.776092 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 14:25:28.776277 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:25:28.776214 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls podName:7a535dae-e92e-4ab6-b2ab-5eb561bc8793 nodeName:}" failed. No retries permitted until 2026-04-24 14:26:32.776204057 +0000 UTC m=+161.327046208 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls") pod "dns-default-w8bbw" (UID: "7a535dae-e92e-4ab6-b2ab-5eb561bc8793") : secret "dns-default-metrics-tls" not found Apr 24 14:25:32.322459 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:25:32.322426 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-md9sj" Apr 24 14:26:01.827165 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:01.827117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:26:01.827545 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:26:01.827255 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 14:26:01.827545 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:26:01.827329 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs podName:1a1e1cea-5bb5-4d2e-83e5-817f18307569 nodeName:}" failed. No retries permitted until 2026-04-24 14:28:03.827313313 +0000 UTC m=+252.378155466 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs") pod "network-metrics-daemon-7vzlj" (UID: "1a1e1cea-5bb5-4d2e-83e5-817f18307569") : secret "metrics-daemon-secret" not found Apr 24 14:26:24.124875 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:24.124844 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qdkg2_7ef2b28b-afd5-4cb7-a98e-824028a4bb08/dns-node-resolver/0.log" Apr 24 14:26:24.724353 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:24.724329 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gxbvj_f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4/node-ca/0.log" Apr 24 14:26:27.781045 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:26:27.781002 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" podUID="96ebb4e4-b101-4abb-a7b7-6babb1746165" Apr 24 14:26:27.838345 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:26:27.838299 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-w8bbw" podUID="7a535dae-e92e-4ab6-b2ab-5eb561bc8793" Apr 24 14:26:27.856688 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:26:27.856651 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dp5qc" podUID="933358e5-3e5d-42e6-8b21-2e75a3e94d59" Apr 24 14:26:28.513819 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:28.513790 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:26:28.514018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:28.513790 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w8bbw" Apr 24 14:26:29.070338 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:26:29.070286 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7vzlj" podUID="1a1e1cea-5bb5-4d2e-83e5-817f18307569" Apr 24 14:26:31.232967 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:31.232910 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" podUID="9a6957ac-fa8e-4d6f-b2b5-d90c674863fe" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 24 14:26:31.523945 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:31.523867 2572 generic.go:358] "Generic (PLEG): container finished" podID="dfff644e-5ef3-46bb-b72a-c092391dc054" containerID="880bed2ddb1b0bb74a4381d4b71e580edb06096c89a95de40342b52ad148068f" exitCode=255 Apr 24 14:26:31.524098 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:31.523942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" event={"ID":"dfff644e-5ef3-46bb-b72a-c092391dc054","Type":"ContainerDied","Data":"880bed2ddb1b0bb74a4381d4b71e580edb06096c89a95de40342b52ad148068f"} Apr 24 14:26:31.524324 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:31.524304 2572 scope.go:117] "RemoveContainer" containerID="880bed2ddb1b0bb74a4381d4b71e580edb06096c89a95de40342b52ad148068f" Apr 24 14:26:31.525157 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:31.525138 2572 generic.go:358] "Generic (PLEG): container finished" podID="9a6957ac-fa8e-4d6f-b2b5-d90c674863fe" containerID="7805608992f738be0cf600cd9e521474b8b0c832636d9c9b090b95801775d3e1" exitCode=1 Apr 24 14:26:31.525208 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:31.525179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" event={"ID":"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe","Type":"ContainerDied","Data":"7805608992f738be0cf600cd9e521474b8b0c832636d9c9b090b95801775d3e1"} Apr 24 14:26:31.525438 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:31.525426 2572 scope.go:117] "RemoveContainer" containerID="7805608992f738be0cf600cd9e521474b8b0c832636d9c9b090b95801775d3e1" Apr 24 14:26:32.529920 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.529880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5477577d75-jp7qz" event={"ID":"dfff644e-5ef3-46bb-b72a-c092391dc054","Type":"ContainerStarted","Data":"4750e9f6369180f7e760e38b77df9e627e75a4deae3e4fd2c4b00dfbcf7bb5f4"} Apr 24 14:26:32.531361 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.531341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" event={"ID":"9a6957ac-fa8e-4d6f-b2b5-d90c674863fe","Type":"ContainerStarted","Data":"4ab9eaf726bd12bb3c7a03604210bbebc2c9eb76e789633dc7496532b300a453"} Apr 24 14:26:32.531619 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.531579 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:26:32.532120 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.532105 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-797579889f-rkk4c" Apr 24 14:26:32.751006 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.750962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:26:32.753473 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.753439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"image-registry-69698d7d56-b8d2z\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:26:32.851911 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.851819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:26:32.851911 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.851873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:26:32.854166 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.854138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7a535dae-e92e-4ab6-b2ab-5eb561bc8793-metrics-tls\") pod \"dns-default-w8bbw\" (UID: \"7a535dae-e92e-4ab6-b2ab-5eb561bc8793\") " pod="openshift-dns/dns-default-w8bbw" Apr 24 14:26:32.854382 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:32.854359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/933358e5-3e5d-42e6-8b21-2e75a3e94d59-cert\") pod \"ingress-canary-dp5qc\" (UID: \"933358e5-3e5d-42e6-8b21-2e75a3e94d59\") " pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:26:33.017525 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.017495 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6p8rb\"" Apr 24 14:26:33.017697 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.017552 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2wpbm\"" Apr 24 14:26:33.025856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.025839 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w8bbw" Apr 24 14:26:33.025904 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.025867 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:26:33.147396 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.147341 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w8bbw"] Apr 24 14:26:33.150741 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:26:33.150714 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a535dae_e92e_4ab6_b2ab_5eb561bc8793.slice/crio-418c3af96429493e36b3b90d480526b640f6e8f311a85ebeefe34c688e6f4e0e WatchSource:0}: Error finding container 418c3af96429493e36b3b90d480526b640f6e8f311a85ebeefe34c688e6f4e0e: Status 404 returned error can't find the container with id 418c3af96429493e36b3b90d480526b640f6e8f311a85ebeefe34c688e6f4e0e Apr 24 14:26:33.168076 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.168052 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69698d7d56-b8d2z"] Apr 24 14:26:33.171163 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:26:33.171137 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96ebb4e4_b101_4abb_a7b7_6babb1746165.slice/crio-a65d38eaaaed5bd2e4ad664c89945e7bf5b43655303b36ab1e1e54478b8e6b3d WatchSource:0}: Error finding container a65d38eaaaed5bd2e4ad664c89945e7bf5b43655303b36ab1e1e54478b8e6b3d: Status 404 returned error can't find the container with id a65d38eaaaed5bd2e4ad664c89945e7bf5b43655303b36ab1e1e54478b8e6b3d Apr 24 14:26:33.535391 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.535356 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" event={"ID":"96ebb4e4-b101-4abb-a7b7-6babb1746165","Type":"ContainerStarted","Data":"0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de"} Apr 24 14:26:33.535391 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.535394 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" event={"ID":"96ebb4e4-b101-4abb-a7b7-6babb1746165","Type":"ContainerStarted","Data":"a65d38eaaaed5bd2e4ad664c89945e7bf5b43655303b36ab1e1e54478b8e6b3d"} Apr 24 14:26:33.535883 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.535431 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:26:33.536437 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.536415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w8bbw" event={"ID":"7a535dae-e92e-4ab6-b2ab-5eb561bc8793","Type":"ContainerStarted","Data":"418c3af96429493e36b3b90d480526b640f6e8f311a85ebeefe34c688e6f4e0e"} Apr 24 14:26:33.554005 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:33.553967 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" podStartSLOduration=161.553952637 podStartE2EDuration="2m41.553952637s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:26:33.553534154 +0000 UTC m=+162.104376338" watchObservedRunningTime="2026-04-24 14:26:33.553952637 +0000 UTC m=+162.104794809" Apr 24 14:26:34.541312 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:34.541226 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w8bbw" event={"ID":"7a535dae-e92e-4ab6-b2ab-5eb561bc8793","Type":"ContainerStarted","Data":"f2cbbc57d4769ce0131d2ceecb627e8f9d323be5c41b8447674aac7d778bb39d"} Apr 24 14:26:35.545455 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:35.545416 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w8bbw" event={"ID":"7a535dae-e92e-4ab6-b2ab-5eb561bc8793","Type":"ContainerStarted","Data":"c2983c155bc0ac2fcb79002c47e9a841eaa11b6ce284b28bf031f0873db8508c"} Apr 24 14:26:35.545858 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:35.545565 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w8bbw" Apr 24 14:26:35.564286 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:35.564237 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w8bbw" podStartSLOduration=130.296006009 podStartE2EDuration="2m11.56422365s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:26:33.1524603 +0000 UTC m=+161.703302450" lastFinishedPulling="2026-04-24 14:26:34.420677941 +0000 UTC m=+162.971520091" observedRunningTime="2026-04-24 14:26:35.56227802 +0000 UTC m=+164.113120195" watchObservedRunningTime="2026-04-24 14:26:35.56422365 +0000 UTC m=+164.115065822" Apr 24 14:26:39.053851 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:39.053798 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:26:39.056585 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:39.056567 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9v25p\"" Apr 24 14:26:39.064726 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:39.064710 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp5qc" Apr 24 14:26:39.175327 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:39.175294 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dp5qc"] Apr 24 14:26:39.177902 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:26:39.177873 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod933358e5_3e5d_42e6_8b21_2e75a3e94d59.slice/crio-33d3a6bd7200777d84db93596098c12231da744b63b0cda77f47a8ac6483a6e1 WatchSource:0}: Error finding container 33d3a6bd7200777d84db93596098c12231da744b63b0cda77f47a8ac6483a6e1: Status 404 returned error can't find the container with id 33d3a6bd7200777d84db93596098c12231da744b63b0cda77f47a8ac6483a6e1 Apr 24 14:26:39.557439 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:39.557403 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dp5qc" event={"ID":"933358e5-3e5d-42e6-8b21-2e75a3e94d59","Type":"ContainerStarted","Data":"33d3a6bd7200777d84db93596098c12231da744b63b0cda77f47a8ac6483a6e1"} Apr 24 14:26:41.053193 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:41.053159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:26:41.563973 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:41.563938 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dp5qc" event={"ID":"933358e5-3e5d-42e6-8b21-2e75a3e94d59","Type":"ContainerStarted","Data":"dadf822e775fe4aa475b93f8cad97fef76e3907c8a1174762a381be3ef4c1e5b"} Apr 24 14:26:41.579242 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:41.579200 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dp5qc" podStartSLOduration=136.07875836 podStartE2EDuration="2m17.579187804s" podCreationTimestamp="2026-04-24 14:24:24 +0000 UTC" firstStartedPulling="2026-04-24 14:26:39.179648914 +0000 UTC m=+167.730491068" lastFinishedPulling="2026-04-24 14:26:40.680078361 +0000 UTC m=+169.230920512" observedRunningTime="2026-04-24 14:26:41.578733177 +0000 UTC m=+170.129575350" watchObservedRunningTime="2026-04-24 14:26:41.579187804 +0000 UTC m=+170.130029995" Apr 24 14:26:45.550328 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:45.550297 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w8bbw" Apr 24 14:26:47.455636 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.455588 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v2k95"] Apr 24 14:26:47.458664 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.458648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.461336 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.461312 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 14:26:47.462767 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.462746 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 14:26:47.462856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.462800 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 14:26:47.462909 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.462757 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 14:26:47.463066 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.463046 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zf9np\"" Apr 24 14:26:47.472917 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.472894 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v2k95"] Apr 24 14:26:47.563895 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.563864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-crio-socket\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.564069 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.563901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.564069 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.563937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grvr5\" (UniqueName: \"kubernetes.io/projected/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-kube-api-access-grvr5\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.564069 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.564024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.564069 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.564061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-data-volume\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.664398 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.664366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grvr5\" (UniqueName: \"kubernetes.io/projected/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-kube-api-access-grvr5\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.664592 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.664427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.664592 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.664459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-data-volume\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.664592 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.664489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-crio-socket\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.664592 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.664516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.664758 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.664582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-crio-socket\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.664871 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.664850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-data-volume\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.665094 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.665077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.666751 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.666734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.675049 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.675025 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grvr5\" (UniqueName: \"kubernetes.io/projected/e4ced4c6-7792-40bd-9f9e-a696d34a3c84-kube-api-access-grvr5\") pod \"insights-runtime-extractor-v2k95\" (UID: \"e4ced4c6-7792-40bd-9f9e-a696d34a3c84\") " pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.768160 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.768078 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v2k95" Apr 24 14:26:47.884048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:47.884017 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v2k95"] Apr 24 14:26:47.886955 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:26:47.886928 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ced4c6_7792_40bd_9f9e_a696d34a3c84.slice/crio-a781aa54383045241cbb2921a758b2b0be1a4fca321333f229fdbfb8991399de WatchSource:0}: Error finding container a781aa54383045241cbb2921a758b2b0be1a4fca321333f229fdbfb8991399de: Status 404 returned error can't find the container with id a781aa54383045241cbb2921a758b2b0be1a4fca321333f229fdbfb8991399de Apr 24 14:26:48.584796 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:48.584766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v2k95" event={"ID":"e4ced4c6-7792-40bd-9f9e-a696d34a3c84","Type":"ContainerStarted","Data":"9d82361a3c6d6e0cab1853c21fd9c4a14bfcf829b36f9dbb96b5cc12a7c4fd28"} Apr 24 14:26:48.584796 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:48.584802 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v2k95" event={"ID":"e4ced4c6-7792-40bd-9f9e-a696d34a3c84","Type":"ContainerStarted","Data":"a781aa54383045241cbb2921a758b2b0be1a4fca321333f229fdbfb8991399de"} Apr 24 14:26:49.589361 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:49.589322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v2k95" event={"ID":"e4ced4c6-7792-40bd-9f9e-a696d34a3c84","Type":"ContainerStarted","Data":"1f3f37992eca59835566ba17ae1006dac60271fe39b40f3da5de3c2938325a32"} Apr 24 14:26:50.594953 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:50.594916 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v2k95" event={"ID":"e4ced4c6-7792-40bd-9f9e-a696d34a3c84","Type":"ContainerStarted","Data":"17aac114e22dfed42fe122632252277c587cc97a9325724c56592217a945c56b"} Apr 24 14:26:50.612848 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:50.612804 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v2k95" podStartSLOduration=1.569506743 podStartE2EDuration="3.612791258s" podCreationTimestamp="2026-04-24 14:26:47 +0000 UTC" firstStartedPulling="2026-04-24 14:26:47.940841069 +0000 UTC m=+176.491683219" lastFinishedPulling="2026-04-24 14:26:49.984125584 +0000 UTC m=+178.534967734" observedRunningTime="2026-04-24 14:26:50.612174564 +0000 UTC m=+179.163016735" watchObservedRunningTime="2026-04-24 14:26:50.612791258 +0000 UTC m=+179.163633430" Apr 24 14:26:53.029949 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:53.029913 2572 patch_prober.go:28] interesting pod/image-registry-69698d7d56-b8d2z container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 14:26:53.030375 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:53.029966 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" podUID="96ebb4e4-b101-4abb-a7b7-6babb1746165" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:26:54.545920 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:26:54.545886 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:27:05.101162 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:05.101121 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" podUID="64dce565-d49d-478c-869d-07eb80e65c86" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:27:06.825869 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.825837 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-m7qfv"] Apr 24 14:27:06.830110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.830089 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.837750 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.837727 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 14:27:06.838847 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.838821 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 14:27:06.843057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.839461 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 14:27:06.843057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.840060 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 14:27:06.843057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.840110 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 14:27:06.843057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.840285 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 14:27:06.843057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.840492 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lpfj8\"" Apr 24 14:27:06.995666 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995630 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-accelerators-collector-config\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.995666 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-sys\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.995856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-textfile\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.995856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-wtmp\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.995856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-root\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.995856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.995856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrz8h\" (UniqueName: \"kubernetes.io/projected/dcf31072-2346-4778-8912-4233fbb8dd01-kube-api-access-jrz8h\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.995856 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dcf31072-2346-4778-8912-4233fbb8dd01-metrics-client-ca\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:06.996045 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:06.995897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-tls\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.096983 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.096906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-wtmp\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.096983 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.096939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-root\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.096983 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.096962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-root\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-wtmp\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrz8h\" (UniqueName: \"kubernetes.io/projected/dcf31072-2346-4778-8912-4233fbb8dd01-kube-api-access-jrz8h\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dcf31072-2346-4778-8912-4233fbb8dd01-metrics-client-ca\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097184 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-tls\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-accelerators-collector-config\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097497 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-sys\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097497 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-textfile\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097497 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097423 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcf31072-2346-4778-8912-4233fbb8dd01-sys\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097698 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-textfile\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.097854 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.097837 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dcf31072-2346-4778-8912-4233fbb8dd01-metrics-client-ca\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.099186 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.099159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-accelerators-collector-config\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.099284 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.099189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.099802 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.099782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dcf31072-2346-4778-8912-4233fbb8dd01-node-exporter-tls\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.104713 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.104693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrz8h\" (UniqueName: \"kubernetes.io/projected/dcf31072-2346-4778-8912-4233fbb8dd01-kube-api-access-jrz8h\") pod \"node-exporter-m7qfv\" (UID: \"dcf31072-2346-4778-8912-4233fbb8dd01\") " pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.139106 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.139080 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-m7qfv" Apr 24 14:27:07.148294 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:27:07.148269 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf31072_2346_4778_8912_4233fbb8dd01.slice/crio-f181da752ab880f8bf4e2fff5de6165680afb38842be2253ccfc9bb15a14c7d3 WatchSource:0}: Error finding container f181da752ab880f8bf4e2fff5de6165680afb38842be2253ccfc9bb15a14c7d3: Status 404 returned error can't find the container with id f181da752ab880f8bf4e2fff5de6165680afb38842be2253ccfc9bb15a14c7d3 Apr 24 14:27:07.637844 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:07.637806 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m7qfv" event={"ID":"dcf31072-2346-4778-8912-4233fbb8dd01","Type":"ContainerStarted","Data":"f181da752ab880f8bf4e2fff5de6165680afb38842be2253ccfc9bb15a14c7d3"} Apr 24 14:27:08.641394 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:08.641351 2572 generic.go:358] "Generic (PLEG): container finished" podID="dcf31072-2346-4778-8912-4233fbb8dd01" containerID="4304f5067386986a9e6497b42c7cfed3de1e582fa018b1064798cea4cfbec452" exitCode=0 Apr 24 14:27:08.641899 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:08.641441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m7qfv" event={"ID":"dcf31072-2346-4778-8912-4233fbb8dd01","Type":"ContainerDied","Data":"4304f5067386986a9e6497b42c7cfed3de1e582fa018b1064798cea4cfbec452"} Apr 24 14:27:09.645336 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:09.645300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m7qfv" event={"ID":"dcf31072-2346-4778-8912-4233fbb8dd01","Type":"ContainerStarted","Data":"2bee141444d7fce186cdcc11f9ae18e9f1d7cd6442c54a9346be536d3a83623c"} Apr 24 14:27:09.645336 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:09.645335 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m7qfv" event={"ID":"dcf31072-2346-4778-8912-4233fbb8dd01","Type":"ContainerStarted","Data":"e85136a389a732efbcb5670814faa176fbd5f3668899bc74e5c202f02de3a154"} Apr 24 14:27:09.667317 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:09.667269 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-m7qfv" podStartSLOduration=2.851162669 podStartE2EDuration="3.667256837s" podCreationTimestamp="2026-04-24 14:27:06 +0000 UTC" firstStartedPulling="2026-04-24 14:27:07.150434034 +0000 UTC m=+195.701276188" lastFinishedPulling="2026-04-24 14:27:07.966528203 +0000 UTC m=+196.517370356" observedRunningTime="2026-04-24 14:27:09.665548605 +0000 UTC m=+198.216390776" watchObservedRunningTime="2026-04-24 14:27:09.667256837 +0000 UTC m=+198.218099065" Apr 24 14:27:09.873484 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:09.873448 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69698d7d56-b8d2z"] Apr 24 14:27:15.101095 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:15.101051 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" podUID="64dce565-d49d-478c-869d-07eb80e65c86" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:27:25.101908 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:25.101853 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" podUID="64dce565-d49d-478c-869d-07eb80e65c86" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 14:27:25.102455 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:25.101983 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" Apr 24 14:27:25.102644 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:25.102581 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"22bfe8f043556d19c18b0705441ce6519fd2c4861a8404ba641e4ad9f33d901f"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 14:27:25.102730 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:25.102670 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" podUID="64dce565-d49d-478c-869d-07eb80e65c86" containerName="service-proxy" containerID="cri-o://22bfe8f043556d19c18b0705441ce6519fd2c4861a8404ba641e4ad9f33d901f" gracePeriod=30 Apr 24 14:27:25.685948 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:25.685914 2572 generic.go:358] "Generic (PLEG): container finished" podID="64dce565-d49d-478c-869d-07eb80e65c86" containerID="22bfe8f043556d19c18b0705441ce6519fd2c4861a8404ba641e4ad9f33d901f" exitCode=2 Apr 24 14:27:25.686087 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:25.685981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" event={"ID":"64dce565-d49d-478c-869d-07eb80e65c86","Type":"ContainerDied","Data":"22bfe8f043556d19c18b0705441ce6519fd2c4861a8404ba641e4ad9f33d901f"} Apr 24 14:27:25.686087 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:25.686016 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-649fdf996-8jfdp" event={"ID":"64dce565-d49d-478c-869d-07eb80e65c86","Type":"ContainerStarted","Data":"0563d4adc5c279b9e7129888d285621328f6b9d86aeb278b5a69918b916510c2"} Apr 24 14:27:34.891919 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:34.891860 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" podUID="96ebb4e4-b101-4abb-a7b7-6babb1746165" containerName="registry" containerID="cri-o://0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de" gracePeriod=30 Apr 24 14:27:35.130003 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.129981 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:27:35.194557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.194497 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn4kw\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-kube-api-access-qn4kw\") pod \"96ebb4e4-b101-4abb-a7b7-6babb1746165\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " Apr 24 14:27:35.194557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.194528 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-trusted-ca\") pod \"96ebb4e4-b101-4abb-a7b7-6babb1746165\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " Apr 24 14:27:35.194557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.194549 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-bound-sa-token\") pod \"96ebb4e4-b101-4abb-a7b7-6babb1746165\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " Apr 24 14:27:35.194791 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.194564 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-certificates\") pod \"96ebb4e4-b101-4abb-a7b7-6babb1746165\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " Apr 24 14:27:35.194791 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.194587 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-image-registry-private-configuration\") pod \"96ebb4e4-b101-4abb-a7b7-6babb1746165\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " Apr 24 14:27:35.194791 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.194643 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/96ebb4e4-b101-4abb-a7b7-6babb1746165-ca-trust-extracted\") pod \"96ebb4e4-b101-4abb-a7b7-6babb1746165\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " Apr 24 14:27:35.194791 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.194707 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-installation-pull-secrets\") pod \"96ebb4e4-b101-4abb-a7b7-6babb1746165\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " Apr 24 14:27:35.194791 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.194733 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") pod \"96ebb4e4-b101-4abb-a7b7-6babb1746165\" (UID: \"96ebb4e4-b101-4abb-a7b7-6babb1746165\") " Apr 24 14:27:35.195097 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.195067 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "96ebb4e4-b101-4abb-a7b7-6babb1746165" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:35.195175 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.195074 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "96ebb4e4-b101-4abb-a7b7-6babb1746165" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:27:35.197058 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.197027 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "96ebb4e4-b101-4abb-a7b7-6babb1746165" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:35.197265 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.197235 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-kube-api-access-qn4kw" (OuterVolumeSpecName: "kube-api-access-qn4kw") pod "96ebb4e4-b101-4abb-a7b7-6babb1746165" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165"). InnerVolumeSpecName "kube-api-access-qn4kw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:35.197265 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.197251 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "96ebb4e4-b101-4abb-a7b7-6babb1746165" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:27:35.197397 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.197267 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "96ebb4e4-b101-4abb-a7b7-6babb1746165" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:35.197473 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.197440 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "96ebb4e4-b101-4abb-a7b7-6babb1746165" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:27:35.203832 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.203809 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ebb4e4-b101-4abb-a7b7-6babb1746165-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "96ebb4e4-b101-4abb-a7b7-6babb1746165" (UID: "96ebb4e4-b101-4abb-a7b7-6babb1746165"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:27:35.295704 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.295669 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-installation-pull-secrets\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:27:35.295704 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.295700 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-tls\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:27:35.295704 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.295710 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qn4kw\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-kube-api-access-qn4kw\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:27:35.295910 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.295719 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-trusted-ca\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:27:35.295910 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.295728 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ebb4e4-b101-4abb-a7b7-6babb1746165-bound-sa-token\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:27:35.295910 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.295737 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/96ebb4e4-b101-4abb-a7b7-6babb1746165-registry-certificates\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:27:35.295910 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.295746 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/96ebb4e4-b101-4abb-a7b7-6babb1746165-image-registry-private-configuration\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:27:35.295910 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.295755 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/96ebb4e4-b101-4abb-a7b7-6babb1746165-ca-trust-extracted\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:27:35.710891 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.710858 2572 generic.go:358] "Generic (PLEG): container finished" podID="96ebb4e4-b101-4abb-a7b7-6babb1746165" containerID="0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de" exitCode=0 Apr 24 14:27:35.711070 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.710919 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" Apr 24 14:27:35.711070 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.710954 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" event={"ID":"96ebb4e4-b101-4abb-a7b7-6babb1746165","Type":"ContainerDied","Data":"0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de"} Apr 24 14:27:35.711070 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.711001 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69698d7d56-b8d2z" event={"ID":"96ebb4e4-b101-4abb-a7b7-6babb1746165","Type":"ContainerDied","Data":"a65d38eaaaed5bd2e4ad664c89945e7bf5b43655303b36ab1e1e54478b8e6b3d"} Apr 24 14:27:35.711070 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.711021 2572 scope.go:117] "RemoveContainer" containerID="0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de" Apr 24 14:27:35.718940 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.718853 2572 scope.go:117] "RemoveContainer" containerID="0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de" Apr 24 14:27:35.719206 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:27:35.719180 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de\": container with ID starting with 0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de not found: ID does not exist" containerID="0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de" Apr 24 14:27:35.719277 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.719218 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de"} err="failed to get container status \"0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de\": rpc error: code = NotFound desc = could not find container \"0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de\": container with ID starting with 0697a434e8c23d5f6c48f3e37cb2d436a8267d5a19e8fbf86b926ae1913742de not found: ID does not exist" Apr 24 14:27:35.733163 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.733143 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69698d7d56-b8d2z"] Apr 24 14:27:35.737148 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:35.737129 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69698d7d56-b8d2z"] Apr 24 14:27:36.057277 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:27:36.057245 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ebb4e4-b101-4abb-a7b7-6babb1746165" path="/var/lib/kubelet/pods/96ebb4e4-b101-4abb-a7b7-6babb1746165/volumes" Apr 24 14:28:03.902013 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:03.901968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:28:03.904178 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:03.904161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a1e1cea-5bb5-4d2e-83e5-817f18307569-metrics-certs\") pod \"network-metrics-daemon-7vzlj\" (UID: \"1a1e1cea-5bb5-4d2e-83e5-817f18307569\") " pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:28:04.157074 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:04.157008 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nctch\"" Apr 24 14:28:04.165436 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:04.165416 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vzlj" Apr 24 14:28:04.279115 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:04.279087 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7vzlj"] Apr 24 14:28:04.282361 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:28:04.282322 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a1e1cea_5bb5_4d2e_83e5_817f18307569.slice/crio-26a26fc3de8c2a0685b7b3a44685718e078ded305f7a4a676bb15a4d68a7b28d WatchSource:0}: Error finding container 26a26fc3de8c2a0685b7b3a44685718e078ded305f7a4a676bb15a4d68a7b28d: Status 404 returned error can't find the container with id 26a26fc3de8c2a0685b7b3a44685718e078ded305f7a4a676bb15a4d68a7b28d Apr 24 14:28:04.783852 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:04.783819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vzlj" event={"ID":"1a1e1cea-5bb5-4d2e-83e5-817f18307569","Type":"ContainerStarted","Data":"26a26fc3de8c2a0685b7b3a44685718e078ded305f7a4a676bb15a4d68a7b28d"} Apr 24 14:28:05.789934 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:05.789901 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vzlj" event={"ID":"1a1e1cea-5bb5-4d2e-83e5-817f18307569","Type":"ContainerStarted","Data":"e898bc684cb599be606a7c8011ce112a11cafa612dd5ae5560950f6e8c034634"} Apr 24 14:28:05.790377 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:05.789940 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vzlj" event={"ID":"1a1e1cea-5bb5-4d2e-83e5-817f18307569","Type":"ContainerStarted","Data":"150a05c5eb580bff8967e2872cec41ffaab64f29fd99482ce9fd53e4c9dc07d6"} Apr 24 14:28:05.806910 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:05.806868 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7vzlj" podStartSLOduration=252.747631014 podStartE2EDuration="4m13.806853896s" podCreationTimestamp="2026-04-24 14:23:52 +0000 UTC" firstStartedPulling="2026-04-24 14:28:04.284171028 +0000 UTC m=+252.835013179" lastFinishedPulling="2026-04-24 14:28:05.343393912 +0000 UTC m=+253.894236061" observedRunningTime="2026-04-24 14:28:05.804813587 +0000 UTC m=+254.355655760" watchObservedRunningTime="2026-04-24 14:28:05.806853896 +0000 UTC m=+254.357696121" Apr 24 14:28:51.932541 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:51.932517 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:28:51.933059 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:51.932519 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:28:51.938321 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:28:51.938304 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 14:30:14.428362 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.428328 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc"] Apr 24 14:30:14.428830 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.428565 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96ebb4e4-b101-4abb-a7b7-6babb1746165" containerName="registry" Apr 24 14:30:14.428830 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.428575 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ebb4e4-b101-4abb-a7b7-6babb1746165" containerName="registry" Apr 24 14:30:14.428830 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.428634 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="96ebb4e4-b101-4abb-a7b7-6babb1746165" containerName="registry" Apr 24 14:30:14.431292 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.431276 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:14.434782 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.434760 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 14:30:14.435195 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.435172 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 14:30:14.435958 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.435938 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 14:30:14.436067 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.435940 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-9dtt9\"" Apr 24 14:30:14.436067 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.435945 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 14:30:14.439530 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.439512 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 14:30:14.443070 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.443049 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc"] Apr 24 14:30:14.460341 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.460314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3bb123f9-8c10-4ae6-8884-532459ca9566-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:14.460426 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.460344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qthzv\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-kube-api-access-qthzv\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:14.460426 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.460373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:14.561413 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.561382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3bb123f9-8c10-4ae6-8884-532459ca9566-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:14.561559 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.561427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qthzv\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-kube-api-access-qthzv\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:14.561559 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.561467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:14.561659 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:14.561643 2572 secret.go:281] references non-existent secret key: tls.crt Apr 24 14:30:14.561692 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:14.561661 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 14:30:14.561692 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:14.561682 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc: references non-existent secret key: tls.crt Apr 24 14:30:14.561756 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:14.561751 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates podName:3bb123f9-8c10-4ae6-8884-532459ca9566 nodeName:}" failed. No retries permitted until 2026-04-24 14:30:15.061730265 +0000 UTC m=+383.612572421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates") pod "keda-metrics-apiserver-7c9f485588-fhxwc" (UID: "3bb123f9-8c10-4ae6-8884-532459ca9566") : references non-existent secret key: tls.crt Apr 24 14:30:14.561809 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.561763 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3bb123f9-8c10-4ae6-8884-532459ca9566-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:14.571463 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:14.571436 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qthzv\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-kube-api-access-qthzv\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:15.065104 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:15.065046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:15.065300 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:15.065187 2572 secret.go:281] references non-existent secret key: tls.crt Apr 24 14:30:15.065300 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:15.065203 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 14:30:15.065300 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:15.065228 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc: references non-existent secret key: tls.crt Apr 24 14:30:15.065400 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:15.065310 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates podName:3bb123f9-8c10-4ae6-8884-532459ca9566 nodeName:}" failed. No retries permitted until 2026-04-24 14:30:16.065292376 +0000 UTC m=+384.616134537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates") pod "keda-metrics-apiserver-7c9f485588-fhxwc" (UID: "3bb123f9-8c10-4ae6-8884-532459ca9566") : references non-existent secret key: tls.crt Apr 24 14:30:16.072991 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:16.072961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:16.073361 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:16.073052 2572 secret.go:281] references non-existent secret key: tls.crt Apr 24 14:30:16.073361 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:16.073064 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 14:30:16.073361 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:16.073081 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc: references non-existent secret key: tls.crt Apr 24 14:30:16.073361 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:16.073135 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates podName:3bb123f9-8c10-4ae6-8884-532459ca9566 nodeName:}" failed. No retries permitted until 2026-04-24 14:30:18.073121422 +0000 UTC m=+386.623963571 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates") pod "keda-metrics-apiserver-7c9f485588-fhxwc" (UID: "3bb123f9-8c10-4ae6-8884-532459ca9566") : references non-existent secret key: tls.crt Apr 24 14:30:18.086908 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:18.086882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:18.087256 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:18.087015 2572 secret.go:281] references non-existent secret key: tls.crt Apr 24 14:30:18.087256 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:18.087032 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 14:30:18.087256 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:18.087048 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc: references non-existent secret key: tls.crt Apr 24 14:30:18.087256 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:30:18.087096 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates podName:3bb123f9-8c10-4ae6-8884-532459ca9566 nodeName:}" failed. No retries permitted until 2026-04-24 14:30:22.087083173 +0000 UTC m=+390.637925322 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates") pod "keda-metrics-apiserver-7c9f485588-fhxwc" (UID: "3bb123f9-8c10-4ae6-8884-532459ca9566") : references non-existent secret key: tls.crt Apr 24 14:30:22.113880 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:22.113843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:22.116455 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:22.116421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3bb123f9-8c10-4ae6-8884-532459ca9566-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fhxwc\" (UID: \"3bb123f9-8c10-4ae6-8884-532459ca9566\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:22.241020 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:22.240974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:22.354260 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:22.354233 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc"] Apr 24 14:30:22.357162 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:30:22.357138 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb123f9_8c10_4ae6_8884_532459ca9566.slice/crio-98f284ae457cfdd2633c7cb603fa6faa2ac94bb47ef681f5ddedaa903a403fac WatchSource:0}: Error finding container 98f284ae457cfdd2633c7cb603fa6faa2ac94bb47ef681f5ddedaa903a403fac: Status 404 returned error can't find the container with id 98f284ae457cfdd2633c7cb603fa6faa2ac94bb47ef681f5ddedaa903a403fac Apr 24 14:30:22.358422 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:22.358406 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:30:23.121735 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:23.121702 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" event={"ID":"3bb123f9-8c10-4ae6-8884-532459ca9566","Type":"ContainerStarted","Data":"98f284ae457cfdd2633c7cb603fa6faa2ac94bb47ef681f5ddedaa903a403fac"} Apr 24 14:30:26.131522 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:26.131485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" event={"ID":"3bb123f9-8c10-4ae6-8884-532459ca9566","Type":"ContainerStarted","Data":"3071afbe26dd13e6100eb9ffee415d677c4653ad87f28f625b444686c2d5732a"} Apr 24 14:30:26.131906 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:26.131636 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:30:26.148866 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:26.148817 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" podStartSLOduration=9.05860139 podStartE2EDuration="12.148805481s" podCreationTimestamp="2026-04-24 14:30:14 +0000 UTC" firstStartedPulling="2026-04-24 14:30:22.358526846 +0000 UTC m=+390.909368996" lastFinishedPulling="2026-04-24 14:30:25.448730923 +0000 UTC m=+393.999573087" observedRunningTime="2026-04-24 14:30:26.147315172 +0000 UTC m=+394.698157344" watchObservedRunningTime="2026-04-24 14:30:26.148805481 +0000 UTC m=+394.699647652" Apr 24 14:30:37.138654 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:30:37.138623 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fhxwc" Apr 24 14:31:29.767435 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.767402 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qhss8"] Apr 24 14:31:29.769511 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.769494 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" Apr 24 14:31:29.772397 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.772375 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 24 14:31:29.772551 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.772430 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-lz2mz\"" Apr 24 14:31:29.773643 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.773596 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 24 14:31:29.779777 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.779756 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qhss8"] Apr 24 14:31:29.857734 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.857693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88f3db51-8b91-4f6e-a53b-d1b5def1ee89-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qhss8\" (UID: \"88f3db51-8b91-4f6e-a53b-d1b5def1ee89\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" Apr 24 14:31:29.857910 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.857757 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kmlc\" (UniqueName: \"kubernetes.io/projected/88f3db51-8b91-4f6e-a53b-d1b5def1ee89-kube-api-access-9kmlc\") pod \"cert-manager-cainjector-68b757865b-qhss8\" (UID: \"88f3db51-8b91-4f6e-a53b-d1b5def1ee89\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" Apr 24 14:31:29.958089 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.958045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88f3db51-8b91-4f6e-a53b-d1b5def1ee89-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qhss8\" (UID: \"88f3db51-8b91-4f6e-a53b-d1b5def1ee89\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" Apr 24 14:31:29.958291 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.958108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kmlc\" (UniqueName: \"kubernetes.io/projected/88f3db51-8b91-4f6e-a53b-d1b5def1ee89-kube-api-access-9kmlc\") pod \"cert-manager-cainjector-68b757865b-qhss8\" (UID: \"88f3db51-8b91-4f6e-a53b-d1b5def1ee89\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" Apr 24 14:31:29.967167 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.967143 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88f3db51-8b91-4f6e-a53b-d1b5def1ee89-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qhss8\" (UID: \"88f3db51-8b91-4f6e-a53b-d1b5def1ee89\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" Apr 24 14:31:29.967272 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:29.967205 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kmlc\" (UniqueName: \"kubernetes.io/projected/88f3db51-8b91-4f6e-a53b-d1b5def1ee89-kube-api-access-9kmlc\") pod \"cert-manager-cainjector-68b757865b-qhss8\" (UID: \"88f3db51-8b91-4f6e-a53b-d1b5def1ee89\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" Apr 24 14:31:30.077784 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:30.077745 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" Apr 24 14:31:30.194381 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:30.194319 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qhss8"] Apr 24 14:31:30.197063 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:31:30.197036 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f3db51_8b91_4f6e_a53b_d1b5def1ee89.slice/crio-0377ac4943a7b208f7418600b17c68f7086c0e1f15bcc44d2c55071abb0674f3 WatchSource:0}: Error finding container 0377ac4943a7b208f7418600b17c68f7086c0e1f15bcc44d2c55071abb0674f3: Status 404 returned error can't find the container with id 0377ac4943a7b208f7418600b17c68f7086c0e1f15bcc44d2c55071abb0674f3 Apr 24 14:31:30.290215 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:30.290166 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" event={"ID":"88f3db51-8b91-4f6e-a53b-d1b5def1ee89","Type":"ContainerStarted","Data":"0377ac4943a7b208f7418600b17c68f7086c0e1f15bcc44d2c55071abb0674f3"} Apr 24 14:31:34.302195 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:34.302154 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" event={"ID":"88f3db51-8b91-4f6e-a53b-d1b5def1ee89","Type":"ContainerStarted","Data":"267e86129a34b75096720d3edf78dbac2e0f29bda1305c0763cac5eb7e80c0a5"} Apr 24 14:31:34.322389 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:31:34.322332 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-qhss8" podStartSLOduration=1.989418724 podStartE2EDuration="5.322316389s" podCreationTimestamp="2026-04-24 14:31:29 +0000 UTC" firstStartedPulling="2026-04-24 14:31:30.198764989 +0000 UTC m=+458.749607143" lastFinishedPulling="2026-04-24 14:31:33.53166265 +0000 UTC m=+462.082504808" observedRunningTime="2026-04-24 14:31:34.321441056 +0000 UTC m=+462.872283219" watchObservedRunningTime="2026-04-24 14:31:34.322316389 +0000 UTC m=+462.873158561" Apr 24 14:32:14.640531 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.640497 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tsflf"] Apr 24 14:32:14.643367 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.643344 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:14.646135 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.646114 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 24 14:32:14.646219 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.646205 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-lqslb\"" Apr 24 14:32:14.646272 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.646255 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 24 14:32:14.650924 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.650904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6d535f1c-6001-43aa-8813-2902e62d4514-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tsflf\" (UID: \"6d535f1c-6001-43aa-8813-2902e62d4514\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:14.651034 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.650969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtnnm\" (UniqueName: \"kubernetes.io/projected/6d535f1c-6001-43aa-8813-2902e62d4514-kube-api-access-rtnnm\") pod \"servicemesh-operator3-55f49c5f94-tsflf\" (UID: \"6d535f1c-6001-43aa-8813-2902e62d4514\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:14.658767 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.658748 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tsflf"] Apr 24 14:32:14.751397 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.751365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6d535f1c-6001-43aa-8813-2902e62d4514-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tsflf\" (UID: \"6d535f1c-6001-43aa-8813-2902e62d4514\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:14.751538 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.751422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtnnm\" (UniqueName: \"kubernetes.io/projected/6d535f1c-6001-43aa-8813-2902e62d4514-kube-api-access-rtnnm\") pod \"servicemesh-operator3-55f49c5f94-tsflf\" (UID: \"6d535f1c-6001-43aa-8813-2902e62d4514\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:14.753759 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.753741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6d535f1c-6001-43aa-8813-2902e62d4514-operator-config\") pod \"servicemesh-operator3-55f49c5f94-tsflf\" (UID: \"6d535f1c-6001-43aa-8813-2902e62d4514\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:14.760702 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.760684 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtnnm\" (UniqueName: \"kubernetes.io/projected/6d535f1c-6001-43aa-8813-2902e62d4514-kube-api-access-rtnnm\") pod \"servicemesh-operator3-55f49c5f94-tsflf\" (UID: \"6d535f1c-6001-43aa-8813-2902e62d4514\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:14.952191 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:14.952105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:15.071265 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:15.071228 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-tsflf"] Apr 24 14:32:15.075386 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:32:15.075358 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d535f1c_6001_43aa_8813_2902e62d4514.slice/crio-4ab190443b7393f0ec10454baaf7431e07ae56bc2217b56ab11f90af28945f38 WatchSource:0}: Error finding container 4ab190443b7393f0ec10454baaf7431e07ae56bc2217b56ab11f90af28945f38: Status 404 returned error can't find the container with id 4ab190443b7393f0ec10454baaf7431e07ae56bc2217b56ab11f90af28945f38 Apr 24 14:32:15.408284 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:15.408248 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" event={"ID":"6d535f1c-6001-43aa-8813-2902e62d4514","Type":"ContainerStarted","Data":"4ab190443b7393f0ec10454baaf7431e07ae56bc2217b56ab11f90af28945f38"} Apr 24 14:32:18.417913 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:18.417878 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" event={"ID":"6d535f1c-6001-43aa-8813-2902e62d4514","Type":"ContainerStarted","Data":"27b9f416e14a3a4d66a32eb9e15b2718b46a9f282121b5034f835b963429eef7"} Apr 24 14:32:18.418361 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:18.417997 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:32:18.444770 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:18.444716 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" podStartSLOduration=1.9580486160000001 podStartE2EDuration="4.444700245s" podCreationTimestamp="2026-04-24 14:32:14 +0000 UTC" firstStartedPulling="2026-04-24 14:32:15.077698663 +0000 UTC m=+503.628540814" lastFinishedPulling="2026-04-24 14:32:17.564350279 +0000 UTC m=+506.115192443" observedRunningTime="2026-04-24 14:32:18.443877836 +0000 UTC m=+506.994720008" watchObservedRunningTime="2026-04-24 14:32:18.444700245 +0000 UTC m=+506.995542418" Apr 24 14:32:29.422577 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:32:29.422543 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-tsflf" Apr 24 14:33:01.804713 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.804679 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns"] Apr 24 14:33:01.810991 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.810970 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.813917 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.813892 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 24 14:33:01.814056 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.813925 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 24 14:33:01.814056 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.813925 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 14:33:01.814056 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.814007 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 14:33:01.814056 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.814031 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 24 14:33:01.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.814121 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 24 14:33:01.814256 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.814233 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-xxlf7\"" Apr 24 14:33:01.817388 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.817358 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns"] Apr 24 14:33:01.875595 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.875569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.875727 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.875616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.875727 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.875667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.875727 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.875691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.875727 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.875719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e461a34-0a1a-4c24-949e-51fada80d12f-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.875858 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.875741 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.875858 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.875761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh86r\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-kube-api-access-rh86r\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.977096 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.977060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e461a34-0a1a-4c24-949e-51fada80d12f-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.977238 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.977108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.977238 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.977137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rh86r\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-kube-api-access-rh86r\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.977238 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.977185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.977428 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.977243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.977428 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.977300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.977428 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.977337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.978061 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.978036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.979714 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.979685 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.979817 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.979747 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.979879 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.979815 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.979879 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.979840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e461a34-0a1a-4c24-949e-51fada80d12f-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.984922 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.984895 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:01.985005 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:01.984976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh86r\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-kube-api-access-rh86r\") pod \"istiod-openshift-gateway-7cd77c7ffd-bklns\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:02.121297 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:02.121207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:02.248315 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:02.248271 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns"] Apr 24 14:33:02.251185 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:33:02.251153 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e461a34_0a1a_4c24_949e_51fada80d12f.slice/crio-6f0eeaec33285893749e93341b258dca266da911df2c06ebf41b5391b05b430e WatchSource:0}: Error finding container 6f0eeaec33285893749e93341b258dca266da911df2c06ebf41b5391b05b430e: Status 404 returned error can't find the container with id 6f0eeaec33285893749e93341b258dca266da911df2c06ebf41b5391b05b430e Apr 24 14:33:02.536126 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:02.536089 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" event={"ID":"6e461a34-0a1a-4c24-949e-51fada80d12f","Type":"ContainerStarted","Data":"6f0eeaec33285893749e93341b258dca266da911df2c06ebf41b5391b05b430e"} Apr 24 14:33:05.516785 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:05.516746 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:33:05.517151 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:05.516812 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:33:06.548683 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:06.548639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" event={"ID":"6e461a34-0a1a-4c24-949e-51fada80d12f","Type":"ContainerStarted","Data":"e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1"} Apr 24 14:33:06.549156 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:06.548803 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:06.550431 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:06.550406 2572 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-bklns container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 24 14:33:06.550515 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:06.550454 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" podUID="6e461a34-0a1a-4c24-949e-51fada80d12f" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 14:33:06.568735 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:06.568689 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" podStartSLOduration=2.307783384 podStartE2EDuration="5.568651872s" podCreationTimestamp="2026-04-24 14:33:01 +0000 UTC" firstStartedPulling="2026-04-24 14:33:02.255650127 +0000 UTC m=+550.806492295" lastFinishedPulling="2026-04-24 14:33:05.516518633 +0000 UTC m=+554.067360783" observedRunningTime="2026-04-24 14:33:06.567560293 +0000 UTC m=+555.118402465" watchObservedRunningTime="2026-04-24 14:33:06.568651872 +0000 UTC m=+555.119494046" Apr 24 14:33:07.552812 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:07.552785 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:33:09.304812 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.304775 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9"] Apr 24 14:33:09.307338 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.307313 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.310022 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.309987 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-w9jp4\"" Apr 24 14:33:09.321247 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.321223 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9"] Apr 24 14:33:09.441955 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.441920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.441955 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.441957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c0110066-5174-4fda-9ac6-6754da8f8764-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.442124 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.441976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.442124 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.442016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9wn7\" (UniqueName: \"kubernetes.io/projected/c0110066-5174-4fda-9ac6-6754da8f8764-kube-api-access-x9wn7\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.442124 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.442073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.442231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.442124 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.442231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.442176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c0110066-5174-4fda-9ac6-6754da8f8764-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.442231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.442207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.442231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.442223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c0110066-5174-4fda-9ac6-6754da8f8764-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543232 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c0110066-5174-4fda-9ac6-6754da8f8764-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543232 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543427 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c0110066-5174-4fda-9ac6-6754da8f8764-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543427 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543427 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c0110066-5174-4fda-9ac6-6754da8f8764-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543427 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543427 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9wn7\" (UniqueName: \"kubernetes.io/projected/c0110066-5174-4fda-9ac6-6754da8f8764-kube-api-access-x9wn7\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543700 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543700 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543700 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543853 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543728 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543853 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.543950 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.543861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.544059 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.544043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c0110066-5174-4fda-9ac6-6754da8f8764-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.545540 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.545520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c0110066-5174-4fda-9ac6-6754da8f8764-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.545833 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.545816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c0110066-5174-4fda-9ac6-6754da8f8764-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.552133 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.552113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c0110066-5174-4fda-9ac6-6754da8f8764-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.552340 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.552318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9wn7\" (UniqueName: \"kubernetes.io/projected/c0110066-5174-4fda-9ac6-6754da8f8764-kube-api-access-x9wn7\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-hndv9\" (UID: \"c0110066-5174-4fda-9ac6-6754da8f8764\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.619533 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.619472 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:09.750066 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:09.750038 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9"] Apr 24 14:33:09.752494 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:33:09.752464 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0110066_5174_4fda_9ac6_6754da8f8764.slice/crio-bdf598e1599944a84d3ba9050cf1e55a63a610efb963dcfd54c919fe70f7b600 WatchSource:0}: Error finding container bdf598e1599944a84d3ba9050cf1e55a63a610efb963dcfd54c919fe70f7b600: Status 404 returned error can't find the container with id bdf598e1599944a84d3ba9050cf1e55a63a610efb963dcfd54c919fe70f7b600 Apr 24 14:33:10.561297 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:10.561253 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" event={"ID":"c0110066-5174-4fda-9ac6-6754da8f8764","Type":"ContainerStarted","Data":"bdf598e1599944a84d3ba9050cf1e55a63a610efb963dcfd54c919fe70f7b600"} Apr 24 14:33:12.565006 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:12.564974 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:33:12.565289 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:12.565043 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:33:12.565289 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:12.565072 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:33:13.571930 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:13.571889 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" event={"ID":"c0110066-5174-4fda-9ac6-6754da8f8764","Type":"ContainerStarted","Data":"c01820492267e55a8ebcceed54563b4622d7a097ad4ad036fc9ecff8eb0e7061"} Apr 24 14:33:13.592275 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:13.592225 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" podStartSLOduration=1.781936188 podStartE2EDuration="4.592210095s" podCreationTimestamp="2026-04-24 14:33:09 +0000 UTC" firstStartedPulling="2026-04-24 14:33:09.754487191 +0000 UTC m=+558.305329346" lastFinishedPulling="2026-04-24 14:33:12.564761081 +0000 UTC m=+561.115603253" observedRunningTime="2026-04-24 14:33:13.590682086 +0000 UTC m=+562.141524259" watchObservedRunningTime="2026-04-24 14:33:13.592210095 +0000 UTC m=+562.143052266" Apr 24 14:33:13.620167 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:13.620138 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:13.624490 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:13.624470 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:14.575399 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:14.575367 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:14.576452 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:14.576431 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-hndv9" Apr 24 14:33:41.671298 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.671265 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz"] Apr 24 14:33:41.674359 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.674342 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" Apr 24 14:33:41.677202 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.677177 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 14:33:41.678418 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.678399 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-kp57n\"" Apr 24 14:33:41.678529 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.678430 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 14:33:41.680557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.680537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkpgd\" (UniqueName: \"kubernetes.io/projected/cbc11936-8b41-4f5c-80e1-56df1e560782-kube-api-access-zkpgd\") pod \"limitador-operator-controller-manager-c7fb4c8d5-s65gz\" (UID: \"cbc11936-8b41-4f5c-80e1-56df1e560782\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" Apr 24 14:33:41.683358 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.683337 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz"] Apr 24 14:33:41.781475 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.781431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkpgd\" (UniqueName: \"kubernetes.io/projected/cbc11936-8b41-4f5c-80e1-56df1e560782-kube-api-access-zkpgd\") pod \"limitador-operator-controller-manager-c7fb4c8d5-s65gz\" (UID: \"cbc11936-8b41-4f5c-80e1-56df1e560782\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" Apr 24 14:33:41.796073 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.796046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkpgd\" (UniqueName: \"kubernetes.io/projected/cbc11936-8b41-4f5c-80e1-56df1e560782-kube-api-access-zkpgd\") pod \"limitador-operator-controller-manager-c7fb4c8d5-s65gz\" (UID: \"cbc11936-8b41-4f5c-80e1-56df1e560782\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" Apr 24 14:33:41.984486 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:41.984413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" Apr 24 14:33:42.106503 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:42.106479 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz"] Apr 24 14:33:42.108456 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:33:42.108429 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc11936_8b41_4f5c_80e1_56df1e560782.slice/crio-4731137c3d1a883fe38dbbf85e06cd4a4875ea6e2bfcc6e859cad49ac83930c7 WatchSource:0}: Error finding container 4731137c3d1a883fe38dbbf85e06cd4a4875ea6e2bfcc6e859cad49ac83930c7: Status 404 returned error can't find the container with id 4731137c3d1a883fe38dbbf85e06cd4a4875ea6e2bfcc6e859cad49ac83930c7 Apr 24 14:33:42.661959 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:42.661917 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" event={"ID":"cbc11936-8b41-4f5c-80e1-56df1e560782","Type":"ContainerStarted","Data":"4731137c3d1a883fe38dbbf85e06cd4a4875ea6e2bfcc6e859cad49ac83930c7"} Apr 24 14:33:45.103393 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.103361 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm"] Apr 24 14:33:45.106379 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.106359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:45.108748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.108728 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-pd5vk\"" Apr 24 14:33:45.120163 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.120135 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm"] Apr 24 14:33:45.206401 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.206372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k45lb\" (UniqueName: \"kubernetes.io/projected/c09d1452-ea2f-4446-ba91-3ce10da5c7ee-kube-api-access-k45lb\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f27mm\" (UID: \"c09d1452-ea2f-4446-ba91-3ce10da5c7ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:45.206532 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.206420 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c09d1452-ea2f-4446-ba91-3ce10da5c7ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f27mm\" (UID: \"c09d1452-ea2f-4446-ba91-3ce10da5c7ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:45.307108 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.307073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k45lb\" (UniqueName: \"kubernetes.io/projected/c09d1452-ea2f-4446-ba91-3ce10da5c7ee-kube-api-access-k45lb\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f27mm\" (UID: \"c09d1452-ea2f-4446-ba91-3ce10da5c7ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:45.307294 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.307119 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c09d1452-ea2f-4446-ba91-3ce10da5c7ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f27mm\" (UID: \"c09d1452-ea2f-4446-ba91-3ce10da5c7ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:45.307465 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.307449 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c09d1452-ea2f-4446-ba91-3ce10da5c7ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f27mm\" (UID: \"c09d1452-ea2f-4446-ba91-3ce10da5c7ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:45.319494 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.319462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k45lb\" (UniqueName: \"kubernetes.io/projected/c09d1452-ea2f-4446-ba91-3ce10da5c7ee-kube-api-access-k45lb\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-f27mm\" (UID: \"c09d1452-ea2f-4446-ba91-3ce10da5c7ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:45.415435 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.415370 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:45.536444 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.536412 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm"] Apr 24 14:33:45.540307 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:33:45.540282 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc09d1452_ea2f_4446_ba91_3ce10da5c7ee.slice/crio-51278ef8c1a201130e6ba15bdbea1eab47627c85774e7fb8ccc5599f78d5afc1 WatchSource:0}: Error finding container 51278ef8c1a201130e6ba15bdbea1eab47627c85774e7fb8ccc5599f78d5afc1: Status 404 returned error can't find the container with id 51278ef8c1a201130e6ba15bdbea1eab47627c85774e7fb8ccc5599f78d5afc1 Apr 24 14:33:45.673542 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.673468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" event={"ID":"cbc11936-8b41-4f5c-80e1-56df1e560782","Type":"ContainerStarted","Data":"104211e34008f0b6377a5185b10dac0535a38607fff4f8abc9e44185b2c101e8"} Apr 24 14:33:45.673717 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.673590 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" Apr 24 14:33:45.674657 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.674637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" event={"ID":"c09d1452-ea2f-4446-ba91-3ce10da5c7ee","Type":"ContainerStarted","Data":"51278ef8c1a201130e6ba15bdbea1eab47627c85774e7fb8ccc5599f78d5afc1"} Apr 24 14:33:45.696459 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:45.696393 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" podStartSLOduration=2.047724702 podStartE2EDuration="4.696374975s" podCreationTimestamp="2026-04-24 14:33:41 +0000 UTC" firstStartedPulling="2026-04-24 14:33:42.110262641 +0000 UTC m=+590.661104790" lastFinishedPulling="2026-04-24 14:33:44.758912903 +0000 UTC m=+593.309755063" observedRunningTime="2026-04-24 14:33:45.695121678 +0000 UTC m=+594.245963888" watchObservedRunningTime="2026-04-24 14:33:45.696374975 +0000 UTC m=+594.247217150" Apr 24 14:33:52.333201 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:52.333174 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:33:52.333594 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:52.333422 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:33:52.700743 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:52.700668 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" event={"ID":"c09d1452-ea2f-4446-ba91-3ce10da5c7ee","Type":"ContainerStarted","Data":"e514178090f70edfbe96bb0547debcb74536b83c0f826e246ef48b093685e8d8"} Apr 24 14:33:52.700869 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:52.700772 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:33:52.722018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:52.721982 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" podStartSLOduration=0.876018082 podStartE2EDuration="7.721971946s" podCreationTimestamp="2026-04-24 14:33:45 +0000 UTC" firstStartedPulling="2026-04-24 14:33:45.54338326 +0000 UTC m=+594.094225414" lastFinishedPulling="2026-04-24 14:33:52.389337115 +0000 UTC m=+600.940179278" observedRunningTime="2026-04-24 14:33:52.720431822 +0000 UTC m=+601.271273994" watchObservedRunningTime="2026-04-24 14:33:52.721971946 +0000 UTC m=+601.272814118" Apr 24 14:33:56.680715 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:33:56.680682 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-s65gz" Apr 24 14:34:03.705645 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:03.705597 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-f27mm" Apr 24 14:34:37.765999 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.765964 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lx5zw"] Apr 24 14:34:37.769492 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.769474 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:37.772322 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.772294 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 14:34:37.772322 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.772316 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7plhs\"" Apr 24 14:34:37.777022 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.776756 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lx5zw"] Apr 24 14:34:37.802719 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.802693 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lx5zw"] Apr 24 14:34:37.894544 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.894513 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3950a3e3-5147-41d7-b415-4cdd6e7bfb02-config-file\") pod \"limitador-limitador-67566c68b4-lx5zw\" (UID: \"3950a3e3-5147-41d7-b415-4cdd6e7bfb02\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:37.894726 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.894558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zck9x\" (UniqueName: \"kubernetes.io/projected/3950a3e3-5147-41d7-b415-4cdd6e7bfb02-kube-api-access-zck9x\") pod \"limitador-limitador-67566c68b4-lx5zw\" (UID: \"3950a3e3-5147-41d7-b415-4cdd6e7bfb02\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:37.995300 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.995268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3950a3e3-5147-41d7-b415-4cdd6e7bfb02-config-file\") pod \"limitador-limitador-67566c68b4-lx5zw\" (UID: \"3950a3e3-5147-41d7-b415-4cdd6e7bfb02\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:37.995467 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.995316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zck9x\" (UniqueName: \"kubernetes.io/projected/3950a3e3-5147-41d7-b415-4cdd6e7bfb02-kube-api-access-zck9x\") pod \"limitador-limitador-67566c68b4-lx5zw\" (UID: \"3950a3e3-5147-41d7-b415-4cdd6e7bfb02\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:37.995935 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:37.995909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3950a3e3-5147-41d7-b415-4cdd6e7bfb02-config-file\") pod \"limitador-limitador-67566c68b4-lx5zw\" (UID: \"3950a3e3-5147-41d7-b415-4cdd6e7bfb02\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:38.003282 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:38.003257 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zck9x\" (UniqueName: \"kubernetes.io/projected/3950a3e3-5147-41d7-b415-4cdd6e7bfb02-kube-api-access-zck9x\") pod \"limitador-limitador-67566c68b4-lx5zw\" (UID: \"3950a3e3-5147-41d7-b415-4cdd6e7bfb02\") " pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:38.080449 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:38.080430 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:38.191598 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:38.191571 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-lx5zw"] Apr 24 14:34:38.193996 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:34:38.193971 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3950a3e3_5147_41d7_b415_4cdd6e7bfb02.slice/crio-5765d9ce381a9e3f274b151e18b521850dbecfe60a7cacd6e6d223a3debfd1fe WatchSource:0}: Error finding container 5765d9ce381a9e3f274b151e18b521850dbecfe60a7cacd6e6d223a3debfd1fe: Status 404 returned error can't find the container with id 5765d9ce381a9e3f274b151e18b521850dbecfe60a7cacd6e6d223a3debfd1fe Apr 24 14:34:38.846811 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:38.846770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" event={"ID":"3950a3e3-5147-41d7-b415-4cdd6e7bfb02","Type":"ContainerStarted","Data":"5765d9ce381a9e3f274b151e18b521850dbecfe60a7cacd6e6d223a3debfd1fe"} Apr 24 14:34:42.860531 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:42.860494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" event={"ID":"3950a3e3-5147-41d7-b415-4cdd6e7bfb02","Type":"ContainerStarted","Data":"0c9149079f34f8822481f8b2695b86011a35a1e708d254009caef9724bd81789"} Apr 24 14:34:42.860903 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:42.860624 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:34:42.877222 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:42.877182 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" podStartSLOduration=2.021406855 podStartE2EDuration="5.87717082s" podCreationTimestamp="2026-04-24 14:34:37 +0000 UTC" firstStartedPulling="2026-04-24 14:34:38.195626148 +0000 UTC m=+646.746468298" lastFinishedPulling="2026-04-24 14:34:42.051390114 +0000 UTC m=+650.602232263" observedRunningTime="2026-04-24 14:34:42.876060705 +0000 UTC m=+651.426902876" watchObservedRunningTime="2026-04-24 14:34:42.87717082 +0000 UTC m=+651.428012991" Apr 24 14:34:53.864574 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:34:53.864543 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-lx5zw" Apr 24 14:35:17.472078 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.472043 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns"] Apr 24 14:35:17.472566 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.472288 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" podUID="6e461a34-0a1a-4c24-949e-51fada80d12f" containerName="discovery" containerID="cri-o://e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1" gracePeriod=30 Apr 24 14:35:17.714276 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.714252 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:35:17.887088 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887054 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh86r\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-kube-api-access-rh86r\") pod \"6e461a34-0a1a-4c24-949e-51fada80d12f\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " Apr 24 14:35:17.887088 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887091 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-cacerts\") pod \"6e461a34-0a1a-4c24-949e-51fada80d12f\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " Apr 24 14:35:17.887331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887112 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-ca-configmap\") pod \"6e461a34-0a1a-4c24-949e-51fada80d12f\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " Apr 24 14:35:17.887331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887134 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-dns-cert\") pod \"6e461a34-0a1a-4c24-949e-51fada80d12f\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " Apr 24 14:35:17.887331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887238 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-kubeconfig\") pod \"6e461a34-0a1a-4c24-949e-51fada80d12f\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " Apr 24 14:35:17.887331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887270 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-token\") pod \"6e461a34-0a1a-4c24-949e-51fada80d12f\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " Apr 24 14:35:17.887534 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887505 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "6e461a34-0a1a-4c24-949e-51fada80d12f" (UID: "6e461a34-0a1a-4c24-949e-51fada80d12f"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 14:35:17.887587 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887550 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e461a34-0a1a-4c24-949e-51fada80d12f-local-certs\") pod \"6e461a34-0a1a-4c24-949e-51fada80d12f\" (UID: \"6e461a34-0a1a-4c24-949e-51fada80d12f\") " Apr 24 14:35:17.887875 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.887835 2572 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-ca-configmap\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:17.889590 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.889559 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-kube-api-access-rh86r" (OuterVolumeSpecName: "kube-api-access-rh86r") pod "6e461a34-0a1a-4c24-949e-51fada80d12f" (UID: "6e461a34-0a1a-4c24-949e-51fada80d12f"). InnerVolumeSpecName "kube-api-access-rh86r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:35:17.889846 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.889821 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-token" (OuterVolumeSpecName: "istio-token") pod "6e461a34-0a1a-4c24-949e-51fada80d12f" (UID: "6e461a34-0a1a-4c24-949e-51fada80d12f"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:35:17.889961 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.889822 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "6e461a34-0a1a-4c24-949e-51fada80d12f" (UID: "6e461a34-0a1a-4c24-949e-51fada80d12f"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:35:17.889961 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.889841 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-cacerts" (OuterVolumeSpecName: "cacerts") pod "6e461a34-0a1a-4c24-949e-51fada80d12f" (UID: "6e461a34-0a1a-4c24-949e-51fada80d12f"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:35:17.890036 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.889975 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "6e461a34-0a1a-4c24-949e-51fada80d12f" (UID: "6e461a34-0a1a-4c24-949e-51fada80d12f"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:35:17.890070 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.890045 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e461a34-0a1a-4c24-949e-51fada80d12f-local-certs" (OuterVolumeSpecName: "local-certs") pod "6e461a34-0a1a-4c24-949e-51fada80d12f" (UID: "6e461a34-0a1a-4c24-949e-51fada80d12f"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:35:17.975323 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.975286 2572 generic.go:358] "Generic (PLEG): container finished" podID="6e461a34-0a1a-4c24-949e-51fada80d12f" containerID="e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1" exitCode=0 Apr 24 14:35:17.975493 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.975345 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" Apr 24 14:35:17.975493 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.975370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" event={"ID":"6e461a34-0a1a-4c24-949e-51fada80d12f","Type":"ContainerDied","Data":"e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1"} Apr 24 14:35:17.975493 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.975415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns" event={"ID":"6e461a34-0a1a-4c24-949e-51fada80d12f","Type":"ContainerDied","Data":"6f0eeaec33285893749e93341b258dca266da911df2c06ebf41b5391b05b430e"} Apr 24 14:35:17.975493 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.975434 2572 scope.go:117] "RemoveContainer" containerID="e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1" Apr 24 14:35:17.984032 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.984014 2572 scope.go:117] "RemoveContainer" containerID="e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1" Apr 24 14:35:17.984286 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:35:17.984267 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1\": container with ID starting with e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1 not found: ID does not exist" containerID="e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1" Apr 24 14:35:17.984387 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.984293 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1"} err="failed to get container status \"e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1\": rpc error: code = NotFound desc = could not find container \"e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1\": container with ID starting with e12869a87c35f45d925a47a776ebcee122263695b9baf275f47f48a36cb97bc1 not found: ID does not exist" Apr 24 14:35:17.988637 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.988623 2572 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6e461a34-0a1a-4c24-949e-51fada80d12f-local-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:17.988698 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.988641 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rh86r\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-kube-api-access-rh86r\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:17.988698 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.988651 2572 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-cacerts\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:17.988698 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.988662 2572 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-csr-dns-cert\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:17.988698 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.988671 2572 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-kubeconfig\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:17.988698 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.988679 2572 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6e461a34-0a1a-4c24-949e-51fada80d12f-istio-token\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:17.998203 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:17.998161 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns"] Apr 24 14:35:18.002231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:18.002211 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-bklns"] Apr 24 14:35:18.057316 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:18.057292 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e461a34-0a1a-4c24-949e-51fada80d12f" path="/var/lib/kubelet/pods/6e461a34-0a1a-4c24-949e-51fada80d12f/volumes" Apr 24 14:35:21.474808 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.474760 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-2nvcp"] Apr 24 14:35:21.475148 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.475066 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e461a34-0a1a-4c24-949e-51fada80d12f" containerName="discovery" Apr 24 14:35:21.475148 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.475077 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e461a34-0a1a-4c24-949e-51fada80d12f" containerName="discovery" Apr 24 14:35:21.475148 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.475121 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e461a34-0a1a-4c24-949e-51fada80d12f" containerName="discovery" Apr 24 14:35:21.478183 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.478163 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:21.480987 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.480966 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 14:35:21.482322 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.482293 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 14:35:21.482322 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.482300 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-7lq8x\"" Apr 24 14:35:21.482459 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.482301 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 14:35:21.489225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.489207 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-2nvcp"] Apr 24 14:35:21.492347 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.492328 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-66876c8d5d-lbsfj"] Apr 24 14:35:21.494557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.494540 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:21.497284 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.497266 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 14:35:21.497386 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.497303 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-hxjpp\"" Apr 24 14:35:21.508264 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.508239 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-66876c8d5d-lbsfj"] Apr 24 14:35:21.521303 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.521282 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-8tjzz"] Apr 24 14:35:21.531892 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.531868 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:21.534716 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.534697 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-jt5l4\"" Apr 24 14:35:21.534842 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.534727 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 14:35:21.535658 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.535638 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-8tjzz"] Apr 24 14:35:21.615890 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.615858 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnflq\" (UniqueName: \"kubernetes.io/projected/453facff-2554-4f6b-8d44-dcd25af01306-kube-api-access-pnflq\") pod \"llmisvc-controller-manager-66876c8d5d-lbsfj\" (UID: \"453facff-2554-4f6b-8d44-dcd25af01306\") " pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:21.616110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.615916 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5d9j\" (UniqueName: \"kubernetes.io/projected/ab2b553c-4475-44ca-90a5-54ebb929d21c-kube-api-access-z5d9j\") pod \"kserve-controller-manager-b7dc77d59-2nvcp\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:21.616110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.615949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/453facff-2554-4f6b-8d44-dcd25af01306-cert\") pod \"llmisvc-controller-manager-66876c8d5d-lbsfj\" (UID: \"453facff-2554-4f6b-8d44-dcd25af01306\") " pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:21.616110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.615981 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert\") pod \"kserve-controller-manager-b7dc77d59-2nvcp\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:21.717331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.717291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5d9j\" (UniqueName: \"kubernetes.io/projected/ab2b553c-4475-44ca-90a5-54ebb929d21c-kube-api-access-z5d9j\") pod \"kserve-controller-manager-b7dc77d59-2nvcp\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:21.717331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.717337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/453facff-2554-4f6b-8d44-dcd25af01306-cert\") pod \"llmisvc-controller-manager-66876c8d5d-lbsfj\" (UID: \"453facff-2554-4f6b-8d44-dcd25af01306\") " pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:21.717588 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.717372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smw2r\" (UniqueName: \"kubernetes.io/projected/4de2d5fb-a2ed-414b-8303-dbe27d214686-kube-api-access-smw2r\") pod \"seaweedfs-86cc847c5c-8tjzz\" (UID: \"4de2d5fb-a2ed-414b-8303-dbe27d214686\") " pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:21.717588 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.717389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert\") pod \"kserve-controller-manager-b7dc77d59-2nvcp\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:21.717588 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.717406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnflq\" (UniqueName: \"kubernetes.io/projected/453facff-2554-4f6b-8d44-dcd25af01306-kube-api-access-pnflq\") pod \"llmisvc-controller-manager-66876c8d5d-lbsfj\" (UID: \"453facff-2554-4f6b-8d44-dcd25af01306\") " pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:21.717588 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.717427 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4de2d5fb-a2ed-414b-8303-dbe27d214686-data\") pod \"seaweedfs-86cc847c5c-8tjzz\" (UID: \"4de2d5fb-a2ed-414b-8303-dbe27d214686\") " pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:21.717588 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:35:21.717518 2572 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 24 14:35:21.717588 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:35:21.717590 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert podName:ab2b553c-4475-44ca-90a5-54ebb929d21c nodeName:}" failed. No retries permitted until 2026-04-24 14:35:22.217568044 +0000 UTC m=+690.768410200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert") pod "kserve-controller-manager-b7dc77d59-2nvcp" (UID: "ab2b553c-4475-44ca-90a5-54ebb929d21c") : secret "kserve-webhook-server-cert" not found Apr 24 14:35:21.719953 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.719935 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/453facff-2554-4f6b-8d44-dcd25af01306-cert\") pod \"llmisvc-controller-manager-66876c8d5d-lbsfj\" (UID: \"453facff-2554-4f6b-8d44-dcd25af01306\") " pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:21.726015 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.725965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5d9j\" (UniqueName: \"kubernetes.io/projected/ab2b553c-4475-44ca-90a5-54ebb929d21c-kube-api-access-z5d9j\") pod \"kserve-controller-manager-b7dc77d59-2nvcp\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:21.731482 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.731457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnflq\" (UniqueName: \"kubernetes.io/projected/453facff-2554-4f6b-8d44-dcd25af01306-kube-api-access-pnflq\") pod \"llmisvc-controller-manager-66876c8d5d-lbsfj\" (UID: \"453facff-2554-4f6b-8d44-dcd25af01306\") " pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:21.809676 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.809643 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:21.818557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.818519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smw2r\" (UniqueName: \"kubernetes.io/projected/4de2d5fb-a2ed-414b-8303-dbe27d214686-kube-api-access-smw2r\") pod \"seaweedfs-86cc847c5c-8tjzz\" (UID: \"4de2d5fb-a2ed-414b-8303-dbe27d214686\") " pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:21.818679 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.818583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4de2d5fb-a2ed-414b-8303-dbe27d214686-data\") pod \"seaweedfs-86cc847c5c-8tjzz\" (UID: \"4de2d5fb-a2ed-414b-8303-dbe27d214686\") " pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:21.818991 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.818970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4de2d5fb-a2ed-414b-8303-dbe27d214686-data\") pod \"seaweedfs-86cc847c5c-8tjzz\" (UID: \"4de2d5fb-a2ed-414b-8303-dbe27d214686\") " pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:21.826838 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.826781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smw2r\" (UniqueName: \"kubernetes.io/projected/4de2d5fb-a2ed-414b-8303-dbe27d214686-kube-api-access-smw2r\") pod \"seaweedfs-86cc847c5c-8tjzz\" (UID: \"4de2d5fb-a2ed-414b-8303-dbe27d214686\") " pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:21.842243 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.842214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:21.932916 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.932880 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-66876c8d5d-lbsfj"] Apr 24 14:35:21.936064 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:35:21.936034 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod453facff_2554_4f6b_8d44_dcd25af01306.slice/crio-7b389433d0e7d44fd074d4edb0936ab0646a96ba30234e30033b58f78fc78693 WatchSource:0}: Error finding container 7b389433d0e7d44fd074d4edb0936ab0646a96ba30234e30033b58f78fc78693: Status 404 returned error can't find the container with id 7b389433d0e7d44fd074d4edb0936ab0646a96ba30234e30033b58f78fc78693 Apr 24 14:35:21.966783 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.966762 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-8tjzz"] Apr 24 14:35:21.968987 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:35:21.968960 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de2d5fb_a2ed_414b_8303_dbe27d214686.slice/crio-b79b3e712f9c4ccdd54dd6dfd206eda49c906d8237ce004b3da26365297148c5 WatchSource:0}: Error finding container b79b3e712f9c4ccdd54dd6dfd206eda49c906d8237ce004b3da26365297148c5: Status 404 returned error can't find the container with id b79b3e712f9c4ccdd54dd6dfd206eda49c906d8237ce004b3da26365297148c5 Apr 24 14:35:21.990862 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.990808 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-8tjzz" event={"ID":"4de2d5fb-a2ed-414b-8303-dbe27d214686","Type":"ContainerStarted","Data":"b79b3e712f9c4ccdd54dd6dfd206eda49c906d8237ce004b3da26365297148c5"} Apr 24 14:35:21.991635 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:21.991617 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" event={"ID":"453facff-2554-4f6b-8d44-dcd25af01306","Type":"ContainerStarted","Data":"7b389433d0e7d44fd074d4edb0936ab0646a96ba30234e30033b58f78fc78693"} Apr 24 14:35:22.221561 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:22.221534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert\") pod \"kserve-controller-manager-b7dc77d59-2nvcp\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:22.223915 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:22.223888 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert\") pod \"kserve-controller-manager-b7dc77d59-2nvcp\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:22.388200 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:22.388157 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:22.575262 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:22.575211 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-2nvcp"] Apr 24 14:35:22.600967 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:35:22.600892 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2b553c_4475_44ca_90a5_54ebb929d21c.slice/crio-cc7b15bbceef2ea03cea465708af391a0b063c9d0ca853d2bb28022525eb168f WatchSource:0}: Error finding container cc7b15bbceef2ea03cea465708af391a0b063c9d0ca853d2bb28022525eb168f: Status 404 returned error can't find the container with id cc7b15bbceef2ea03cea465708af391a0b063c9d0ca853d2bb28022525eb168f Apr 24 14:35:22.602738 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:22.602702 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:35:23.002694 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:23.002625 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" event={"ID":"ab2b553c-4475-44ca-90a5-54ebb929d21c","Type":"ContainerStarted","Data":"cc7b15bbceef2ea03cea465708af391a0b063c9d0ca853d2bb28022525eb168f"} Apr 24 14:35:27.018124 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.018039 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" event={"ID":"ab2b553c-4475-44ca-90a5-54ebb929d21c","Type":"ContainerStarted","Data":"10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845"} Apr 24 14:35:27.018124 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.018110 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:27.019293 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.019271 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-8tjzz" event={"ID":"4de2d5fb-a2ed-414b-8303-dbe27d214686","Type":"ContainerStarted","Data":"19b41cac80febfd9700dc0d38bebb6919c51258bf87d33db79a950df688c8ffa"} Apr 24 14:35:27.019395 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.019319 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:27.020431 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.020404 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" event={"ID":"453facff-2554-4f6b-8d44-dcd25af01306","Type":"ContainerStarted","Data":"8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2"} Apr 24 14:35:27.020529 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.020519 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:27.035023 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.034983 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" podStartSLOduration=2.285459132 podStartE2EDuration="6.034971558s" podCreationTimestamp="2026-04-24 14:35:21 +0000 UTC" firstStartedPulling="2026-04-24 14:35:22.602886443 +0000 UTC m=+691.153728599" lastFinishedPulling="2026-04-24 14:35:26.352398871 +0000 UTC m=+694.903241025" observedRunningTime="2026-04-24 14:35:27.033467621 +0000 UTC m=+695.584309803" watchObservedRunningTime="2026-04-24 14:35:27.034971558 +0000 UTC m=+695.585813729" Apr 24 14:35:27.051146 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.051099 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" podStartSLOduration=1.601363805 podStartE2EDuration="6.051086403s" podCreationTimestamp="2026-04-24 14:35:21 +0000 UTC" firstStartedPulling="2026-04-24 14:35:21.937560428 +0000 UTC m=+690.488402578" lastFinishedPulling="2026-04-24 14:35:26.387283022 +0000 UTC m=+694.938125176" observedRunningTime="2026-04-24 14:35:27.049879928 +0000 UTC m=+695.600722099" watchObservedRunningTime="2026-04-24 14:35:27.051086403 +0000 UTC m=+695.601928619" Apr 24 14:35:27.065237 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:27.065202 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-8tjzz" podStartSLOduration=1.607586723 podStartE2EDuration="6.065190813s" podCreationTimestamp="2026-04-24 14:35:21 +0000 UTC" firstStartedPulling="2026-04-24 14:35:21.970117603 +0000 UTC m=+690.520959752" lastFinishedPulling="2026-04-24 14:35:26.427721684 +0000 UTC m=+694.978563842" observedRunningTime="2026-04-24 14:35:27.064186663 +0000 UTC m=+695.615028833" watchObservedRunningTime="2026-04-24 14:35:27.065190813 +0000 UTC m=+695.616032984" Apr 24 14:35:33.026814 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:33.026786 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-8tjzz" Apr 24 14:35:58.025731 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:58.025701 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:35:58.028637 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:58.028597 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:59.207638 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.207591 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-2nvcp"] Apr 24 14:35:59.208082 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.207813 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" podUID="ab2b553c-4475-44ca-90a5-54ebb929d21c" containerName="manager" containerID="cri-o://10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845" gracePeriod=10 Apr 24 14:35:59.232835 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.232808 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-6cvhv"] Apr 24 14:35:59.294195 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.294173 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-6cvhv"] Apr 24 14:35:59.294291 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.294214 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:35:59.397088 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.397055 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrkmp\" (UniqueName: \"kubernetes.io/projected/5e08c887-a51c-4489-a68b-5644b9e3a4f6-kube-api-access-lrkmp\") pod \"kserve-controller-manager-b7dc77d59-6cvhv\" (UID: \"5e08c887-a51c-4489-a68b-5644b9e3a4f6\") " pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:35:59.397213 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.397185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e08c887-a51c-4489-a68b-5644b9e3a4f6-cert\") pod \"kserve-controller-manager-b7dc77d59-6cvhv\" (UID: \"5e08c887-a51c-4489-a68b-5644b9e3a4f6\") " pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:35:59.466885 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.466833 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:35:59.497932 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.497904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e08c887-a51c-4489-a68b-5644b9e3a4f6-cert\") pod \"kserve-controller-manager-b7dc77d59-6cvhv\" (UID: \"5e08c887-a51c-4489-a68b-5644b9e3a4f6\") " pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:35:59.498059 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.497957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrkmp\" (UniqueName: \"kubernetes.io/projected/5e08c887-a51c-4489-a68b-5644b9e3a4f6-kube-api-access-lrkmp\") pod \"kserve-controller-manager-b7dc77d59-6cvhv\" (UID: \"5e08c887-a51c-4489-a68b-5644b9e3a4f6\") " pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:35:59.500497 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.500474 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e08c887-a51c-4489-a68b-5644b9e3a4f6-cert\") pod \"kserve-controller-manager-b7dc77d59-6cvhv\" (UID: \"5e08c887-a51c-4489-a68b-5644b9e3a4f6\") " pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:35:59.506468 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.506440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrkmp\" (UniqueName: \"kubernetes.io/projected/5e08c887-a51c-4489-a68b-5644b9e3a4f6-kube-api-access-lrkmp\") pod \"kserve-controller-manager-b7dc77d59-6cvhv\" (UID: \"5e08c887-a51c-4489-a68b-5644b9e3a4f6\") " pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:35:59.598912 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.598879 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5d9j\" (UniqueName: \"kubernetes.io/projected/ab2b553c-4475-44ca-90a5-54ebb929d21c-kube-api-access-z5d9j\") pod \"ab2b553c-4475-44ca-90a5-54ebb929d21c\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " Apr 24 14:35:59.599088 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.598980 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert\") pod \"ab2b553c-4475-44ca-90a5-54ebb929d21c\" (UID: \"ab2b553c-4475-44ca-90a5-54ebb929d21c\") " Apr 24 14:35:59.601061 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.601028 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert" (OuterVolumeSpecName: "cert") pod "ab2b553c-4475-44ca-90a5-54ebb929d21c" (UID: "ab2b553c-4475-44ca-90a5-54ebb929d21c"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:35:59.601061 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.601045 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab2b553c-4475-44ca-90a5-54ebb929d21c-kube-api-access-z5d9j" (OuterVolumeSpecName: "kube-api-access-z5d9j") pod "ab2b553c-4475-44ca-90a5-54ebb929d21c" (UID: "ab2b553c-4475-44ca-90a5-54ebb929d21c"). InnerVolumeSpecName "kube-api-access-z5d9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:35:59.654369 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.654338 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:35:59.700917 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.700822 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5d9j\" (UniqueName: \"kubernetes.io/projected/ab2b553c-4475-44ca-90a5-54ebb929d21c-kube-api-access-z5d9j\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:59.700917 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.700853 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab2b553c-4475-44ca-90a5-54ebb929d21c-cert\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:35:59.773211 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:35:59.773121 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-6cvhv"] Apr 24 14:35:59.775783 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:35:59.775753 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e08c887_a51c_4489_a68b_5644b9e3a4f6.slice/crio-5132d30a34ba0c13fe0af1f6394de9e3415d39e58699c46c0d74ee7944c1fc06 WatchSource:0}: Error finding container 5132d30a34ba0c13fe0af1f6394de9e3415d39e58699c46c0d74ee7944c1fc06: Status 404 returned error can't find the container with id 5132d30a34ba0c13fe0af1f6394de9e3415d39e58699c46c0d74ee7944c1fc06 Apr 24 14:36:00.118983 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.118946 2572 generic.go:358] "Generic (PLEG): container finished" podID="ab2b553c-4475-44ca-90a5-54ebb929d21c" containerID="10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845" exitCode=0 Apr 24 14:36:00.119159 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.119016 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" Apr 24 14:36:00.119159 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.119036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" event={"ID":"ab2b553c-4475-44ca-90a5-54ebb929d21c","Type":"ContainerDied","Data":"10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845"} Apr 24 14:36:00.119159 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.119081 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-2nvcp" event={"ID":"ab2b553c-4475-44ca-90a5-54ebb929d21c","Type":"ContainerDied","Data":"cc7b15bbceef2ea03cea465708af391a0b063c9d0ca853d2bb28022525eb168f"} Apr 24 14:36:00.119159 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.119101 2572 scope.go:117] "RemoveContainer" containerID="10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845" Apr 24 14:36:00.120188 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.120079 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" event={"ID":"5e08c887-a51c-4489-a68b-5644b9e3a4f6","Type":"ContainerStarted","Data":"5132d30a34ba0c13fe0af1f6394de9e3415d39e58699c46c0d74ee7944c1fc06"} Apr 24 14:36:00.126773 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.126758 2572 scope.go:117] "RemoveContainer" containerID="10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845" Apr 24 14:36:00.127009 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:36:00.126993 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845\": container with ID starting with 10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845 not found: ID does not exist" containerID="10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845" Apr 24 14:36:00.127062 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.127017 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845"} err="failed to get container status \"10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845\": rpc error: code = NotFound desc = could not find container \"10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845\": container with ID starting with 10eb984450469f8d3ed5ad039f45bcd47e592dc1d25a025e42d91fbc987f2845 not found: ID does not exist" Apr 24 14:36:00.135249 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.135227 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-2nvcp"] Apr 24 14:36:00.138160 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:00.138139 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-b7dc77d59-2nvcp"] Apr 24 14:36:01.124172 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:01.124141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" event={"ID":"5e08c887-a51c-4489-a68b-5644b9e3a4f6","Type":"ContainerStarted","Data":"b0797d422a6c0ded67c79f9d4e75a12f3e83abecaf166958f850f962b1dd8eec"} Apr 24 14:36:01.124599 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:01.124182 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:36:01.142049 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:01.142005 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" podStartSLOduration=1.6245682160000001 podStartE2EDuration="2.141992288s" podCreationTimestamp="2026-04-24 14:35:59 +0000 UTC" firstStartedPulling="2026-04-24 14:35:59.777079378 +0000 UTC m=+728.327921528" lastFinishedPulling="2026-04-24 14:36:00.294503447 +0000 UTC m=+728.845345600" observedRunningTime="2026-04-24 14:36:01.139940921 +0000 UTC m=+729.690783094" watchObservedRunningTime="2026-04-24 14:36:01.141992288 +0000 UTC m=+729.692834460" Apr 24 14:36:02.057355 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:02.057327 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab2b553c-4475-44ca-90a5-54ebb929d21c" path="/var/lib/kubelet/pods/ab2b553c-4475-44ca-90a5-54ebb929d21c/volumes" Apr 24 14:36:32.132902 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:32.132871 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b7dc77d59-6cvhv" Apr 24 14:36:33.045492 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.045453 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-7cq72"] Apr 24 14:36:33.045902 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.045881 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab2b553c-4475-44ca-90a5-54ebb929d21c" containerName="manager" Apr 24 14:36:33.045902 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.045903 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2b553c-4475-44ca-90a5-54ebb929d21c" containerName="manager" Apr 24 14:36:33.046060 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.045985 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab2b553c-4475-44ca-90a5-54ebb929d21c" containerName="manager" Apr 24 14:36:33.049043 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.049022 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:33.051594 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.051572 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 14:36:33.051855 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.051829 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-592w5\"" Apr 24 14:36:33.057674 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.057651 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-7cq72"] Apr 24 14:36:33.060614 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.060580 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-62b7s"] Apr 24 14:36:33.063499 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.063484 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:33.065989 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.065966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-pvj2f\"" Apr 24 14:36:33.066086 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.066015 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 14:36:33.072528 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.072509 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-62b7s"] Apr 24 14:36:33.151828 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.151794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be7aa129-6149-4473-933b-b1541696bf80-tls-certs\") pod \"model-serving-api-86f7b4b499-7cq72\" (UID: \"be7aa129-6149-4473-933b-b1541696bf80\") " pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:33.151828 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.151826 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk248\" (UniqueName: \"kubernetes.io/projected/be7aa129-6149-4473-933b-b1541696bf80-kube-api-access-lk248\") pod \"model-serving-api-86f7b4b499-7cq72\" (UID: \"be7aa129-6149-4473-933b-b1541696bf80\") " pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:33.253264 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.253214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-774z6\" (UniqueName: \"kubernetes.io/projected/4836eeb4-d0f0-4314-b7f7-256e05157faf-kube-api-access-774z6\") pod \"odh-model-controller-696fc77849-62b7s\" (UID: \"4836eeb4-d0f0-4314-b7f7-256e05157faf\") " pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:33.253414 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.253319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4836eeb4-d0f0-4314-b7f7-256e05157faf-cert\") pod \"odh-model-controller-696fc77849-62b7s\" (UID: \"4836eeb4-d0f0-4314-b7f7-256e05157faf\") " pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:33.253414 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.253396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be7aa129-6149-4473-933b-b1541696bf80-tls-certs\") pod \"model-serving-api-86f7b4b499-7cq72\" (UID: \"be7aa129-6149-4473-933b-b1541696bf80\") " pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:33.253521 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.253425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk248\" (UniqueName: \"kubernetes.io/projected/be7aa129-6149-4473-933b-b1541696bf80-kube-api-access-lk248\") pod \"model-serving-api-86f7b4b499-7cq72\" (UID: \"be7aa129-6149-4473-933b-b1541696bf80\") " pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:33.255929 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.255906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/be7aa129-6149-4473-933b-b1541696bf80-tls-certs\") pod \"model-serving-api-86f7b4b499-7cq72\" (UID: \"be7aa129-6149-4473-933b-b1541696bf80\") " pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:33.262590 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.262560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk248\" (UniqueName: \"kubernetes.io/projected/be7aa129-6149-4473-933b-b1541696bf80-kube-api-access-lk248\") pod \"model-serving-api-86f7b4b499-7cq72\" (UID: \"be7aa129-6149-4473-933b-b1541696bf80\") " pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:33.354270 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.354170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-774z6\" (UniqueName: \"kubernetes.io/projected/4836eeb4-d0f0-4314-b7f7-256e05157faf-kube-api-access-774z6\") pod \"odh-model-controller-696fc77849-62b7s\" (UID: \"4836eeb4-d0f0-4314-b7f7-256e05157faf\") " pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:33.354270 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.354239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4836eeb4-d0f0-4314-b7f7-256e05157faf-cert\") pod \"odh-model-controller-696fc77849-62b7s\" (UID: \"4836eeb4-d0f0-4314-b7f7-256e05157faf\") " pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:33.356827 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.356803 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4836eeb4-d0f0-4314-b7f7-256e05157faf-cert\") pod \"odh-model-controller-696fc77849-62b7s\" (UID: \"4836eeb4-d0f0-4314-b7f7-256e05157faf\") " pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:33.359453 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.359433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:33.362482 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.362456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-774z6\" (UniqueName: \"kubernetes.io/projected/4836eeb4-d0f0-4314-b7f7-256e05157faf-kube-api-access-774z6\") pod \"odh-model-controller-696fc77849-62b7s\" (UID: \"4836eeb4-d0f0-4314-b7f7-256e05157faf\") " pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:33.374169 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.374147 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:33.487046 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.487014 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-7cq72"] Apr 24 14:36:33.489685 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:36:33.489650 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe7aa129_6149_4473_933b_b1541696bf80.slice/crio-d3ca11c90853f270e47eda44a948052efbe09ca812fd9f8241b5d899e76736ef WatchSource:0}: Error finding container d3ca11c90853f270e47eda44a948052efbe09ca812fd9f8241b5d899e76736ef: Status 404 returned error can't find the container with id d3ca11c90853f270e47eda44a948052efbe09ca812fd9f8241b5d899e76736ef Apr 24 14:36:33.511323 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:33.511292 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-62b7s"] Apr 24 14:36:33.513475 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:36:33.513447 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4836eeb4_d0f0_4314_b7f7_256e05157faf.slice/crio-6ab8c355e82c3ed8a3daeb9c8f0130bde033f4d9b3504c9d680271ca349bc1fc WatchSource:0}: Error finding container 6ab8c355e82c3ed8a3daeb9c8f0130bde033f4d9b3504c9d680271ca349bc1fc: Status 404 returned error can't find the container with id 6ab8c355e82c3ed8a3daeb9c8f0130bde033f4d9b3504c9d680271ca349bc1fc Apr 24 14:36:34.227353 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:34.227313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-62b7s" event={"ID":"4836eeb4-d0f0-4314-b7f7-256e05157faf","Type":"ContainerStarted","Data":"6ab8c355e82c3ed8a3daeb9c8f0130bde033f4d9b3504c9d680271ca349bc1fc"} Apr 24 14:36:34.229059 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:34.229006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-7cq72" event={"ID":"be7aa129-6149-4473-933b-b1541696bf80","Type":"ContainerStarted","Data":"d3ca11c90853f270e47eda44a948052efbe09ca812fd9f8241b5d899e76736ef"} Apr 24 14:36:36.239379 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:36.239348 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-7cq72" event={"ID":"be7aa129-6149-4473-933b-b1541696bf80","Type":"ContainerStarted","Data":"2983592c446210c7742b04b3df24f4308236b9f9d3f8f9d270bc806d2451a62b"} Apr 24 14:36:37.244766 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:37.244729 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-62b7s" event={"ID":"4836eeb4-d0f0-4314-b7f7-256e05157faf","Type":"ContainerStarted","Data":"e881630c129e1160cb1173271cf0f8407b4a8c661371dab44dd200454bd46057"} Apr 24 14:36:37.245186 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:37.244829 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:37.245186 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:37.245046 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:37.260997 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:37.260941 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-62b7s" podStartSLOduration=1.598366572 podStartE2EDuration="4.260926234s" podCreationTimestamp="2026-04-24 14:36:33 +0000 UTC" firstStartedPulling="2026-04-24 14:36:33.514877358 +0000 UTC m=+762.065719508" lastFinishedPulling="2026-04-24 14:36:36.177437001 +0000 UTC m=+764.728279170" observedRunningTime="2026-04-24 14:36:37.259974935 +0000 UTC m=+765.810817107" watchObservedRunningTime="2026-04-24 14:36:37.260926234 +0000 UTC m=+765.811768407" Apr 24 14:36:37.278434 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:37.278390 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-7cq72" podStartSLOduration=1.640878414 podStartE2EDuration="4.278376994s" podCreationTimestamp="2026-04-24 14:36:33 +0000 UTC" firstStartedPulling="2026-04-24 14:36:33.491548435 +0000 UTC m=+762.042390585" lastFinishedPulling="2026-04-24 14:36:36.129047012 +0000 UTC m=+764.679889165" observedRunningTime="2026-04-24 14:36:37.277083297 +0000 UTC m=+765.827925482" watchObservedRunningTime="2026-04-24 14:36:37.278376994 +0000 UTC m=+765.829219166" Apr 24 14:36:48.250683 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:48.250644 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-62b7s" Apr 24 14:36:48.252684 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:48.252663 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-7cq72" Apr 24 14:36:48.998376 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:48.998339 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-z2w67"] Apr 24 14:36:49.002575 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:49.002559 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z2w67" Apr 24 14:36:49.007956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:49.007928 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-z2w67"] Apr 24 14:36:49.081017 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:49.080979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htxzh\" (UniqueName: \"kubernetes.io/projected/a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2-kube-api-access-htxzh\") pod \"s3-init-z2w67\" (UID: \"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2\") " pod="kserve/s3-init-z2w67" Apr 24 14:36:49.181859 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:49.181823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htxzh\" (UniqueName: \"kubernetes.io/projected/a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2-kube-api-access-htxzh\") pod \"s3-init-z2w67\" (UID: \"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2\") " pod="kserve/s3-init-z2w67" Apr 24 14:36:49.191252 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:49.191227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htxzh\" (UniqueName: \"kubernetes.io/projected/a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2-kube-api-access-htxzh\") pod \"s3-init-z2w67\" (UID: \"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2\") " pod="kserve/s3-init-z2w67" Apr 24 14:36:49.312654 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:49.312597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z2w67" Apr 24 14:36:49.428790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:49.428765 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-z2w67"] Apr 24 14:36:49.431216 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:36:49.431183 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda13d4e1c_afbc_46bd_b1b9_4d74924c7eb2.slice/crio-39f48406484c987b6aea673d949f85ebabad7a9b61f9acc7680bac4c6329c93c WatchSource:0}: Error finding container 39f48406484c987b6aea673d949f85ebabad7a9b61f9acc7680bac4c6329c93c: Status 404 returned error can't find the container with id 39f48406484c987b6aea673d949f85ebabad7a9b61f9acc7680bac4c6329c93c Apr 24 14:36:50.292230 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:50.292178 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z2w67" event={"ID":"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2","Type":"ContainerStarted","Data":"39f48406484c987b6aea673d949f85ebabad7a9b61f9acc7680bac4c6329c93c"} Apr 24 14:36:54.307842 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:54.307801 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z2w67" event={"ID":"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2","Type":"ContainerStarted","Data":"28d1e2b6ad14edd806907eb92afe0c5b28129780596e48b776c4733628782af4"} Apr 24 14:36:54.324937 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:54.324889 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-z2w67" podStartSLOduration=1.926393664 podStartE2EDuration="6.324873955s" podCreationTimestamp="2026-04-24 14:36:48 +0000 UTC" firstStartedPulling="2026-04-24 14:36:49.43300397 +0000 UTC m=+777.983846120" lastFinishedPulling="2026-04-24 14:36:53.831484255 +0000 UTC m=+782.382326411" observedRunningTime="2026-04-24 14:36:54.323291261 +0000 UTC m=+782.874133436" watchObservedRunningTime="2026-04-24 14:36:54.324873955 +0000 UTC m=+782.875716126" Apr 24 14:36:57.321395 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:57.321364 2572 generic.go:358] "Generic (PLEG): container finished" podID="a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2" containerID="28d1e2b6ad14edd806907eb92afe0c5b28129780596e48b776c4733628782af4" exitCode=0 Apr 24 14:36:57.321790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:57.321421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z2w67" event={"ID":"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2","Type":"ContainerDied","Data":"28d1e2b6ad14edd806907eb92afe0c5b28129780596e48b776c4733628782af4"} Apr 24 14:36:58.453753 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:58.453731 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z2w67" Apr 24 14:36:58.561481 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:58.561446 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htxzh\" (UniqueName: \"kubernetes.io/projected/a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2-kube-api-access-htxzh\") pod \"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2\" (UID: \"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2\") " Apr 24 14:36:58.563496 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:58.563471 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2-kube-api-access-htxzh" (OuterVolumeSpecName: "kube-api-access-htxzh") pod "a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2" (UID: "a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2"). InnerVolumeSpecName "kube-api-access-htxzh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:36:58.662085 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:58.662000 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-htxzh\" (UniqueName: \"kubernetes.io/projected/a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2-kube-api-access-htxzh\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:36:59.328905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:59.328870 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-z2w67" event={"ID":"a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2","Type":"ContainerDied","Data":"39f48406484c987b6aea673d949f85ebabad7a9b61f9acc7680bac4c6329c93c"} Apr 24 14:36:59.328905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:59.328901 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f48406484c987b6aea673d949f85ebabad7a9b61f9acc7680bac4c6329c93c" Apr 24 14:36:59.329180 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:36:59.328915 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-z2w67" Apr 24 14:37:12.915901 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.915813 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s"] Apr 24 14:37:12.916333 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.916293 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2" containerName="s3-init" Apr 24 14:37:12.916333 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.916316 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2" containerName="s3-init" Apr 24 14:37:12.916438 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.916420 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2" containerName="s3-init" Apr 24 14:37:12.919420 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.919396 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.922787 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.922747 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 24 14:37:12.922787 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.922782 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-fsb9l\"" Apr 24 14:37:12.922993 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.922790 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 14:37:12.923103 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.923084 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 14:37:12.935812 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.935789 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s"] Apr 24 14:37:12.971423 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2cf912da-55a2-4be1-b630-60258fde33f3-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.971556 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.971556 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjppv\" (UniqueName: \"kubernetes.io/projected/2cf912da-55a2-4be1-b630-60258fde33f3-kube-api-access-kjppv\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.971556 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971480 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.971556 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2cf912da-55a2-4be1-b630-60258fde33f3-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.971556 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.971740 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2cf912da-55a2-4be1-b630-60258fde33f3-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.971740 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:12.971740 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:12.971722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073086 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073086 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073085 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2cf912da-55a2-4be1-b630-60258fde33f3-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjppv\" (UniqueName: \"kubernetes.io/projected/2cf912da-55a2-4be1-b630-60258fde33f3-kube-api-access-kjppv\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2cf912da-55a2-4be1-b630-60258fde33f3-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2cf912da-55a2-4be1-b630-60258fde33f3-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073325 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073706 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073706 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073904 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.073991 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.073959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.074052 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.074018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2cf912da-55a2-4be1-b630-60258fde33f3-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.075617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.075581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2cf912da-55a2-4be1-b630-60258fde33f3-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.075704 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.075626 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2cf912da-55a2-4be1-b630-60258fde33f3-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.082759 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.082730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2cf912da-55a2-4be1-b630-60258fde33f3-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.083049 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.083024 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjppv\" (UniqueName: \"kubernetes.io/projected/2cf912da-55a2-4be1-b630-60258fde33f3-kube-api-access-kjppv\") pod \"router-gateway-1-openshift-default-6c59fbf55c-74r8s\" (UID: \"2cf912da-55a2-4be1-b630-60258fde33f3\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.233084 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.232991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:13.374487 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.374450 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s"] Apr 24 14:37:13.380316 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:37:13.380282 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cf912da_55a2_4be1_b630_60258fde33f3.slice/crio-45fa0bb3dac6f9a44abf712ea46671641881d173ca67c34315e2de27ff0708d0 WatchSource:0}: Error finding container 45fa0bb3dac6f9a44abf712ea46671641881d173ca67c34315e2de27ff0708d0: Status 404 returned error can't find the container with id 45fa0bb3dac6f9a44abf712ea46671641881d173ca67c34315e2de27ff0708d0 Apr 24 14:37:13.386181 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.386141 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:37:13.386278 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.386230 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:37:13.386319 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:13.386271 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 24 14:37:14.376324 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:14.376289 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" event={"ID":"2cf912da-55a2-4be1-b630-60258fde33f3","Type":"ContainerStarted","Data":"be6663bd344677ebfde01a080aed8144252a77450ce945a51a29273032bebb66"} Apr 24 14:37:14.376324 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:14.376326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" event={"ID":"2cf912da-55a2-4be1-b630-60258fde33f3","Type":"ContainerStarted","Data":"45fa0bb3dac6f9a44abf712ea46671641881d173ca67c34315e2de27ff0708d0"} Apr 24 14:37:14.397941 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:14.397895 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" podStartSLOduration=2.397876775 podStartE2EDuration="2.397876775s" podCreationTimestamp="2026-04-24 14:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:37:14.396866985 +0000 UTC m=+802.947709193" watchObservedRunningTime="2026-04-24 14:37:14.397876775 +0000 UTC m=+802.948718999" Apr 24 14:37:15.233596 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:15.233559 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:15.238400 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:15.238377 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:15.379404 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:15.379376 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:15.380229 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:15.380211 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-74r8s" Apr 24 14:37:20.077775 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.077740 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw"] Apr 24 14:37:20.081386 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.081365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.085330 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.085308 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 24 14:37:20.085442 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.085348 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-92j8s\"" Apr 24 14:37:20.090439 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.090417 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw"] Apr 24 14:37:20.134317 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.134281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.134479 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.134344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crwj\" (UniqueName: \"kubernetes.io/projected/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kube-api-access-9crwj\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.134479 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.134392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.134479 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.134424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.134479 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.134467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.134635 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.134496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.235838 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.235803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.236018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.235853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9crwj\" (UniqueName: \"kubernetes.io/projected/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kube-api-access-9crwj\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.236018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.235880 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.236018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.235905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.236018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.235946 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.236018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.235977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.236284 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.236249 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.236338 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.236307 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.236390 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.236341 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.238293 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.238268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.238524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.238504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.246354 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.246326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crwj\" (UniqueName: \"kubernetes.io/projected/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kube-api-access-9crwj\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.392451 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.392365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:20.517123 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:20.517091 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw"] Apr 24 14:37:20.520007 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:37:20.519984 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bf00cd9_85bf_48b1_94e7_d6061fb6315b.slice/crio-910eac6cc923bcaf319b0870ccc98a346c5bed5dbbf6988689e84253eaf8f0d0 WatchSource:0}: Error finding container 910eac6cc923bcaf319b0870ccc98a346c5bed5dbbf6988689e84253eaf8f0d0: Status 404 returned error can't find the container with id 910eac6cc923bcaf319b0870ccc98a346c5bed5dbbf6988689e84253eaf8f0d0 Apr 24 14:37:21.400180 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:21.400137 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" event={"ID":"4bf00cd9-85bf-48b1-94e7-d6061fb6315b","Type":"ContainerStarted","Data":"910eac6cc923bcaf319b0870ccc98a346c5bed5dbbf6988689e84253eaf8f0d0"} Apr 24 14:37:25.415036 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:25.415003 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" event={"ID":"4bf00cd9-85bf-48b1-94e7-d6061fb6315b","Type":"ContainerStarted","Data":"b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d"} Apr 24 14:37:29.429068 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:29.429031 2572 generic.go:358] "Generic (PLEG): container finished" podID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" containerID="b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d" exitCode=0 Apr 24 14:37:29.429449 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:29.429106 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" event={"ID":"4bf00cd9-85bf-48b1-94e7-d6061fb6315b","Type":"ContainerDied","Data":"b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d"} Apr 24 14:37:31.274482 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.274445 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s"] Apr 24 14:37:31.277867 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.277849 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.280740 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.280717 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 24 14:37:31.280870 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.280852 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-kjdnd\"" Apr 24 14:37:31.289324 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.289302 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s"] Apr 24 14:37:31.439501 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.439471 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" event={"ID":"4bf00cd9-85bf-48b1-94e7-d6061fb6315b","Type":"ContainerStarted","Data":"6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8"} Apr 24 14:37:31.447370 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.447335 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.447530 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.447380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.447530 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.447424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.447530 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.447456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.447711 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.447532 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9e698c-4573-41f0-af28-cdc03d648778-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.447711 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.447561 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjd4p\" (UniqueName: \"kubernetes.io/projected/5c9e698c-4573-41f0-af28-cdc03d648778-kube-api-access-pjd4p\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.459254 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.459206 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" podStartSLOduration=1.3065680419999999 podStartE2EDuration="11.459191902s" podCreationTimestamp="2026-04-24 14:37:20 +0000 UTC" firstStartedPulling="2026-04-24 14:37:20.521746034 +0000 UTC m=+809.072588183" lastFinishedPulling="2026-04-24 14:37:30.674369877 +0000 UTC m=+819.225212043" observedRunningTime="2026-04-24 14:37:31.457374473 +0000 UTC m=+820.008216656" watchObservedRunningTime="2026-04-24 14:37:31.459191902 +0000 UTC m=+820.010034074" Apr 24 14:37:31.548426 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.548391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9e698c-4573-41f0-af28-cdc03d648778-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.548629 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.548440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjd4p\" (UniqueName: \"kubernetes.io/projected/5c9e698c-4573-41f0-af28-cdc03d648778-kube-api-access-pjd4p\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.548629 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.548484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.548629 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.548504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.548629 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.548532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.548629 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.548569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.549326 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.548957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.549326 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.549181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.549326 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.549198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.549534 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.549405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.551170 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.551144 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9e698c-4573-41f0-af28-cdc03d648778-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.556244 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.556219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjd4p\" (UniqueName: \"kubernetes.io/projected/5c9e698c-4573-41f0-af28-cdc03d648778-kube-api-access-pjd4p\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.588904 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.588880 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:37:31.714318 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:31.714294 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s"] Apr 24 14:37:31.715960 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:37:31.715934 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c9e698c_4573_41f0_af28_cdc03d648778.slice/crio-a61ba2cc0996255a9d51c80aafd628a7ddaea5112b62be5e408faa9e8d6b80d0 WatchSource:0}: Error finding container a61ba2cc0996255a9d51c80aafd628a7ddaea5112b62be5e408faa9e8d6b80d0: Status 404 returned error can't find the container with id a61ba2cc0996255a9d51c80aafd628a7ddaea5112b62be5e408faa9e8d6b80d0 Apr 24 14:37:32.445215 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:32.445165 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" event={"ID":"5c9e698c-4573-41f0-af28-cdc03d648778","Type":"ContainerStarted","Data":"3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f"} Apr 24 14:37:32.445215 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:32.445222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" event={"ID":"5c9e698c-4573-41f0-af28-cdc03d648778","Type":"ContainerStarted","Data":"a61ba2cc0996255a9d51c80aafd628a7ddaea5112b62be5e408faa9e8d6b80d0"} Apr 24 14:37:33.450481 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:33.450438 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c9e698c-4573-41f0-af28-cdc03d648778" containerID="3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f" exitCode=0 Apr 24 14:37:33.450958 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:33.450528 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" event={"ID":"5c9e698c-4573-41f0-af28-cdc03d648778","Type":"ContainerDied","Data":"3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f"} Apr 24 14:37:35.461777 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:35.461690 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" event={"ID":"5c9e698c-4573-41f0-af28-cdc03d648778","Type":"ContainerStarted","Data":"ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db"} Apr 24 14:37:40.392702 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:40.392666 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:40.393178 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:40.392758 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:40.408910 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:40.408886 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:37:40.497769 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:37:40.497741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:38:04.571184 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:04.571144 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" event={"ID":"5c9e698c-4573-41f0-af28-cdc03d648778","Type":"ContainerStarted","Data":"f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1"} Apr 24 14:38:04.571591 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:04.571269 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:38:04.595862 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:04.595803 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" podStartSLOduration=3.15079197 podStartE2EDuration="33.595784448s" podCreationTimestamp="2026-04-24 14:37:31 +0000 UTC" firstStartedPulling="2026-04-24 14:37:33.451696105 +0000 UTC m=+822.002538255" lastFinishedPulling="2026-04-24 14:38:03.896688583 +0000 UTC m=+852.447530733" observedRunningTime="2026-04-24 14:38:04.592283513 +0000 UTC m=+853.143125686" watchObservedRunningTime="2026-04-24 14:38:04.595784448 +0000 UTC m=+853.146626624" Apr 24 14:38:05.577351 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:05.577319 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:38:11.589868 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:11.589820 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:38:11.589868 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:11.589877 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:38:11.591495 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:11.591470 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:38:11.596035 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:11.596014 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:38:52.355502 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:52.355415 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:38:52.356250 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:38:52.356234 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:39:22.982921 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:22.982882 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s"] Apr 24 14:39:22.983537 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:22.983192 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="main" containerID="cri-o://ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db" gracePeriod=30 Apr 24 14:39:22.983537 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:22.983258 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="tokenizer" containerID="cri-o://f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1" gracePeriod=30 Apr 24 14:39:23.829900 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:23.829864 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c9e698c-4573-41f0-af28-cdc03d648778" containerID="ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db" exitCode=0 Apr 24 14:39:23.830081 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:23.829943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" event={"ID":"5c9e698c-4573-41f0-af28-cdc03d648778","Type":"ContainerDied","Data":"ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db"} Apr 24 14:39:24.129428 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.129403 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:39:24.319704 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.319664 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-tmp\") pod \"5c9e698c-4573-41f0-af28-cdc03d648778\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " Apr 24 14:39:24.319898 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.319738 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9e698c-4573-41f0-af28-cdc03d648778-tls-certs\") pod \"5c9e698c-4573-41f0-af28-cdc03d648778\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " Apr 24 14:39:24.319898 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.319784 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-cache\") pod \"5c9e698c-4573-41f0-af28-cdc03d648778\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " Apr 24 14:39:24.319898 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.319825 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-uds\") pod \"5c9e698c-4573-41f0-af28-cdc03d648778\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " Apr 24 14:39:24.319898 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.319878 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-kserve-provision-location\") pod \"5c9e698c-4573-41f0-af28-cdc03d648778\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " Apr 24 14:39:24.320103 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.319908 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjd4p\" (UniqueName: \"kubernetes.io/projected/5c9e698c-4573-41f0-af28-cdc03d648778-kube-api-access-pjd4p\") pod \"5c9e698c-4573-41f0-af28-cdc03d648778\" (UID: \"5c9e698c-4573-41f0-af28-cdc03d648778\") " Apr 24 14:39:24.320150 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.320016 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5c9e698c-4573-41f0-af28-cdc03d648778" (UID: "5c9e698c-4573-41f0-af28-cdc03d648778"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:39:24.320150 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.320118 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5c9e698c-4573-41f0-af28-cdc03d648778" (UID: "5c9e698c-4573-41f0-af28-cdc03d648778"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:39:24.320150 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.320041 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5c9e698c-4573-41f0-af28-cdc03d648778" (UID: "5c9e698c-4573-41f0-af28-cdc03d648778"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:39:24.320285 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.320196 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-tmp\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:39:24.320285 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.320214 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:39:24.320285 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.320231 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-tokenizer-uds\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:39:24.320544 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.320518 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5c9e698c-4573-41f0-af28-cdc03d648778" (UID: "5c9e698c-4573-41f0-af28-cdc03d648778"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:39:24.322044 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.322025 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9e698c-4573-41f0-af28-cdc03d648778-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5c9e698c-4573-41f0-af28-cdc03d648778" (UID: "5c9e698c-4573-41f0-af28-cdc03d648778"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:39:24.322120 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.322097 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9e698c-4573-41f0-af28-cdc03d648778-kube-api-access-pjd4p" (OuterVolumeSpecName: "kube-api-access-pjd4p") pod "5c9e698c-4573-41f0-af28-cdc03d648778" (UID: "5c9e698c-4573-41f0-af28-cdc03d648778"). InnerVolumeSpecName "kube-api-access-pjd4p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:39:24.421496 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.421430 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9e698c-4573-41f0-af28-cdc03d648778-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:39:24.421496 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.421458 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c9e698c-4573-41f0-af28-cdc03d648778-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:39:24.421496 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.421470 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjd4p\" (UniqueName: \"kubernetes.io/projected/5c9e698c-4573-41f0-af28-cdc03d648778-kube-api-access-pjd4p\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:39:24.835095 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.835058 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c9e698c-4573-41f0-af28-cdc03d648778" containerID="f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1" exitCode=0 Apr 24 14:39:24.835272 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.835134 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" Apr 24 14:39:24.835272 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.835145 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" event={"ID":"5c9e698c-4573-41f0-af28-cdc03d648778","Type":"ContainerDied","Data":"f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1"} Apr 24 14:39:24.835272 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.835181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s" event={"ID":"5c9e698c-4573-41f0-af28-cdc03d648778","Type":"ContainerDied","Data":"a61ba2cc0996255a9d51c80aafd628a7ddaea5112b62be5e408faa9e8d6b80d0"} Apr 24 14:39:24.835272 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.835199 2572 scope.go:117] "RemoveContainer" containerID="f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1" Apr 24 14:39:24.843334 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.843315 2572 scope.go:117] "RemoveContainer" containerID="ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db" Apr 24 14:39:24.851956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.851932 2572 scope.go:117] "RemoveContainer" containerID="3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f" Apr 24 14:39:24.857538 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.857515 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s"] Apr 24 14:39:24.860666 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.860643 2572 scope.go:117] "RemoveContainer" containerID="f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1" Apr 24 14:39:24.860946 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:39:24.860925 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1\": container with ID starting with f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1 not found: ID does not exist" containerID="f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1" Apr 24 14:39:24.861011 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.860955 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1"} err="failed to get container status \"f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1\": rpc error: code = NotFound desc = could not find container \"f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1\": container with ID starting with f5b7ee808450e5db921eb814ef379cb1fa6bacaa9cd97933ca89a489303210b1 not found: ID does not exist" Apr 24 14:39:24.861011 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.860974 2572 scope.go:117] "RemoveContainer" containerID="ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db" Apr 24 14:39:24.861198 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:39:24.861182 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db\": container with ID starting with ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db not found: ID does not exist" containerID="ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db" Apr 24 14:39:24.861248 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.861204 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db"} err="failed to get container status \"ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db\": rpc error: code = NotFound desc = could not find container \"ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db\": container with ID starting with ee0a1c08abba43c354e9b581b76b1e17b55cf03197fc587959df8daf2c9869db not found: ID does not exist" Apr 24 14:39:24.861248 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.861217 2572 scope.go:117] "RemoveContainer" containerID="3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f" Apr 24 14:39:24.861469 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:39:24.861453 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f\": container with ID starting with 3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f not found: ID does not exist" containerID="3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f" Apr 24 14:39:24.861515 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.861471 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f"} err="failed to get container status \"3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f\": rpc error: code = NotFound desc = could not find container \"3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f\": container with ID starting with 3446602bde2c5261e017454859b338c99a94c8946e5050e0116f4a40ce1c5f6f not found: ID does not exist" Apr 24 14:39:24.861780 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:24.861763 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74dc59cd9s"] Apr 24 14:39:26.057394 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:26.057349 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" path="/var/lib/kubelet/pods/5c9e698c-4573-41f0-af28-cdc03d648778/volumes" Apr 24 14:39:28.835039 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835005 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9"] Apr 24 14:39:28.835500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835323 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="tokenizer" Apr 24 14:39:28.835500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835335 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="tokenizer" Apr 24 14:39:28.835500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835345 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="storage-initializer" Apr 24 14:39:28.835500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835350 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="storage-initializer" Apr 24 14:39:28.835500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835366 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="main" Apr 24 14:39:28.835500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835371 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="main" Apr 24 14:39:28.835500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835421 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="tokenizer" Apr 24 14:39:28.835500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.835432 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9e698c-4573-41f0-af28-cdc03d648778" containerName="main" Apr 24 14:39:28.840489 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.840469 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:28.843361 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.843337 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 14:39:28.843531 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.843511 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-x4cvm\"" Apr 24 14:39:28.850493 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.850470 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9"] Apr 24 14:39:28.959790 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.959755 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:28.959987 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.959809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:28.959987 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.959845 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf98l\" (UniqueName: \"kubernetes.io/projected/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kube-api-access-zf98l\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:28.959987 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.959876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:28.959987 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.959901 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:28.959987 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:28.959959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.060620 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.060581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.060802 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.060660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.060802 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.060702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.060802 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.060730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.060802 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.060776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf98l\" (UniqueName: \"kubernetes.io/projected/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kube-api-access-zf98l\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.061086 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.060815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.061206 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.061148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.061206 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.061179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.061348 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.061327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.061396 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.061335 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.063189 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.063171 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.068788 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.068766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf98l\" (UniqueName: \"kubernetes.io/projected/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kube-api-access-zf98l\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.151458 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.151381 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:29.275126 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.275099 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9"] Apr 24 14:39:29.276999 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:39:29.276967 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda6d9043_24a0_4e30_9fa4_a1ecc58e8007.slice/crio-69d4f82ecb60ca2d6f59860b999b0d942a7ef381b71b32e4408b9825ef12af96 WatchSource:0}: Error finding container 69d4f82ecb60ca2d6f59860b999b0d942a7ef381b71b32e4408b9825ef12af96: Status 404 returned error can't find the container with id 69d4f82ecb60ca2d6f59860b999b0d942a7ef381b71b32e4408b9825ef12af96 Apr 24 14:39:29.855488 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.855455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" event={"ID":"da6d9043-24a0-4e30-9fa4-a1ecc58e8007","Type":"ContainerStarted","Data":"5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931"} Apr 24 14:39:29.855488 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:29.855490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" event={"ID":"da6d9043-24a0-4e30-9fa4-a1ecc58e8007","Type":"ContainerStarted","Data":"69d4f82ecb60ca2d6f59860b999b0d942a7ef381b71b32e4408b9825ef12af96"} Apr 24 14:39:30.859453 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:30.859417 2572 generic.go:358] "Generic (PLEG): container finished" podID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerID="5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931" exitCode=0 Apr 24 14:39:30.859831 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:30.859504 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" event={"ID":"da6d9043-24a0-4e30-9fa4-a1ecc58e8007","Type":"ContainerDied","Data":"5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931"} Apr 24 14:39:31.864331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:31.864301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" event={"ID":"da6d9043-24a0-4e30-9fa4-a1ecc58e8007","Type":"ContainerStarted","Data":"e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf"} Apr 24 14:39:31.864331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:31.864333 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" event={"ID":"da6d9043-24a0-4e30-9fa4-a1ecc58e8007","Type":"ContainerStarted","Data":"4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5"} Apr 24 14:39:31.864795 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:31.864469 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:31.893024 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:31.887860 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" podStartSLOduration=3.887843792 podStartE2EDuration="3.887843792s" podCreationTimestamp="2026-04-24 14:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:39:31.885585581 +0000 UTC m=+940.436427753" watchObservedRunningTime="2026-04-24 14:39:31.887843792 +0000 UTC m=+940.438685968" Apr 24 14:39:39.151635 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:39.151579 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:39.151635 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:39.151642 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:39.154255 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:39.154232 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:39.893267 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:39.893233 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:39:59.981586 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:59.981556 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw"] Apr 24 14:39:59.982181 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:39:59.981874 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" podUID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" containerName="main" containerID="cri-o://6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8" gracePeriod=30 Apr 24 14:40:00.229751 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.229728 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:40:00.306639 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.306584 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-dshm\") pod \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " Apr 24 14:40:00.306775 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.306649 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-model-cache\") pod \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " Apr 24 14:40:00.306775 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.306686 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kserve-provision-location\") pod \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " Apr 24 14:40:00.306775 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.306710 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-tls-certs\") pod \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " Apr 24 14:40:00.306775 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.306740 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9crwj\" (UniqueName: \"kubernetes.io/projected/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kube-api-access-9crwj\") pod \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " Apr 24 14:40:00.306986 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.306799 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-home\") pod \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\" (UID: \"4bf00cd9-85bf-48b1-94e7-d6061fb6315b\") " Apr 24 14:40:00.306986 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.306949 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-model-cache" (OuterVolumeSpecName: "model-cache") pod "4bf00cd9-85bf-48b1-94e7-d6061fb6315b" (UID: "4bf00cd9-85bf-48b1-94e7-d6061fb6315b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:00.307084 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.307048 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-model-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:00.307173 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.307150 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-home" (OuterVolumeSpecName: "home") pod "4bf00cd9-85bf-48b1-94e7-d6061fb6315b" (UID: "4bf00cd9-85bf-48b1-94e7-d6061fb6315b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:00.308786 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.308760 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-dshm" (OuterVolumeSpecName: "dshm") pod "4bf00cd9-85bf-48b1-94e7-d6061fb6315b" (UID: "4bf00cd9-85bf-48b1-94e7-d6061fb6315b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:00.308881 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.308856 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4bf00cd9-85bf-48b1-94e7-d6061fb6315b" (UID: "4bf00cd9-85bf-48b1-94e7-d6061fb6315b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:40:00.308922 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.308873 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kube-api-access-9crwj" (OuterVolumeSpecName: "kube-api-access-9crwj") pod "4bf00cd9-85bf-48b1-94e7-d6061fb6315b" (UID: "4bf00cd9-85bf-48b1-94e7-d6061fb6315b"). InnerVolumeSpecName "kube-api-access-9crwj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:40:00.367745 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.367714 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4bf00cd9-85bf-48b1-94e7-d6061fb6315b" (UID: "4bf00cd9-85bf-48b1-94e7-d6061fb6315b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:00.407791 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.407767 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-dshm\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:00.407791 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.407790 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:00.407930 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.407800 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:00.407930 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.407810 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9crwj\" (UniqueName: \"kubernetes.io/projected/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-kube-api-access-9crwj\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:00.407930 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.407820 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4bf00cd9-85bf-48b1-94e7-d6061fb6315b-home\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:00.896566 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.896536 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:40:00.968451 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.968412 2572 generic.go:358] "Generic (PLEG): container finished" podID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" containerID="6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8" exitCode=0 Apr 24 14:40:00.968658 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.968496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" event={"ID":"4bf00cd9-85bf-48b1-94e7-d6061fb6315b","Type":"ContainerDied","Data":"6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8"} Apr 24 14:40:00.968658 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.968511 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" Apr 24 14:40:00.968658 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.968538 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw" event={"ID":"4bf00cd9-85bf-48b1-94e7-d6061fb6315b","Type":"ContainerDied","Data":"910eac6cc923bcaf319b0870ccc98a346c5bed5dbbf6988689e84253eaf8f0d0"} Apr 24 14:40:00.968658 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.968558 2572 scope.go:117] "RemoveContainer" containerID="6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8" Apr 24 14:40:00.977061 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.977046 2572 scope.go:117] "RemoveContainer" containerID="b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d" Apr 24 14:40:00.990577 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.990552 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw"] Apr 24 14:40:00.991469 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.991383 2572 scope.go:117] "RemoveContainer" containerID="6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8" Apr 24 14:40:00.991749 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:40:00.991714 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8\": container with ID starting with 6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8 not found: ID does not exist" containerID="6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8" Apr 24 14:40:00.991861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.991751 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8"} err="failed to get container status \"6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8\": rpc error: code = NotFound desc = could not find container \"6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8\": container with ID starting with 6ab4134eee2b375bc5e18473bd68556102e4cb026880d05786a2ad73e1e416f8 not found: ID does not exist" Apr 24 14:40:00.991861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.991810 2572 scope.go:117] "RemoveContainer" containerID="b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d" Apr 24 14:40:00.992105 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:40:00.992080 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d\": container with ID starting with b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d not found: ID does not exist" containerID="b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d" Apr 24 14:40:00.992234 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.992204 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d"} err="failed to get container status \"b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d\": rpc error: code = NotFound desc = could not find container \"b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d\": container with ID starting with b176566a6b307c5254a0202fcff60ac5488dafbca0259e427847ee26f8cf649d not found: ID does not exist" Apr 24 14:40:00.994331 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:00.994311 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7946664fcfhhhfw"] Apr 24 14:40:02.058977 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:02.058942 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" path="/var/lib/kubelet/pods/4bf00cd9-85bf-48b1-94e7-d6061fb6315b/volumes" Apr 24 14:40:13.092467 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.092390 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz"] Apr 24 14:40:13.092960 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.092940 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" containerName="main" Apr 24 14:40:13.092960 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.092963 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" containerName="main" Apr 24 14:40:13.093092 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.092983 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" containerName="storage-initializer" Apr 24 14:40:13.093092 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.092993 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" containerName="storage-initializer" Apr 24 14:40:13.093092 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.093082 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4bf00cd9-85bf-48b1-94e7-d6061fb6315b" containerName="main" Apr 24 14:40:13.102795 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.102772 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.105802 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.105778 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 24 14:40:13.107829 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.107804 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz"] Apr 24 14:40:13.207905 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.207866 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.208104 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.207915 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.208104 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.208041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.208104 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.208078 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f72563c0-f4e4-4a46-bab6-bae209cceaa5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.208229 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.208158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.208229 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.208179 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq524\" (UniqueName: \"kubernetes.io/projected/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kube-api-access-nq524\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.308623 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.308577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.308787 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.308681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.308787 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.308702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f72563c0-f4e4-4a46-bab6-bae209cceaa5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.308787 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.308735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.308787 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.308763 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq524\" (UniqueName: \"kubernetes.io/projected/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kube-api-access-nq524\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.309064 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.308806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.309064 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.308980 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.309064 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.309055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.309220 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.309146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.311057 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.311035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.311221 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.311203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f72563c0-f4e4-4a46-bab6-bae209cceaa5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.317019 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.316996 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq524\" (UniqueName: \"kubernetes.io/projected/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kube-api-access-nq524\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.407498 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.407418 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz"] Apr 24 14:40:13.411863 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.411840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.414419 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.414400 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:40:13.414534 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.414458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-654s6\"" Apr 24 14:40:13.421163 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.421138 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz"] Apr 24 14:40:13.511198 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.511168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.511352 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.511201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.511352 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.511225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d9n4\" (UniqueName: \"kubernetes.io/projected/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kube-api-access-5d9n4\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.511352 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.511285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.511515 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.511354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.511515 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.511434 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.541241 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.541217 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz"] Apr 24 14:40:13.542813 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:40:13.542787 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf72563c0_f4e4_4a46_bab6_bae209cceaa5.slice/crio-3fa9750e68e68f51d8488397d9c2e2d3e9782df3ff09734267d732b3d9a4a3e8 WatchSource:0}: Error finding container 3fa9750e68e68f51d8488397d9c2e2d3e9782df3ff09734267d732b3d9a4a3e8: Status 404 returned error can't find the container with id 3fa9750e68e68f51d8488397d9c2e2d3e9782df3ff09734267d732b3d9a4a3e8 Apr 24 14:40:13.611955 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.611921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612059 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.611978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612059 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.612018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d9n4\" (UniqueName: \"kubernetes.io/projected/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kube-api-access-5d9n4\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612059 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.612046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612185 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.612129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612239 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.612194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612430 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.612405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612560 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.612537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612649 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.612624 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.612649 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.612594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.614230 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.614213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.620072 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.620050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d9n4\" (UniqueName: \"kubernetes.io/projected/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kube-api-access-5d9n4\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.737485 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.737413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:13.866108 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:13.866075 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz"] Apr 24 14:40:13.868506 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:40:13.868482 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod288f4689_d5e7_4cbd_80f7_c06c86d14d26.slice/crio-57ace546c79f277b12f18ddf86ba01ebe1cb8668ecee267b70616264e3ac5a08 WatchSource:0}: Error finding container 57ace546c79f277b12f18ddf86ba01ebe1cb8668ecee267b70616264e3ac5a08: Status 404 returned error can't find the container with id 57ace546c79f277b12f18ddf86ba01ebe1cb8668ecee267b70616264e3ac5a08 Apr 24 14:40:14.019799 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:14.019695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" event={"ID":"288f4689-d5e7-4cbd-80f7-c06c86d14d26","Type":"ContainerStarted","Data":"57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269"} Apr 24 14:40:14.019799 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:14.019740 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" event={"ID":"288f4689-d5e7-4cbd-80f7-c06c86d14d26","Type":"ContainerStarted","Data":"57ace546c79f277b12f18ddf86ba01ebe1cb8668ecee267b70616264e3ac5a08"} Apr 24 14:40:14.021208 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:14.021173 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" event={"ID":"f72563c0-f4e4-4a46-bab6-bae209cceaa5","Type":"ContainerStarted","Data":"923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed"} Apr 24 14:40:14.021348 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:14.021213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" event={"ID":"f72563c0-f4e4-4a46-bab6-bae209cceaa5","Type":"ContainerStarted","Data":"3fa9750e68e68f51d8488397d9c2e2d3e9782df3ff09734267d732b3d9a4a3e8"} Apr 24 14:40:15.026653 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:15.026549 2572 generic.go:358] "Generic (PLEG): container finished" podID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerID="57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269" exitCode=0 Apr 24 14:40:15.026653 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:15.026640 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" event={"ID":"288f4689-d5e7-4cbd-80f7-c06c86d14d26","Type":"ContainerDied","Data":"57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269"} Apr 24 14:40:16.039016 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:16.038973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" event={"ID":"288f4689-d5e7-4cbd-80f7-c06c86d14d26","Type":"ContainerStarted","Data":"4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4"} Apr 24 14:40:16.039016 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:16.039016 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" event={"ID":"288f4689-d5e7-4cbd-80f7-c06c86d14d26","Type":"ContainerStarted","Data":"2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6"} Apr 24 14:40:16.039486 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:16.039300 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:16.060766 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:16.060711 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" podStartSLOduration=3.060694675 podStartE2EDuration="3.060694675s" podCreationTimestamp="2026-04-24 14:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:40:16.059338101 +0000 UTC m=+984.610180271" watchObservedRunningTime="2026-04-24 14:40:16.060694675 +0000 UTC m=+984.611536846" Apr 24 14:40:23.738591 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:23.738556 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:23.739017 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:23.738599 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:23.741421 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:23.741395 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:24.070551 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:24.070523 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:45.073507 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:45.073479 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:40:51.820727 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:51.820692 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9"] Apr 24 14:40:51.821138 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:51.821094 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="main" containerID="cri-o://4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5" gracePeriod=30 Apr 24 14:40:51.821211 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:51.821151 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="tokenizer" containerID="cri-o://e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf" gracePeriod=30 Apr 24 14:40:52.163577 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:52.163483 2572 generic.go:358] "Generic (PLEG): container finished" podID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerID="4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5" exitCode=0 Apr 24 14:40:52.163577 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:52.163553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" event={"ID":"da6d9043-24a0-4e30-9fa4-a1ecc58e8007","Type":"ContainerDied","Data":"4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5"} Apr 24 14:40:53.074018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.073991 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:40:53.164780 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.164692 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-tmp\") pod \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " Apr 24 14:40:53.164780 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.164743 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kserve-provision-location\") pod \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " Apr 24 14:40:53.164780 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.164763 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf98l\" (UniqueName: \"kubernetes.io/projected/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kube-api-access-zf98l\") pod \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " Apr 24 14:40:53.165049 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.164818 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tls-certs\") pod \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " Apr 24 14:40:53.165049 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.164842 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-cache\") pod \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " Apr 24 14:40:53.165049 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.164869 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-uds\") pod \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\" (UID: \"da6d9043-24a0-4e30-9fa4-a1ecc58e8007\") " Apr 24 14:40:53.165208 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.165169 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "da6d9043-24a0-4e30-9fa4-a1ecc58e8007" (UID: "da6d9043-24a0-4e30-9fa4-a1ecc58e8007"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:53.165387 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.165335 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "da6d9043-24a0-4e30-9fa4-a1ecc58e8007" (UID: "da6d9043-24a0-4e30-9fa4-a1ecc58e8007"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:53.165387 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.165366 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "da6d9043-24a0-4e30-9fa4-a1ecc58e8007" (UID: "da6d9043-24a0-4e30-9fa4-a1ecc58e8007"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:53.165644 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.165623 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da6d9043-24a0-4e30-9fa4-a1ecc58e8007" (UID: "da6d9043-24a0-4e30-9fa4-a1ecc58e8007"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:40:53.167048 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.167006 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kube-api-access-zf98l" (OuterVolumeSpecName: "kube-api-access-zf98l") pod "da6d9043-24a0-4e30-9fa4-a1ecc58e8007" (UID: "da6d9043-24a0-4e30-9fa4-a1ecc58e8007"). InnerVolumeSpecName "kube-api-access-zf98l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:40:53.167165 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.167060 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "da6d9043-24a0-4e30-9fa4-a1ecc58e8007" (UID: "da6d9043-24a0-4e30-9fa4-a1ecc58e8007"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:40:53.168532 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.168510 2572 generic.go:358] "Generic (PLEG): container finished" podID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerID="e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf" exitCode=0 Apr 24 14:40:53.168649 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.168554 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" event={"ID":"da6d9043-24a0-4e30-9fa4-a1ecc58e8007","Type":"ContainerDied","Data":"e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf"} Apr 24 14:40:53.168649 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.168581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" event={"ID":"da6d9043-24a0-4e30-9fa4-a1ecc58e8007","Type":"ContainerDied","Data":"69d4f82ecb60ca2d6f59860b999b0d942a7ef381b71b32e4408b9825ef12af96"} Apr 24 14:40:53.168649 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.168585 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9" Apr 24 14:40:53.168649 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.168596 2572 scope.go:117] "RemoveContainer" containerID="e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf" Apr 24 14:40:53.182793 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.182778 2572 scope.go:117] "RemoveContainer" containerID="4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5" Apr 24 14:40:53.189572 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.189551 2572 scope.go:117] "RemoveContainer" containerID="5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931" Apr 24 14:40:53.195357 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.195337 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9"] Apr 24 14:40:53.196825 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.196807 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-fc6d4779jjd9"] Apr 24 14:40:53.198225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.198209 2572 scope.go:117] "RemoveContainer" containerID="e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf" Apr 24 14:40:53.198464 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:40:53.198444 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf\": container with ID starting with e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf not found: ID does not exist" containerID="e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf" Apr 24 14:40:53.198527 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.198471 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf"} err="failed to get container status \"e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf\": rpc error: code = NotFound desc = could not find container \"e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf\": container with ID starting with e29b431a5574f9e69653ba3921b3a7ba0df1067d7a2c08fa101dfdbf3bd1bfcf not found: ID does not exist" Apr 24 14:40:53.198527 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.198488 2572 scope.go:117] "RemoveContainer" containerID="4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5" Apr 24 14:40:53.198758 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:40:53.198741 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5\": container with ID starting with 4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5 not found: ID does not exist" containerID="4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5" Apr 24 14:40:53.198811 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.198762 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5"} err="failed to get container status \"4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5\": rpc error: code = NotFound desc = could not find container \"4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5\": container with ID starting with 4a46c2aaf6a9caf41db4591e9cfb22cb4f3978c72a0c125fc7aa66ac3dcdf0e5 not found: ID does not exist" Apr 24 14:40:53.198811 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.198778 2572 scope.go:117] "RemoveContainer" containerID="5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931" Apr 24 14:40:53.199003 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:40:53.198985 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931\": container with ID starting with 5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931 not found: ID does not exist" containerID="5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931" Apr 24 14:40:53.199049 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.199012 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931"} err="failed to get container status \"5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931\": rpc error: code = NotFound desc = could not find container \"5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931\": container with ID starting with 5ec40fff5c5bbe97d35dc93fe94ed5b0166528b8dcb280109faf57e09fe12931 not found: ID does not exist" Apr 24 14:40:53.265677 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.265649 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:53.265677 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.265673 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:53.265677 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.265683 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-uds\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:53.265861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.265691 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-tokenizer-tmp\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:53.265861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.265701 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:53.265861 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:53.265711 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zf98l\" (UniqueName: \"kubernetes.io/projected/da6d9043-24a0-4e30-9fa4-a1ecc58e8007-kube-api-access-zf98l\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:40:54.058966 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:40:54.058928 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" path="/var/lib/kubelet/pods/da6d9043-24a0-4e30-9fa4-a1ecc58e8007/volumes" Apr 24 14:42:04.404385 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:42:04.404353 2572 generic.go:358] "Generic (PLEG): container finished" podID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerID="923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed" exitCode=0 Apr 24 14:42:04.404815 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:42:04.404426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" event={"ID":"f72563c0-f4e4-4a46-bab6-bae209cceaa5","Type":"ContainerDied","Data":"923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed"} Apr 24 14:42:04.405591 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:42:04.405576 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:42:49.570792 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:42:49.570757 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" event={"ID":"f72563c0-f4e4-4a46-bab6-bae209cceaa5","Type":"ContainerStarted","Data":"e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187"} Apr 24 14:42:49.592633 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:42:49.592540 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podStartSLOduration=111.551708297 podStartE2EDuration="2m36.592518115s" podCreationTimestamp="2026-04-24 14:40:13 +0000 UTC" firstStartedPulling="2026-04-24 14:42:04.405715551 +0000 UTC m=+1092.956557705" lastFinishedPulling="2026-04-24 14:42:49.446525364 +0000 UTC m=+1137.997367523" observedRunningTime="2026-04-24 14:42:49.590521896 +0000 UTC m=+1138.141364094" watchObservedRunningTime="2026-04-24 14:42:49.592518115 +0000 UTC m=+1138.143360288" Apr 24 14:42:53.414798 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:42:53.414759 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:42:53.415233 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:42:53.414920 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:42:53.416496 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:42:53.416461 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:43:03.415741 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:43:03.415691 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:43:13.415488 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:43:13.415393 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:43:23.415619 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:43:23.415561 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:43:33.415439 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:43:33.415397 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:43:43.414995 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:43:43.414950 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:43:52.377126 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:43:52.377093 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:43:52.379231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:43:52.379196 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:43:53.415206 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:43:53.415166 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:44:03.415766 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:03.415722 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:44:13.415122 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:13.415072 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 14:44:23.424625 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:23.424576 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:44:23.435475 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:23.435454 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:44:37.166588 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:37.166556 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz"] Apr 24 14:44:37.167290 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:37.166932 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" containerID="cri-o://e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187" gracePeriod=30 Apr 24 14:44:37.170782 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:37.170756 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz"] Apr 24 14:44:37.171156 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:37.171129 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="main" containerID="cri-o://2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6" gracePeriod=30 Apr 24 14:44:37.171281 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:37.171153 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="tokenizer" containerID="cri-o://4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4" gracePeriod=30 Apr 24 14:44:37.948001 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:37.947966 2572 generic.go:358] "Generic (PLEG): container finished" podID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerID="2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6" exitCode=0 Apr 24 14:44:37.948197 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:37.948050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" event={"ID":"288f4689-d5e7-4cbd-80f7-c06c86d14d26","Type":"ContainerDied","Data":"2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6"} Apr 24 14:44:38.420279 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.420254 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:44:38.457015 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.456990 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-cache\") pod \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " Apr 24 14:44:38.457177 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457032 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-uds\") pod \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " Apr 24 14:44:38.457177 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457060 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d9n4\" (UniqueName: \"kubernetes.io/projected/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kube-api-access-5d9n4\") pod \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " Apr 24 14:44:38.457177 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457118 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tls-certs\") pod \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " Apr 24 14:44:38.457177 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457133 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-tmp\") pod \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " Apr 24 14:44:38.457392 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457185 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kserve-provision-location\") pod \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\" (UID: \"288f4689-d5e7-4cbd-80f7-c06c86d14d26\") " Apr 24 14:44:38.457454 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457424 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "288f4689-d5e7-4cbd-80f7-c06c86d14d26" (UID: "288f4689-d5e7-4cbd-80f7-c06c86d14d26"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:44:38.457565 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457324 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "288f4689-d5e7-4cbd-80f7-c06c86d14d26" (UID: "288f4689-d5e7-4cbd-80f7-c06c86d14d26"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:44:38.457656 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457554 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "288f4689-d5e7-4cbd-80f7-c06c86d14d26" (UID: "288f4689-d5e7-4cbd-80f7-c06c86d14d26"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:44:38.458018 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.457992 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "288f4689-d5e7-4cbd-80f7-c06c86d14d26" (UID: "288f4689-d5e7-4cbd-80f7-c06c86d14d26"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:44:38.459283 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.459257 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "288f4689-d5e7-4cbd-80f7-c06c86d14d26" (UID: "288f4689-d5e7-4cbd-80f7-c06c86d14d26"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:44:38.459370 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.459322 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kube-api-access-5d9n4" (OuterVolumeSpecName: "kube-api-access-5d9n4") pod "288f4689-d5e7-4cbd-80f7-c06c86d14d26" (UID: "288f4689-d5e7-4cbd-80f7-c06c86d14d26"). InnerVolumeSpecName "kube-api-access-5d9n4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:44:38.558450 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.558425 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:44:38.558450 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.558448 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-tmp\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:44:38.558617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.558458 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:44:38.558617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.558469 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:44:38.558617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.558478 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/288f4689-d5e7-4cbd-80f7-c06c86d14d26-tokenizer-uds\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:44:38.558617 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.558487 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5d9n4\" (UniqueName: \"kubernetes.io/projected/288f4689-d5e7-4cbd-80f7-c06c86d14d26-kube-api-access-5d9n4\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:44:38.953327 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.953241 2572 generic.go:358] "Generic (PLEG): container finished" podID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerID="4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4" exitCode=0 Apr 24 14:44:38.953327 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.953290 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" event={"ID":"288f4689-d5e7-4cbd-80f7-c06c86d14d26","Type":"ContainerDied","Data":"4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4"} Apr 24 14:44:38.953327 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.953313 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" Apr 24 14:44:38.953327 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.953326 2572 scope.go:117] "RemoveContainer" containerID="4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4" Apr 24 14:44:38.953674 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.953316 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz" event={"ID":"288f4689-d5e7-4cbd-80f7-c06c86d14d26","Type":"ContainerDied","Data":"57ace546c79f277b12f18ddf86ba01ebe1cb8668ecee267b70616264e3ac5a08"} Apr 24 14:44:38.961992 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.961974 2572 scope.go:117] "RemoveContainer" containerID="2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6" Apr 24 14:44:38.969068 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.969050 2572 scope.go:117] "RemoveContainer" containerID="57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269" Apr 24 14:44:38.975328 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.975305 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz"] Apr 24 14:44:38.975809 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.975795 2572 scope.go:117] "RemoveContainer" containerID="4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4" Apr 24 14:44:38.976033 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:44:38.976014 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4\": container with ID starting with 4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4 not found: ID does not exist" containerID="4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4" Apr 24 14:44:38.976105 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.976045 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4"} err="failed to get container status \"4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4\": rpc error: code = NotFound desc = could not find container \"4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4\": container with ID starting with 4bbfb79dd06fa12abf3b63193e644ca542067eb7226237c8190ca0f227d0c0f4 not found: ID does not exist" Apr 24 14:44:38.976105 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.976070 2572 scope.go:117] "RemoveContainer" containerID="2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6" Apr 24 14:44:38.976395 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:44:38.976358 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6\": container with ID starting with 2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6 not found: ID does not exist" containerID="2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6" Apr 24 14:44:38.976498 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.976390 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6"} err="failed to get container status \"2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6\": rpc error: code = NotFound desc = could not find container \"2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6\": container with ID starting with 2ef10949b674e1a91eb07689564639bbe5e1e9613f9846e777ff9eadf2a026e6 not found: ID does not exist" Apr 24 14:44:38.976498 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.976411 2572 scope.go:117] "RemoveContainer" containerID="57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269" Apr 24 14:44:38.976808 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:44:38.976775 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269\": container with ID starting with 57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269 not found: ID does not exist" containerID="57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269" Apr 24 14:44:38.976903 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.976807 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269"} err="failed to get container status \"57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269\": rpc error: code = NotFound desc = could not find container \"57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269\": container with ID starting with 57f8f8ce5738b6691b94cf0b825713485a212b26f185e2bfde966e3e68c83269 not found: ID does not exist" Apr 24 14:44:38.978449 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:38.978426 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schekbxkz"] Apr 24 14:44:40.058214 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:40.058178 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" path="/var/lib/kubelet/pods/288f4689-d5e7-4cbd-80f7-c06c86d14d26/volumes" Apr 24 14:44:44.063054 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063019 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs"] Apr 24 14:44:44.063543 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063525 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="tokenizer" Apr 24 14:44:44.063584 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063546 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="tokenizer" Apr 24 14:44:44.063584 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063566 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="storage-initializer" Apr 24 14:44:44.063584 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063576 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="storage-initializer" Apr 24 14:44:44.063705 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063585 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="main" Apr 24 14:44:44.063705 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063595 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="main" Apr 24 14:44:44.063782 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063769 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="storage-initializer" Apr 24 14:44:44.063818 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063784 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="storage-initializer" Apr 24 14:44:44.063818 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063806 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="main" Apr 24 14:44:44.063818 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063816 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="main" Apr 24 14:44:44.063903 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063836 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="tokenizer" Apr 24 14:44:44.063903 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063845 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="tokenizer" Apr 24 14:44:44.063963 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063933 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="main" Apr 24 14:44:44.063963 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063947 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="main" Apr 24 14:44:44.063963 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063958 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="288f4689-d5e7-4cbd-80f7-c06c86d14d26" containerName="tokenizer" Apr 24 14:44:44.064049 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.063969 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="da6d9043-24a0-4e30-9fa4-a1ecc58e8007" containerName="tokenizer" Apr 24 14:44:44.067673 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.067649 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.070150 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.070099 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 24 14:44:44.078066 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.078043 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs"] Apr 24 14:44:44.203134 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.203102 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.203294 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.203140 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-home\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.203294 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.203162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/686331d9-1c58-4e27-9ccc-935c2ebd5b26-tls-certs\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.203294 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.203183 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-dshm\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.203463 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.203290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vr66\" (UniqueName: \"kubernetes.io/projected/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kube-api-access-7vr66\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.203463 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.203328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-model-cache\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304017 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.303984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/686331d9-1c58-4e27-9ccc-935c2ebd5b26-tls-certs\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304017 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.304018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-dshm\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304245 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.304061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vr66\" (UniqueName: \"kubernetes.io/projected/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kube-api-access-7vr66\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304245 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.304084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-model-cache\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304245 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.304117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304245 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.304144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-home\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304646 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.304593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304646 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.304639 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-home\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.304846 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.304682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-model-cache\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.306287 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.306267 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-dshm\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.306557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.306539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/686331d9-1c58-4e27-9ccc-935c2ebd5b26-tls-certs\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.312317 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.312289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vr66\" (UniqueName: \"kubernetes.io/projected/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kube-api-access-7vr66\") pod \"custom-route-timeout-test-kserve-6cbfb55898-jn9bs\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.381474 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.381401 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:44.417343 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.416701 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns"] Apr 24 14:44:44.422500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.422472 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.425164 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.425138 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-d6jwd\"" Apr 24 14:44:44.436270 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.436222 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns"] Apr 24 14:44:44.506094 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.506067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.506240 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.506111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.506240 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.506197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.506240 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.506229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp4pl\" (UniqueName: \"kubernetes.io/projected/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kube-api-access-mp4pl\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.506342 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.506248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.506342 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.506278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.511262 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.511234 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs"] Apr 24 14:44:44.514362 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:44:44.514337 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686331d9_1c58_4e27_9ccc_935c2ebd5b26.slice/crio-a2ae0a8b781e0738d5873bff6b39ade3a41f10750642151d7eda253b63b6bd90 WatchSource:0}: Error finding container a2ae0a8b781e0738d5873bff6b39ade3a41f10750642151d7eda253b63b6bd90: Status 404 returned error can't find the container with id a2ae0a8b781e0738d5873bff6b39ade3a41f10750642151d7eda253b63b6bd90 Apr 24 14:44:44.607423 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607397 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.607553 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.607553 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp4pl\" (UniqueName: \"kubernetes.io/projected/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kube-api-access-mp4pl\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.607553 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.607553 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.607796 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.607875 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.607956 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.608023 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.608023 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.607978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.609826 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.609808 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.615630 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.615596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp4pl\" (UniqueName: \"kubernetes.io/projected/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kube-api-access-mp4pl\") pod \"custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.736937 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.736855 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:44.867391 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.867365 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns"] Apr 24 14:44:44.869887 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:44:44.869857 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3a8a4e3_b1b6_4087_ac98_0b472e7b57a8.slice/crio-b03ee372a07b33ec65ff40a3e0f31827f275ef5756bd044aa64ae602f83e637e WatchSource:0}: Error finding container b03ee372a07b33ec65ff40a3e0f31827f275ef5756bd044aa64ae602f83e637e: Status 404 returned error can't find the container with id b03ee372a07b33ec65ff40a3e0f31827f275ef5756bd044aa64ae602f83e637e Apr 24 14:44:44.980889 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.980849 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" event={"ID":"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8","Type":"ContainerStarted","Data":"7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833"} Apr 24 14:44:44.980889 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.980895 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" event={"ID":"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8","Type":"ContainerStarted","Data":"b03ee372a07b33ec65ff40a3e0f31827f275ef5756bd044aa64ae602f83e637e"} Apr 24 14:44:44.982340 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.982314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" event={"ID":"686331d9-1c58-4e27-9ccc-935c2ebd5b26","Type":"ContainerStarted","Data":"a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6"} Apr 24 14:44:44.982340 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:44.982343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" event={"ID":"686331d9-1c58-4e27-9ccc-935c2ebd5b26","Type":"ContainerStarted","Data":"a2ae0a8b781e0738d5873bff6b39ade3a41f10750642151d7eda253b63b6bd90"} Apr 24 14:44:45.987590 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:45.987497 2572 generic.go:358] "Generic (PLEG): container finished" podID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerID="7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833" exitCode=0 Apr 24 14:44:45.987962 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:45.987640 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" event={"ID":"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8","Type":"ContainerDied","Data":"7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833"} Apr 24 14:44:46.995750 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:46.995715 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" event={"ID":"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8","Type":"ContainerStarted","Data":"4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8"} Apr 24 14:44:46.995750 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:46.995755 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" event={"ID":"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8","Type":"ContainerStarted","Data":"c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c"} Apr 24 14:44:46.996306 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:46.995845 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:47.021754 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:47.021689 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" podStartSLOduration=3.021673012 podStartE2EDuration="3.021673012s" podCreationTimestamp="2026-04-24 14:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:44:47.018886288 +0000 UTC m=+1255.569728484" watchObservedRunningTime="2026-04-24 14:44:47.021673012 +0000 UTC m=+1255.572515183" Apr 24 14:44:49.004655 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:49.004554 2572 generic.go:358] "Generic (PLEG): container finished" podID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerID="a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6" exitCode=0 Apr 24 14:44:49.004655 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:49.004612 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" event={"ID":"686331d9-1c58-4e27-9ccc-935c2ebd5b26","Type":"ContainerDied","Data":"a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6"} Apr 24 14:44:50.009843 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:50.009812 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" event={"ID":"686331d9-1c58-4e27-9ccc-935c2ebd5b26","Type":"ContainerStarted","Data":"541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9"} Apr 24 14:44:50.031227 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:50.031169 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podStartSLOduration=6.031148901 podStartE2EDuration="6.031148901s" podCreationTimestamp="2026-04-24 14:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:44:50.030127028 +0000 UTC m=+1258.580969236" watchObservedRunningTime="2026-04-24 14:44:50.031148901 +0000 UTC m=+1258.581991074" Apr 24 14:44:54.382179 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:54.382143 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:54.382179 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:54.382181 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:44:54.383765 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:54.383738 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:44:54.737442 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:54.737354 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:54.737442 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:54.737398 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:54.740173 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:54.740145 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:44:55.030252 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:44:55.030223 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:45:04.382158 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:04.382102 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:45:07.403504 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.403476 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz_f72563c0-f4e4-4a46-bab6-bae209cceaa5/main/0.log" Apr 24 14:45:07.403866 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.403848 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:45:07.513146 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.513056 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq524\" (UniqueName: \"kubernetes.io/projected/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kube-api-access-nq524\") pod \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " Apr 24 14:45:07.513146 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.513101 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kserve-provision-location\") pod \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " Apr 24 14:45:07.513393 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.513218 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-model-cache\") pod \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " Apr 24 14:45:07.513393 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.513237 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-home\") pod \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " Apr 24 14:45:07.513393 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.513272 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-dshm\") pod \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " Apr 24 14:45:07.513393 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.513295 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f72563c0-f4e4-4a46-bab6-bae209cceaa5-tls-certs\") pod \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\" (UID: \"f72563c0-f4e4-4a46-bab6-bae209cceaa5\") " Apr 24 14:45:07.513593 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.513519 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-model-cache" (OuterVolumeSpecName: "model-cache") pod "f72563c0-f4e4-4a46-bab6-bae209cceaa5" (UID: "f72563c0-f4e4-4a46-bab6-bae209cceaa5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:45:07.513726 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.513695 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-home" (OuterVolumeSpecName: "home") pod "f72563c0-f4e4-4a46-bab6-bae209cceaa5" (UID: "f72563c0-f4e4-4a46-bab6-bae209cceaa5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:45:07.515420 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.515385 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kube-api-access-nq524" (OuterVolumeSpecName: "kube-api-access-nq524") pod "f72563c0-f4e4-4a46-bab6-bae209cceaa5" (UID: "f72563c0-f4e4-4a46-bab6-bae209cceaa5"). InnerVolumeSpecName "kube-api-access-nq524". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:45:07.515859 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.515837 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f72563c0-f4e4-4a46-bab6-bae209cceaa5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f72563c0-f4e4-4a46-bab6-bae209cceaa5" (UID: "f72563c0-f4e4-4a46-bab6-bae209cceaa5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:45:07.515963 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.515855 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-dshm" (OuterVolumeSpecName: "dshm") pod "f72563c0-f4e4-4a46-bab6-bae209cceaa5" (UID: "f72563c0-f4e4-4a46-bab6-bae209cceaa5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:45:07.575183 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.575131 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f72563c0-f4e4-4a46-bab6-bae209cceaa5" (UID: "f72563c0-f4e4-4a46-bab6-bae209cceaa5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:45:07.614317 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.614284 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nq524\" (UniqueName: \"kubernetes.io/projected/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kube-api-access-nq524\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:45:07.614317 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.614317 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:45:07.614457 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.614328 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-model-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:45:07.614457 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.614337 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-home\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:45:07.614457 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.614347 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f72563c0-f4e4-4a46-bab6-bae209cceaa5-dshm\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:45:07.614457 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:07.614354 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f72563c0-f4e4-4a46-bab6-bae209cceaa5-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:45:08.072336 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.072309 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz_f72563c0-f4e4-4a46-bab6-bae209cceaa5/main/0.log" Apr 24 14:45:08.072679 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.072655 2572 generic.go:358] "Generic (PLEG): container finished" podID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerID="e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187" exitCode=137 Apr 24 14:45:08.072774 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.072716 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" Apr 24 14:45:08.072774 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.072735 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" event={"ID":"f72563c0-f4e4-4a46-bab6-bae209cceaa5","Type":"ContainerDied","Data":"e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187"} Apr 24 14:45:08.072892 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.072775 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz" event={"ID":"f72563c0-f4e4-4a46-bab6-bae209cceaa5","Type":"ContainerDied","Data":"3fa9750e68e68f51d8488397d9c2e2d3e9782df3ff09734267d732b3d9a4a3e8"} Apr 24 14:45:08.072892 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.072791 2572 scope.go:117] "RemoveContainer" containerID="e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187" Apr 24 14:45:08.082938 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.082916 2572 scope.go:117] "RemoveContainer" containerID="923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed" Apr 24 14:45:08.092293 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.092266 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz"] Apr 24 14:45:08.094768 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.094746 2572 scope.go:117] "RemoveContainer" containerID="e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187" Apr 24 14:45:08.095065 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:45:08.095045 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187\": container with ID starting with e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187 not found: ID does not exist" containerID="e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187" Apr 24 14:45:08.095144 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.095079 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187"} err="failed to get container status \"e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187\": rpc error: code = NotFound desc = could not find container \"e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187\": container with ID starting with e49a168be200f4b9951d237fb285de79ee88b64900df185133a5d36b1e349187 not found: ID does not exist" Apr 24 14:45:08.095144 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.095104 2572 scope.go:117] "RemoveContainer" containerID="923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed" Apr 24 14:45:08.095375 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:45:08.095354 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed\": container with ID starting with 923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed not found: ID does not exist" containerID="923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed" Apr 24 14:45:08.095491 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.095379 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed"} err="failed to get container status \"923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed\": rpc error: code = NotFound desc = could not find container \"923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed\": container with ID starting with 923794b55bd92be14450a7746a75a5c5c8890034bcc42c73acf83bbe18c452ed not found: ID does not exist" Apr 24 14:45:08.095558 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:08.095494 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-fbc44c7dd-k98sz"] Apr 24 14:45:10.060668 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:10.060630 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" path="/var/lib/kubelet/pods/f72563c0-f4e4-4a46-bab6-bae209cceaa5/volumes" Apr 24 14:45:14.382662 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:14.382583 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:45:16.034230 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:16.034199 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:45:24.382551 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:24.382508 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:45:34.382354 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:34.382312 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:45:44.382712 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:44.382660 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:45:54.382808 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:45:54.382756 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:46:04.382163 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:04.382114 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:46:14.382633 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:14.382504 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 14:46:24.392417 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:24.392387 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:46:24.400231 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:24.400209 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:46:29.908823 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:29.908782 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns"] Apr 24 14:46:29.909307 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:29.909085 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="main" containerID="cri-o://c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c" gracePeriod=30 Apr 24 14:46:29.909307 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:29.909141 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="tokenizer" containerID="cri-o://4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8" gracePeriod=30 Apr 24 14:46:29.914422 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:29.914398 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs"] Apr 24 14:46:29.914760 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:29.914720 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" containerID="cri-o://541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9" gracePeriod=30 Apr 24 14:46:30.346733 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:30.346688 2572 generic.go:358] "Generic (PLEG): container finished" podID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerID="c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c" exitCode=0 Apr 24 14:46:30.346891 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:30.346758 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" event={"ID":"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8","Type":"ContainerDied","Data":"c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c"} Apr 24 14:46:31.066449 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.066426 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:46:31.115428 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115401 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-uds\") pod \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " Apr 24 14:46:31.115565 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115432 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tls-certs\") pod \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " Apr 24 14:46:31.115565 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115453 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp4pl\" (UniqueName: \"kubernetes.io/projected/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kube-api-access-mp4pl\") pod \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " Apr 24 14:46:31.115565 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115480 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-cache\") pod \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " Apr 24 14:46:31.115565 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115521 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-tmp\") pod \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " Apr 24 14:46:31.115565 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115556 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kserve-provision-location\") pod \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\" (UID: \"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8\") " Apr 24 14:46:31.115854 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115675 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" (UID: "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:46:31.115854 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115828 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" (UID: "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:46:31.115967 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115925 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" (UID: "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:46:31.115967 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115944 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-uds\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:46:31.116067 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.115966 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:46:31.116355 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.116326 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" (UID: "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:46:31.117575 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.117552 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" (UID: "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:46:31.117743 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.117724 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kube-api-access-mp4pl" (OuterVolumeSpecName: "kube-api-access-mp4pl") pod "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" (UID: "a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8"). InnerVolumeSpecName "kube-api-access-mp4pl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:46:31.217286 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.217226 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tokenizer-tmp\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:46:31.217286 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.217251 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:46:31.217286 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.217262 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:46:31.217286 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.217271 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mp4pl\" (UniqueName: \"kubernetes.io/projected/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8-kube-api-access-mp4pl\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:46:31.351895 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.351862 2572 generic.go:358] "Generic (PLEG): container finished" podID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerID="4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8" exitCode=0 Apr 24 14:46:31.352075 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.351906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" event={"ID":"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8","Type":"ContainerDied","Data":"4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8"} Apr 24 14:46:31.352075 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.351949 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" event={"ID":"a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8","Type":"ContainerDied","Data":"b03ee372a07b33ec65ff40a3e0f31827f275ef5756bd044aa64ae602f83e637e"} Apr 24 14:46:31.352075 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.351953 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns" Apr 24 14:46:31.352075 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.351966 2572 scope.go:117] "RemoveContainer" containerID="4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8" Apr 24 14:46:31.360329 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.360310 2572 scope.go:117] "RemoveContainer" containerID="c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c" Apr 24 14:46:31.367096 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.367081 2572 scope.go:117] "RemoveContainer" containerID="7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833" Apr 24 14:46:31.374409 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.374393 2572 scope.go:117] "RemoveContainer" containerID="4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8" Apr 24 14:46:31.374693 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:46:31.374666 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8\": container with ID starting with 4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8 not found: ID does not exist" containerID="4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8" Apr 24 14:46:31.374784 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.374702 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8"} err="failed to get container status \"4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8\": rpc error: code = NotFound desc = could not find container \"4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8\": container with ID starting with 4e61f85f59f3474a1b5b5bc5cd3b60ed05c4b2e3f4523de20ab3d3faf5f337c8 not found: ID does not exist" Apr 24 14:46:31.374784 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.374721 2572 scope.go:117] "RemoveContainer" containerID="c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c" Apr 24 14:46:31.374961 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:46:31.374943 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c\": container with ID starting with c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c not found: ID does not exist" containerID="c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c" Apr 24 14:46:31.375000 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.374968 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c"} err="failed to get container status \"c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c\": rpc error: code = NotFound desc = could not find container \"c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c\": container with ID starting with c80de19771c279bf9452a526ab24f9b0bac558cdec0dfda0d3375a0c344bed0c not found: ID does not exist" Apr 24 14:46:31.375000 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.374982 2572 scope.go:117] "RemoveContainer" containerID="7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833" Apr 24 14:46:31.375175 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.375156 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns"] Apr 24 14:46:31.375225 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:46:31.375179 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833\": container with ID starting with 7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833 not found: ID does not exist" containerID="7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833" Apr 24 14:46:31.375225 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.375192 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833"} err="failed to get container status \"7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833\": rpc error: code = NotFound desc = could not find container \"7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833\": container with ID starting with 7db9cce4bd10047232e46651110ebf4f2ca41c764fd0f06f2109a9f662c1e833 not found: ID does not exist" Apr 24 14:46:31.379345 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:31.379326 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-79f9c87cndzns"] Apr 24 14:46:32.059147 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:32.059115 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" path="/var/lib/kubelet/pods/a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8/volumes" Apr 24 14:46:40.990305 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990270 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz"] Apr 24 14:46:40.990786 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990758 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" Apr 24 14:46:40.990786 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990777 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" Apr 24 14:46:40.990896 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990803 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="main" Apr 24 14:46:40.990896 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990811 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="main" Apr 24 14:46:40.990896 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990822 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="tokenizer" Apr 24 14:46:40.990896 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990831 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="tokenizer" Apr 24 14:46:40.990896 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990847 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="storage-initializer" Apr 24 14:46:40.990896 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990858 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="storage-initializer" Apr 24 14:46:40.990896 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990871 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="storage-initializer" Apr 24 14:46:40.990896 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990880 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="storage-initializer" Apr 24 14:46:40.991271 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990961 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f72563c0-f4e4-4a46-bab6-bae209cceaa5" containerName="main" Apr 24 14:46:40.991271 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990974 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="main" Apr 24 14:46:40.991271 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.990987 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3a8a4e3-b1b6-4087-ac98-0b472e7b57a8" containerName="tokenizer" Apr 24 14:46:40.995734 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.995711 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:40.998350 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:40.998324 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 24 14:46:41.005544 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.005517 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz"] Apr 24 14:46:41.103834 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.103803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-dshm\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.103959 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.103863 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-home\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.103959 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.103914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-model-cache\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.103959 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.103948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-kserve-provision-location\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.104076 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.103976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhzkx\" (UniqueName: \"kubernetes.io/projected/2339aac3-d1f6-4e44-b828-a14f725536eb-kube-api-access-mhzkx\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.104076 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.103997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2339aac3-d1f6-4e44-b828-a14f725536eb-tls-certs\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205127 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhzkx\" (UniqueName: \"kubernetes.io/projected/2339aac3-d1f6-4e44-b828-a14f725536eb-kube-api-access-mhzkx\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205288 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2339aac3-d1f6-4e44-b828-a14f725536eb-tls-certs\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205288 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-dshm\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205288 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-home\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205288 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-model-cache\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205500 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-kserve-provision-location\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205738 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-home\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205863 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-kserve-provision-location\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.205863 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.205845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-model-cache\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.207926 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.207902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-dshm\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.208106 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.208085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2339aac3-d1f6-4e44-b828-a14f725536eb-tls-certs\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.213205 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.213181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhzkx\" (UniqueName: \"kubernetes.io/projected/2339aac3-d1f6-4e44-b828-a14f725536eb-kube-api-access-mhzkx\") pod \"router-with-refs-test-kserve-5cfbdd5558-h5cjz\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.306883 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.306855 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 14:46:41.371556 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.371503 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw"] Apr 24 14:46:41.376341 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.376319 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.379141 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.378939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-w5r87\"" Apr 24 14:46:41.382036 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.382002 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw"] Apr 24 14:46:41.507685 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.507653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533b40ef-1f72-4877-8937-a76e237fe998-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.507685 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.507686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69t5x\" (UniqueName: \"kubernetes.io/projected/533b40ef-1f72-4877-8937-a76e237fe998-kube-api-access-69t5x\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.507901 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.507715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.507901 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.507774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.507901 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.507794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.507901 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.507810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.608911 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.608839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.609046 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.608907 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.609046 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.608934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.609046 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.608954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.609046 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.609029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533b40ef-1f72-4877-8937-a76e237fe998-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.609263 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.609067 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69t5x\" (UniqueName: \"kubernetes.io/projected/533b40ef-1f72-4877-8937-a76e237fe998-kube-api-access-69t5x\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.609312 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.609265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.609824 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.609797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.610007 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.609859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.610147 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.610072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.616257 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.614118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533b40ef-1f72-4877-8937-a76e237fe998-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.618893 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.618874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69t5x\" (UniqueName: \"kubernetes.io/projected/533b40ef-1f72-4877-8937-a76e237fe998-kube-api-access-69t5x\") pod \"router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.642973 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.642944 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz"] Apr 24 14:46:41.646530 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:46:41.646505 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2339aac3_d1f6_4e44_b828_a14f725536eb.slice/crio-232edbcfdd1b5aa5c3c03aea3ebd625b02afcf7d7440547ad328c9e7c34c1ab1 WatchSource:0}: Error finding container 232edbcfdd1b5aa5c3c03aea3ebd625b02afcf7d7440547ad328c9e7c34c1ab1: Status 404 returned error can't find the container with id 232edbcfdd1b5aa5c3c03aea3ebd625b02afcf7d7440547ad328c9e7c34c1ab1 Apr 24 14:46:41.687524 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.687502 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:41.817726 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:41.817630 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw"] Apr 24 14:46:41.820206 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:46:41.820171 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533b40ef_1f72_4877_8937_a76e237fe998.slice/crio-89ed3ad33271ea7e5cb819f861955c12f190394b880c63dd2239487d656bffc8 WatchSource:0}: Error finding container 89ed3ad33271ea7e5cb819f861955c12f190394b880c63dd2239487d656bffc8: Status 404 returned error can't find the container with id 89ed3ad33271ea7e5cb819f861955c12f190394b880c63dd2239487d656bffc8 Apr 24 14:46:42.392505 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:42.392463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" event={"ID":"533b40ef-1f72-4877-8937-a76e237fe998","Type":"ContainerStarted","Data":"585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b"} Apr 24 14:46:42.392505 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:42.392507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" event={"ID":"533b40ef-1f72-4877-8937-a76e237fe998","Type":"ContainerStarted","Data":"89ed3ad33271ea7e5cb819f861955c12f190394b880c63dd2239487d656bffc8"} Apr 24 14:46:42.394087 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:42.394056 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" event={"ID":"2339aac3-d1f6-4e44-b828-a14f725536eb","Type":"ContainerStarted","Data":"c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b"} Apr 24 14:46:42.394222 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:42.394093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" event={"ID":"2339aac3-d1f6-4e44-b828-a14f725536eb","Type":"ContainerStarted","Data":"232edbcfdd1b5aa5c3c03aea3ebd625b02afcf7d7440547ad328c9e7c34c1ab1"} Apr 24 14:46:43.398671 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:43.398629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" event={"ID":"533b40ef-1f72-4877-8937-a76e237fe998","Type":"ContainerDied","Data":"585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b"} Apr 24 14:46:43.398671 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:43.398534 2572 generic.go:358] "Generic (PLEG): container finished" podID="533b40ef-1f72-4877-8937-a76e237fe998" containerID="585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b" exitCode=0 Apr 24 14:46:44.407953 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:44.407904 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" event={"ID":"533b40ef-1f72-4877-8937-a76e237fe998","Type":"ContainerStarted","Data":"f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47"} Apr 24 14:46:44.408322 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:44.407962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" event={"ID":"533b40ef-1f72-4877-8937-a76e237fe998","Type":"ContainerStarted","Data":"48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807"} Apr 24 14:46:44.408322 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:44.408008 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:44.428641 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:44.428541 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" podStartSLOduration=3.4285272989999998 podStartE2EDuration="3.428527299s" podCreationTimestamp="2026-04-24 14:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:46:44.427133263 +0000 UTC m=+1372.977975472" watchObservedRunningTime="2026-04-24 14:46:44.428527299 +0000 UTC m=+1372.979369465" Apr 24 14:46:51.687921 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:51.687882 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:51.688362 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:51.687942 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:51.690752 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:51.690728 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:46:52.437557 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:46:52.437514 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:47:00.209925 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.209903 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6cbfb55898-jn9bs_686331d9-1c58-4e27-9ccc-935c2ebd5b26/main/0.log" Apr 24 14:47:00.210271 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.210254 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:47:00.375506 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.375475 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-dshm\") pod \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " Apr 24 14:47:00.375682 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.375515 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vr66\" (UniqueName: \"kubernetes.io/projected/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kube-api-access-7vr66\") pod \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " Apr 24 14:47:00.375682 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.375547 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kserve-provision-location\") pod \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " Apr 24 14:47:00.375682 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.375568 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-home\") pod \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " Apr 24 14:47:00.375682 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.375634 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/686331d9-1c58-4e27-9ccc-935c2ebd5b26-tls-certs\") pod \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " Apr 24 14:47:00.375869 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.375703 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-model-cache\") pod \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\" (UID: \"686331d9-1c58-4e27-9ccc-935c2ebd5b26\") " Apr 24 14:47:00.376052 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.376000 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-model-cache" (OuterVolumeSpecName: "model-cache") pod "686331d9-1c58-4e27-9ccc-935c2ebd5b26" (UID: "686331d9-1c58-4e27-9ccc-935c2ebd5b26"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:47:00.376159 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.376070 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-home" (OuterVolumeSpecName: "home") pod "686331d9-1c58-4e27-9ccc-935c2ebd5b26" (UID: "686331d9-1c58-4e27-9ccc-935c2ebd5b26"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:47:00.377565 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.377543 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-dshm" (OuterVolumeSpecName: "dshm") pod "686331d9-1c58-4e27-9ccc-935c2ebd5b26" (UID: "686331d9-1c58-4e27-9ccc-935c2ebd5b26"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:47:00.378007 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.377979 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/686331d9-1c58-4e27-9ccc-935c2ebd5b26-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "686331d9-1c58-4e27-9ccc-935c2ebd5b26" (UID: "686331d9-1c58-4e27-9ccc-935c2ebd5b26"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:47:00.378110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.378067 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kube-api-access-7vr66" (OuterVolumeSpecName: "kube-api-access-7vr66") pod "686331d9-1c58-4e27-9ccc-935c2ebd5b26" (UID: "686331d9-1c58-4e27-9ccc-935c2ebd5b26"). InnerVolumeSpecName "kube-api-access-7vr66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:47:00.429874 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.429846 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "686331d9-1c58-4e27-9ccc-935c2ebd5b26" (UID: "686331d9-1c58-4e27-9ccc-935c2ebd5b26"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:47:00.463748 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.463695 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6cbfb55898-jn9bs_686331d9-1c58-4e27-9ccc-935c2ebd5b26/main/0.log" Apr 24 14:47:00.464023 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.464000 2572 generic.go:358] "Generic (PLEG): container finished" podID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerID="541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9" exitCode=137 Apr 24 14:47:00.464075 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.464051 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" event={"ID":"686331d9-1c58-4e27-9ccc-935c2ebd5b26","Type":"ContainerDied","Data":"541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9"} Apr 24 14:47:00.464075 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.464067 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" Apr 24 14:47:00.464143 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.464076 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs" event={"ID":"686331d9-1c58-4e27-9ccc-935c2ebd5b26","Type":"ContainerDied","Data":"a2ae0a8b781e0738d5873bff6b39ade3a41f10750642151d7eda253b63b6bd90"} Apr 24 14:47:00.464143 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.464100 2572 scope.go:117] "RemoveContainer" containerID="541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9" Apr 24 14:47:00.476986 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.476969 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-dshm\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:47:00.477068 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.476988 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vr66\" (UniqueName: \"kubernetes.io/projected/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kube-api-access-7vr66\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:47:00.477068 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.476999 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:47:00.477068 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.477008 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-home\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:47:00.477068 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.477016 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/686331d9-1c58-4e27-9ccc-935c2ebd5b26-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:47:00.477068 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.477025 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/686331d9-1c58-4e27-9ccc-935c2ebd5b26-model-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:47:00.483305 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.483289 2572 scope.go:117] "RemoveContainer" containerID="a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6" Apr 24 14:47:00.496560 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.496540 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs"] Apr 24 14:47:00.499579 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.499556 2572 scope.go:117] "RemoveContainer" containerID="541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9" Apr 24 14:47:00.499882 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:47:00.499859 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9\": container with ID starting with 541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9 not found: ID does not exist" containerID="541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9" Apr 24 14:47:00.499945 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.499897 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9"} err="failed to get container status \"541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9\": rpc error: code = NotFound desc = could not find container \"541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9\": container with ID starting with 541fe50b4cc1c013a3e75281473e9a2aa7351e461b4b180faa0b1f3b951ffab9 not found: ID does not exist" Apr 24 14:47:00.499945 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.499919 2572 scope.go:117] "RemoveContainer" containerID="a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6" Apr 24 14:47:00.500179 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:47:00.500158 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6\": container with ID starting with a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6 not found: ID does not exist" containerID="a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6" Apr 24 14:47:00.500238 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.500182 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6cbfb55898-jn9bs"] Apr 24 14:47:00.500238 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:00.500187 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6"} err="failed to get container status \"a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6\": rpc error: code = NotFound desc = could not find container \"a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6\": container with ID starting with a661da50c06f5af21346c8462a5a8e19b1298a0b11f22ba2b956d20a001bd7e6 not found: ID does not exist" Apr 24 14:47:02.061725 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:02.061690 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" path="/var/lib/kubelet/pods/686331d9-1c58-4e27-9ccc-935c2ebd5b26/volumes" Apr 24 14:47:13.440463 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:13.440433 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 14:47:37.591848 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.591815 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-66876c8d5d-lbsfj"] Apr 24 14:47:37.592366 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.592046 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" podUID="453facff-2554-4f6b-8d44-dcd25af01306" containerName="manager" containerID="cri-o://8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2" gracePeriod=30 Apr 24 14:47:37.835547 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.835524 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:47:37.865428 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.865364 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/453facff-2554-4f6b-8d44-dcd25af01306-cert\") pod \"453facff-2554-4f6b-8d44-dcd25af01306\" (UID: \"453facff-2554-4f6b-8d44-dcd25af01306\") " Apr 24 14:47:37.865545 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.865497 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnflq\" (UniqueName: \"kubernetes.io/projected/453facff-2554-4f6b-8d44-dcd25af01306-kube-api-access-pnflq\") pod \"453facff-2554-4f6b-8d44-dcd25af01306\" (UID: \"453facff-2554-4f6b-8d44-dcd25af01306\") " Apr 24 14:47:37.867564 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.867540 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453facff-2554-4f6b-8d44-dcd25af01306-kube-api-access-pnflq" (OuterVolumeSpecName: "kube-api-access-pnflq") pod "453facff-2554-4f6b-8d44-dcd25af01306" (UID: "453facff-2554-4f6b-8d44-dcd25af01306"). InnerVolumeSpecName "kube-api-access-pnflq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:47:37.867846 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.867813 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/453facff-2554-4f6b-8d44-dcd25af01306-cert" (OuterVolumeSpecName: "cert") pod "453facff-2554-4f6b-8d44-dcd25af01306" (UID: "453facff-2554-4f6b-8d44-dcd25af01306"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:47:37.966982 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.966958 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pnflq\" (UniqueName: \"kubernetes.io/projected/453facff-2554-4f6b-8d44-dcd25af01306-kube-api-access-pnflq\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:47:37.966982 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:37.966980 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/453facff-2554-4f6b-8d44-dcd25af01306-cert\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:47:38.590563 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.590529 2572 generic.go:358] "Generic (PLEG): container finished" podID="453facff-2554-4f6b-8d44-dcd25af01306" containerID="8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2" exitCode=0 Apr 24 14:47:38.590767 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.590588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" event={"ID":"453facff-2554-4f6b-8d44-dcd25af01306","Type":"ContainerDied","Data":"8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2"} Apr 24 14:47:38.590767 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.590630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" event={"ID":"453facff-2554-4f6b-8d44-dcd25af01306","Type":"ContainerDied","Data":"7b389433d0e7d44fd074d4edb0936ab0646a96ba30234e30033b58f78fc78693"} Apr 24 14:47:38.590767 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.590647 2572 scope.go:117] "RemoveContainer" containerID="8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2" Apr 24 14:47:38.590767 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.590591 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-66876c8d5d-lbsfj" Apr 24 14:47:38.598979 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.598818 2572 scope.go:117] "RemoveContainer" containerID="8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2" Apr 24 14:47:38.599234 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:47:38.599049 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2\": container with ID starting with 8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2 not found: ID does not exist" containerID="8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2" Apr 24 14:47:38.599234 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.599073 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2"} err="failed to get container status \"8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2\": rpc error: code = NotFound desc = could not find container \"8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2\": container with ID starting with 8f7e9ba1632068ef8f3ff65155686d7436c3cac4c26f901c9dbcddacc08046a2 not found: ID does not exist" Apr 24 14:47:38.607317 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.607294 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-66876c8d5d-lbsfj"] Apr 24 14:47:38.611053 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:38.611034 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-66876c8d5d-lbsfj"] Apr 24 14:47:40.058906 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:47:40.058825 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453facff-2554-4f6b-8d44-dcd25af01306" path="/var/lib/kubelet/pods/453facff-2554-4f6b-8d44-dcd25af01306/volumes" Apr 24 14:48:52.401362 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:48:52.401330 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:48:52.407377 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:48:52.407354 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:51:45.958435 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958396 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 14:51:45.958849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958726 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" Apr 24 14:51:45.958849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958738 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" Apr 24 14:51:45.958849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958750 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="453facff-2554-4f6b-8d44-dcd25af01306" containerName="manager" Apr 24 14:51:45.958849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958755 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="453facff-2554-4f6b-8d44-dcd25af01306" containerName="manager" Apr 24 14:51:45.958849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958764 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="storage-initializer" Apr 24 14:51:45.958849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958770 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="storage-initializer" Apr 24 14:51:45.958849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958829 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="453facff-2554-4f6b-8d44-dcd25af01306" containerName="manager" Apr 24 14:51:45.958849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.958837 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="686331d9-1c58-4e27-9ccc-935c2ebd5b26" containerName="main" Apr 24 14:51:45.961822 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.961802 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:45.965914 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.965867 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 14:51:45.966042 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.965872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-fbmng\"" Apr 24 14:51:45.971191 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:45.971170 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 14:51:46.087591 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.087559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c727d4-3cc7-415b-8611-2c818aca51e5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.087591 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.087598 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.087862 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.087688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxndc\" (UniqueName: \"kubernetes.io/projected/b0c727d4-3cc7-415b-8611-2c818aca51e5-kube-api-access-pxndc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.087862 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.087765 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.087862 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.087830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.087991 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.087860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189104 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189063 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c727d4-3cc7-415b-8611-2c818aca51e5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189289 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189289 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxndc\" (UniqueName: \"kubernetes.io/projected/b0c727d4-3cc7-415b-8611-2c818aca51e5-kube-api-access-pxndc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189289 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189289 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189509 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189301 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189647 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189855 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.189968 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.189861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.192232 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.192208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c727d4-3cc7-415b-8611-2c818aca51e5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.192361 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.192294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.197045 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.197019 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxndc\" (UniqueName: \"kubernetes.io/projected/b0c727d4-3cc7-415b-8611-2c818aca51e5-kube-api-access-pxndc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.273204 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.273120 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:51:46.400364 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.400333 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 14:51:46.401132 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:51:46.401098 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0c727d4_3cc7_415b_8611_2c818aca51e5.slice/crio-5a9778df41cabc422ec2670344842e698c465d2aea058684052e22087211410f WatchSource:0}: Error finding container 5a9778df41cabc422ec2670344842e698c465d2aea058684052e22087211410f: Status 404 returned error can't find the container with id 5a9778df41cabc422ec2670344842e698c465d2aea058684052e22087211410f Apr 24 14:51:46.403446 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:46.403425 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 14:51:47.401867 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:47.401832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b0c727d4-3cc7-415b-8611-2c818aca51e5","Type":"ContainerStarted","Data":"4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b"} Apr 24 14:51:47.401867 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:51:47.401870 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b0c727d4-3cc7-415b-8611-2c818aca51e5","Type":"ContainerStarted","Data":"5a9778df41cabc422ec2670344842e698c465d2aea058684052e22087211410f"} Apr 24 14:52:40.572831 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:52:40.572800 2572 generic.go:358] "Generic (PLEG): container finished" podID="b0c727d4-3cc7-415b-8611-2c818aca51e5" containerID="4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b" exitCode=0 Apr 24 14:52:40.573186 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:52:40.572870 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b0c727d4-3cc7-415b-8611-2c818aca51e5","Type":"ContainerDied","Data":"4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b"} Apr 24 14:52:41.578907 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:52:41.578871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b0c727d4-3cc7-415b-8611-2c818aca51e5","Type":"ContainerStarted","Data":"5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63"} Apr 24 14:52:41.598395 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:52:41.598339 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=56.59832497 podStartE2EDuration="56.59832497s" podCreationTimestamp="2026-04-24 14:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:52:41.595947627 +0000 UTC m=+1730.146789812" watchObservedRunningTime="2026-04-24 14:52:41.59832497 +0000 UTC m=+1730.149167141" Apr 24 14:53:34.797292 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:53:34.797254 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 14:53:34.797859 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:53:34.797545 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="b0c727d4-3cc7-415b-8611-2c818aca51e5" containerName="main" containerID="cri-o://5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63" gracePeriod=30 Apr 24 14:53:52.423484 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:53:52.423449 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:53:52.429919 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:53:52.429899 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:54:05.438110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.438087 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_b0c727d4-3cc7-415b-8611-2c818aca51e5/main/0.log" Apr 24 14:54:05.438471 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.438460 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:54:05.551818 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.551787 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c727d4-3cc7-415b-8611-2c818aca51e5-tls-certs\") pod \"b0c727d4-3cc7-415b-8611-2c818aca51e5\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " Apr 24 14:54:05.551981 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.551843 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxndc\" (UniqueName: \"kubernetes.io/projected/b0c727d4-3cc7-415b-8611-2c818aca51e5-kube-api-access-pxndc\") pod \"b0c727d4-3cc7-415b-8611-2c818aca51e5\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " Apr 24 14:54:05.551981 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.551911 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-home\") pod \"b0c727d4-3cc7-415b-8611-2c818aca51e5\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " Apr 24 14:54:05.551981 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.551952 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-model-cache\") pod \"b0c727d4-3cc7-415b-8611-2c818aca51e5\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " Apr 24 14:54:05.552120 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.551989 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-dshm\") pod \"b0c727d4-3cc7-415b-8611-2c818aca51e5\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " Apr 24 14:54:05.552120 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.552020 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-kserve-provision-location\") pod \"b0c727d4-3cc7-415b-8611-2c818aca51e5\" (UID: \"b0c727d4-3cc7-415b-8611-2c818aca51e5\") " Apr 24 14:54:05.552222 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.552189 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-model-cache" (OuterVolumeSpecName: "model-cache") pod "b0c727d4-3cc7-415b-8611-2c818aca51e5" (UID: "b0c727d4-3cc7-415b-8611-2c818aca51e5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:54:05.552298 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.552281 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-model-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:54:05.552383 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.552353 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-home" (OuterVolumeSpecName: "home") pod "b0c727d4-3cc7-415b-8611-2c818aca51e5" (UID: "b0c727d4-3cc7-415b-8611-2c818aca51e5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:54:05.554121 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.554093 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-dshm" (OuterVolumeSpecName: "dshm") pod "b0c727d4-3cc7-415b-8611-2c818aca51e5" (UID: "b0c727d4-3cc7-415b-8611-2c818aca51e5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:54:05.554121 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.554105 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c727d4-3cc7-415b-8611-2c818aca51e5-kube-api-access-pxndc" (OuterVolumeSpecName: "kube-api-access-pxndc") pod "b0c727d4-3cc7-415b-8611-2c818aca51e5" (UID: "b0c727d4-3cc7-415b-8611-2c818aca51e5"). InnerVolumeSpecName "kube-api-access-pxndc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 14:54:05.554240 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.554185 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c727d4-3cc7-415b-8611-2c818aca51e5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b0c727d4-3cc7-415b-8611-2c818aca51e5" (UID: "b0c727d4-3cc7-415b-8611-2c818aca51e5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 14:54:05.611967 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.611900 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0c727d4-3cc7-415b-8611-2c818aca51e5" (UID: "b0c727d4-3cc7-415b-8611-2c818aca51e5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 14:54:05.652964 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.652935 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:54:05.652964 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.652960 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0c727d4-3cc7-415b-8611-2c818aca51e5-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:54:05.652964 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.652971 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxndc\" (UniqueName: \"kubernetes.io/projected/b0c727d4-3cc7-415b-8611-2c818aca51e5-kube-api-access-pxndc\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:54:05.653161 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.652981 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-home\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:54:05.653161 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.652991 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0c727d4-3cc7-415b-8611-2c818aca51e5-dshm\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 14:54:05.865505 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.865419 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_b0c727d4-3cc7-415b-8611-2c818aca51e5/main/0.log" Apr 24 14:54:05.865803 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.865776 2572 generic.go:358] "Generic (PLEG): container finished" podID="b0c727d4-3cc7-415b-8611-2c818aca51e5" containerID="5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63" exitCode=137 Apr 24 14:54:05.865895 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.865829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b0c727d4-3cc7-415b-8611-2c818aca51e5","Type":"ContainerDied","Data":"5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63"} Apr 24 14:54:05.865895 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.865855 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 24 14:54:05.865895 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.865863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b0c727d4-3cc7-415b-8611-2c818aca51e5","Type":"ContainerDied","Data":"5a9778df41cabc422ec2670344842e698c465d2aea058684052e22087211410f"} Apr 24 14:54:05.865895 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.865880 2572 scope.go:117] "RemoveContainer" containerID="5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63" Apr 24 14:54:05.887131 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.886486 2572 scope.go:117] "RemoveContainer" containerID="4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b" Apr 24 14:54:05.888712 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.888692 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 14:54:05.893548 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.893524 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 24 14:54:05.950032 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.950016 2572 scope.go:117] "RemoveContainer" containerID="5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63" Apr 24 14:54:05.950298 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:54:05.950280 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63\": container with ID starting with 5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63 not found: ID does not exist" containerID="5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63" Apr 24 14:54:05.950342 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.950307 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63"} err="failed to get container status \"5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63\": rpc error: code = NotFound desc = could not find container \"5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63\": container with ID starting with 5078d35849458df7584d0cc6fc09965b75b06e32e7adec72a4c0d064b4ab0c63 not found: ID does not exist" Apr 24 14:54:05.950342 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.950327 2572 scope.go:117] "RemoveContainer" containerID="4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b" Apr 24 14:54:05.950534 ip-10-0-128-169 kubenswrapper[2572]: E0424 14:54:05.950517 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b\": container with ID starting with 4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b not found: ID does not exist" containerID="4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b" Apr 24 14:54:05.950572 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:05.950538 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b"} err="failed to get container status \"4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b\": rpc error: code = NotFound desc = could not find container \"4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b\": container with ID starting with 4a766962511c2e700dda0fa1482efdb0cb9741a24b58b25d3d93233e49cc6a5b not found: ID does not exist" Apr 24 14:54:06.062957 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:06.062930 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c727d4-3cc7-415b-8611-2c818aca51e5" path="/var/lib/kubelet/pods/b0c727d4-3cc7-415b-8611-2c818aca51e5/volumes" Apr 24 14:54:36.127242 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.127213 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn"] Apr 24 14:54:36.127731 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.127519 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0c727d4-3cc7-415b-8611-2c818aca51e5" containerName="main" Apr 24 14:54:36.127731 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.127530 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c727d4-3cc7-415b-8611-2c818aca51e5" containerName="main" Apr 24 14:54:36.127731 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.127554 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0c727d4-3cc7-415b-8611-2c818aca51e5" containerName="storage-initializer" Apr 24 14:54:36.127731 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.127560 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c727d4-3cc7-415b-8611-2c818aca51e5" containerName="storage-initializer" Apr 24 14:54:36.127731 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.127626 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0c727d4-3cc7-415b-8611-2c818aca51e5" containerName="main" Apr 24 14:54:36.130540 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.130524 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.133285 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.133260 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 24 14:54:36.133405 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.133354 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-qtgxc\"" Apr 24 14:54:36.141754 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.141733 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn"] Apr 24 14:54:36.190391 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.190364 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.190391 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.190393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.190536 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.190412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hbz\" (UniqueName: \"kubernetes.io/projected/ac2b3e5c-b931-48f3-a618-83c88ba06036-kube-api-access-67hbz\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.190536 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.190432 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.190536 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.190495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.190650 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.190550 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b3e5c-b931-48f3-a618-83c88ba06036-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.290959 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.290924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.290966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.290990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67hbz\" (UniqueName: \"kubernetes.io/projected/ac2b3e5c-b931-48f3-a618-83c88ba06036-kube-api-access-67hbz\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.291016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291110 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.291052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291318 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.291114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b3e5c-b931-48f3-a618-83c88ba06036-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291399 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.291374 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291399 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.291392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291503 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.291428 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.291503 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.291471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.293527 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.293505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b3e5c-b931-48f3-a618-83c88ba06036-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.299685 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.299663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hbz\" (UniqueName: \"kubernetes.io/projected/ac2b3e5c-b931-48f3-a618-83c88ba06036-kube-api-access-67hbz\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.441118 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.441033 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:36.571120 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.571097 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn"] Apr 24 14:54:36.573165 ip-10-0-128-169 kubenswrapper[2572]: W0424 14:54:36.573136 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2b3e5c_b931_48f3_a618_83c88ba06036.slice/crio-7242155216e47ee54dfe9e3b0c71129293ff94fa77d3950470a771eba29ddc5e WatchSource:0}: Error finding container 7242155216e47ee54dfe9e3b0c71129293ff94fa77d3950470a771eba29ddc5e: Status 404 returned error can't find the container with id 7242155216e47ee54dfe9e3b0c71129293ff94fa77d3950470a771eba29ddc5e Apr 24 14:54:36.976816 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.976780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" event={"ID":"ac2b3e5c-b931-48f3-a618-83c88ba06036","Type":"ContainerStarted","Data":"fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437"} Apr 24 14:54:36.976816 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:36.976821 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" event={"ID":"ac2b3e5c-b931-48f3-a618-83c88ba06036","Type":"ContainerStarted","Data":"7242155216e47ee54dfe9e3b0c71129293ff94fa77d3950470a771eba29ddc5e"} Apr 24 14:54:37.982139 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:37.982096 2572 generic.go:358] "Generic (PLEG): container finished" podID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerID="fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437" exitCode=0 Apr 24 14:54:37.982507 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:37.982165 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" event={"ID":"ac2b3e5c-b931-48f3-a618-83c88ba06036","Type":"ContainerDied","Data":"fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437"} Apr 24 14:54:38.988037 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:38.987990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" event={"ID":"ac2b3e5c-b931-48f3-a618-83c88ba06036","Type":"ContainerStarted","Data":"83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95"} Apr 24 14:54:38.988037 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:38.988037 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" event={"ID":"ac2b3e5c-b931-48f3-a618-83c88ba06036","Type":"ContainerStarted","Data":"1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577"} Apr 24 14:54:38.988441 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:38.988127 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:39.013340 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:39.013282 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" podStartSLOduration=3.013264123 podStartE2EDuration="3.013264123s" podCreationTimestamp="2026-04-24 14:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 14:54:39.011816015 +0000 UTC m=+1847.562658190" watchObservedRunningTime="2026-04-24 14:54:39.013264123 +0000 UTC m=+1847.564106291" Apr 24 14:54:46.441849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:46.441813 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:46.441849 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:46.441856 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:46.444527 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:46.444502 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:54:47.016213 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:54:47.016181 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:55:08.019997 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:55:08.019967 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 14:58:52.444781 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:58:52.444749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 14:58:52.453673 ip-10-0-128-169 kubenswrapper[2572]: I0424 14:58:52.453649 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 15:00:07.708679 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:07.708632 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn"] Apr 24 15:00:07.709687 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:07.709647 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="tokenizer" containerID="cri-o://83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95" gracePeriod=30 Apr 24 15:00:07.709687 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:07.709649 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="main" containerID="cri-o://1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577" gracePeriod=30 Apr 24 15:00:08.019180 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:00:08.019088 2572 logging.go:55] [core] [Channel #974 SubChannel #975]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.39:9003", ServerName: "10.132.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.39:9003: connect: connection refused" Apr 24 15:00:08.082596 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:08.082560 2572 generic.go:358] "Generic (PLEG): container finished" podID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerID="1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577" exitCode=0 Apr 24 15:00:08.082795 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:08.082644 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" event={"ID":"ac2b3e5c-b931-48f3-a618-83c88ba06036","Type":"ContainerDied","Data":"1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577"} Apr 24 15:00:08.971076 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:08.971052 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 15:00:09.019766 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.019679 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.39:9003\" within 1s: context deadline exceeded" Apr 24 15:00:09.019766 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:00:09.019740 2572 logging.go:55] [core] [Channel #974 SubChannel #975]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.39:9003", ServerName: "10.132.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.39:9003: operation was canceled" Apr 24 15:00:09.064756 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.064725 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-kserve-provision-location\") pod \"ac2b3e5c-b931-48f3-a618-83c88ba06036\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " Apr 24 15:00:09.064917 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.064769 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67hbz\" (UniqueName: \"kubernetes.io/projected/ac2b3e5c-b931-48f3-a618-83c88ba06036-kube-api-access-67hbz\") pod \"ac2b3e5c-b931-48f3-a618-83c88ba06036\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " Apr 24 15:00:09.064917 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.064803 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-cache\") pod \"ac2b3e5c-b931-48f3-a618-83c88ba06036\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " Apr 24 15:00:09.064917 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.064820 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-uds\") pod \"ac2b3e5c-b931-48f3-a618-83c88ba06036\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " Apr 24 15:00:09.064917 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.064849 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-tmp\") pod \"ac2b3e5c-b931-48f3-a618-83c88ba06036\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " Apr 24 15:00:09.065127 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.064941 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b3e5c-b931-48f3-a618-83c88ba06036-tls-certs\") pod \"ac2b3e5c-b931-48f3-a618-83c88ba06036\" (UID: \"ac2b3e5c-b931-48f3-a618-83c88ba06036\") " Apr 24 15:00:09.065127 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.065073 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ac2b3e5c-b931-48f3-a618-83c88ba06036" (UID: "ac2b3e5c-b931-48f3-a618-83c88ba06036"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:00:09.065127 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.065087 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ac2b3e5c-b931-48f3-a618-83c88ba06036" (UID: "ac2b3e5c-b931-48f3-a618-83c88ba06036"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:00:09.065334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.065305 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ac2b3e5c-b931-48f3-a618-83c88ba06036" (UID: "ac2b3e5c-b931-48f3-a618-83c88ba06036"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:00:09.065649 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.065623 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ac2b3e5c-b931-48f3-a618-83c88ba06036" (UID: "ac2b3e5c-b931-48f3-a618-83c88ba06036"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:00:09.066954 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.066927 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2b3e5c-b931-48f3-a618-83c88ba06036-kube-api-access-67hbz" (OuterVolumeSpecName: "kube-api-access-67hbz") pod "ac2b3e5c-b931-48f3-a618-83c88ba06036" (UID: "ac2b3e5c-b931-48f3-a618-83c88ba06036"). InnerVolumeSpecName "kube-api-access-67hbz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:00:09.067069 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.067051 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2b3e5c-b931-48f3-a618-83c88ba06036-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ac2b3e5c-b931-48f3-a618-83c88ba06036" (UID: "ac2b3e5c-b931-48f3-a618-83c88ba06036"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 15:00:09.094088 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.094054 2572 generic.go:358] "Generic (PLEG): container finished" podID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerID="83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95" exitCode=0 Apr 24 15:00:09.094213 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.094144 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" event={"ID":"ac2b3e5c-b931-48f3-a618-83c88ba06036","Type":"ContainerDied","Data":"83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95"} Apr 24 15:00:09.094213 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.094179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" event={"ID":"ac2b3e5c-b931-48f3-a618-83c88ba06036","Type":"ContainerDied","Data":"7242155216e47ee54dfe9e3b0c71129293ff94fa77d3950470a771eba29ddc5e"} Apr 24 15:00:09.094213 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.094202 2572 scope.go:117] "RemoveContainer" containerID="83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95" Apr 24 15:00:09.094382 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.094207 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn" Apr 24 15:00:09.104794 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.104771 2572 scope.go:117] "RemoveContainer" containerID="1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577" Apr 24 15:00:09.113092 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.113077 2572 scope.go:117] "RemoveContainer" containerID="fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437" Apr 24 15:00:09.121349 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.121334 2572 scope.go:117] "RemoveContainer" containerID="83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95" Apr 24 15:00:09.121698 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:00:09.121669 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95\": container with ID starting with 83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95 not found: ID does not exist" containerID="83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95" Apr 24 15:00:09.121965 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.121707 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95"} err="failed to get container status \"83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95\": rpc error: code = NotFound desc = could not find container \"83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95\": container with ID starting with 83d69f8aacd7c86eeb0b207ca9410e01e459660b2f556d59367c2a176e279f95 not found: ID does not exist" Apr 24 15:00:09.121965 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.121733 2572 scope.go:117] "RemoveContainer" containerID="1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577" Apr 24 15:00:09.122115 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:00:09.122003 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577\": container with ID starting with 1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577 not found: ID does not exist" containerID="1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577" Apr 24 15:00:09.122115 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.122025 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577"} err="failed to get container status \"1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577\": rpc error: code = NotFound desc = could not find container \"1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577\": container with ID starting with 1244b04a796c3c2787edf6f3911af4a1eb3a3783800ca7f0c98a15de32e85577 not found: ID does not exist" Apr 24 15:00:09.122115 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.122042 2572 scope.go:117] "RemoveContainer" containerID="fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437" Apr 24 15:00:09.122340 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:00:09.122323 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437\": container with ID starting with fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437 not found: ID does not exist" containerID="fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437" Apr 24 15:00:09.122413 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.122344 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437"} err="failed to get container status \"fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437\": rpc error: code = NotFound desc = could not find container \"fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437\": container with ID starting with fbc1ae3d6f4688d2e532863662877b9a19c68b07559992adb8d8180c67cdc437 not found: ID does not exist" Apr 24 15:00:09.122885 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.122857 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn"] Apr 24 15:00:09.126037 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.126017 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-647b5sprvn"] Apr 24 15:00:09.165715 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.165689 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ac2b3e5c-b931-48f3-a618-83c88ba06036-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:00:09.165715 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.165716 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:00:09.165884 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.165727 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-67hbz\" (UniqueName: \"kubernetes.io/projected/ac2b3e5c-b931-48f3-a618-83c88ba06036-kube-api-access-67hbz\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:00:09.165884 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.165740 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:00:09.165884 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.165756 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-uds\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:00:09.165884 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:09.165769 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ac2b3e5c-b931-48f3-a618-83c88ba06036-tokenizer-tmp\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:00:10.059448 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:10.059414 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" path="/var/lib/kubelet/pods/ac2b3e5c-b931-48f3-a618-83c88ba06036/volumes" Apr 24 15:00:24.458277 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458242 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b"] Apr 24 15:00:24.458684 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458544 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="storage-initializer" Apr 24 15:00:24.458684 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458554 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="storage-initializer" Apr 24 15:00:24.458684 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458563 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="main" Apr 24 15:00:24.458684 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458568 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="main" Apr 24 15:00:24.458684 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458585 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="tokenizer" Apr 24 15:00:24.458684 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458591 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="tokenizer" Apr 24 15:00:24.458684 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458679 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="tokenizer" Apr 24 15:00:24.458913 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.458690 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac2b3e5c-b931-48f3-a618-83c88ba06036" containerName="main" Apr 24 15:00:24.461809 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.461783 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.464433 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.464416 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-cgpxf\"" Apr 24 15:00:24.464538 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.464465 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 24 15:00:24.472245 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.472221 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b"] Apr 24 15:00:24.589777 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.589745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.589955 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.589798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-dshm\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.589955 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.589859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-tls-certs\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.589955 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.589907 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-model-cache\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.590117 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.589965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tw49\" (UniqueName: \"kubernetes.io/projected/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kube-api-access-2tw49\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.590117 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.590031 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-home\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.690879 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.690840 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-dshm\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.690879 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.690881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-tls-certs\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.691127 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.690907 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-model-cache\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.691127 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.690933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tw49\" (UniqueName: \"kubernetes.io/projected/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kube-api-access-2tw49\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.691127 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.690964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-home\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.691127 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.690992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.691418 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.691315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-model-cache\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.691418 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.691381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.691514 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.691415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-home\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.693276 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.693244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-dshm\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.693521 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.693504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-tls-certs\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.701726 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.701702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tw49\" (UniqueName: \"kubernetes.io/projected/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kube-api-access-2tw49\") pod \"router-with-refs-pd-test-kserve-76696559d6-c5f2b\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.718719 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.718662 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp"] Apr 24 15:00:24.722327 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.722308 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.724852 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.724824 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-dlk72\"" Apr 24 15:00:24.734192 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.734172 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp"] Apr 24 15:00:24.771185 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.771158 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:24.792127 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.792102 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.792256 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.792150 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.792256 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.792178 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.792379 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.792260 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.792379 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.792337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.792486 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.792391 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfp9\" (UniqueName: \"kubernetes.io/projected/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kube-api-access-xwfp9\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.892817 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.892783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.892992 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.892831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfp9\" (UniqueName: \"kubernetes.io/projected/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kube-api-access-xwfp9\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.892992 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.892864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.892992 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.892897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.892992 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.892918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.893212 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.893039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.893341 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.893312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.893405 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.893346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.893456 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.893409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.893456 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.893446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.896260 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.896237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:24.898148 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.898130 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b"] Apr 24 15:00:24.900354 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:00:24.900333 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c6d7a9e_dcf4_4148_8bf9_4feb851e4cba.slice/crio-d4e520bc03fe5e291dce7eb9a15192c6d86da09306c7c37047c2b459a2a3b9cd WatchSource:0}: Error finding container d4e520bc03fe5e291dce7eb9a15192c6d86da09306c7c37047c2b459a2a3b9cd: Status 404 returned error can't find the container with id d4e520bc03fe5e291dce7eb9a15192c6d86da09306c7c37047c2b459a2a3b9cd Apr 24 15:00:24.902161 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.902143 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:00:24.902433 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:24.902409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfp9\" (UniqueName: \"kubernetes.io/projected/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kube-api-access-xwfp9\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:25.032224 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:25.032184 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:25.153555 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:25.153523 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp"] Apr 24 15:00:25.153670 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:25.153560 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" event={"ID":"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba","Type":"ContainerStarted","Data":"d4e520bc03fe5e291dce7eb9a15192c6d86da09306c7c37047c2b459a2a3b9cd"} Apr 24 15:00:25.155356 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:00:25.155332 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cfe1b8b_28ab_4a49_b0f6_9060ee1e1069.slice/crio-d4129fd38f79224f37c858651e422817a79637339b32b1fc64d4e9569ba245bf WatchSource:0}: Error finding container d4129fd38f79224f37c858651e422817a79637339b32b1fc64d4e9569ba245bf: Status 404 returned error can't find the container with id d4129fd38f79224f37c858651e422817a79637339b32b1fc64d4e9569ba245bf Apr 24 15:00:26.157950 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:26.157908 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerID="2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6" exitCode=0 Apr 24 15:00:26.158393 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:26.157963 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" event={"ID":"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069","Type":"ContainerDied","Data":"2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6"} Apr 24 15:00:26.158393 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:26.157986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" event={"ID":"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069","Type":"ContainerStarted","Data":"d4129fd38f79224f37c858651e422817a79637339b32b1fc64d4e9569ba245bf"} Apr 24 15:00:27.164134 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:27.164099 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" event={"ID":"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069","Type":"ContainerStarted","Data":"8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b"} Apr 24 15:00:27.164134 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:27.164139 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" event={"ID":"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069","Type":"ContainerStarted","Data":"8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660"} Apr 24 15:00:27.164525 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:27.164196 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:27.187571 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:27.187521 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" podStartSLOduration=3.18750838 podStartE2EDuration="3.18750838s" podCreationTimestamp="2026-04-24 15:00:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:00:27.184477077 +0000 UTC m=+2195.735319250" watchObservedRunningTime="2026-04-24 15:00:27.18750838 +0000 UTC m=+2195.738350552" Apr 24 15:00:35.032655 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:35.032613 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:35.032655 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:35.032661 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:35.035242 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:35.035210 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:35.193279 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:35.193225 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:56.196303 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:56.196271 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:00:58.268859 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:58.268820 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" event={"ID":"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba","Type":"ContainerStarted","Data":"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa"} Apr 24 15:00:58.269310 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:58.268934 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:00:59.274586 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:00:59.274547 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" event={"ID":"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba","Type":"ContainerStarted","Data":"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad"} Apr 24 15:01:03.290016 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:03.289978 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerID="789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad" exitCode=0 Apr 24 15:01:03.290422 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:03.290038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" event={"ID":"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba","Type":"ContainerDied","Data":"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad"} Apr 24 15:01:04.298440 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:04.298406 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" event={"ID":"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba","Type":"ContainerStarted","Data":"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810"} Apr 24 15:01:04.323450 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:04.323390 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podStartSLOduration=7.819169955 podStartE2EDuration="40.323370807s" podCreationTimestamp="2026-04-24 15:00:24 +0000 UTC" firstStartedPulling="2026-04-24 15:00:24.902261184 +0000 UTC m=+2193.453103334" lastFinishedPulling="2026-04-24 15:00:57.406462035 +0000 UTC m=+2225.957304186" observedRunningTime="2026-04-24 15:01:04.322843586 +0000 UTC m=+2232.873685758" watchObservedRunningTime="2026-04-24 15:01:04.323370807 +0000 UTC m=+2232.874212981" Apr 24 15:01:04.771969 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:04.771895 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:01:04.771969 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:04.771931 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:01:04.773046 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:04.773014 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:01:14.771839 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:14.771733 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:01:15.315721 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:15.315693 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:01:24.772345 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:24.772298 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:01:33.993728 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:33.993696 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw"] Apr 24 15:01:33.994304 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:33.994090 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="main" containerID="cri-o://48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807" gracePeriod=30 Apr 24 15:01:33.994304 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:33.994184 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="tokenizer" containerID="cri-o://f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47" gracePeriod=30 Apr 24 15:01:34.005249 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:34.005219 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz"] Apr 24 15:01:34.005640 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:34.005573 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" podUID="2339aac3-d1f6-4e44-b828-a14f725536eb" containerName="storage-initializer" containerID="cri-o://c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b" gracePeriod=30 Apr 24 15:01:34.407761 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:34.407720 2572 generic.go:358] "Generic (PLEG): container finished" podID="533b40ef-1f72-4877-8937-a76e237fe998" containerID="48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807" exitCode=0 Apr 24 15:01:34.407964 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:34.407791 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" event={"ID":"533b40ef-1f72-4877-8937-a76e237fe998","Type":"ContainerDied","Data":"48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807"} Apr 24 15:01:34.771925 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:34.771815 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:01:35.238646 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.238596 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 15:01:35.403987 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.403886 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69t5x\" (UniqueName: \"kubernetes.io/projected/533b40ef-1f72-4877-8937-a76e237fe998-kube-api-access-69t5x\") pod \"533b40ef-1f72-4877-8937-a76e237fe998\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " Apr 24 15:01:35.403987 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.403974 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-cache\") pod \"533b40ef-1f72-4877-8937-a76e237fe998\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " Apr 24 15:01:35.404227 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404072 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-kserve-provision-location\") pod \"533b40ef-1f72-4877-8937-a76e237fe998\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " Apr 24 15:01:35.404227 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404111 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-tmp\") pod \"533b40ef-1f72-4877-8937-a76e237fe998\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " Apr 24 15:01:35.404227 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404148 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-uds\") pod \"533b40ef-1f72-4877-8937-a76e237fe998\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " Apr 24 15:01:35.404227 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404176 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533b40ef-1f72-4877-8937-a76e237fe998-tls-certs\") pod \"533b40ef-1f72-4877-8937-a76e237fe998\" (UID: \"533b40ef-1f72-4877-8937-a76e237fe998\") " Apr 24 15:01:35.404436 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404230 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "533b40ef-1f72-4877-8937-a76e237fe998" (UID: "533b40ef-1f72-4877-8937-a76e237fe998"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:01:35.404505 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404423 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "533b40ef-1f72-4877-8937-a76e237fe998" (UID: "533b40ef-1f72-4877-8937-a76e237fe998"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:01:35.404505 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404453 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "533b40ef-1f72-4877-8937-a76e237fe998" (UID: "533b40ef-1f72-4877-8937-a76e237fe998"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:01:35.404505 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404470 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:01:35.404842 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.404817 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "533b40ef-1f72-4877-8937-a76e237fe998" (UID: "533b40ef-1f72-4877-8937-a76e237fe998"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:01:35.406362 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.406330 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533b40ef-1f72-4877-8937-a76e237fe998-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "533b40ef-1f72-4877-8937-a76e237fe998" (UID: "533b40ef-1f72-4877-8937-a76e237fe998"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 15:01:35.406477 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.406396 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533b40ef-1f72-4877-8937-a76e237fe998-kube-api-access-69t5x" (OuterVolumeSpecName: "kube-api-access-69t5x") pod "533b40ef-1f72-4877-8937-a76e237fe998" (UID: "533b40ef-1f72-4877-8937-a76e237fe998"). InnerVolumeSpecName "kube-api-access-69t5x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:01:35.413498 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.413467 2572 generic.go:358] "Generic (PLEG): container finished" podID="533b40ef-1f72-4877-8937-a76e237fe998" containerID="f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47" exitCode=0 Apr 24 15:01:35.413639 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.413512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" event={"ID":"533b40ef-1f72-4877-8937-a76e237fe998","Type":"ContainerDied","Data":"f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47"} Apr 24 15:01:35.413639 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.413536 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" event={"ID":"533b40ef-1f72-4877-8937-a76e237fe998","Type":"ContainerDied","Data":"89ed3ad33271ea7e5cb819f861955c12f190394b880c63dd2239487d656bffc8"} Apr 24 15:01:35.413639 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.413551 2572 scope.go:117] "RemoveContainer" containerID="f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47" Apr 24 15:01:35.413639 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.413551 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw" Apr 24 15:01:35.422457 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.422442 2572 scope.go:117] "RemoveContainer" containerID="48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807" Apr 24 15:01:35.431574 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.431557 2572 scope.go:117] "RemoveContainer" containerID="585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b" Apr 24 15:01:35.436404 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.436384 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw"] Apr 24 15:01:35.439931 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.439911 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-55ddf8c976-qb7gw"] Apr 24 15:01:35.440304 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.440276 2572 scope.go:117] "RemoveContainer" containerID="f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47" Apr 24 15:01:35.440563 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:01:35.440545 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47\": container with ID starting with f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47 not found: ID does not exist" containerID="f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47" Apr 24 15:01:35.440658 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.440571 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47"} err="failed to get container status \"f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47\": rpc error: code = NotFound desc = could not find container \"f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47\": container with ID starting with f83fa5d73ea12f5e3b0f591791f4fee738fd2375db0585529e04ada48e059b47 not found: ID does not exist" Apr 24 15:01:35.440658 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.440589 2572 scope.go:117] "RemoveContainer" containerID="48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807" Apr 24 15:01:35.440872 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:01:35.440854 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807\": container with ID starting with 48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807 not found: ID does not exist" containerID="48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807" Apr 24 15:01:35.440915 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.440877 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807"} err="failed to get container status \"48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807\": rpc error: code = NotFound desc = could not find container \"48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807\": container with ID starting with 48faed321e40ee959aa6bcdf51cc0efea711708e2adc577e1e713f813ce84807 not found: ID does not exist" Apr 24 15:01:35.440915 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.440894 2572 scope.go:117] "RemoveContainer" containerID="585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b" Apr 24 15:01:35.441122 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:01:35.441107 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b\": container with ID starting with 585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b not found: ID does not exist" containerID="585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b" Apr 24 15:01:35.441164 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.441126 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b"} err="failed to get container status \"585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b\": rpc error: code = NotFound desc = could not find container \"585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b\": container with ID starting with 585ec4dffa7dd3a1bc182755b8e748e38dc71015f4d2da649c427200f9340b7b not found: ID does not exist" Apr 24 15:01:35.504938 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.504912 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-69t5x\" (UniqueName: \"kubernetes.io/projected/533b40ef-1f72-4877-8937-a76e237fe998-kube-api-access-69t5x\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:01:35.504938 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.504937 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:01:35.505073 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.504948 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-tmp\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:01:35.505073 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.504957 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/533b40ef-1f72-4877-8937-a76e237fe998-tokenizer-uds\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:01:35.505073 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:35.504966 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/533b40ef-1f72-4877-8937-a76e237fe998-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:01:36.058258 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:36.058224 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="533b40ef-1f72-4877-8937-a76e237fe998" path="/var/lib/kubelet/pods/533b40ef-1f72-4877-8937-a76e237fe998/volumes" Apr 24 15:01:44.772334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:44.772283 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:01:52.083325 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.083289 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn"] Apr 24 15:01:52.083828 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.083809 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="main" Apr 24 15:01:52.083874 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.083832 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="main" Apr 24 15:01:52.083874 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.083851 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="storage-initializer" Apr 24 15:01:52.083874 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.083862 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="storage-initializer" Apr 24 15:01:52.083967 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.083903 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="tokenizer" Apr 24 15:01:52.083967 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.083913 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="tokenizer" Apr 24 15:01:52.084070 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.083993 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="main" Apr 24 15:01:52.084070 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.084006 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="533b40ef-1f72-4877-8937-a76e237fe998" containerName="tokenizer" Apr 24 15:01:52.087710 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.087691 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.090491 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.090464 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-kdv49\"" Apr 24 15:01:52.090653 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.090565 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 24 15:01:52.094901 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.094816 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn"] Apr 24 15:01:52.132918 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.132890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.133080 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.132932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qxgn\" (UniqueName: \"kubernetes.io/projected/71358538-e1ff-4f32-a496-4197ecf6146d-kube-api-access-2qxgn\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.133080 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.132955 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.133080 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.133013 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.133080 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.133061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71358538-e1ff-4f32-a496-4197ecf6146d-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.133225 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.133087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.234088 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.234043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71358538-e1ff-4f32-a496-4197ecf6146d-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.234287 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.234108 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.234287 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.234193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.234287 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.234264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qxgn\" (UniqueName: \"kubernetes.io/projected/71358538-e1ff-4f32-a496-4197ecf6146d-kube-api-access-2qxgn\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.234433 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.234321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.234433 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.234351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.234754 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.234724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.234877 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.234797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.235096 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.235075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.235338 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.235276 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.237301 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.237275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71358538-e1ff-4f32-a496-4197ecf6146d-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.243946 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.243919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qxgn\" (UniqueName: \"kubernetes.io/projected/71358538-e1ff-4f32-a496-4197ecf6146d-kube-api-access-2qxgn\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.398701 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.398589 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:52.531266 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:52.531025 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn"] Apr 24 15:01:52.534072 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:01:52.534043 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71358538_e1ff_4f32_a496_4197ecf6146d.slice/crio-9c56820d4dc454e0153a0ff85f2d2f7d66ee7997f06e663562edf8031d20d97a WatchSource:0}: Error finding container 9c56820d4dc454e0153a0ff85f2d2f7d66ee7997f06e663562edf8031d20d97a: Status 404 returned error can't find the container with id 9c56820d4dc454e0153a0ff85f2d2f7d66ee7997f06e663562edf8031d20d97a Apr 24 15:01:53.476860 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:53.476829 2572 generic.go:358] "Generic (PLEG): container finished" podID="71358538-e1ff-4f32-a496-4197ecf6146d" containerID="ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0" exitCode=0 Apr 24 15:01:53.477266 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:53.476925 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" event={"ID":"71358538-e1ff-4f32-a496-4197ecf6146d","Type":"ContainerDied","Data":"ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0"} Apr 24 15:01:53.477266 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:53.476968 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" event={"ID":"71358538-e1ff-4f32-a496-4197ecf6146d","Type":"ContainerStarted","Data":"9c56820d4dc454e0153a0ff85f2d2f7d66ee7997f06e663562edf8031d20d97a"} Apr 24 15:01:54.483261 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:54.483225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" event={"ID":"71358538-e1ff-4f32-a496-4197ecf6146d","Type":"ContainerStarted","Data":"126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb"} Apr 24 15:01:54.483261 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:54.483263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" event={"ID":"71358538-e1ff-4f32-a496-4197ecf6146d","Type":"ContainerStarted","Data":"7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994"} Apr 24 15:01:54.483761 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:54.483360 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:01:54.505504 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:54.505454 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" podStartSLOduration=2.505434885 podStartE2EDuration="2.505434885s" podCreationTimestamp="2026-04-24 15:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:01:54.503489783 +0000 UTC m=+2283.054331979" watchObservedRunningTime="2026-04-24 15:01:54.505434885 +0000 UTC m=+2283.056277060" Apr 24 15:01:54.772502 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:01:54.772393 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:02:02.398842 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:02.398802 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:02:02.398842 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:02.398843 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:02:02.401409 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:02.401383 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:02:02.515746 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:02.515713 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:02:04.186330 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.186308 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5cfbdd5558-h5cjz_2339aac3-d1f6-4e44-b828-a14f725536eb/storage-initializer/0.log" Apr 24 15:02:04.186740 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.186388 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 15:02:04.247116 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.247083 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2339aac3-d1f6-4e44-b828-a14f725536eb-tls-certs\") pod \"2339aac3-d1f6-4e44-b828-a14f725536eb\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " Apr 24 15:02:04.247308 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.247161 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhzkx\" (UniqueName: \"kubernetes.io/projected/2339aac3-d1f6-4e44-b828-a14f725536eb-kube-api-access-mhzkx\") pod \"2339aac3-d1f6-4e44-b828-a14f725536eb\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " Apr 24 15:02:04.247308 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.247219 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-dshm\") pod \"2339aac3-d1f6-4e44-b828-a14f725536eb\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " Apr 24 15:02:04.247308 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.247249 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-kserve-provision-location\") pod \"2339aac3-d1f6-4e44-b828-a14f725536eb\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " Apr 24 15:02:04.247308 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.247277 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-model-cache\") pod \"2339aac3-d1f6-4e44-b828-a14f725536eb\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " Apr 24 15:02:04.247506 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.247340 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-home\") pod \"2339aac3-d1f6-4e44-b828-a14f725536eb\" (UID: \"2339aac3-d1f6-4e44-b828-a14f725536eb\") " Apr 24 15:02:04.247655 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.247626 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-model-cache" (OuterVolumeSpecName: "model-cache") pod "2339aac3-d1f6-4e44-b828-a14f725536eb" (UID: "2339aac3-d1f6-4e44-b828-a14f725536eb"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:02:04.247737 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.247714 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-home" (OuterVolumeSpecName: "home") pod "2339aac3-d1f6-4e44-b828-a14f725536eb" (UID: "2339aac3-d1f6-4e44-b828-a14f725536eb"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:02:04.249410 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.249387 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-dshm" (OuterVolumeSpecName: "dshm") pod "2339aac3-d1f6-4e44-b828-a14f725536eb" (UID: "2339aac3-d1f6-4e44-b828-a14f725536eb"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:02:04.249712 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.249692 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2339aac3-d1f6-4e44-b828-a14f725536eb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2339aac3-d1f6-4e44-b828-a14f725536eb" (UID: "2339aac3-d1f6-4e44-b828-a14f725536eb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 15:02:04.249790 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.249717 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2339aac3-d1f6-4e44-b828-a14f725536eb-kube-api-access-mhzkx" (OuterVolumeSpecName: "kube-api-access-mhzkx") pod "2339aac3-d1f6-4e44-b828-a14f725536eb" (UID: "2339aac3-d1f6-4e44-b828-a14f725536eb"). InnerVolumeSpecName "kube-api-access-mhzkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:02:04.311969 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.311932 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2339aac3-d1f6-4e44-b828-a14f725536eb" (UID: "2339aac3-d1f6-4e44-b828-a14f725536eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:02:04.348701 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.348673 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-home\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:04.348701 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.348697 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2339aac3-d1f6-4e44-b828-a14f725536eb-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:04.348701 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.348706 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mhzkx\" (UniqueName: \"kubernetes.io/projected/2339aac3-d1f6-4e44-b828-a14f725536eb-kube-api-access-mhzkx\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:04.348900 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.348717 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-dshm\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:04.348900 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.348729 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:04.348900 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.348740 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2339aac3-d1f6-4e44-b828-a14f725536eb-model-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:04.522077 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.522044 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5cfbdd5558-h5cjz_2339aac3-d1f6-4e44-b828-a14f725536eb/storage-initializer/0.log" Apr 24 15:02:04.522243 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.522092 2572 generic.go:358] "Generic (PLEG): container finished" podID="2339aac3-d1f6-4e44-b828-a14f725536eb" containerID="c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b" exitCode=137 Apr 24 15:02:04.522243 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.522177 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" Apr 24 15:02:04.522366 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.522177 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" event={"ID":"2339aac3-d1f6-4e44-b828-a14f725536eb","Type":"ContainerDied","Data":"c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b"} Apr 24 15:02:04.522366 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.522281 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz" event={"ID":"2339aac3-d1f6-4e44-b828-a14f725536eb","Type":"ContainerDied","Data":"232edbcfdd1b5aa5c3c03aea3ebd625b02afcf7d7440547ad328c9e7c34c1ab1"} Apr 24 15:02:04.522366 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.522304 2572 scope.go:117] "RemoveContainer" containerID="c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b" Apr 24 15:02:04.540372 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.540351 2572 scope.go:117] "RemoveContainer" containerID="c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b" Apr 24 15:02:04.540714 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:02:04.540689 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b\": container with ID starting with c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b not found: ID does not exist" containerID="c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b" Apr 24 15:02:04.540814 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.540721 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b"} err="failed to get container status \"c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b\": rpc error: code = NotFound desc = could not find container \"c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b\": container with ID starting with c84f506dc456da979743bc38cbaf09ddb8a38680a8265a4ae1c63bc06fd2e13b not found: ID does not exist" Apr 24 15:02:04.562307 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.562226 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz"] Apr 24 15:02:04.566161 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.566134 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5cfbdd5558-h5cjz"] Apr 24 15:02:04.772019 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:04.771972 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:02:06.058404 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:06.058361 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2339aac3-d1f6-4e44-b828-a14f725536eb" path="/var/lib/kubelet/pods/2339aac3-d1f6-4e44-b828-a14f725536eb/volumes" Apr 24 15:02:14.771770 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:14.771716 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:02:23.519170 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:23.519133 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:02:24.772203 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:24.772162 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8001/health\": dial tcp 10.132.0.40:8001: connect: connection refused" Apr 24 15:02:34.781037 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:34.781005 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:02:34.792697 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:34.792671 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:02:45.989248 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:45.989171 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b"] Apr 24 15:02:45.989741 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:45.989527 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" containerID="cri-o://2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810" gracePeriod=30 Apr 24 15:02:45.993126 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:45.993099 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp"] Apr 24 15:02:45.993523 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:45.993490 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="main" containerID="cri-o://8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660" gracePeriod=30 Apr 24 15:02:45.993636 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:45.993493 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="tokenizer" containerID="cri-o://8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b" gracePeriod=30 Apr 24 15:02:46.195931 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:02:46.195899 2572 logging.go:55] [core] [Channel #1129 SubChannel #1130]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.41:9003", ServerName: "10.132.0.41:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.41:9003: connect: connection refused" Apr 24 15:02:46.688692 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:46.688654 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerID="8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660" exitCode=0 Apr 24 15:02:46.688865 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:46.688712 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" event={"ID":"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069","Type":"ContainerDied","Data":"8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660"} Apr 24 15:02:47.159886 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.159860 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:02:47.196423 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.196388 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.41:9003\" within 1s: context deadline exceeded" Apr 24 15:02:47.216180 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216117 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tls-certs\") pod \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " Apr 24 15:02:47.216180 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216152 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwfp9\" (UniqueName: \"kubernetes.io/projected/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kube-api-access-xwfp9\") pod \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " Apr 24 15:02:47.216343 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216184 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kserve-provision-location\") pod \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " Apr 24 15:02:47.216343 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216215 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-uds\") pod \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " Apr 24 15:02:47.216343 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216251 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-cache\") pod \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " Apr 24 15:02:47.216343 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216278 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-tmp\") pod \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\" (UID: \"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069\") " Apr 24 15:02:47.216546 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216512 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" (UID: "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:02:47.216640 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216528 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" (UID: "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:02:47.216736 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.216707 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" (UID: "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:02:47.217183 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.217156 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" (UID: "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:02:47.218087 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.218065 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kube-api-access-xwfp9" (OuterVolumeSpecName: "kube-api-access-xwfp9") pod "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" (UID: "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069"). InnerVolumeSpecName "kube-api-access-xwfp9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:02:47.218222 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.218205 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" (UID: "7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 15:02:47.316912 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.316890 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:47.316912 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.316910 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-uds\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:47.317053 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.316921 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:47.317053 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.316930 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tokenizer-tmp\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:47.317053 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.316939 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:47.317053 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.316947 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwfp9\" (UniqueName: \"kubernetes.io/projected/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069-kube-api-access-xwfp9\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:02:47.693478 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.693442 2572 generic.go:358] "Generic (PLEG): container finished" podID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerID="8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b" exitCode=0 Apr 24 15:02:47.693672 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.693527 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" Apr 24 15:02:47.693672 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.693522 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" event={"ID":"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069","Type":"ContainerDied","Data":"8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b"} Apr 24 15:02:47.693756 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.693679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp" event={"ID":"7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069","Type":"ContainerDied","Data":"d4129fd38f79224f37c858651e422817a79637339b32b1fc64d4e9569ba245bf"} Apr 24 15:02:47.693756 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.693696 2572 scope.go:117] "RemoveContainer" containerID="8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b" Apr 24 15:02:47.701962 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.701944 2572 scope.go:117] "RemoveContainer" containerID="8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660" Apr 24 15:02:47.709024 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.709006 2572 scope.go:117] "RemoveContainer" containerID="2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6" Apr 24 15:02:47.715148 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.715127 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp"] Apr 24 15:02:47.717348 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.717333 2572 scope.go:117] "RemoveContainer" containerID="8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b" Apr 24 15:02:47.717575 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:02:47.717553 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b\": container with ID starting with 8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b not found: ID does not exist" containerID="8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b" Apr 24 15:02:47.717758 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.717589 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b"} err="failed to get container status \"8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b\": rpc error: code = NotFound desc = could not find container \"8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b\": container with ID starting with 8d6c2431dd7f3bd687f36d7cb26ff7b977afcc7e8ca6d2893515f13efa775e7b not found: ID does not exist" Apr 24 15:02:47.717758 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.717629 2572 scope.go:117] "RemoveContainer" containerID="8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660" Apr 24 15:02:47.718041 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:02:47.718012 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660\": container with ID starting with 8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660 not found: ID does not exist" containerID="8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660" Apr 24 15:02:47.718119 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.718050 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660"} err="failed to get container status \"8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660\": rpc error: code = NotFound desc = could not find container \"8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660\": container with ID starting with 8b3f283d0fb8037d2b6bbcdedf530d5efdadcb303b9f15802ecb77ee7ec49660 not found: ID does not exist" Apr 24 15:02:47.718119 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.718074 2572 scope.go:117] "RemoveContainer" containerID="2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6" Apr 24 15:02:47.718430 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:02:47.718405 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6\": container with ID starting with 2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6 not found: ID does not exist" containerID="2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6" Apr 24 15:02:47.718533 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.718452 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6"} err="failed to get container status \"2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6\": rpc error: code = NotFound desc = could not find container \"2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6\": container with ID starting with 2b34b323f53c5d619c2b39d7f7f438528c656395ad5495c9134f1cf74247d6a6 not found: ID does not exist" Apr 24 15:02:47.720346 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:47.720328 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5f9bf4f9csrhbp"] Apr 24 15:02:48.058056 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:02:48.058029 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" path="/var/lib/kubelet/pods/7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069/volumes" Apr 24 15:03:15.990243 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:15.990189 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="llm-d-routing-sidecar" containerID="cri-o://92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa" gracePeriod=2 Apr 24 15:03:16.233830 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.233805 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-76696559d6-c5f2b_5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba/main/0.log" Apr 24 15:03:16.234418 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.234401 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:03:16.354535 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.354508 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-tls-certs\") pod \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " Apr 24 15:03:16.354705 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.354551 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kserve-provision-location\") pod \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " Apr 24 15:03:16.354705 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.354578 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tw49\" (UniqueName: \"kubernetes.io/projected/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kube-api-access-2tw49\") pod \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " Apr 24 15:03:16.354705 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.354597 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-home\") pod \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " Apr 24 15:03:16.354705 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.354655 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-model-cache\") pod \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " Apr 24 15:03:16.354913 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.354718 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-dshm\") pod \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\" (UID: \"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba\") " Apr 24 15:03:16.355013 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.354980 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-model-cache" (OuterVolumeSpecName: "model-cache") pod "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" (UID: "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:03:16.355136 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.355112 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-home" (OuterVolumeSpecName: "home") pod "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" (UID: "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:03:16.356655 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.356624 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kube-api-access-2tw49" (OuterVolumeSpecName: "kube-api-access-2tw49") pod "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" (UID: "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba"). InnerVolumeSpecName "kube-api-access-2tw49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:03:16.356766 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.356708 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" (UID: "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 15:03:16.357041 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.357013 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-dshm" (OuterVolumeSpecName: "dshm") pod "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" (UID: "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:03:16.410701 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.410671 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" (UID: "5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:03:16.455915 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.455890 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:03:16.455915 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.455911 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:03:16.456095 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.455921 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2tw49\" (UniqueName: \"kubernetes.io/projected/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-kube-api-access-2tw49\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:03:16.456095 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.455935 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-home\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:03:16.456095 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.455948 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-model-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:03:16.456095 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.455956 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba-dshm\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:03:16.792159 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.792133 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-76696559d6-c5f2b_5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba/main/0.log" Apr 24 15:03:16.792784 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.792762 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerID="2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810" exitCode=137 Apr 24 15:03:16.792784 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.792782 2572 generic.go:358] "Generic (PLEG): container finished" podID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerID="92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa" exitCode=0 Apr 24 15:03:16.792896 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.792797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" event={"ID":"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba","Type":"ContainerDied","Data":"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810"} Apr 24 15:03:16.792896 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.792834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" event={"ID":"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba","Type":"ContainerDied","Data":"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa"} Apr 24 15:03:16.792896 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.792835 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" Apr 24 15:03:16.792896 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.792845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b" event={"ID":"5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba","Type":"ContainerDied","Data":"d4e520bc03fe5e291dce7eb9a15192c6d86da09306c7c37047c2b459a2a3b9cd"} Apr 24 15:03:16.792896 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.792862 2572 scope.go:117] "RemoveContainer" containerID="2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810" Apr 24 15:03:16.813364 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.813342 2572 scope.go:117] "RemoveContainer" containerID="789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad" Apr 24 15:03:16.816420 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.816400 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b"] Apr 24 15:03:16.820554 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.820534 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-76696559d6-c5f2b"] Apr 24 15:03:16.824188 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.824170 2572 scope.go:117] "RemoveContainer" containerID="92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa" Apr 24 15:03:16.830800 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.830784 2572 scope.go:117] "RemoveContainer" containerID="2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810" Apr 24 15:03:16.831056 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:03:16.831028 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810\": container with ID starting with 2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810 not found: ID does not exist" containerID="2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810" Apr 24 15:03:16.831148 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.831066 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810"} err="failed to get container status \"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810\": rpc error: code = NotFound desc = could not find container \"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810\": container with ID starting with 2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810 not found: ID does not exist" Apr 24 15:03:16.831148 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.831088 2572 scope.go:117] "RemoveContainer" containerID="789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad" Apr 24 15:03:16.831317 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:03:16.831300 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad\": container with ID starting with 789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad not found: ID does not exist" containerID="789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad" Apr 24 15:03:16.831364 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.831326 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad"} err="failed to get container status \"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad\": rpc error: code = NotFound desc = could not find container \"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad\": container with ID starting with 789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad not found: ID does not exist" Apr 24 15:03:16.831364 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.831343 2572 scope.go:117] "RemoveContainer" containerID="92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa" Apr 24 15:03:16.831564 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:03:16.831546 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa\": container with ID starting with 92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa not found: ID does not exist" containerID="92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa" Apr 24 15:03:16.831641 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.831568 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa"} err="failed to get container status \"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa\": rpc error: code = NotFound desc = could not find container \"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa\": container with ID starting with 92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa not found: ID does not exist" Apr 24 15:03:16.831641 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.831583 2572 scope.go:117] "RemoveContainer" containerID="2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810" Apr 24 15:03:16.831823 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.831803 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810"} err="failed to get container status \"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810\": rpc error: code = NotFound desc = could not find container \"2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810\": container with ID starting with 2ebc09694d1a43586bb973879859101520f3b4583a3798735bae761f3e9ea810 not found: ID does not exist" Apr 24 15:03:16.831872 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.831825 2572 scope.go:117] "RemoveContainer" containerID="789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad" Apr 24 15:03:16.832049 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.832028 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad"} err="failed to get container status \"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad\": rpc error: code = NotFound desc = could not find container \"789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad\": container with ID starting with 789b311d907a652da11d1ed2654378ed47e50417f47b7be3121f482ee1ccdbad not found: ID does not exist" Apr 24 15:03:16.832100 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.832049 2572 scope.go:117] "RemoveContainer" containerID="92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa" Apr 24 15:03:16.832250 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:16.832233 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa"} err="failed to get container status \"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa\": rpc error: code = NotFound desc = could not find container \"92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa\": container with ID starting with 92df9f3f7aff671617826e20cfa49b1f0267f526ee3e6dd7dec2efcc82f319aa not found: ID does not exist" Apr 24 15:03:18.057688 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:18.057655 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" path="/var/lib/kubelet/pods/5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba/volumes" Apr 24 15:03:52.467281 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:52.467255 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 15:03:52.475891 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:03:52.475868 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 15:04:43.570516 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:43.570429 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn"] Apr 24 15:04:43.571123 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:43.570760 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="main" containerID="cri-o://7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994" gracePeriod=30 Apr 24 15:04:43.571123 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:43.570844 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="tokenizer" containerID="cri-o://126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb" gracePeriod=30 Apr 24 15:04:44.086442 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.086410 2572 generic.go:358] "Generic (PLEG): container finished" podID="71358538-e1ff-4f32-a496-4197ecf6146d" containerID="7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994" exitCode=0 Apr 24 15:04:44.086633 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.086488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" event={"ID":"71358538-e1ff-4f32-a496-4197ecf6146d","Type":"ContainerDied","Data":"7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994"} Apr 24 15:04:44.498205 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498119 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gvvtc/must-gather-h5dm5"] Apr 24 15:04:44.498590 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498569 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="storage-initializer" Apr 24 15:04:44.498659 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498593 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="storage-initializer" Apr 24 15:04:44.498659 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498623 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="tokenizer" Apr 24 15:04:44.498659 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498632 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="tokenizer" Apr 24 15:04:44.498659 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498643 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="storage-initializer" Apr 24 15:04:44.498659 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498651 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="storage-initializer" Apr 24 15:04:44.498819 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498667 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2339aac3-d1f6-4e44-b828-a14f725536eb" containerName="storage-initializer" Apr 24 15:04:44.498819 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498676 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2339aac3-d1f6-4e44-b828-a14f725536eb" containerName="storage-initializer" Apr 24 15:04:44.498819 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498691 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="main" Apr 24 15:04:44.498819 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498698 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="main" Apr 24 15:04:44.498819 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498707 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" Apr 24 15:04:44.498819 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498715 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" Apr 24 15:04:44.498819 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498735 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="llm-d-routing-sidecar" Apr 24 15:04:44.498819 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498743 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="llm-d-routing-sidecar" Apr 24 15:04:44.499048 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498826 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2339aac3-d1f6-4e44-b828-a14f725536eb" containerName="storage-initializer" Apr 24 15:04:44.499048 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498837 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="tokenizer" Apr 24 15:04:44.499048 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498846 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="llm-d-routing-sidecar" Apr 24 15:04:44.499048 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498857 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c6d7a9e-dcf4-4148-8bf9-4feb851e4cba" containerName="main" Apr 24 15:04:44.499048 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.498867 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7cfe1b8b-28ab-4a49-b0f6-9060ee1e1069" containerName="main" Apr 24 15:04:44.502192 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.502167 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:04:44.504906 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.504880 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvvtc\"/\"kube-root-ca.crt\"" Apr 24 15:04:44.505033 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.504976 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvvtc\"/\"openshift-service-ca.crt\"" Apr 24 15:04:44.506045 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.506030 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gvvtc\"/\"default-dockercfg-bx858\"" Apr 24 15:04:44.510659 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.510637 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvvtc/must-gather-h5dm5"] Apr 24 15:04:44.573402 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.573355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/027ba1cc-2379-404e-9f9f-256559e9844e-must-gather-output\") pod \"must-gather-h5dm5\" (UID: \"027ba1cc-2379-404e-9f9f-256559e9844e\") " pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:04:44.573799 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.573458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mx44\" (UniqueName: \"kubernetes.io/projected/027ba1cc-2379-404e-9f9f-256559e9844e-kube-api-access-4mx44\") pod \"must-gather-h5dm5\" (UID: \"027ba1cc-2379-404e-9f9f-256559e9844e\") " pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:04:44.674274 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.674237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/027ba1cc-2379-404e-9f9f-256559e9844e-must-gather-output\") pod \"must-gather-h5dm5\" (UID: \"027ba1cc-2379-404e-9f9f-256559e9844e\") " pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:04:44.674485 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.674306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mx44\" (UniqueName: \"kubernetes.io/projected/027ba1cc-2379-404e-9f9f-256559e9844e-kube-api-access-4mx44\") pod \"must-gather-h5dm5\" (UID: \"027ba1cc-2379-404e-9f9f-256559e9844e\") " pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:04:44.674656 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.674634 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/027ba1cc-2379-404e-9f9f-256559e9844e-must-gather-output\") pod \"must-gather-h5dm5\" (UID: \"027ba1cc-2379-404e-9f9f-256559e9844e\") " pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:04:44.682918 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.682891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mx44\" (UniqueName: \"kubernetes.io/projected/027ba1cc-2379-404e-9f9f-256559e9844e-kube-api-access-4mx44\") pod \"must-gather-h5dm5\" (UID: \"027ba1cc-2379-404e-9f9f-256559e9844e\") " pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:04:44.812946 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.812912 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:04:44.818186 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.818166 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:04:44.876335 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876306 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-cache\") pod \"71358538-e1ff-4f32-a496-4197ecf6146d\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " Apr 24 15:04:44.876521 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876354 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71358538-e1ff-4f32-a496-4197ecf6146d-tls-certs\") pod \"71358538-e1ff-4f32-a496-4197ecf6146d\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " Apr 24 15:04:44.876521 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876385 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-uds\") pod \"71358538-e1ff-4f32-a496-4197ecf6146d\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " Apr 24 15:04:44.876521 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876423 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-kserve-provision-location\") pod \"71358538-e1ff-4f32-a496-4197ecf6146d\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " Apr 24 15:04:44.876521 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876504 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qxgn\" (UniqueName: \"kubernetes.io/projected/71358538-e1ff-4f32-a496-4197ecf6146d-kube-api-access-2qxgn\") pod \"71358538-e1ff-4f32-a496-4197ecf6146d\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " Apr 24 15:04:44.876803 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876526 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-tmp\") pod \"71358538-e1ff-4f32-a496-4197ecf6146d\" (UID: \"71358538-e1ff-4f32-a496-4197ecf6146d\") " Apr 24 15:04:44.876803 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876630 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "71358538-e1ff-4f32-a496-4197ecf6146d" (UID: "71358538-e1ff-4f32-a496-4197ecf6146d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:04:44.876921 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876800 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-cache\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:04:44.876969 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.876944 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "71358538-e1ff-4f32-a496-4197ecf6146d" (UID: "71358538-e1ff-4f32-a496-4197ecf6146d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:04:44.877166 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.877130 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "71358538-e1ff-4f32-a496-4197ecf6146d" (UID: "71358538-e1ff-4f32-a496-4197ecf6146d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:04:44.877748 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.877721 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "71358538-e1ff-4f32-a496-4197ecf6146d" (UID: "71358538-e1ff-4f32-a496-4197ecf6146d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:04:44.878544 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.878517 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71358538-e1ff-4f32-a496-4197ecf6146d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "71358538-e1ff-4f32-a496-4197ecf6146d" (UID: "71358538-e1ff-4f32-a496-4197ecf6146d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 15:04:44.878965 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.878940 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71358538-e1ff-4f32-a496-4197ecf6146d-kube-api-access-2qxgn" (OuterVolumeSpecName: "kube-api-access-2qxgn") pod "71358538-e1ff-4f32-a496-4197ecf6146d" (UID: "71358538-e1ff-4f32-a496-4197ecf6146d"). InnerVolumeSpecName "kube-api-access-2qxgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:04:44.933296 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.933261 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvvtc/must-gather-h5dm5"] Apr 24 15:04:44.935182 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:04:44.935152 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027ba1cc_2379_404e_9f9f_256559e9844e.slice/crio-b03452a2baa559b2e54d38a8bcd7463c871631f84c1208b9b700292e305cdfb0 WatchSource:0}: Error finding container b03452a2baa559b2e54d38a8bcd7463c871631f84c1208b9b700292e305cdfb0: Status 404 returned error can't find the container with id b03452a2baa559b2e54d38a8bcd7463c871631f84c1208b9b700292e305cdfb0 Apr 24 15:04:44.977957 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.977932 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71358538-e1ff-4f32-a496-4197ecf6146d-tls-certs\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:04:44.977957 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.977954 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-uds\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:04:44.978114 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.977964 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-kserve-provision-location\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:04:44.978114 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.977975 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qxgn\" (UniqueName: \"kubernetes.io/projected/71358538-e1ff-4f32-a496-4197ecf6146d-kube-api-access-2qxgn\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:04:44.978114 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:44.977986 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/71358538-e1ff-4f32-a496-4197ecf6146d-tokenizer-tmp\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:04:45.091292 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.091201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" event={"ID":"027ba1cc-2379-404e-9f9f-256559e9844e","Type":"ContainerStarted","Data":"b03452a2baa559b2e54d38a8bcd7463c871631f84c1208b9b700292e305cdfb0"} Apr 24 15:04:45.092865 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.092837 2572 generic.go:358] "Generic (PLEG): container finished" podID="71358538-e1ff-4f32-a496-4197ecf6146d" containerID="126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb" exitCode=0 Apr 24 15:04:45.092997 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.092919 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" event={"ID":"71358538-e1ff-4f32-a496-4197ecf6146d","Type":"ContainerDied","Data":"126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb"} Apr 24 15:04:45.092997 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.092952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" event={"ID":"71358538-e1ff-4f32-a496-4197ecf6146d","Type":"ContainerDied","Data":"9c56820d4dc454e0153a0ff85f2d2f7d66ee7997f06e663562edf8031d20d97a"} Apr 24 15:04:45.092997 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.092967 2572 scope.go:117] "RemoveContainer" containerID="126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb" Apr 24 15:04:45.093169 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.092927 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn" Apr 24 15:04:45.101546 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.101522 2572 scope.go:117] "RemoveContainer" containerID="7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994" Apr 24 15:04:45.109002 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.108984 2572 scope.go:117] "RemoveContainer" containerID="ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0" Apr 24 15:04:45.115661 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.115633 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn"] Apr 24 15:04:45.116896 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.116880 2572 scope.go:117] "RemoveContainer" containerID="126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb" Apr 24 15:04:45.117130 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:04:45.117112 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb\": container with ID starting with 126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb not found: ID does not exist" containerID="126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb" Apr 24 15:04:45.117168 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.117139 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb"} err="failed to get container status \"126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb\": rpc error: code = NotFound desc = could not find container \"126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb\": container with ID starting with 126618c0003db24b4fd9f8feb43dcf1b0d86c51a9db9808cb7a863adca0d1ddb not found: ID does not exist" Apr 24 15:04:45.117168 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.117159 2572 scope.go:117] "RemoveContainer" containerID="7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994" Apr 24 15:04:45.117376 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:04:45.117361 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994\": container with ID starting with 7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994 not found: ID does not exist" containerID="7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994" Apr 24 15:04:45.117422 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.117380 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994"} err="failed to get container status \"7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994\": rpc error: code = NotFound desc = could not find container \"7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994\": container with ID starting with 7205dd35439459a708a570d77ace7daef20c454c17f02bd6f1586368f1f44994 not found: ID does not exist" Apr 24 15:04:45.117422 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.117393 2572 scope.go:117] "RemoveContainer" containerID="ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0" Apr 24 15:04:45.117612 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:04:45.117584 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0\": container with ID starting with ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0 not found: ID does not exist" containerID="ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0" Apr 24 15:04:45.117657 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.117626 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0"} err="failed to get container status \"ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0\": rpc error: code = NotFound desc = could not find container \"ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0\": container with ID starting with ebdfb405ef2b6308786f05e70cd4f338299182f18e42cc7263fdc7a31a2740b0 not found: ID does not exist" Apr 24 15:04:45.121471 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:45.121452 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schegn5bn"] Apr 24 15:04:46.059524 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:46.059490 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" path="/var/lib/kubelet/pods/71358538-e1ff-4f32-a496-4197ecf6146d/volumes" Apr 24 15:04:50.116699 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:50.116665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" event={"ID":"027ba1cc-2379-404e-9f9f-256559e9844e","Type":"ContainerStarted","Data":"663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad"} Apr 24 15:04:50.116699 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:50.116704 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" event={"ID":"027ba1cc-2379-404e-9f9f-256559e9844e","Type":"ContainerStarted","Data":"4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb"} Apr 24 15:04:50.132478 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:50.132420 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" podStartSLOduration=1.82861369 podStartE2EDuration="6.132403175s" podCreationTimestamp="2026-04-24 15:04:44 +0000 UTC" firstStartedPulling="2026-04-24 15:04:44.936773411 +0000 UTC m=+2453.487615561" lastFinishedPulling="2026-04-24 15:04:49.240562886 +0000 UTC m=+2457.791405046" observedRunningTime="2026-04-24 15:04:50.132177014 +0000 UTC m=+2458.683019186" watchObservedRunningTime="2026-04-24 15:04:50.132403175 +0000 UTC m=+2458.683245348" Apr 24 15:04:58.246419 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:58.246385 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:04:59.279407 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:04:59.279376 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:00.237454 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:00.237424 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:01.167732 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:01.167702 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:02.110560 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:02.110534 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:03.051327 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:03.051299 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:03.996559 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:03.996527 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:04.929059 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:04.929021 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:05.846019 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:05.845988 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:06.795306 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:06.795276 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:07.731371 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:07.731342 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:08.669707 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:08.669682 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:09.618773 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:09.618744 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:10.629237 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:10.629206 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-74r8s_2cf912da-55a2-4be1-b630-60258fde33f3/istio-proxy/0.log" Apr 24 15:05:11.597429 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:11.597397 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-hndv9_c0110066-5174-4fda-9ac6-6754da8f8764/istio-proxy/0.log" Apr 24 15:05:12.364234 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:12.364207 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-hndv9_c0110066-5174-4fda-9ac6-6754da8f8764/istio-proxy/0.log" Apr 24 15:05:13.156477 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:13.156445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-f27mm_c09d1452-ea2f-4446-ba91-3ce10da5c7ee/manager/0.log" Apr 24 15:05:13.164008 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:13.163985 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-lx5zw_3950a3e3-5147-41d7-b415-4cdd6e7bfb02/limitador/0.log" Apr 24 15:05:13.184797 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:13.184777 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-s65gz_cbc11936-8b41-4f5c-80e1-56df1e560782/manager/0.log" Apr 24 15:05:14.202880 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:14.202788 2572 generic.go:358] "Generic (PLEG): container finished" podID="027ba1cc-2379-404e-9f9f-256559e9844e" containerID="4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb" exitCode=0 Apr 24 15:05:14.202880 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:14.202850 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" event={"ID":"027ba1cc-2379-404e-9f9f-256559e9844e","Type":"ContainerDied","Data":"4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb"} Apr 24 15:05:14.203312 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:14.203164 2572 scope.go:117] "RemoveContainer" containerID="4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb" Apr 24 15:05:14.807641 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:14.807590 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gvvtc_must-gather-h5dm5_027ba1cc-2379-404e-9f9f-256559e9844e/gather/0.log" Apr 24 15:05:15.445629 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.445585 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rpb9c/must-gather-vxmfj"] Apr 24 15:05:15.445998 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.445917 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="storage-initializer" Apr 24 15:05:15.445998 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.445929 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="storage-initializer" Apr 24 15:05:15.445998 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.445950 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="main" Apr 24 15:05:15.445998 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.445959 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="main" Apr 24 15:05:15.445998 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.445968 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="tokenizer" Apr 24 15:05:15.445998 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.445976 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="tokenizer" Apr 24 15:05:15.446199 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.446027 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="main" Apr 24 15:05:15.446199 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.446037 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="71358538-e1ff-4f32-a496-4197ecf6146d" containerName="tokenizer" Apr 24 15:05:15.450293 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.450272 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rpb9c/must-gather-vxmfj" Apr 24 15:05:15.453071 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.453051 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rpb9c\"/\"kube-root-ca.crt\"" Apr 24 15:05:15.454273 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.454245 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rpb9c\"/\"openshift-service-ca.crt\"" Apr 24 15:05:15.454273 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.454247 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rpb9c\"/\"default-dockercfg-rtxl9\"" Apr 24 15:05:15.458359 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.458337 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rpb9c/must-gather-vxmfj"] Apr 24 15:05:15.543842 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.543806 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7l9p\" (UniqueName: \"kubernetes.io/projected/92f45227-df1d-40a6-aa89-b45b40dba993-kube-api-access-r7l9p\") pod \"must-gather-vxmfj\" (UID: \"92f45227-df1d-40a6-aa89-b45b40dba993\") " pod="openshift-must-gather-rpb9c/must-gather-vxmfj" Apr 24 15:05:15.543992 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.543850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/92f45227-df1d-40a6-aa89-b45b40dba993-must-gather-output\") pod \"must-gather-vxmfj\" (UID: \"92f45227-df1d-40a6-aa89-b45b40dba993\") " pod="openshift-must-gather-rpb9c/must-gather-vxmfj" Apr 24 15:05:15.644870 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.644823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7l9p\" (UniqueName: \"kubernetes.io/projected/92f45227-df1d-40a6-aa89-b45b40dba993-kube-api-access-r7l9p\") pod \"must-gather-vxmfj\" (UID: \"92f45227-df1d-40a6-aa89-b45b40dba993\") " pod="openshift-must-gather-rpb9c/must-gather-vxmfj" Apr 24 15:05:15.644870 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.644875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/92f45227-df1d-40a6-aa89-b45b40dba993-must-gather-output\") pod \"must-gather-vxmfj\" (UID: \"92f45227-df1d-40a6-aa89-b45b40dba993\") " pod="openshift-must-gather-rpb9c/must-gather-vxmfj" Apr 24 15:05:15.645207 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.645191 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/92f45227-df1d-40a6-aa89-b45b40dba993-must-gather-output\") pod \"must-gather-vxmfj\" (UID: \"92f45227-df1d-40a6-aa89-b45b40dba993\") " pod="openshift-must-gather-rpb9c/must-gather-vxmfj" Apr 24 15:05:15.653492 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.653470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7l9p\" (UniqueName: \"kubernetes.io/projected/92f45227-df1d-40a6-aa89-b45b40dba993-kube-api-access-r7l9p\") pod \"must-gather-vxmfj\" (UID: \"92f45227-df1d-40a6-aa89-b45b40dba993\") " pod="openshift-must-gather-rpb9c/must-gather-vxmfj" Apr 24 15:05:15.760103 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.760021 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rpb9c/must-gather-vxmfj" Apr 24 15:05:15.875097 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:15.875065 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rpb9c/must-gather-vxmfj"] Apr 24 15:05:15.878007 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:05:15.877964 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92f45227_df1d_40a6_aa89_b45b40dba993.slice/crio-21450d0e2b706e6d7cc60e37ad481218f084c4106e55800752d30e4e8e418d53 WatchSource:0}: Error finding container 21450d0e2b706e6d7cc60e37ad481218f084c4106e55800752d30e4e8e418d53: Status 404 returned error can't find the container with id 21450d0e2b706e6d7cc60e37ad481218f084c4106e55800752d30e4e8e418d53 Apr 24 15:05:16.210438 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:16.210399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rpb9c/must-gather-vxmfj" event={"ID":"92f45227-df1d-40a6-aa89-b45b40dba993","Type":"ContainerStarted","Data":"21450d0e2b706e6d7cc60e37ad481218f084c4106e55800752d30e4e8e418d53"} Apr 24 15:05:17.215860 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:17.215833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rpb9c/must-gather-vxmfj" event={"ID":"92f45227-df1d-40a6-aa89-b45b40dba993","Type":"ContainerStarted","Data":"8ab9353f02fef40a836f3cf1de35d3df305f029c6faa4bbb4d5f2d2dc406618c"} Apr 24 15:05:17.216206 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:17.215869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rpb9c/must-gather-vxmfj" event={"ID":"92f45227-df1d-40a6-aa89-b45b40dba993","Type":"ContainerStarted","Data":"3fdb0040fd0aaf6f793535e45554cd5083be979b87cb8cf7a26d12813b95634b"} Apr 24 15:05:17.233176 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:17.233132 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rpb9c/must-gather-vxmfj" podStartSLOduration=1.167447577 podStartE2EDuration="2.2331166s" podCreationTimestamp="2026-04-24 15:05:15 +0000 UTC" firstStartedPulling="2026-04-24 15:05:15.87983932 +0000 UTC m=+2484.430681473" lastFinishedPulling="2026-04-24 15:05:16.945508335 +0000 UTC m=+2485.496350496" observedRunningTime="2026-04-24 15:05:17.231414048 +0000 UTC m=+2485.782256221" watchObservedRunningTime="2026-04-24 15:05:17.2331166 +0000 UTC m=+2485.783958750" Apr 24 15:05:18.408150 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:18.408121 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rdrc9_9c7c423b-7cf4-4b11-b125-c5bcef103313/global-pull-secret-syncer/0.log" Apr 24 15:05:18.470352 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:18.470315 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7wv4w_c69fa493-7572-43af-8e01-2691e09381c7/konnectivity-agent/0.log" Apr 24 15:05:18.541791 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:18.541761 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-169.ec2.internal_e8a21c2b27ced13adbceceb16f3c2439/haproxy/0.log" Apr 24 15:05:20.285677 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.285641 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gvvtc/must-gather-h5dm5"] Apr 24 15:05:20.286275 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.285905 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" containerName="copy" containerID="cri-o://663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad" gracePeriod=2 Apr 24 15:05:20.288493 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.288460 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gvvtc/must-gather-h5dm5"] Apr 24 15:05:20.288630 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.288480 2572 status_manager.go:895] "Failed to get status for pod" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" err="pods \"must-gather-h5dm5\" is forbidden: User \"system:node:ip-10-0-128-169.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvvtc\": no relationship found between node 'ip-10-0-128-169.ec2.internal' and this object" Apr 24 15:05:20.647415 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.646555 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gvvtc_must-gather-h5dm5_027ba1cc-2379-404e-9f9f-256559e9844e/copy/0.log" Apr 24 15:05:20.647415 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.647067 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:05:20.649876 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.649822 2572 status_manager.go:895] "Failed to get status for pod" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" err="pods \"must-gather-h5dm5\" is forbidden: User \"system:node:ip-10-0-128-169.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvvtc\": no relationship found between node 'ip-10-0-128-169.ec2.internal' and this object" Apr 24 15:05:20.795955 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.790110 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mx44\" (UniqueName: \"kubernetes.io/projected/027ba1cc-2379-404e-9f9f-256559e9844e-kube-api-access-4mx44\") pod \"027ba1cc-2379-404e-9f9f-256559e9844e\" (UID: \"027ba1cc-2379-404e-9f9f-256559e9844e\") " Apr 24 15:05:20.795955 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.790177 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/027ba1cc-2379-404e-9f9f-256559e9844e-must-gather-output\") pod \"027ba1cc-2379-404e-9f9f-256559e9844e\" (UID: \"027ba1cc-2379-404e-9f9f-256559e9844e\") " Apr 24 15:05:20.800249 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.798406 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/027ba1cc-2379-404e-9f9f-256559e9844e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "027ba1cc-2379-404e-9f9f-256559e9844e" (UID: "027ba1cc-2379-404e-9f9f-256559e9844e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 15:05:20.800249 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.799071 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027ba1cc-2379-404e-9f9f-256559e9844e-kube-api-access-4mx44" (OuterVolumeSpecName: "kube-api-access-4mx44") pod "027ba1cc-2379-404e-9f9f-256559e9844e" (UID: "027ba1cc-2379-404e-9f9f-256559e9844e"). InnerVolumeSpecName "kube-api-access-4mx44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 15:05:20.891152 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.891036 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mx44\" (UniqueName: \"kubernetes.io/projected/027ba1cc-2379-404e-9f9f-256559e9844e-kube-api-access-4mx44\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:05:20.891152 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:20.891078 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/027ba1cc-2379-404e-9f9f-256559e9844e-must-gather-output\") on node \"ip-10-0-128-169.ec2.internal\" DevicePath \"\"" Apr 24 15:05:21.235709 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.235385 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gvvtc_must-gather-h5dm5_027ba1cc-2379-404e-9f9f-256559e9844e/copy/0.log" Apr 24 15:05:21.235886 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.235812 2572 generic.go:358] "Generic (PLEG): container finished" podID="027ba1cc-2379-404e-9f9f-256559e9844e" containerID="663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad" exitCode=143 Apr 24 15:05:21.236049 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.235946 2572 scope.go:117] "RemoveContainer" containerID="663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad" Apr 24 15:05:21.236188 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.236062 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" Apr 24 15:05:21.239766 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.239729 2572 status_manager.go:895] "Failed to get status for pod" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" err="pods \"must-gather-h5dm5\" is forbidden: User \"system:node:ip-10-0-128-169.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvvtc\": no relationship found between node 'ip-10-0-128-169.ec2.internal' and this object" Apr 24 15:05:21.252681 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.252661 2572 scope.go:117] "RemoveContainer" containerID="4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb" Apr 24 15:05:21.253666 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.253634 2572 status_manager.go:895] "Failed to get status for pod" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" err="pods \"must-gather-h5dm5\" is forbidden: User \"system:node:ip-10-0-128-169.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvvtc\": no relationship found between node 'ip-10-0-128-169.ec2.internal' and this object" Apr 24 15:05:21.282794 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.282670 2572 scope.go:117] "RemoveContainer" containerID="663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad" Apr 24 15:05:21.282997 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:05:21.282970 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad\": container with ID starting with 663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad not found: ID does not exist" containerID="663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad" Apr 24 15:05:21.283076 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.283010 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad"} err="failed to get container status \"663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad\": rpc error: code = NotFound desc = could not find container \"663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad\": container with ID starting with 663a5d4bc47eaf767f210c0b48bc2927870df6d444ce5e9c996db47b20afe1ad not found: ID does not exist" Apr 24 15:05:21.283076 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.283034 2572 scope.go:117] "RemoveContainer" containerID="4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb" Apr 24 15:05:21.283258 ip-10-0-128-169 kubenswrapper[2572]: E0424 15:05:21.283235 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb\": container with ID starting with 4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb not found: ID does not exist" containerID="4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb" Apr 24 15:05:21.283334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:21.283266 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb"} err="failed to get container status \"4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb\": rpc error: code = NotFound desc = could not find container \"4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb\": container with ID starting with 4710c259df7fb243d5b2368abc3265c3fa3a72bea49c273f974fdb5178b5c3cb not found: ID does not exist" Apr 24 15:05:22.081668 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:22.081629 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" path="/var/lib/kubelet/pods/027ba1cc-2379-404e-9f9f-256559e9844e/volumes" Apr 24 15:05:22.083059 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:22.082999 2572 status_manager.go:895] "Failed to get status for pod" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" pod="openshift-must-gather-gvvtc/must-gather-h5dm5" err="pods \"must-gather-h5dm5\" is forbidden: User \"system:node:ip-10-0-128-169.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-gvvtc\": no relationship found between node 'ip-10-0-128-169.ec2.internal' and this object" Apr 24 15:05:22.166987 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:22.166951 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-f27mm_c09d1452-ea2f-4446-ba91-3ce10da5c7ee/manager/0.log" Apr 24 15:05:22.183792 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:22.183753 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-lx5zw_3950a3e3-5147-41d7-b415-4cdd6e7bfb02/limitador/0.log" Apr 24 15:05:22.215940 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:22.215913 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-s65gz_cbc11936-8b41-4f5c-80e1-56df1e560782/manager/0.log" Apr 24 15:05:23.642959 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:23.642918 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m7qfv_dcf31072-2346-4778-8912-4233fbb8dd01/node-exporter/0.log" Apr 24 15:05:23.659243 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:23.659205 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m7qfv_dcf31072-2346-4778-8912-4233fbb8dd01/kube-rbac-proxy/0.log" Apr 24 15:05:23.675396 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:23.675362 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m7qfv_dcf31072-2346-4778-8912-4233fbb8dd01/init-textfile/0.log" Apr 24 15:05:27.733648 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.733591 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk"] Apr 24 15:05:27.734334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.734088 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" containerName="gather" Apr 24 15:05:27.734334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.734108 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" containerName="gather" Apr 24 15:05:27.734334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.734126 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" containerName="copy" Apr 24 15:05:27.734334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.734135 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" containerName="copy" Apr 24 15:05:27.734334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.734208 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" containerName="copy" Apr 24 15:05:27.734334 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.734223 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="027ba1cc-2379-404e-9f9f-256559e9844e" containerName="gather" Apr 24 15:05:27.740676 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.740653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.744763 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.744734 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk"] Apr 24 15:05:27.861674 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.861643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg42g\" (UniqueName: \"kubernetes.io/projected/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-kube-api-access-xg42g\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.861859 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.861685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-podres\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.861859 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.861820 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-proc\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.861963 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.861857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-sys\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.861963 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.861915 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-lib-modules\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.962949 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.962902 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-proc\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.963146 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.962968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-sys\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.963146 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.963030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-lib-modules\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.963146 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.963096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xg42g\" (UniqueName: \"kubernetes.io/projected/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-kube-api-access-xg42g\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.963146 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.963131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-podres\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.963392 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.963314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-podres\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.963392 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.963373 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-proc\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.963505 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.963416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-sys\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.963505 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.963499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-lib-modules\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:27.973496 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:27.973468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg42g\" (UniqueName: \"kubernetes.io/projected/fd4a3a10-157b-4e24-8bef-ccb2075dd8c7-kube-api-access-xg42g\") pod \"perf-node-gather-daemonset-r7mgk\" (UID: \"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7\") " pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:28.024933 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:28.024841 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w8bbw_7a535dae-e92e-4ab6-b2ab-5eb561bc8793/dns/0.log" Apr 24 15:05:28.040396 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:28.040371 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w8bbw_7a535dae-e92e-4ab6-b2ab-5eb561bc8793/kube-rbac-proxy/0.log" Apr 24 15:05:28.055034 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:28.055009 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:28.147166 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:28.147140 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qdkg2_7ef2b28b-afd5-4cb7-a98e-824028a4bb08/dns-node-resolver/0.log" Apr 24 15:05:28.184784 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:28.184111 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk"] Apr 24 15:05:28.187635 ip-10-0-128-169 kubenswrapper[2572]: W0424 15:05:28.187060 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfd4a3a10_157b_4e24_8bef_ccb2075dd8c7.slice/crio-c00437efbd41a414dbc3c32335c5ce6466fd6109e097aae0c462598c0d28b1f0 WatchSource:0}: Error finding container c00437efbd41a414dbc3c32335c5ce6466fd6109e097aae0c462598c0d28b1f0: Status 404 returned error can't find the container with id c00437efbd41a414dbc3c32335c5ce6466fd6109e097aae0c462598c0d28b1f0 Apr 24 15:05:28.191260 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:28.188981 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 15:05:28.270452 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:28.270416 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" event={"ID":"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7","Type":"ContainerStarted","Data":"c00437efbd41a414dbc3c32335c5ce6466fd6109e097aae0c462598c0d28b1f0"} Apr 24 15:05:28.590363 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:28.590296 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gxbvj_f4b9ac64-b92d-41f4-8a6d-92cb3608d7a4/node-ca/0.log" Apr 24 15:05:29.276725 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:29.276690 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" event={"ID":"fd4a3a10-157b-4e24-8bef-ccb2075dd8c7","Type":"ContainerStarted","Data":"1db378fc25052611f5856fc8840668d923890cfcbb4c2e35d5c891191a30e3de"} Apr 24 15:05:29.277106 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:29.276753 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:29.296668 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:29.296592 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" podStartSLOduration=2.2965737920000002 podStartE2EDuration="2.296573792s" podCreationTimestamp="2026-04-24 15:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 15:05:29.293974715 +0000 UTC m=+2497.844816886" watchObservedRunningTime="2026-04-24 15:05:29.296573792 +0000 UTC m=+2497.847415965" Apr 24 15:05:29.405914 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:29.405885 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-hndv9_c0110066-5174-4fda-9ac6-6754da8f8764/istio-proxy/0.log" Apr 24 15:05:29.850723 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:29.850698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dp5qc_933358e5-3e5d-42e6-8b21-2e75a3e94d59/serve-healthcheck-canary/0.log" Apr 24 15:05:30.339068 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:30.339042 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v2k95_e4ced4c6-7792-40bd-9f9e-a696d34a3c84/kube-rbac-proxy/0.log" Apr 24 15:05:30.353297 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:30.353264 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v2k95_e4ced4c6-7792-40bd-9f9e-a696d34a3c84/exporter/0.log" Apr 24 15:05:30.367853 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:30.367830 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v2k95_e4ced4c6-7792-40bd-9f9e-a696d34a3c84/extractor/0.log" Apr 24 15:05:33.390054 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:33.390026 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-b7dc77d59-6cvhv_5e08c887-a51c-4489-a68b-5644b9e3a4f6/manager/0.log" Apr 24 15:05:33.464202 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:33.464176 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-7cq72_be7aa129-6149-4473-933b-b1541696bf80/server/0.log" Apr 24 15:05:33.674502 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:33.674432 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-62b7s_4836eeb4-d0f0-4314-b7f7-256e05157faf/manager/0.log" Apr 24 15:05:33.689944 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:33.689910 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-z2w67_a13d4e1c-afbc-46bd-b1b9-4d74924c7eb2/s3-init/0.log" Apr 24 15:05:33.710747 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:33.710722 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-8tjzz_4de2d5fb-a2ed-414b-8303-dbe27d214686/seaweedfs/0.log" Apr 24 15:05:35.292995 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:35.292967 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rpb9c/perf-node-gather-daemonset-r7mgk" Apr 24 15:05:39.346744 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.346714 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gsthx_4372c551-45c1-401f-b043-c8048ef49e81/kube-multus-additional-cni-plugins/0.log" Apr 24 15:05:39.365169 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.365141 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gsthx_4372c551-45c1-401f-b043-c8048ef49e81/egress-router-binary-copy/0.log" Apr 24 15:05:39.381356 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.381334 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gsthx_4372c551-45c1-401f-b043-c8048ef49e81/cni-plugins/0.log" Apr 24 15:05:39.396394 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.396370 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gsthx_4372c551-45c1-401f-b043-c8048ef49e81/bond-cni-plugin/0.log" Apr 24 15:05:39.411853 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.411829 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gsthx_4372c551-45c1-401f-b043-c8048ef49e81/routeoverride-cni/0.log" Apr 24 15:05:39.426205 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.426177 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gsthx_4372c551-45c1-401f-b043-c8048ef49e81/whereabouts-cni-bincopy/0.log" Apr 24 15:05:39.440532 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.440513 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gsthx_4372c551-45c1-401f-b043-c8048ef49e81/whereabouts-cni/0.log" Apr 24 15:05:39.743217 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.743147 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9n5z_0cf9f051-7347-447e-934c-30eb0f79fd31/kube-multus/0.log" Apr 24 15:05:39.806866 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.806821 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7vzlj_1a1e1cea-5bb5-4d2e-83e5-817f18307569/network-metrics-daemon/0.log" Apr 24 15:05:39.819488 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:39.819458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7vzlj_1a1e1cea-5bb5-4d2e-83e5-817f18307569/kube-rbac-proxy/0.log" Apr 24 15:05:40.923257 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:40.923144 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-controller/0.log" Apr 24 15:05:40.936573 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:40.936526 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/0.log" Apr 24 15:05:40.950006 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:40.949984 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovn-acl-logging/1.log" Apr 24 15:05:40.965485 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:40.965440 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/kube-rbac-proxy-node/0.log" Apr 24 15:05:40.980588 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:40.980565 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 15:05:40.993452 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:40.993410 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/northd/0.log" Apr 24 15:05:41.008840 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:41.008820 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/nbdb/0.log" Apr 24 15:05:41.022924 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:41.022883 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/sbdb/0.log" Apr 24 15:05:41.134300 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:41.134273 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7w4g5_2b4a98b9-41f7-4893-a186-4cc7fb68fb05/ovnkube-controller/0.log" Apr 24 15:05:42.456272 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:42.456229 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-md9sj_7433ec7f-d2d9-4c4e-a533-4c4c15fc9bfa/network-check-target-container/0.log" Apr 24 15:05:43.340558 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:43.340518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7fk7l_da7813e0-cc46-4210-982d-86a1bd6de417/iptables-alerter/0.log" Apr 24 15:05:44.074756 ip-10-0-128-169 kubenswrapper[2572]: I0424 15:05:44.074726 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-dx4gw_fc559a32-7f1b-4b50-a3a8-4b6b184a2586/tuned/0.log"