Apr 16 14:50:05.507611 ip-10-0-139-55 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 14:50:05.507624 ip-10-0-139-55 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 14:50:05.507631 ip-10-0-139-55 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 14:50:05.507892 ip-10-0-139-55 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 14:50:15.732669 ip-10-0-139-55 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 14:50:15.732689 ip-10-0-139-55 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 35c1850a4af141839855d9899e960afa -- Apr 16 14:52:28.967286 ip-10-0-139-55 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:29.487591 ip-10-0-139-55 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:29.487591 ip-10-0-139-55 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:29.487591 ip-10-0-139-55 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:29.487591 ip-10-0-139-55 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:29.487591 ip-10-0-139-55 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:29.489249 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.489151 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:29.494076 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494053 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.494076 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494072 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.494076 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494078 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.494076 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494082 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494086 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494090 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494094 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494098 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494102 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494106 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494110 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494114 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494117 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494129 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494133 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494137 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494141 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494146 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494150 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494153 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494157 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494161 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494165 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.494331 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494169 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494173 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494177 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494181 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494185 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494189 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494194 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494199 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494202 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494206 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494211 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494214 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494218 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494222 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494228 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494233 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494237 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494241 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494245 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494249 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.495145 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494254 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494258 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494263 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494268 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494272 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494279 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494288 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494293 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494298 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494302 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494307 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494311 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494315 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494319 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494324 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494327 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494332 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494336 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494342 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.495960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494347 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494354 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494358 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494363 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494367 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494371 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494376 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494381 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494385 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494389 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494394 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494398 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494402 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494406 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494411 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494415 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494422 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494428 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494432 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494437 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.496686 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494442 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494446 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494451 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.494455 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495122 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495131 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495136 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495140 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495144 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495148 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495152 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495157 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495164 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495169 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495174 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495178 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495183 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495188 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495193 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.497214 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495198 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495203 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495207 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495212 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495216 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495220 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495225 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495231 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495236 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495241 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495246 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495251 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495255 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495259 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495263 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495267 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495272 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495276 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495281 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.497810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495285 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495289 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495293 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495297 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495301 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495306 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495310 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495315 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495320 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495326 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495330 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495334 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495338 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495343 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495347 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495353 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495357 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495362 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495366 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495370 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.498385 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495376 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495380 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495384 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495389 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495394 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495398 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495402 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495406 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495411 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495415 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495419 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495423 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495427 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495441 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495446 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495450 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495455 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495459 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495463 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495467 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.498881 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495471 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495476 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495480 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495485 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495489 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495493 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495497 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495502 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495506 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495510 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495514 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.495518 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497368 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497430 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497575 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497580 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497586 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497590 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497595 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497600 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497603 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:29.499392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497606 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497609 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497613 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497616 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497619 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497622 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497625 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497628 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497631 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497634 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497639 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497642 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497646 2577 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497649 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497652 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497656 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497660 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497663 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497666 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497670 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497673 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497677 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497680 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497684 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497689 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:29.499905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497692 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497695 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497698 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497701 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497704 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497710 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497713 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497716 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497719 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497722 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497726 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497729 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497732 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497736 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497738 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497741 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497744 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497747 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497751 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497754 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497757 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497761 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497764 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497768 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497771 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497774 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:29.500522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497777 2577 flags.go:64] FLAG: --help="false" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497780 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497784 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497787 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497790 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497794 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497798 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497801 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497804 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497807 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497810 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497813 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497817 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497820 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497823 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497826 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497829 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497832 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497835 2577 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497839 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497842 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497845 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497851 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:29.501146 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497854 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497856 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497859 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497862 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497865 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497868 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497871 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497876 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497879 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497883 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497887 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497890 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497893 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497896 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497899 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497902 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497905 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497912 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497915 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497918 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497921 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497924 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497945 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497950 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:29.501777 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497953 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497956 2577 flags.go:64] FLAG: --port="10250" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497960 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497962 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ea93e02c67747a4e" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497966 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497969 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497973 2577 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497975 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497978 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497982 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497985 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497988 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497991 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497995 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.497997 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498000 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498003 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498006 2577 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498009 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498012 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498019 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498022 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498025 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498028 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498031 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498034 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:29.502371 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498037 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498040 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498043 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498046 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498049 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498052 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498055 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498061 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498064 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498067 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498071 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498075 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498078 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498081 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498084 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498087 2577 flags.go:64] FLAG: --v="2" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498091 2577 flags.go:64] FLAG: --version="false" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498101 2577 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498105 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498108 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498207 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498212 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498215 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498219 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.503013 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498222 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498225 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498230 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498232 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498235 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498238 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498241 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498244 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498246 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498249 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498252 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498254 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498257 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498260 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498262 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498265 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498268 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498270 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498273 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498276 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.503581 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498279 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498281 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498284 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498286 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498289 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498292 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498294 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498297 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498299 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498302 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498304 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498307 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498311 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498314 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498318 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498322 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498325 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498328 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498331 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498333 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.504136 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498336 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498338 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498341 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498344 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498346 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498349 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498351 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498354 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498356 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498359 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498362 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498364 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498367 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498370 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498373 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498375 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498378 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498380 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498383 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498386 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.504632 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498389 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498391 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498394 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498397 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498401 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498404 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498407 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498410 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498413 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498417 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498421 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498424 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498426 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498429 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498432 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498435 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498438 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498440 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498443 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.505172 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498445 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498448 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.498450 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.498456 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.505310 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.505328 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505376 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505381 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505384 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505387 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505390 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505393 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505396 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505399 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505402 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505404 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.505634 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505407 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505410 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505412 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505416 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505418 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505421 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505424 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505427 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505430 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505433 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505435 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505438 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505441 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505444 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505447 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505449 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505452 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505455 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505457 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505460 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.506048 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505462 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505467 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505470 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505473 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505475 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505478 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505481 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505484 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505486 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505489 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505492 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505495 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505497 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505500 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505503 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505506 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505509 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505511 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505514 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505517 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.506529 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505519 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505522 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505525 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505530 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505533 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505536 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505540 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505543 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505546 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505548 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505551 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505554 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505556 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505559 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505562 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505566 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505569 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505571 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505574 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.507102 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505576 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505579 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505582 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505584 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505587 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505589 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505592 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505595 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505597 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505601 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505605 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505608 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505611 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505614 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505617 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505620 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.507558 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505623 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.505628 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505725 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505730 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505733 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505736 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505739 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505742 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505745 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505747 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505750 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505753 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505756 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505759 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505761 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:29.507960 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505763 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505766 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505769 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505771 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505774 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505776 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505778 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505781 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505784 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505787 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505789 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505792 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505794 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505797 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505800 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505802 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505805 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505808 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505810 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505813 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:29.508333 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505816 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505819 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505821 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505824 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505826 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505829 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505831 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505834 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505837 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505839 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505842 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505845 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505847 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505851 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505853 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505855 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505858 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505861 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505863 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505866 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:29.508810 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505868 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505871 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505873 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505876 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505878 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505881 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505884 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505887 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505890 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505892 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505895 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505897 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505900 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505902 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505905 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505907 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505911 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505914 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505917 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505920 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:29.509322 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505923 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505926 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505928 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505947 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505950 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505954 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505958 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505960 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505963 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505966 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505969 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505972 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:29.505975 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.505979 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.506747 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:29.509809 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.508961 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:29.510192 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.509958 2577 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:29.510192 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.510060 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:29.510192 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.510101 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:29.541524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.541508 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:29.546380 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.546363 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:29.562247 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.562187 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:29.570340 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.570318 2577 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:29.571345 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.571325 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:29.571592 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.571574 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:29.576723 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.576705 2577 fs.go:135] Filesystem UUIDs: map[4cd7c7b3-d797-4dca-a263-859be5f8b8cd:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 888937bc-cd1c-4b86-a965-744352b3aaff:/dev/nvme0n1p3] Apr 16 14:52:29.576767 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.576724 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:29.583286 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.583173 2577 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:29.581114108 +0000 UTC m=+0.471528993 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100748 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28497552d805f996651463c6c78585 SystemUUID:ec284975-52d8-05f9-9665-1463c6c78585 BootID:35c1850a-4af1-4183-9855-d9899e960afa Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:28:8f:b7:f9:1b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:28:8f:b7:f9:1b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4e:4e:3e:f0:ea:92 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:29.583286 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.583279 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:29.583440 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.583364 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:29.586953 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.586910 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:29.587103 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.586951 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-55.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:29.587156 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.587112 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:29.587156 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.587120 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:29.587156 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.587133 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:29.588023 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.588011 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:29.589822 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.589811 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:29.590140 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.590129 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:29.593433 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.593421 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:29.594088 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.594077 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:29.594127 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.594101 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:29.594127 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.594111 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:29.594127 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.594121 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:29.595266 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.595249 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:29.595266 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.595270 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:29.599020 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.598993 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:29.601804 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.601790 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:29.603885 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603873 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:29.603953 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603890 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:29.603953 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603898 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:29.603953 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603906 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:29.603953 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603915 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:29.603953 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603923 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:29.603953 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603947 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:29.603953 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603954 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:29.604134 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603961 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:29.604134 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603967 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:29.604134 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603976 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:29.604134 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.603985 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:29.606729 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.606717 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:29.606766 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.606733 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:29.607763 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.607736 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:29.607809 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.607743 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:29.610505 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.610491 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:29.610544 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.610539 2577 server.go:1295] "Started kubelet" Apr 16 14:52:29.610657 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.610626 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:29.610842 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.610615 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:29.610898 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.610863 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:29.611423 ip-10-0-139-55 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:29.612029 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.611905 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:29.613350 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.613338 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:29.616778 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.616758 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-55.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:29.617630 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.617611 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:29.617630 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.617620 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:29.617763 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.616810 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-55.ec2.internal.18a6ddf584e4104e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-55.ec2.internal,UID:ip-10-0-139-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-55.ec2.internal,},FirstTimestamp:2026-04-16 14:52:29.61050427 +0000 UTC m=+0.500919146,LastTimestamp:2026-04-16 14:52:29.61050427 +0000 UTC m=+0.500919146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-55.ec2.internal,}" Apr 16 14:52:29.618231 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618211 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:29.618314 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618302 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:29.618361 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618316 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:29.618361 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.618349 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:29.618450 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618430 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:29.618450 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618439 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:29.618709 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618470 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:29.618709 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618481 2577 factory.go:55] Registering systemd factory Apr 16 14:52:29.618709 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618491 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:29.618709 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.618692 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zqbhc" Apr 16 14:52:29.623455 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.623425 2577 factory.go:153] Registering CRI-O factory Apr 16 14:52:29.623568 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.623556 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:29.623661 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.623645 2577 factory.go:103] Registering Raw factory Apr 16 14:52:29.623661 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.623663 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:29.624006 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.623994 2577 manager.go:319] Starting recovery of all containers Apr 16 14:52:29.625487 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.625463 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zqbhc" Apr 16 14:52:29.625778 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.625757 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:29.628954 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.628911 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 14:52:29.629095 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.629072 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 14:52:29.636051 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.635920 2577 manager.go:324] Recovery completed Apr 16 14:52:29.639965 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.639951 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.642871 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.642852 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.642980 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.642887 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.642980 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.642898 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.644066 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.644047 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:29.644066 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.644066 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:29.644168 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.644083 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:29.646333 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.646318 2577 policy_none.go:49] "None policy: Start" Apr 16 14:52:29.646389 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.646339 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:29.646389 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.646351 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:29.684607 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.684590 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.684623 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.684634 2577 server.go:85] "Starting device plugin registration server" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.684875 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.684888 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.684986 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.685070 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.685081 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.685682 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:29.696818 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.685722 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:29.751149 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.751116 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:29.752516 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.752503 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:29.752584 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.752532 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:29.752584 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.752553 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:29.752584 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.752559 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:29.752720 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.752590 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:29.754898 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.754879 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:29.785920 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.785873 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.787752 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.787732 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.787850 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.787766 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.787850 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.787781 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.787850 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.787810 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.797775 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.797753 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.797853 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.797777 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-55.ec2.internal\": node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:29.814217 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.814160 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:29.853398 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.853346 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal"] Apr 16 14:52:29.853504 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.853446 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.854383 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.854369 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.854443 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.854399 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.854443 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.854408 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.855609 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.855596 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.855743 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.855729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.855785 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.855758 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.856328 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.856314 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.856403 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.856341 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.856403 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.856353 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.856403 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.856356 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.856403 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.856379 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.856403 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.856393 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.857822 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.857804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.857900 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.857829 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:29.858498 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.858464 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:29.858498 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.858488 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:29.858498 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.858498 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:29.876489 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.876472 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-55.ec2.internal\" not found" node="ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.882506 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.882489 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-55.ec2.internal\" not found" node="ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.914706 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:29.914678 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:29.919641 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.919621 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f471d1950d411efb876ec3531c30b142-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal\" (UID: \"f471d1950d411efb876ec3531c30b142\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.919702 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.919651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f471d1950d411efb876ec3531c30b142-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal\" (UID: \"f471d1950d411efb876ec3531c30b142\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:29.919702 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:29.919668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8bc00e80c830605c41caa74a5fa346b2-config\") pod \"kube-apiserver-proxy-ip-10-0-139-55.ec2.internal\" (UID: \"8bc00e80c830605c41caa74a5fa346b2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.015161 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:30.015126 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:30.020499 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.020478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f471d1950d411efb876ec3531c30b142-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal\" (UID: \"f471d1950d411efb876ec3531c30b142\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.020561 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.020498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f471d1950d411efb876ec3531c30b142-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal\" (UID: \"f471d1950d411efb876ec3531c30b142\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.020561 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.020521 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f471d1950d411efb876ec3531c30b142-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal\" (UID: \"f471d1950d411efb876ec3531c30b142\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.020561 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.020539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8bc00e80c830605c41caa74a5fa346b2-config\") pod \"kube-apiserver-proxy-ip-10-0-139-55.ec2.internal\" (UID: \"8bc00e80c830605c41caa74a5fa346b2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.020666 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.020574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8bc00e80c830605c41caa74a5fa346b2-config\") pod \"kube-apiserver-proxy-ip-10-0-139-55.ec2.internal\" (UID: \"8bc00e80c830605c41caa74a5fa346b2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.020666 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.020604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f471d1950d411efb876ec3531c30b142-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal\" (UID: \"f471d1950d411efb876ec3531c30b142\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.115918 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:30.115843 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:30.182469 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.182432 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.184848 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.184834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.216948 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:30.216905 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:30.317472 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:30.317419 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:30.417982 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:30.417868 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:30.509444 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.509418 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:30.510042 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.509557 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:30.518588 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:30.518567 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:30.618653 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.618618 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:30.618653 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:30.618659 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-55.ec2.internal\" not found" Apr 16 14:52:30.623819 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.623795 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:30.628043 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.628007 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:29 +0000 UTC" deadline="2028-01-17 20:39:35.348940634 +0000 UTC" Apr 16 14:52:30.628043 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.628043 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15389h47m4.720900459s" Apr 16 14:52:30.631104 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.631088 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:30.658347 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.658321 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kqc9d" Apr 16 14:52:30.659905 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.659889 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:30.664084 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.664062 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kqc9d" Apr 16 14:52:30.706944 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.706864 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:30.718763 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.718736 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.722759 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:30.722733 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf471d1950d411efb876ec3531c30b142.slice/crio-b149a7f6a637045054524044059fe047a3a3943c7ceeb5367f335d983f274f21 WatchSource:0}: Error finding container b149a7f6a637045054524044059fe047a3a3943c7ceeb5367f335d983f274f21: Status 404 returned error can't find the container with id b149a7f6a637045054524044059fe047a3a3943c7ceeb5367f335d983f274f21 Apr 16 14:52:30.724435 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:30.724412 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc00e80c830605c41caa74a5fa346b2.slice/crio-7257a5dfd573e44bf8ad0fa7a400c1b8750133e5bc474ed28870afd6680fb42e WatchSource:0}: Error finding container 7257a5dfd573e44bf8ad0fa7a400c1b8750133e5bc474ed28870afd6680fb42e: Status 404 returned error can't find the container with id 7257a5dfd573e44bf8ad0fa7a400c1b8750133e5bc474ed28870afd6680fb42e Apr 16 14:52:30.727696 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.727682 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:30.732325 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.732307 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:30.733256 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.733240 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" Apr 16 14:52:30.741046 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.741032 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:30.756035 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.755990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" event={"ID":"8bc00e80c830605c41caa74a5fa346b2","Type":"ContainerStarted","Data":"7257a5dfd573e44bf8ad0fa7a400c1b8750133e5bc474ed28870afd6680fb42e"} Apr 16 14:52:30.756847 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:30.756821 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" event={"ID":"f471d1950d411efb876ec3531c30b142","Type":"ContainerStarted","Data":"b149a7f6a637045054524044059fe047a3a3943c7ceeb5367f335d983f274f21"} Apr 16 14:52:31.566366 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.566328 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:31.595416 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.595379 2577 apiserver.go:52] "Watching apiserver" Apr 16 14:52:31.601572 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.601545 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:31.603588 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.603563 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-7s8dl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal","openshift-multus/multus-additional-cni-plugins-gdnq8","openshift-network-diagnostics/network-check-target-tt7s8","openshift-network-operator/iptables-alerter-m9pmx","openshift-ovn-kubernetes/ovnkube-node-4rq4s","kube-system/global-pull-secret-syncer-m29v8","kube-system/konnectivity-agent-mbdgw","kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal","openshift-dns/node-resolver-nhw5h","openshift-image-registry/node-ca-dx2h6","openshift-multus/multus-zsj2z","openshift-multus/network-metrics-daemon-7rlnq","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp"] Apr 16 14:52:31.607202 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.607169 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:31.608788 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.608524 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:31.608788 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.608608 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:31.609607 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.609583 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.611036 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.611017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:31.611129 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.611101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.611184 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.611101 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:31.612306 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.612286 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:31.612463 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.612441 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:31.612567 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.612533 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.612622 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.612565 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ptp4g\"" Apr 16 14:52:31.618589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.614344 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.618589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.617042 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:31.618589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.617329 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.618589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.618017 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2h4df\"" Apr 16 14:52:31.618589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.618187 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:31.618589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.618266 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.618589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.618525 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.619130 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.618755 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6s4dj\"" Apr 16 14:52:31.619130 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.619059 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.619756 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.619501 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:31.619756 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.619743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.619904 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.619754 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.620183 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.620167 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:31.620544 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.620524 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.621642 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.621317 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.621642 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.621360 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:31.621642 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.621476 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.621642 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.621496 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.621642 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.621499 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:31.621642 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.621625 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-t52mz\"" Apr 16 14:52:31.621996 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.621762 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:31.621996 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.621848 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tqtvc\"" Apr 16 14:52:31.625080 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.625061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.626721 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.626702 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.626816 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.626729 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.627045 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627029 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:31.627112 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627066 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.627345 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627324 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6795f\"" Apr 16 14:52:31.627531 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysconfig\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.627639 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-system-cni-dir\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.627639 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba49942b-2640-414e-a0c5-df74f0eda02d-cni-binary-copy\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.627639 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627326 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.627819 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627578 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r2ffp\"" Apr 16 14:52:31.627819 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627742 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:31.627819 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-sys\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.627819 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.627819 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/89345827-ec40-475a-b830-4d6c1df489c5-agent-certs\") pod \"konnectivity-agent-mbdgw\" (UID: \"89345827-ec40-475a-b830-4d6c1df489c5\") " pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:31.628069 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7987cab-cae1-4625-84c9-b135a5b3b6e7-tmp-dir\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.628069 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-daemon-config\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.628069 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.627968 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysctl-conf\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.628069 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlct2\" (UniqueName: \"kubernetes.io/projected/4974791a-9870-4f06-9c9e-14a97a8f16f7-kube-api-access-tlct2\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.628069 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.628069 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f1d6917-4b83-407b-a119-822474f666a8-serviceca\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.628069 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628075 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zggwx\"" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-tuned\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f1d6917-4b83-407b-a119-822474f666a8-host\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74b2\" (UniqueName: \"kubernetes.io/projected/01ced172-b8c5-4042-97b6-a8813deb4542-kube-api-access-d74b2\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-ovn\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628217 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovnkube-script-lib\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628267 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-kubelet\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krp57\" (UniqueName: \"kubernetes.io/projected/ba49942b-2640-414e-a0c5-df74f0eda02d-kube-api-access-krp57\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628299 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-slash\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.628364 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-hostroot\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.628392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-node-log\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-socket-dir-parent\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysctl-d\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628500 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-lib-modules\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-var-lib-kubelet\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628521 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-run-netns\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-cni-multus\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-multus-certs\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-host\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxgsp\" (UniqueName: \"kubernetes.io/projected/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-kube-api-access-jxgsp\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-var-lib-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-env-overrides\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovn-node-metrics-cert\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629014 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-cni-bin\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628787 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-run\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95faf3e2-9a40-4a30-bcf9-eb791cd16272-tmp\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628833 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qgx8\" (UniqueName: \"kubernetes.io/projected/95faf3e2-9a40-4a30-bcf9-eb791cd16272-kube-api-access-2qgx8\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-systemd-units\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-netns\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-os-release\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-conf-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.628990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-cni-bin\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovnkube-config\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4974791a-9870-4f06-9c9e-14a97a8f16f7-iptables-alerter-script\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629077 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-kubernetes\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-systemd\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f6bw\" (UniqueName: \"kubernetes.io/projected/4f1d6917-4b83-407b-a119-822474f666a8-kube-api-access-2f6bw\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-cni-binary-copy\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.629877 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-kubelet\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-systemd\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a7987cab-cae1-4625-84c9-b135a5b3b6e7-hosts-file\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-cnibin\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-cnibin\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-etc-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-run-ovn-kubernetes\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgghm\" (UniqueName: \"kubernetes.io/projected/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-kube-api-access-hgghm\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4974791a-9870-4f06-9c9e-14a97a8f16f7-host-slash\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/89345827-ec40-475a-b830-4d6c1df489c5-konnectivity-ca\") pod \"konnectivity-agent-mbdgw\" (UID: \"89345827-ec40-475a-b830-4d6c1df489c5\") " pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-cni-netd\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629561 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-os-release\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-k8s-cni-cncf-io\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85rfr\" (UniqueName: \"kubernetes.io/projected/a7987cab-cae1-4625-84c9-b135a5b3b6e7-kube-api-access-85rfr\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-cni-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.630524 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-log-socket\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.631090 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-system-cni-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.631090 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-etc-kubernetes\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.631090 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.629799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-modprobe-d\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.631090 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.630158 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:31.631090 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.630176 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6cbql\"" Apr 16 14:52:31.631293 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.631157 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:31.631638 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.631618 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:31.665221 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.665184 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:30 +0000 UTC" deadline="2027-09-25 07:54:48.926156596 +0000 UTC" Apr 16 14:52:31.665221 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.665220 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12641h2m17.260940923s" Apr 16 14:52:31.720273 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.720237 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:31.730132 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-sys-fs\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.730303 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-system-cni-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.730303 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-etc-kubernetes\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.730303 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-etc-kubernetes\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.730303 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-system-cni-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.730303 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-modprobe-d\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysconfig\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-system-cni-dir\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba49942b-2640-414e-a0c5-df74f0eda02d-cni-binary-copy\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysconfig\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-sys\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-modprobe-d\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-system-cni-dir\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-sys\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/89345827-ec40-475a-b830-4d6c1df489c5-agent-certs\") pod \"konnectivity-agent-mbdgw\" (UID: \"89345827-ec40-475a-b830-4d6c1df489c5\") " pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730500 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4mvt\" (UniqueName: \"kubernetes.io/projected/9c084eef-0fdf-420d-a2d7-9e96988f8f90-kube-api-access-z4mvt\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7987cab-cae1-4625-84c9-b135a5b3b6e7-tmp-dir\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.730559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-daemon-config\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysctl-conf\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlct2\" (UniqueName: \"kubernetes.io/projected/4974791a-9870-4f06-9c9e-14a97a8f16f7-kube-api-access-tlct2\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f1d6917-4b83-407b-a119-822474f666a8-serviceca\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-socket-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-tuned\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f1d6917-4b83-407b-a119-822474f666a8-host\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d74b2\" (UniqueName: \"kubernetes.io/projected/01ced172-b8c5-4042-97b6-a8813deb4542-kube-api-access-d74b2\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730786 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-ovn\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730788 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovnkube-script-lib\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b683b265-ac6f-41fa-a2b7-b634d385846f-dbus\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-kubelet\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krp57\" (UniqueName: \"kubernetes.io/projected/ba49942b-2640-414e-a0c5-df74f0eda02d-kube-api-access-krp57\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-slash\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba49942b-2640-414e-a0c5-df74f0eda02d-cni-binary-copy\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730966 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-hostroot\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.731216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.730993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-node-log\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-etc-selinux\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731020 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-socket-dir-parent\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysctl-d\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysctl-conf\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-lib-modules\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-lib-modules\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f1d6917-4b83-407b-a119-822474f666a8-host\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4f1d6917-4b83-407b-a119-822474f666a8-serviceca\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a7987cab-cae1-4625-84c9-b135a5b3b6e7-tmp-dir\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-slash\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-var-lib-kubelet\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-ovn\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-run-netns\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-hostroot\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.732135 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-run-netns\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-cni-multus\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-node-log\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731570 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-var-lib-kubelet\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-multus-certs\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-host\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-socket-dir-parent\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxgsp\" (UniqueName: \"kubernetes.io/projected/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-kube-api-access-jxgsp\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-var-lib-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.731702 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-daemon-config\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-env-overrides\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-kubelet\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731844 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-multus-certs\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-sysctl-d\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.733060 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.731925 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.231894012 +0000 UTC m=+3.122308904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovn-node-metrics-cert\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-host\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.731998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-cni-bin\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-cni-multus\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-var-lib-cni-bin\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-run\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95faf3e2-9a40-4a30-bcf9-eb791cd16272-tmp\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-var-lib-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qgx8\" (UniqueName: \"kubernetes.io/projected/95faf3e2-9a40-4a30-bcf9-eb791cd16272-kube-api-access-2qgx8\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-systemd-units\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-run\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732169 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732198 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-registration-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732224 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-netns\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-os-release\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-systemd-units\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.733892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-conf-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732300 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovnkube-script-lib\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732346 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-cni-bin\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-cni-bin\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovnkube-config\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-os-release\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4974791a-9870-4f06-9c9e-14a97a8f16f7-iptables-alerter-script\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-kubernetes\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-conf-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-systemd\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-env-overrides\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-kubernetes\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-systemd\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732359 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-netns\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f6bw\" (UniqueName: \"kubernetes.io/projected/4f1d6917-4b83-407b-a119-822474f666a8-kube-api-access-2f6bw\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-cni-binary-copy\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.734618 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732907 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4974791a-9870-4f06-9c9e-14a97a8f16f7-iptables-alerter-script\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-kubelet\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.732985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-systemd\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a7987cab-cae1-4625-84c9-b135a5b3b6e7-hosts-file\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-cnibin\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-cnibin\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-etc-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-run-ovn-kubernetes\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgghm\" (UniqueName: \"kubernetes.io/projected/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-kube-api-access-hgghm\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733169 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b683b265-ac6f-41fa-a2b7-b634d385846f-kubelet-config\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4974791a-9870-4f06-9c9e-14a97a8f16f7-host-slash\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/89345827-ec40-475a-b830-4d6c1df489c5-konnectivity-ca\") pod \"konnectivity-agent-mbdgw\" (UID: \"89345827-ec40-475a-b830-4d6c1df489c5\") " pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-cni-netd\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-os-release\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-k8s-cni-cncf-io\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.735305 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733380 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-cni-binary-copy\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85rfr\" (UniqueName: \"kubernetes.io/projected/a7987cab-cae1-4625-84c9-b135a5b3b6e7-kube-api-access-85rfr\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733420 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-cni-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovnkube-config\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-run-ovn-kubernetes\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-log-socket\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-device-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733490 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-run-systemd\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a7987cab-cae1-4625-84c9-b135a5b3b6e7-hosts-file\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-cnibin\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-multus-cni-dir\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-cnibin\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733728 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4974791a-9870-4f06-9c9e-14a97a8f16f7-host-slash\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-log-socket\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733761 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-etc-openvswitch\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-kubelet\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-host-cni-netd\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733863 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01ced172-b8c5-4042-97b6-a8813deb4542-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.735955 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba49942b-2640-414e-a0c5-df74f0eda02d-host-run-k8s-cni-cncf-io\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.736616 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.733957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01ced172-b8c5-4042-97b6-a8813deb4542-os-release\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.736616 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.734242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/89345827-ec40-475a-b830-4d6c1df489c5-konnectivity-ca\") pod \"konnectivity-agent-mbdgw\" (UID: \"89345827-ec40-475a-b830-4d6c1df489c5\") " pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:31.736616 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.734697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/95faf3e2-9a40-4a30-bcf9-eb791cd16272-tmp\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.736616 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.734714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/95faf3e2-9a40-4a30-bcf9-eb791cd16272-etc-tuned\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.736616 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.736195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-ovn-node-metrics-cert\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.736786 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.736673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/89345827-ec40-475a-b830-4d6c1df489c5-agent-certs\") pod \"konnectivity-agent-mbdgw\" (UID: \"89345827-ec40-475a-b830-4d6c1df489c5\") " pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:31.743290 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.743266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxgsp\" (UniqueName: \"kubernetes.io/projected/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-kube-api-access-jxgsp\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:31.745724 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.745696 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:31.745829 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.745729 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:31.745829 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.745744 2577 projected.go:194] Error preparing data for projected volume kube-api-access-8hzzm for pod openshift-network-diagnostics/network-check-target-tt7s8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:31.745829 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.745826 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm podName:367d90fd-ba99-4dee-9cbe-ed7ac607159d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.24580591 +0000 UTC m=+3.136220791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8hzzm" (UniqueName: "kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm") pod "network-check-target-tt7s8" (UID: "367d90fd-ba99-4dee-9cbe-ed7ac607159d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:31.747332 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.747304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlct2\" (UniqueName: \"kubernetes.io/projected/4974791a-9870-4f06-9c9e-14a97a8f16f7-kube-api-access-tlct2\") pod \"iptables-alerter-m9pmx\" (UID: \"4974791a-9870-4f06-9c9e-14a97a8f16f7\") " pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.747461 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.747302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74b2\" (UniqueName: \"kubernetes.io/projected/01ced172-b8c5-4042-97b6-a8813deb4542-kube-api-access-d74b2\") pod \"multus-additional-cni-plugins-gdnq8\" (UID: \"01ced172-b8c5-4042-97b6-a8813deb4542\") " pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.748389 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.748361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgghm\" (UniqueName: \"kubernetes.io/projected/d696eec3-0a3d-418d-8aa7-05a9c9e9ef26-kube-api-access-hgghm\") pod \"ovnkube-node-4rq4s\" (UID: \"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26\") " pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.748909 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.748886 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85rfr\" (UniqueName: \"kubernetes.io/projected/a7987cab-cae1-4625-84c9-b135a5b3b6e7-kube-api-access-85rfr\") pod \"node-resolver-nhw5h\" (UID: \"a7987cab-cae1-4625-84c9-b135a5b3b6e7\") " pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.749676 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.749654 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f6bw\" (UniqueName: \"kubernetes.io/projected/4f1d6917-4b83-407b-a119-822474f666a8-kube-api-access-2f6bw\") pod \"node-ca-dx2h6\" (UID: \"4f1d6917-4b83-407b-a119-822474f666a8\") " pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.750169 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.750144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qgx8\" (UniqueName: \"kubernetes.io/projected/95faf3e2-9a40-4a30-bcf9-eb791cd16272-kube-api-access-2qgx8\") pod \"tuned-7s8dl\" (UID: \"95faf3e2-9a40-4a30-bcf9-eb791cd16272\") " pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.751533 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.751516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krp57\" (UniqueName: \"kubernetes.io/projected/ba49942b-2640-414e-a0c5-df74f0eda02d-kube-api-access-krp57\") pod \"multus-zsj2z\" (UID: \"ba49942b-2640-414e-a0c5-df74f0eda02d\") " pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.834326 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.834326 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4mvt\" (UniqueName: \"kubernetes.io/projected/9c084eef-0fdf-420d-a2d7-9e96988f8f90-kube-api-access-z4mvt\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.834527 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-socket-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.834527 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b683b265-ac6f-41fa-a2b7-b634d385846f-dbus\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.834527 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-etc-selinux\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.834527 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.834527 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-registration-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.834527 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b683b265-ac6f-41fa-a2b7-b634d385846f-kubelet-config\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.834723 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-device-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.834723 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-sys-fs\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.834723 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-sys-fs\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.834723 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.834714 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:31.834897 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:31.834782 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret podName:b683b265-ac6f-41fa-a2b7-b634d385846f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.334760995 +0000 UTC m=+3.225175910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret") pod "global-pull-secret-syncer-m29v8" (UID: "b683b265-ac6f-41fa-a2b7-b634d385846f") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:31.834970 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.834910 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-etc-selinux\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.835078 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.835059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-socket-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.835147 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.835112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.835190 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.835171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-device-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.835190 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.835173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b683b265-ac6f-41fa-a2b7-b634d385846f-kubelet-config\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.835281 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.835222 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9c084eef-0fdf-420d-a2d7-9e96988f8f90-registration-dir\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.835281 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.835233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b683b265-ac6f-41fa-a2b7-b634d385846f-dbus\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:31.848312 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.848282 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4mvt\" (UniqueName: \"kubernetes.io/projected/9c084eef-0fdf-420d-a2d7-9e96988f8f90-kube-api-access-z4mvt\") pod \"aws-ebs-csi-driver-node-7qjtp\" (UID: \"9c084eef-0fdf-420d-a2d7-9e96988f8f90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:31.919141 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.919088 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:31.933262 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.933229 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zsj2z" Apr 16 14:52:31.943096 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.943066 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m9pmx" Apr 16 14:52:31.948700 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.948676 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:31.957327 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.957302 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nhw5h" Apr 16 14:52:31.965026 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.965001 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" Apr 16 14:52:31.972652 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.972619 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dx2h6" Apr 16 14:52:31.980223 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.980196 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" Apr 16 14:52:31.985795 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:31.985771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" Apr 16 14:52:32.237160 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.237074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:32.237317 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:32.237218 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.237383 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:32.237318 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.237296975 +0000 UTC m=+4.127711851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:32.338359 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.338327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:32.338544 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.338414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:32.338544 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:32.338518 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:32.338634 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:32.338598 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret podName:b683b265-ac6f-41fa-a2b7-b634d385846f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.338577188 +0000 UTC m=+4.228992057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret") pod "global-pull-secret-syncer-m29v8" (UID: "b683b265-ac6f-41fa-a2b7-b634d385846f") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:32.338634 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:32.338528 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:32.338634 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:32.338632 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:32.338776 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:32.338646 2577 projected.go:194] Error preparing data for projected volume kube-api-access-8hzzm for pod openshift-network-diagnostics/network-check-target-tt7s8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:32.338776 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:32.338702 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm podName:367d90fd-ba99-4dee-9cbe-ed7ac607159d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.33868666 +0000 UTC m=+4.229101541 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hzzm" (UniqueName: "kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm") pod "network-check-target-tt7s8" (UID: "367d90fd-ba99-4dee-9cbe-ed7ac607159d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:32.354251 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.354222 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c084eef_0fdf_420d_a2d7_9e96988f8f90.slice/crio-51167439f85af6e0c877e0721e66755f1bef239859ac764f602a24a767ac7c5e WatchSource:0}: Error finding container 51167439f85af6e0c877e0721e66755f1bef239859ac764f602a24a767ac7c5e: Status 404 returned error can't find the container with id 51167439f85af6e0c877e0721e66755f1bef239859ac764f602a24a767ac7c5e Apr 16 14:52:32.355491 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.355468 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95faf3e2_9a40_4a30_bcf9_eb791cd16272.slice/crio-1637fa674caa1c445bbe3903d72a0fc28fdcadd040b9b9bc5d2a55802904ef86 WatchSource:0}: Error finding container 1637fa674caa1c445bbe3903d72a0fc28fdcadd040b9b9bc5d2a55802904ef86: Status 404 returned error can't find the container with id 1637fa674caa1c445bbe3903d72a0fc28fdcadd040b9b9bc5d2a55802904ef86 Apr 16 14:52:32.356542 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.356516 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4974791a_9870_4f06_9c9e_14a97a8f16f7.slice/crio-3bb6c15f849e5b52e5b7c50d8a8fa7b29ec44c9f16eff635b8189bbd4db91fef WatchSource:0}: Error finding container 3bb6c15f849e5b52e5b7c50d8a8fa7b29ec44c9f16eff635b8189bbd4db91fef: Status 404 returned error can't find the container with id 3bb6c15f849e5b52e5b7c50d8a8fa7b29ec44c9f16eff635b8189bbd4db91fef Apr 16 14:52:32.359915 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.359892 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1d6917_4b83_407b_a119_822474f666a8.slice/crio-6f0fab5d0a1f88fbdafd24135f3e6cc3f26420e81313ab42e486ff81e7af00a5 WatchSource:0}: Error finding container 6f0fab5d0a1f88fbdafd24135f3e6cc3f26420e81313ab42e486ff81e7af00a5: Status 404 returned error can't find the container with id 6f0fab5d0a1f88fbdafd24135f3e6cc3f26420e81313ab42e486ff81e7af00a5 Apr 16 14:52:32.360873 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.360854 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7987cab_cae1_4625_84c9_b135a5b3b6e7.slice/crio-b1448a3d98a1f97c0025b778ce17a4531433243a75daa0d5653185b1cd46cf5f WatchSource:0}: Error finding container b1448a3d98a1f97c0025b778ce17a4531433243a75daa0d5653185b1cd46cf5f: Status 404 returned error can't find the container with id b1448a3d98a1f97c0025b778ce17a4531433243a75daa0d5653185b1cd46cf5f Apr 16 14:52:32.362566 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.362227 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba49942b_2640_414e_a0c5_df74f0eda02d.slice/crio-13444e097f7ce3778e0e77edfd8f73a4e86ce3be529816a989f68e9efd2f3d79 WatchSource:0}: Error finding container 13444e097f7ce3778e0e77edfd8f73a4e86ce3be529816a989f68e9efd2f3d79: Status 404 returned error can't find the container with id 13444e097f7ce3778e0e77edfd8f73a4e86ce3be529816a989f68e9efd2f3d79 Apr 16 14:52:32.363193 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.363175 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ced172_b8c5_4042_97b6_a8813deb4542.slice/crio-3ce01d48195076608bd19d983ef282c0b0fc0740c10c99ed440de7807c5ac434 WatchSource:0}: Error finding container 3ce01d48195076608bd19d983ef282c0b0fc0740c10c99ed440de7807c5ac434: Status 404 returned error can't find the container with id 3ce01d48195076608bd19d983ef282c0b0fc0740c10c99ed440de7807c5ac434 Apr 16 14:52:32.363951 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.363912 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd696eec3_0a3d_418d_8aa7_05a9c9e9ef26.slice/crio-6760a025c333f44cbb548e223a66792d489d8307cbb255823b89a3227d2c9f0a WatchSource:0}: Error finding container 6760a025c333f44cbb548e223a66792d489d8307cbb255823b89a3227d2c9f0a: Status 404 returned error can't find the container with id 6760a025c333f44cbb548e223a66792d489d8307cbb255823b89a3227d2c9f0a Apr 16 14:52:32.365469 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:52:32.365451 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89345827_ec40_475a_b830_4d6c1df489c5.slice/crio-db87a90506b27e7235583783364d90d39db0853434c6fcba0cf24999f0216e83 WatchSource:0}: Error finding container db87a90506b27e7235583783364d90d39db0853434c6fcba0cf24999f0216e83: Status 404 returned error can't find the container with id db87a90506b27e7235583783364d90d39db0853434c6fcba0cf24999f0216e83 Apr 16 14:52:32.666348 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.666075 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:30 +0000 UTC" deadline="2027-12-08 11:31:47.623699612 +0000 UTC" Apr 16 14:52:32.666348 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.666279 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14420h39m14.957426206s" Apr 16 14:52:32.762050 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.762014 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"6760a025c333f44cbb548e223a66792d489d8307cbb255823b89a3227d2c9f0a"} Apr 16 14:52:32.765720 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.765684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nhw5h" event={"ID":"a7987cab-cae1-4625-84c9-b135a5b3b6e7","Type":"ContainerStarted","Data":"b1448a3d98a1f97c0025b778ce17a4531433243a75daa0d5653185b1cd46cf5f"} Apr 16 14:52:32.770133 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.770064 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m9pmx" event={"ID":"4974791a-9870-4f06-9c9e-14a97a8f16f7","Type":"ContainerStarted","Data":"3bb6c15f849e5b52e5b7c50d8a8fa7b29ec44c9f16eff635b8189bbd4db91fef"} Apr 16 14:52:32.773355 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.772193 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" event={"ID":"95faf3e2-9a40-4a30-bcf9-eb791cd16272","Type":"ContainerStarted","Data":"1637fa674caa1c445bbe3903d72a0fc28fdcadd040b9b9bc5d2a55802904ef86"} Apr 16 14:52:32.778662 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.778574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" event={"ID":"9c084eef-0fdf-420d-a2d7-9e96988f8f90","Type":"ContainerStarted","Data":"51167439f85af6e0c877e0721e66755f1bef239859ac764f602a24a767ac7c5e"} Apr 16 14:52:32.781652 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.781624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" event={"ID":"8bc00e80c830605c41caa74a5fa346b2","Type":"ContainerStarted","Data":"f24fa04cb864be31df6d0a87eff6540e0a4da1f7dde368bddc5b1883f180d86b"} Apr 16 14:52:32.787460 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.787418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mbdgw" event={"ID":"89345827-ec40-475a-b830-4d6c1df489c5","Type":"ContainerStarted","Data":"db87a90506b27e7235583783364d90d39db0853434c6fcba0cf24999f0216e83"} Apr 16 14:52:32.796895 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.796837 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-55.ec2.internal" podStartSLOduration=2.796820272 podStartE2EDuration="2.796820272s" podCreationTimestamp="2026-04-16 14:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:32.796330224 +0000 UTC m=+3.686745110" watchObservedRunningTime="2026-04-16 14:52:32.796820272 +0000 UTC m=+3.687235157" Apr 16 14:52:32.806894 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.798333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerStarted","Data":"3ce01d48195076608bd19d983ef282c0b0fc0740c10c99ed440de7807c5ac434"} Apr 16 14:52:32.806894 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.806576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zsj2z" event={"ID":"ba49942b-2640-414e-a0c5-df74f0eda02d","Type":"ContainerStarted","Data":"13444e097f7ce3778e0e77edfd8f73a4e86ce3be529816a989f68e9efd2f3d79"} Apr 16 14:52:32.817202 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:32.809860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dx2h6" event={"ID":"4f1d6917-4b83-407b-a119-822474f666a8","Type":"ContainerStarted","Data":"6f0fab5d0a1f88fbdafd24135f3e6cc3f26420e81313ab42e486ff81e7af00a5"} Apr 16 14:52:33.246114 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:33.246067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:33.246304 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.246222 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.246304 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.246291 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:35.246272483 +0000 UTC m=+6.136687350 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:33.347096 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:33.347053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:33.347096 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:33.347109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:33.347340 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.347256 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:33.347340 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.347317 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret podName:b683b265-ac6f-41fa-a2b7-b634d385846f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:35.347299738 +0000 UTC m=+6.237714607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret") pod "global-pull-secret-syncer-m29v8" (UID: "b683b265-ac6f-41fa-a2b7-b634d385846f") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:33.347759 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.347729 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:33.347759 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.347749 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:33.347759 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.347762 2577 projected.go:194] Error preparing data for projected volume kube-api-access-8hzzm for pod openshift-network-diagnostics/network-check-target-tt7s8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.347956 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.347806 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm podName:367d90fd-ba99-4dee-9cbe-ed7ac607159d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:35.34779204 +0000 UTC m=+6.238206910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hzzm" (UniqueName: "kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm") pod "network-check-target-tt7s8" (UID: "367d90fd-ba99-4dee-9cbe-ed7ac607159d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:33.755340 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:33.755247 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:33.755770 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.755404 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:33.755950 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:33.755911 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:33.756048 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.756027 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:33.756113 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:33.756104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:33.756257 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:33.756237 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:33.832622 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:33.831258 2577 generic.go:358] "Generic (PLEG): container finished" podID="f471d1950d411efb876ec3531c30b142" containerID="2a96c7fc6f5cb90d9964de60c41d371e040119ff1c5d1000a272223bb458e084" exitCode=0 Apr 16 14:52:33.832622 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:33.832358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" event={"ID":"f471d1950d411efb876ec3531c30b142","Type":"ContainerDied","Data":"2a96c7fc6f5cb90d9964de60c41d371e040119ff1c5d1000a272223bb458e084"} Apr 16 14:52:34.842482 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:34.842443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" event={"ID":"f471d1950d411efb876ec3531c30b142","Type":"ContainerStarted","Data":"3f47d4df292971d0939d82f7033b679b80240c5744dff8799376efba7f273bf5"} Apr 16 14:52:34.855319 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:34.855234 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-55.ec2.internal" podStartSLOduration=4.85521141 podStartE2EDuration="4.85521141s" podCreationTimestamp="2026-04-16 14:52:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:34.85458798 +0000 UTC m=+5.745002872" watchObservedRunningTime="2026-04-16 14:52:34.85521141 +0000 UTC m=+5.745626306" Apr 16 14:52:35.263191 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:35.263104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:35.263355 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.263304 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:35.263408 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.263369 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:39.263350653 +0000 UTC m=+10.153765531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:35.364254 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:35.364215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:35.364420 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:35.364277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:35.364492 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.364425 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:35.364492 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.364485 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret podName:b683b265-ac6f-41fa-a2b7-b634d385846f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:39.364469121 +0000 UTC m=+10.254883986 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret") pod "global-pull-secret-syncer-m29v8" (UID: "b683b265-ac6f-41fa-a2b7-b634d385846f") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:35.364847 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.364829 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:35.364847 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.364852 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:35.365032 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.364864 2577 projected.go:194] Error preparing data for projected volume kube-api-access-8hzzm for pod openshift-network-diagnostics/network-check-target-tt7s8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:35.365032 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.364906 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm podName:367d90fd-ba99-4dee-9cbe-ed7ac607159d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:39.364892427 +0000 UTC m=+10.255307300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hzzm" (UniqueName: "kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm") pod "network-check-target-tt7s8" (UID: "367d90fd-ba99-4dee-9cbe-ed7ac607159d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:35.755746 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:35.755666 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:35.755883 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:35.755796 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:35.755883 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.755804 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:35.755883 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:35.755849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:35.756004 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.755978 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:35.756092 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:35.756063 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:37.755132 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:37.755094 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:37.755589 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:37.755220 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:37.755589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:37.755232 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:37.755589 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:37.755383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:37.755589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:37.755434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:37.755589 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:37.755498 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:39.298826 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:39.298790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:39.299309 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.298985 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:39.299309 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.299059 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.299038215 +0000 UTC m=+18.189453086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:39.399981 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:39.399923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:39.400154 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:39.399995 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:39.400154 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.400106 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:39.400154 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.400134 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:39.400154 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.400148 2577 projected.go:194] Error preparing data for projected volume kube-api-access-8hzzm for pod openshift-network-diagnostics/network-check-target-tt7s8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:39.400304 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.400208 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm podName:367d90fd-ba99-4dee-9cbe-ed7ac607159d nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.400189833 +0000 UTC m=+18.290604698 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hzzm" (UniqueName: "kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm") pod "network-check-target-tt7s8" (UID: "367d90fd-ba99-4dee-9cbe-ed7ac607159d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:39.400446 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.400112 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:39.400560 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.400505 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret podName:b683b265-ac6f-41fa-a2b7-b634d385846f nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.400474935 +0000 UTC m=+18.290889801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret") pod "global-pull-secret-syncer-m29v8" (UID: "b683b265-ac6f-41fa-a2b7-b634d385846f") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:39.753979 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:39.753875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:39.754138 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.754011 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:39.754419 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:39.754360 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:39.754555 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.754469 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:39.754555 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:39.754513 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:39.754663 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:39.754577 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:41.753576 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:41.753535 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:41.754039 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:41.753661 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:41.754039 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:41.753676 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:41.754039 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:41.753763 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:41.754039 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:41.753808 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:41.754039 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:41.753874 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:43.753773 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:43.753733 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:43.754275 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:43.753856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:43.754275 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:43.753890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:43.754275 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:43.753853 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:43.754275 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:43.754005 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:43.754275 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:43.754070 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:45.753687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:45.753649 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:45.753687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:45.753687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:45.754207 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:45.753748 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:45.754207 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:45.753853 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:45.754207 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:45.753895 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:45.754207 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:45.753977 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:47.364478 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:47.364433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:47.364970 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.364593 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.364970 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.364675 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.364653465 +0000 UTC m=+34.255068330 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:47.465017 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:47.464972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:47.465017 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:47.465027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:47.465251 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.465142 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:47.465251 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.465161 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:47.465251 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.465184 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:47.465251 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.465194 2577 projected.go:194] Error preparing data for projected volume kube-api-access-8hzzm for pod openshift-network-diagnostics/network-check-target-tt7s8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.465251 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.465202 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret podName:b683b265-ac6f-41fa-a2b7-b634d385846f nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.465187168 +0000 UTC m=+34.355602034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret") pod "global-pull-secret-syncer-m29v8" (UID: "b683b265-ac6f-41fa-a2b7-b634d385846f") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:47.465251 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.465236 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm podName:367d90fd-ba99-4dee-9cbe-ed7ac607159d nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.465223086 +0000 UTC m=+34.355637949 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8hzzm" (UniqueName: "kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm") pod "network-check-target-tt7s8" (UID: "367d90fd-ba99-4dee-9cbe-ed7ac607159d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:47.753490 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:47.753452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:47.753490 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:47.753477 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:47.753724 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:47.753593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:47.753724 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.753598 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:47.753724 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.753675 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:47.753847 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:47.753766 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:49.754596 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.753967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:49.754596 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:49.754287 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:49.754596 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.754338 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:49.754596 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:49.754397 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:49.754596 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.754428 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:49.754596 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:49.754491 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:49.870485 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.870452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"96517d4856ea3a31f8b71de3116525b9cec3da37d786f688698529cc78b555bc"} Apr 16 14:52:49.871770 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.871729 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nhw5h" event={"ID":"a7987cab-cae1-4625-84c9-b135a5b3b6e7","Type":"ContainerStarted","Data":"02ee45bd5c58b12252a510ca177ea236c0db3ae3cb72655c5063b19e84695539"} Apr 16 14:52:49.873274 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.873239 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" event={"ID":"95faf3e2-9a40-4a30-bcf9-eb791cd16272","Type":"ContainerStarted","Data":"f04828a18504b4763a5ea7e50167bf7e6fe6a878d4001ddbc0c1d8b6289079d8"} Apr 16 14:52:49.874814 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.874791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" event={"ID":"9c084eef-0fdf-420d-a2d7-9e96988f8f90","Type":"ContainerStarted","Data":"72a05fe65a2cf18a494287134e2ddbbdccb245b85441d8221726062fded6ed22"} Apr 16 14:52:49.876236 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.876216 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mbdgw" event={"ID":"89345827-ec40-475a-b830-4d6c1df489c5","Type":"ContainerStarted","Data":"ec7eafbe6937badbaa40bd9f202b31aad1591091d1194b4c4bc1bba2d9693754"} Apr 16 14:52:49.877508 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.877479 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerStarted","Data":"d2865053170ce7e97d99e631ff045084d70d8974e6a6db26e957d6da59264ee9"} Apr 16 14:52:49.878753 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.878732 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zsj2z" event={"ID":"ba49942b-2640-414e-a0c5-df74f0eda02d","Type":"ContainerStarted","Data":"32531ecca4d7a4edd4453a4710046692aef1ff3dd9aafa48e22d34ddc80a1cc2"} Apr 16 14:52:49.880117 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.880097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dx2h6" event={"ID":"4f1d6917-4b83-407b-a119-822474f666a8","Type":"ContainerStarted","Data":"9f218783ecc18c850a4e39c8b1ee78735b7fb0bdde444812c3d13faf43b44dd8"} Apr 16 14:52:49.883522 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.883488 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nhw5h" podStartSLOduration=2.761550294 podStartE2EDuration="19.883475661s" podCreationTimestamp="2026-04-16 14:52:30 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.362580465 +0000 UTC m=+3.252995343" lastFinishedPulling="2026-04-16 14:52:49.484505841 +0000 UTC m=+20.374920710" observedRunningTime="2026-04-16 14:52:49.883071127 +0000 UTC m=+20.773486003" watchObservedRunningTime="2026-04-16 14:52:49.883475661 +0000 UTC m=+20.773890547" Apr 16 14:52:49.905810 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.905770 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7s8dl" podStartSLOduration=2.77273676 podStartE2EDuration="19.905756527s" podCreationTimestamp="2026-04-16 14:52:30 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.358147371 +0000 UTC m=+3.248562237" lastFinishedPulling="2026-04-16 14:52:49.491167134 +0000 UTC m=+20.381582004" observedRunningTime="2026-04-16 14:52:49.894781797 +0000 UTC m=+20.785196685" watchObservedRunningTime="2026-04-16 14:52:49.905756527 +0000 UTC m=+20.796171412" Apr 16 14:52:49.917315 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.917246 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mbdgw" podStartSLOduration=3.801071226 podStartE2EDuration="20.917231977s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.368202707 +0000 UTC m=+3.258617577" lastFinishedPulling="2026-04-16 14:52:49.484363462 +0000 UTC m=+20.374778328" observedRunningTime="2026-04-16 14:52:49.905557265 +0000 UTC m=+20.795972151" watchObservedRunningTime="2026-04-16 14:52:49.917231977 +0000 UTC m=+20.807646854" Apr 16 14:52:49.917455 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.917435 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dx2h6" podStartSLOduration=2.79465015 podStartE2EDuration="19.917427894s" podCreationTimestamp="2026-04-16 14:52:30 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.361641149 +0000 UTC m=+3.252056024" lastFinishedPulling="2026-04-16 14:52:49.484418902 +0000 UTC m=+20.374833768" observedRunningTime="2026-04-16 14:52:49.916959743 +0000 UTC m=+20.807374633" watchObservedRunningTime="2026-04-16 14:52:49.917427894 +0000 UTC m=+20.807842782" Apr 16 14:52:49.977779 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:49.977731 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zsj2z" podStartSLOduration=3.77601283 podStartE2EDuration="20.977713424s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.364615777 +0000 UTC m=+3.255030664" lastFinishedPulling="2026-04-16 14:52:49.566316379 +0000 UTC m=+20.456731258" observedRunningTime="2026-04-16 14:52:49.977415285 +0000 UTC m=+20.867830175" watchObservedRunningTime="2026-04-16 14:52:49.977713424 +0000 UTC m=+20.868128309" Apr 16 14:52:50.717462 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.717420 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:50.884030 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.883984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" event={"ID":"9c084eef-0fdf-420d-a2d7-9e96988f8f90","Type":"ContainerStarted","Data":"cf40d45a3ce086bde4c1e11df719965f028e96dd5a0722472eeb9d8cf7dc03b4"} Apr 16 14:52:50.885367 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.885342 2577 generic.go:358] "Generic (PLEG): container finished" podID="01ced172-b8c5-4042-97b6-a8813deb4542" containerID="d2865053170ce7e97d99e631ff045084d70d8974e6a6db26e957d6da59264ee9" exitCode=0 Apr 16 14:52:50.885473 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.885421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerDied","Data":"d2865053170ce7e97d99e631ff045084d70d8974e6a6db26e957d6da59264ee9"} Apr 16 14:52:50.888187 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.888166 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 14:52:50.888539 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.888494 2577 generic.go:358] "Generic (PLEG): container finished" podID="d696eec3-0a3d-418d-8aa7-05a9c9e9ef26" containerID="2c08035c2f51db763e2175d30e6f7e6b7955da1f786a5354637805bcf2fa5305" exitCode=1 Apr 16 14:52:50.888639 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.888573 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"d90349d9e2539ab1ff0e0c9567bd76e62cf5e691b5f09872d67f6bab5b22800c"} Apr 16 14:52:50.888639 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.888607 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"fd110a9559f56e0610a65d6f70e0eec129ec857abfded7631d04f81d7521b706"} Apr 16 14:52:50.888639 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.888634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"42f8f8bc2f2f97efa1bfea775ad1b834a9c728dfd539853d31009e84f247d87d"} Apr 16 14:52:50.888871 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.888651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"82c3c12609f6125cd4c471d2b46a94eaaa2ffdd452b6a2e1c03234f16db2351a"} Apr 16 14:52:50.888871 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.888665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerDied","Data":"2c08035c2f51db763e2175d30e6f7e6b7955da1f786a5354637805bcf2fa5305"} Apr 16 14:52:50.889864 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.889839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m9pmx" event={"ID":"4974791a-9870-4f06-9c9e-14a97a8f16f7","Type":"ContainerStarted","Data":"a59a30f6ee6ec5b41e51cceb753cce08c55d58265d3ff59659ea347725ab03e9"} Apr 16 14:52:50.912854 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:50.912800 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m9pmx" podStartSLOduration=4.787843115 podStartE2EDuration="21.912784938s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.359420361 +0000 UTC m=+3.249835242" lastFinishedPulling="2026-04-16 14:52:49.484362186 +0000 UTC m=+20.374777065" observedRunningTime="2026-04-16 14:52:50.912516709 +0000 UTC m=+21.802931595" watchObservedRunningTime="2026-04-16 14:52:50.912784938 +0000 UTC m=+21.803199823" Apr 16 14:52:51.700775 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:51.700668 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:50.717446479Z","UUID":"4c85e3a1-0972-422f-8e72-150eea4d2940","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:51.703204 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:51.703177 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:51.703204 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:51.703210 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:51.753484 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:51.753454 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:51.753484 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:51.753481 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:51.753721 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:51.753480 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:51.753721 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:51.753575 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:51.753814 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:51.753729 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:51.753814 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:51.753781 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:52.897619 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:52.897415 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 14:52:52.898096 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:52.897963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"6809202659d36deba64a62b202b1854ba5d8d2563f10ae8c83827fd92647eeb5"} Apr 16 14:52:52.900002 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:52.899973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" event={"ID":"9c084eef-0fdf-420d-a2d7-9e96988f8f90","Type":"ContainerStarted","Data":"2cd00698b73fd98713ea22cdbdfc3e816191c9738dbce31102aae01b0d51091d"} Apr 16 14:52:52.928889 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:52.928834 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7qjtp" podStartSLOduration=3.468258868 podStartE2EDuration="22.928819711s" podCreationTimestamp="2026-04-16 14:52:30 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.357779063 +0000 UTC m=+3.248193941" lastFinishedPulling="2026-04-16 14:52:51.818339918 +0000 UTC m=+22.708754784" observedRunningTime="2026-04-16 14:52:52.928437636 +0000 UTC m=+23.818852521" watchObservedRunningTime="2026-04-16 14:52:52.928819711 +0000 UTC m=+23.819234596" Apr 16 14:52:53.753553 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:53.753524 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:53.753774 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:53.753644 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:53.753774 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:53.753652 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:53.753881 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:53.753769 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:53.753881 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:53.753802 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:53.754005 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:53.753881 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:54.105469 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:54.105390 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:54.106216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:54.106169 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:55.753401 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.753211 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:55.754065 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.753238 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:55.754065 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:55.753478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:55.754065 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:55.753529 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:55.754065 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.753267 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:55.754065 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:55.753642 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:55.908132 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.908105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 14:52:55.908494 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.908470 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"d8257f546d68cd8b094167cb96ae3f563a9c60bb97953151e24b1624cb531058"} Apr 16 14:52:55.908753 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.908732 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:55.908974 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.908956 2577 scope.go:117] "RemoveContainer" containerID="2c08035c2f51db763e2175d30e6f7e6b7955da1f786a5354637805bcf2fa5305" Apr 16 14:52:55.910287 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.910262 2577 generic.go:358] "Generic (PLEG): container finished" podID="01ced172-b8c5-4042-97b6-a8813deb4542" containerID="0500f968d4e97bcf2c549742459bb0078b6ac60e4b8513f43ae67c36fd0829a0" exitCode=0 Apr 16 14:52:55.910378 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.910295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerDied","Data":"0500f968d4e97bcf2c549742459bb0078b6ac60e4b8513f43ae67c36fd0829a0"} Apr 16 14:52:55.924399 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:55.924380 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:56.306330 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.306296 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:56.306517 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.306414 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:52:56.306860 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.306843 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mbdgw" Apr 16 14:52:56.911491 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.911461 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m29v8"] Apr 16 14:52:56.911986 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.911613 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:56.911986 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:56.911725 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:56.912593 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.912567 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7rlnq"] Apr 16 14:52:56.912702 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.912688 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:56.912819 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:56.912791 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:56.914481 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.914452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerStarted","Data":"ea91796e3a8af50724e7ebf9f91ccf5eeaf8bfc4a6cbfb2b686dda78b643df28"} Apr 16 14:52:56.915593 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.915566 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tt7s8"] Apr 16 14:52:56.915694 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.915675 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:56.915782 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:56.915759 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:56.918615 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.918596 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 14:52:56.919008 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.918921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" event={"ID":"d696eec3-0a3d-418d-8aa7-05a9c9e9ef26","Type":"ContainerStarted","Data":"987f31206b278380fa09deac0a7c3768124d30d8b3078921ce79075c64ca30e8"} Apr 16 14:52:56.919386 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.919359 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:56.919533 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.919514 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:56.936248 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.936223 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:52:56.960239 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:56.960195 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" podStartSLOduration=10.691933951 podStartE2EDuration="27.960180232s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.366873812 +0000 UTC m=+3.257288682" lastFinishedPulling="2026-04-16 14:52:49.635120083 +0000 UTC m=+20.525534963" observedRunningTime="2026-04-16 14:52:56.959954645 +0000 UTC m=+27.850369555" watchObservedRunningTime="2026-04-16 14:52:56.960180232 +0000 UTC m=+27.850595112" Apr 16 14:52:57.922458 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:57.922425 2577 generic.go:358] "Generic (PLEG): container finished" podID="01ced172-b8c5-4042-97b6-a8813deb4542" containerID="ea91796e3a8af50724e7ebf9f91ccf5eeaf8bfc4a6cbfb2b686dda78b643df28" exitCode=0 Apr 16 14:52:57.922851 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:57.922499 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerDied","Data":"ea91796e3a8af50724e7ebf9f91ccf5eeaf8bfc4a6cbfb2b686dda78b643df28"} Apr 16 14:52:58.753533 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:58.753280 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:52:58.753694 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:58.753297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:52:58.753694 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:58.753648 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:52:58.753694 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:58.753314 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:52:58.753694 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:58.753676 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:52:58.753863 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:52:58.753824 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:52:58.928162 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:58.928075 2577 generic.go:358] "Generic (PLEG): container finished" podID="01ced172-b8c5-4042-97b6-a8813deb4542" containerID="8c52bd4ccad68a5cc6dc48e8bc04ff29643202956c561f40a090dfd2d1413803" exitCode=0 Apr 16 14:52:58.928162 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:52:58.928135 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerDied","Data":"8c52bd4ccad68a5cc6dc48e8bc04ff29643202956c561f40a090dfd2d1413803"} Apr 16 14:53:00.753816 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:00.753777 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:53:00.753816 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:00.753801 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:53:00.753816 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:00.753788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:53:00.754444 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:00.753894 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tt7s8" podUID="367d90fd-ba99-4dee-9cbe-ed7ac607159d" Apr 16 14:53:00.754444 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:00.753994 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m29v8" podUID="b683b265-ac6f-41fa-a2b7-b634d385846f" Apr 16 14:53:00.754444 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:00.754051 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:53:02.439191 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.439158 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-55.ec2.internal" event="NodeReady" Apr 16 14:53:02.439757 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.439316 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:53:02.478651 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.478621 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c"] Apr 16 14:53:02.483306 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.483277 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv"] Apr 16 14:53:02.483468 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.483447 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.485739 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.485715 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 14:53:02.485860 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.485773 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 14:53:02.486271 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.486252 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 14:53:02.486383 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.486251 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 14:53:02.486457 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.486411 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr"] Apr 16 14:53:02.486520 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.486500 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" Apr 16 14:53:02.488539 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.488518 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 14:53:02.488648 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.488575 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-5sw7n\"" Apr 16 14:53:02.489520 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.489501 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f5df9cf-4lq8j"] Apr 16 14:53:02.489647 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.489631 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.491660 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.491642 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 14:53:02.491982 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.491963 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 14:53:02.492257 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.491963 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 14:53:02.492984 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.492965 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.494372 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.494349 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 14:53:02.495308 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.495290 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c"] Apr 16 14:53:02.495429 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.495410 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:53:02.495631 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.495617 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:53:02.495784 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.495772 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:53:02.496007 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.495975 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv"] Apr 16 14:53:02.496007 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.496005 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hzw2z\"" Apr 16 14:53:02.502789 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.501979 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr"] Apr 16 14:53:02.508213 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.508188 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nqtsn"] Apr 16 14:53:02.509977 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.509955 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:53:02.511919 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.511898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f5df9cf-4lq8j"] Apr 16 14:53:02.512103 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.512044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:02.512455 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.512435 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nqtsn"] Apr 16 14:53:02.514489 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.514221 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:53:02.514489 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.514277 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8jkdg\"" Apr 16 14:53:02.514489 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.514227 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:53:02.515625 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.515604 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:53:02.575743 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.575673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/caf51d6d-515c-4900-b0f6-a0458a558256-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.575743 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.575717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-ca\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.575960 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.575756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.575960 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.575801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/11bb9de9-92f3-4fb1-9653-666f48e5e018-tmp\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.575960 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.575827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkdgc\" (UniqueName: \"kubernetes.io/projected/11bb9de9-92f3-4fb1-9653-666f48e5e018-kube-api-access-pkdgc\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.575960 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.575867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-trusted-ca\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.575960 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.575956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/11bb9de9-92f3-4fb1-9653-666f48e5e018-klusterlet-config\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.575976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-bound-sa-token\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-registry-certificates\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/27195efe-d90d-405b-94c1-f7863e41efcd-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8898fcc9-t9tfv\" (UID: \"27195efe-d90d-405b-94c1-f7863e41efcd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576036 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-hub\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576077 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/746d3597-20e2-4f59-a894-1bd13270f82a-ca-trust-extracted\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576103 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576119 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-installation-pull-secrets\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx89w\" (UniqueName: \"kubernetes.io/projected/caf51d6d-515c-4900-b0f6-a0458a558256-kube-api-access-tx89w\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.576176 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtfhk\" (UniqueName: \"kubernetes.io/projected/27195efe-d90d-405b-94c1-f7863e41efcd-kube-api-access-gtfhk\") pod \"managed-serviceaccount-addon-agent-8898fcc9-t9tfv\" (UID: \"27195efe-d90d-405b-94c1-f7863e41efcd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" Apr 16 14:53:02.576530 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576185 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-image-registry-private-configuration\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.576530 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.576202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m7vs\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-kube-api-access-2m7vs\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.594696 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.594664 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lbn5l"] Apr 16 14:53:02.598075 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.598050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.600964 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.600920 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vzfs6\"" Apr 16 14:53:02.601178 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.601158 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:53:02.601325 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.601310 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:53:02.608888 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.608866 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lbn5l"] Apr 16 14:53:02.676564 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.676523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/caf51d6d-515c-4900-b0f6-a0458a558256-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.676761 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.676576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-ca\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.676761 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.676623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.676761 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.676664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/11bb9de9-92f3-4fb1-9653-666f48e5e018-tmp\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.676761 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.676700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkdgc\" (UniqueName: \"kubernetes.io/projected/11bb9de9-92f3-4fb1-9653-666f48e5e018-kube-api-access-pkdgc\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.677082 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-trusted-ca\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.677168 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/11bb9de9-92f3-4fb1-9653-666f48e5e018-klusterlet-config\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.677168 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677098 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/11bb9de9-92f3-4fb1-9653-666f48e5e018-tmp\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-bound-sa-token\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-registry-certificates\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/27195efe-d90d-405b-94c1-f7863e41efcd-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8898fcc9-t9tfv\" (UID: \"27195efe-d90d-405b-94c1-f7863e41efcd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-hub\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/caf51d6d-515c-4900-b0f6-a0458a558256-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/746d3597-20e2-4f59-a894-1bd13270f82a-ca-trust-extracted\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-installation-pull-secrets\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx89w\" (UniqueName: \"kubernetes.io/projected/caf51d6d-515c-4900-b0f6-a0458a558256-kube-api-access-tx89w\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b10a07b-1617-44d5-b186-97458e04e5e0-tmp-dir\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtfhk\" (UniqueName: \"kubernetes.io/projected/27195efe-d90d-405b-94c1-f7863e41efcd-kube-api-access-gtfhk\") pod \"managed-serviceaccount-addon-agent-8898fcc9-t9tfv\" (UID: \"27195efe-d90d-405b-94c1-f7863e41efcd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-image-registry-private-configuration\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m7vs\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-kube-api-access-2m7vs\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.677687 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677690 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:02.678713 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bghw\" (UniqueName: \"kubernetes.io/projected/df57e009-9151-4b90-8c22-bedd6e86b057-kube-api-access-7bghw\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:02.678713 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b10a07b-1617-44d5-b186-97458e04e5e0-config-volume\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.678713 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j27vc\" (UniqueName: \"kubernetes.io/projected/7b10a07b-1617-44d5-b186-97458e04e5e0-kube-api-access-j27vc\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.678713 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.677775 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/746d3597-20e2-4f59-a894-1bd13270f82a-ca-trust-extracted\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.678713 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.678047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-trusted-ca\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.678713 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:02.678282 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:02.678713 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:02.678298 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:53:02.678713 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:02.678362 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.178343125 +0000 UTC m=+34.068757991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:53:02.679097 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.678745 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-registry-certificates\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.681952 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.681901 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-installation-pull-secrets\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.682079 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.681963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-image-registry-private-configuration\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.682134 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.682100 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/11bb9de9-92f3-4fb1-9653-666f48e5e018-klusterlet-config\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.682227 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.682208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.682428 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.682406 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-ca\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.683073 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.683050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.683825 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.683783 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/27195efe-d90d-405b-94c1-f7863e41efcd-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8898fcc9-t9tfv\" (UID: \"27195efe-d90d-405b-94c1-f7863e41efcd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" Apr 16 14:53:02.684210 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.684143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/caf51d6d-515c-4900-b0f6-a0458a558256-hub\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.685754 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.685729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkdgc\" (UniqueName: \"kubernetes.io/projected/11bb9de9-92f3-4fb1-9653-666f48e5e018-kube-api-access-pkdgc\") pod \"klusterlet-addon-workmgr-77c998d4f7-wkl9c\" (UID: \"11bb9de9-92f3-4fb1-9653-666f48e5e018\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.685863 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.685767 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-bound-sa-token\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.686148 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.686107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx89w\" (UniqueName: \"kubernetes.io/projected/caf51d6d-515c-4900-b0f6-a0458a558256-kube-api-access-tx89w\") pod \"cluster-proxy-proxy-agent-d989dbd46-shlwr\" (UID: \"caf51d6d-515c-4900-b0f6-a0458a558256\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:02.687459 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.687437 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m7vs\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-kube-api-access-2m7vs\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:02.687547 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.687466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtfhk\" (UniqueName: \"kubernetes.io/projected/27195efe-d90d-405b-94c1-f7863e41efcd-kube-api-access-gtfhk\") pod \"managed-serviceaccount-addon-agent-8898fcc9-t9tfv\" (UID: \"27195efe-d90d-405b-94c1-f7863e41efcd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" Apr 16 14:53:02.753638 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.753599 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:53:02.753821 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.753722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:53:02.754187 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.754162 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:53:02.756396 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.756373 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:53:02.756521 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.756373 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:02.756521 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.756386 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:02.756521 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.756467 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:02.756521 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.756383 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j5fbx\"" Apr 16 14:53:02.756763 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.756746 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kc8hc\"" Apr 16 14:53:02.778392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.778362 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.778392 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.778388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b10a07b-1617-44d5-b186-97458e04e5e0-tmp-dir\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.778559 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.778421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:02.778559 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:02.778518 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:02.778657 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:02.778572 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.278555455 +0000 UTC m=+34.168970323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:53:02.778657 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:02.778580 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:02.778657 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:02.778635 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:03.278615902 +0000 UTC m=+34.169030782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:02.778818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.778663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bghw\" (UniqueName: \"kubernetes.io/projected/df57e009-9151-4b90-8c22-bedd6e86b057-kube-api-access-7bghw\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:02.778818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.778697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b10a07b-1617-44d5-b186-97458e04e5e0-config-volume\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.778818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.778729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j27vc\" (UniqueName: \"kubernetes.io/projected/7b10a07b-1617-44d5-b186-97458e04e5e0-kube-api-access-j27vc\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.778818 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.778751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b10a07b-1617-44d5-b186-97458e04e5e0-tmp-dir\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.779320 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.779301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b10a07b-1617-44d5-b186-97458e04e5e0-config-volume\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.788919 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.788878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j27vc\" (UniqueName: \"kubernetes.io/projected/7b10a07b-1617-44d5-b186-97458e04e5e0-kube-api-access-j27vc\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:02.789115 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.789095 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bghw\" (UniqueName: \"kubernetes.io/projected/df57e009-9151-4b90-8c22-bedd6e86b057-kube-api-access-7bghw\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:02.807344 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.807320 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:02.824751 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.824719 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" Apr 16 14:53:02.831989 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:02.831961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:53:03.182708 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.182604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:03.182868 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.182801 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:03.182868 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.182821 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:53:03.183021 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.182903 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.182885458 +0000 UTC m=+35.073300322 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:53:03.283535 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.283493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:03.283535 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.283543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:03.283777 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.283651 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:03.283777 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.283654 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:03.283777 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.283717 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.283701071 +0000 UTC m=+35.174115935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:53:03.283777 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.283735 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:04.283726345 +0000 UTC m=+35.174141212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:03.384846 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.384804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:53:03.385060 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.384999 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:03.385127 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:03.385089 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:53:35.385068396 +0000 UTC m=+66.275483274 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : secret "metrics-daemon-secret" not found Apr 16 14:53:03.486301 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.486261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:53:03.486867 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.486327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:53:03.489113 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.489089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b683b265-ac6f-41fa-a2b7-b634d385846f-original-pull-secret\") pod \"global-pull-secret-syncer-m29v8\" (UID: \"b683b265-ac6f-41fa-a2b7-b634d385846f\") " pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:53:03.491447 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.491423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hzzm\" (UniqueName: \"kubernetes.io/projected/367d90fd-ba99-4dee-9cbe-ed7ac607159d-kube-api-access-8hzzm\") pod \"network-check-target-tt7s8\" (UID: \"367d90fd-ba99-4dee-9cbe-ed7ac607159d\") " pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:53:03.666434 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.666393 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:53:03.679334 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:03.679296 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m29v8" Apr 16 14:53:04.192838 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.192799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:04.193057 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:04.192990 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:04.193057 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:04.193010 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:53:04.193173 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:04.193075 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:06.193055235 +0000 UTC m=+37.083470098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:53:04.294128 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.294089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:04.294128 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.294135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:04.294371 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:04.294240 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:04.294371 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:04.294256 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:04.294371 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:04.294316 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:06.294298919 +0000 UTC m=+37.184713782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:53:04.294371 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:04.294336 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:06.294325315 +0000 UTC m=+37.184740187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:04.574213 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.573957 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr"] Apr 16 14:53:04.575798 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.575587 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c"] Apr 16 14:53:04.585737 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.585712 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tt7s8"] Apr 16 14:53:04.590892 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.590872 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m29v8"] Apr 16 14:53:04.594817 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.594796 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv"] Apr 16 14:53:04.622472 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:53:04.622437 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11bb9de9_92f3_4fb1_9653_666f48e5e018.slice/crio-b9a15ef1bab9d0cf0417478fb4bdf18839f27380f8832434aecd1b7e9416f2ec WatchSource:0}: Error finding container b9a15ef1bab9d0cf0417478fb4bdf18839f27380f8832434aecd1b7e9416f2ec: Status 404 returned error can't find the container with id b9a15ef1bab9d0cf0417478fb4bdf18839f27380f8832434aecd1b7e9416f2ec Apr 16 14:53:04.623363 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:53:04.623337 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaf51d6d_515c_4900_b0f6_a0458a558256.slice/crio-d565990c14f9085a1243ad0ca92a493ce195018e3a36eb24541f284d6f923e40 WatchSource:0}: Error finding container d565990c14f9085a1243ad0ca92a493ce195018e3a36eb24541f284d6f923e40: Status 404 returned error can't find the container with id d565990c14f9085a1243ad0ca92a493ce195018e3a36eb24541f284d6f923e40 Apr 16 14:53:04.624200 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:53:04.624100 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod367d90fd_ba99_4dee_9cbe_ed7ac607159d.slice/crio-217bffc74a9f70134031fc8ae78653e83db2c500a8ea5cd5d9f97aa4d0995976 WatchSource:0}: Error finding container 217bffc74a9f70134031fc8ae78653e83db2c500a8ea5cd5d9f97aa4d0995976: Status 404 returned error can't find the container with id 217bffc74a9f70134031fc8ae78653e83db2c500a8ea5cd5d9f97aa4d0995976 Apr 16 14:53:04.626262 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:53:04.626166 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27195efe_d90d_405b_94c1_f7863e41efcd.slice/crio-62a2e2771775e1cffb5d090c8e5f904e062d48b509ee236ccd7e01dfd169aef8 WatchSource:0}: Error finding container 62a2e2771775e1cffb5d090c8e5f904e062d48b509ee236ccd7e01dfd169aef8: Status 404 returned error can't find the container with id 62a2e2771775e1cffb5d090c8e5f904e062d48b509ee236ccd7e01dfd169aef8 Apr 16 14:53:04.940606 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.940575 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tt7s8" event={"ID":"367d90fd-ba99-4dee-9cbe-ed7ac607159d","Type":"ContainerStarted","Data":"217bffc74a9f70134031fc8ae78653e83db2c500a8ea5cd5d9f97aa4d0995976"} Apr 16 14:53:04.941648 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.941618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" event={"ID":"caf51d6d-515c-4900-b0f6-a0458a558256","Type":"ContainerStarted","Data":"d565990c14f9085a1243ad0ca92a493ce195018e3a36eb24541f284d6f923e40"} Apr 16 14:53:04.944035 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.944013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerStarted","Data":"b825d4edaaee0b2a6781bb4a559f898293a0420f8edcd701b53f212afb93610d"} Apr 16 14:53:04.945050 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.945023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m29v8" event={"ID":"b683b265-ac6f-41fa-a2b7-b634d385846f","Type":"ContainerStarted","Data":"f8aabd318643bdaf72031f37e1a4694b28384779bc6b7be6eac3329f0f23a589"} Apr 16 14:53:04.946111 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.946089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" event={"ID":"27195efe-d90d-405b-94c1-f7863e41efcd","Type":"ContainerStarted","Data":"62a2e2771775e1cffb5d090c8e5f904e062d48b509ee236ccd7e01dfd169aef8"} Apr 16 14:53:04.946975 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:04.946956 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" event={"ID":"11bb9de9-92f3-4fb1-9653-666f48e5e018","Type":"ContainerStarted","Data":"b9a15ef1bab9d0cf0417478fb4bdf18839f27380f8832434aecd1b7e9416f2ec"} Apr 16 14:53:05.961424 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:05.960430 2577 generic.go:358] "Generic (PLEG): container finished" podID="01ced172-b8c5-4042-97b6-a8813deb4542" containerID="b825d4edaaee0b2a6781bb4a559f898293a0420f8edcd701b53f212afb93610d" exitCode=0 Apr 16 14:53:05.961424 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:05.960486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerDied","Data":"b825d4edaaee0b2a6781bb4a559f898293a0420f8edcd701b53f212afb93610d"} Apr 16 14:53:06.215642 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:06.215567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:06.215785 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:06.215719 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:06.215785 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:06.215740 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:53:06.215894 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:06.215801 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:10.21578088 +0000 UTC m=+41.106195748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:53:06.316316 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:06.316274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:06.316442 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:06.316416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:06.316571 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:06.316550 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:06.316663 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:06.316613 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:10.316595551 +0000 UTC m=+41.207010418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:06.317048 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:06.317027 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:06.317141 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:06.317081 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:10.317065965 +0000 UTC m=+41.207480832 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:53:06.975606 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:06.974628 2577 generic.go:358] "Generic (PLEG): container finished" podID="01ced172-b8c5-4042-97b6-a8813deb4542" containerID="77da1f14660bad94e9fb3f10af0a953dcae9091de74f8f94249deb3a40b0c4b6" exitCode=0 Apr 16 14:53:06.975606 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:06.974729 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerDied","Data":"77da1f14660bad94e9fb3f10af0a953dcae9091de74f8f94249deb3a40b0c4b6"} Apr 16 14:53:07.982199 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:07.982163 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" event={"ID":"01ced172-b8c5-4042-97b6-a8813deb4542","Type":"ContainerStarted","Data":"6a42b68b3e4c8e65335f5ceb4820b6ba96a7ad8809a065eac2577368e7af4dbf"} Apr 16 14:53:08.006012 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:08.005909 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gdnq8" podStartSLOduration=5.698097684 podStartE2EDuration="38.005894597s" podCreationTimestamp="2026-04-16 14:52:30 +0000 UTC" firstStartedPulling="2026-04-16 14:52:32.364909566 +0000 UTC m=+3.255324437" lastFinishedPulling="2026-04-16 14:53:04.672706485 +0000 UTC m=+35.563121350" observedRunningTime="2026-04-16 14:53:08.004980681 +0000 UTC m=+38.895395569" watchObservedRunningTime="2026-04-16 14:53:08.005894597 +0000 UTC m=+38.896309465" Apr 16 14:53:10.252147 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:10.252105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:10.252534 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:10.252269 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:10.252534 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:10.252293 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:53:10.252534 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:10.252364 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.252344357 +0000 UTC m=+49.142759233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:53:10.353044 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:10.353002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:10.353217 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:10.353054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:10.353217 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:10.353163 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:10.353217 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:10.353196 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:10.353320 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:10.353233 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.353209813 +0000 UTC m=+49.243624680 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:10.353320 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:10.353249 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:18.353242048 +0000 UTC m=+49.243656911 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:53:14.999362 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:14.999301 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m29v8" event={"ID":"b683b265-ac6f-41fa-a2b7-b634d385846f","Type":"ContainerStarted","Data":"a95f78d75f1ab5705cd6b5f2957f4ac18d07ccda98563019c448820c550366a3"} Apr 16 14:53:15.000847 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.000813 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" event={"ID":"27195efe-d90d-405b-94c1-f7863e41efcd","Type":"ContainerStarted","Data":"c7c2d283cf17e091903cc783942d71ef50def7a72415bd482c8ab9b15b1408e1"} Apr 16 14:53:15.002272 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.002247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" event={"ID":"11bb9de9-92f3-4fb1-9653-666f48e5e018","Type":"ContainerStarted","Data":"03564f1a3b45c14c07012da0348c565095a4b8ecbb49e7620987770ad9bc1221"} Apr 16 14:53:15.002694 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.002675 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:15.004078 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.004047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tt7s8" event={"ID":"367d90fd-ba99-4dee-9cbe-ed7ac607159d","Type":"ContainerStarted","Data":"bfc29a0bb5bd32bdc4e974dad20d748e3ed46b102eb300d0323c6fe1c157b240"} Apr 16 14:53:15.004216 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.004199 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:53:15.004491 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.004460 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:53:15.005451 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.005434 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" event={"ID":"caf51d6d-515c-4900-b0f6-a0458a558256","Type":"ContainerStarted","Data":"216b1e434996acd7a110b9542920ae7697221b5fd8854a949146cc3c960c1f96"} Apr 16 14:53:15.014195 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.014155 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m29v8" podStartSLOduration=34.02457714 podStartE2EDuration="44.014144969s" podCreationTimestamp="2026-04-16 14:52:31 +0000 UTC" firstStartedPulling="2026-04-16 14:53:04.651570794 +0000 UTC m=+35.541985663" lastFinishedPulling="2026-04-16 14:53:14.641138613 +0000 UTC m=+45.531553492" observedRunningTime="2026-04-16 14:53:15.013797928 +0000 UTC m=+45.904212814" watchObservedRunningTime="2026-04-16 14:53:15.014144969 +0000 UTC m=+45.904559855" Apr 16 14:53:15.028838 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.028788 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" podStartSLOduration=32.023844157 podStartE2EDuration="42.028777972s" podCreationTimestamp="2026-04-16 14:52:33 +0000 UTC" firstStartedPulling="2026-04-16 14:53:04.62461834 +0000 UTC m=+35.515033221" lastFinishedPulling="2026-04-16 14:53:14.629552169 +0000 UTC m=+45.519967036" observedRunningTime="2026-04-16 14:53:15.027705808 +0000 UTC m=+45.918120695" watchObservedRunningTime="2026-04-16 14:53:15.028777972 +0000 UTC m=+45.919192835" Apr 16 14:53:15.042790 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.042737 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" podStartSLOduration=32.064484212 podStartE2EDuration="42.042721647s" podCreationTimestamp="2026-04-16 14:52:33 +0000 UTC" firstStartedPulling="2026-04-16 14:53:04.651562785 +0000 UTC m=+35.541977653" lastFinishedPulling="2026-04-16 14:53:14.629800206 +0000 UTC m=+45.520215088" observedRunningTime="2026-04-16 14:53:15.042548946 +0000 UTC m=+45.932963832" watchObservedRunningTime="2026-04-16 14:53:15.042721647 +0000 UTC m=+45.933136534" Apr 16 14:53:15.057659 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:15.057612 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tt7s8" podStartSLOduration=36.078614455 podStartE2EDuration="46.057597233s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:53:04.651642175 +0000 UTC m=+35.542057038" lastFinishedPulling="2026-04-16 14:53:14.630624938 +0000 UTC m=+45.521039816" observedRunningTime="2026-04-16 14:53:15.057352106 +0000 UTC m=+45.947766993" watchObservedRunningTime="2026-04-16 14:53:15.057597233 +0000 UTC m=+45.948012119" Apr 16 14:53:18.016223 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:18.016182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" event={"ID":"caf51d6d-515c-4900-b0f6-a0458a558256","Type":"ContainerStarted","Data":"a0d872ab6e76077c6fe868512719935e096057c355f57ca573bb4e7ba46bf401"} Apr 16 14:53:18.016223 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:18.016221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" event={"ID":"caf51d6d-515c-4900-b0f6-a0458a558256","Type":"ContainerStarted","Data":"fc149ebfa1b8a09ea1f3a4586856f7615132f164145774620e195fe889da2b83"} Apr 16 14:53:18.032949 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:18.032890 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" podStartSLOduration=32.432936343 podStartE2EDuration="45.032879042s" podCreationTimestamp="2026-04-16 14:52:33 +0000 UTC" firstStartedPulling="2026-04-16 14:53:04.651439034 +0000 UTC m=+35.541853901" lastFinishedPulling="2026-04-16 14:53:17.251381733 +0000 UTC m=+48.141796600" observedRunningTime="2026-04-16 14:53:18.031823484 +0000 UTC m=+48.922238396" watchObservedRunningTime="2026-04-16 14:53:18.032879042 +0000 UTC m=+48.923293944" Apr 16 14:53:18.319674 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:18.319584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:18.319816 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:18.319713 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:18.319816 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:18.319730 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:53:18.319816 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:18.319796 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:34.319779881 +0000 UTC m=+65.210194746 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:53:18.420544 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:18.420513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:18.420651 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:18.420554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:18.420651 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:18.420639 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:18.420651 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:18.420642 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:18.420740 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:18.420693 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:34.420679578 +0000 UTC m=+65.311094442 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:53:18.420740 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:18.420704 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:34.420698661 +0000 UTC m=+65.311113524 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:28.945828 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:28.945799 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4rq4s" Apr 16 14:53:34.338688 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:34.338647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:53:34.339085 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:34.338796 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:34.339085 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:34.338816 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:53:34.339085 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:34.338876 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:06.338859257 +0000 UTC m=+97.229274130 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:53:34.439760 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:34.439728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:53:34.439875 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:34.439830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:53:34.439911 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:34.439878 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:34.439911 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:34.439905 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:34.439993 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:34.439965 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:06.439925685 +0000 UTC m=+97.330340549 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:53:34.439993 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:34.439980 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:06.439973448 +0000 UTC m=+97.330388312 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:53:35.447406 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:35.447358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:53:35.447872 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:35.447533 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:35.447872 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:53:35.447615 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:54:39.447594879 +0000 UTC m=+130.338009748 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : secret "metrics-daemon-secret" not found Apr 16 14:53:46.010686 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:53:46.010651 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tt7s8" Apr 16 14:54:06.367281 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:54:06.367234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:54:06.367696 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:06.367381 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:06.367696 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:06.367400 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:54:06.367696 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:06.367468 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.36745191 +0000 UTC m=+161.257866775 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:54:06.468289 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:54:06.468256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:54:06.468426 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:54:06.468298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:54:06.468426 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:06.468390 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:06.468426 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:06.468398 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:06.468546 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:06.468441 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.468426028 +0000 UTC m=+161.358840893 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:54:06.468546 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:06.468453 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.46844731 +0000 UTC m=+161.358862174 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:54:39.513915 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:54:39.513857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:54:39.514436 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:39.514036 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:39.514436 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:54:39.514106 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs podName:731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e nodeName:}" failed. No retries permitted until 2026-04-16 14:56:41.514088064 +0000 UTC m=+252.404502928 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs") pod "network-metrics-daemon-7rlnq" (UID: "731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e") : secret "metrics-daemon-secret" not found Apr 16 14:55:05.556135 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:05.556078 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" podUID="746d3597-20e2-4f59-a894-1bd13270f82a" Apr 16 14:55:05.561210 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:05.561189 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-nqtsn" podUID="df57e009-9151-4b90-8c22-bedd6e86b057" Apr 16 14:55:05.609776 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:05.609744 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lbn5l" podUID="7b10a07b-1617-44d5-b186-97458e04e5e0" Apr 16 14:55:05.773464 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:05.773436 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7rlnq" podUID="731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e" Apr 16 14:55:06.266537 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:06.266504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lbn5l" Apr 16 14:55:06.266537 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:06.266538 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:55:06.266724 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:06.266651 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:55:06.580523 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:06.580458 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nhw5h_a7987cab-cae1-4625-84c9-b135a5b3b6e7/dns-node-resolver/0.log" Apr 16 14:55:07.379053 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:07.379028 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dx2h6_4f1d6917-4b83-407b-a119-822474f666a8/node-ca/0.log" Apr 16 14:55:10.444680 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:10.444642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") pod \"image-registry-f5df9cf-4lq8j\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:55:10.445096 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:10.444788 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:55:10.445096 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:10.444808 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f5df9cf-4lq8j: secret "image-registry-tls" not found Apr 16 14:55:10.445096 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:10.444874 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls podName:746d3597-20e2-4f59-a894-1bd13270f82a nodeName:}" failed. No retries permitted until 2026-04-16 14:57:12.444858435 +0000 UTC m=+283.335273299 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls") pod "image-registry-f5df9cf-4lq8j" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a") : secret "image-registry-tls" not found Apr 16 14:55:10.546064 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:10.546027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:55:10.546064 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:10.546069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:55:10.546276 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:10.546181 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:55:10.546276 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:10.546190 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:55:10.546276 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:10.546243 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert podName:df57e009-9151-4b90-8c22-bedd6e86b057 nodeName:}" failed. No retries permitted until 2026-04-16 14:57:12.546230448 +0000 UTC m=+283.436645312 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert") pod "ingress-canary-nqtsn" (UID: "df57e009-9151-4b90-8c22-bedd6e86b057") : secret "canary-serving-cert" not found Apr 16 14:55:10.546276 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:10.546257 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls podName:7b10a07b-1617-44d5-b186-97458e04e5e0 nodeName:}" failed. No retries permitted until 2026-04-16 14:57:12.546249909 +0000 UTC m=+283.436664773 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls") pod "dns-default-lbn5l" (UID: "7b10a07b-1617-44d5-b186-97458e04e5e0") : secret "dns-default-metrics-tls" not found Apr 16 14:55:15.003261 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:15.003202 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" podUID="11bb9de9-92f3-4fb1-9653-666f48e5e018" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.6:8000/readyz\": dial tcp 10.132.0.6:8000: connect: connection refused" Apr 16 14:55:15.288169 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:15.288135 2577 generic.go:358] "Generic (PLEG): container finished" podID="27195efe-d90d-405b-94c1-f7863e41efcd" containerID="c7c2d283cf17e091903cc783942d71ef50def7a72415bd482c8ab9b15b1408e1" exitCode=255 Apr 16 14:55:15.288364 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:15.288214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" event={"ID":"27195efe-d90d-405b-94c1-f7863e41efcd","Type":"ContainerDied","Data":"c7c2d283cf17e091903cc783942d71ef50def7a72415bd482c8ab9b15b1408e1"} Apr 16 14:55:15.288575 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:15.288552 2577 scope.go:117] "RemoveContainer" containerID="c7c2d283cf17e091903cc783942d71ef50def7a72415bd482c8ab9b15b1408e1" Apr 16 14:55:15.289495 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:15.289476 2577 generic.go:358] "Generic (PLEG): container finished" podID="11bb9de9-92f3-4fb1-9653-666f48e5e018" containerID="03564f1a3b45c14c07012da0348c565095a4b8ecbb49e7620987770ad9bc1221" exitCode=1 Apr 16 14:55:15.289544 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:15.289507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" event={"ID":"11bb9de9-92f3-4fb1-9653-666f48e5e018","Type":"ContainerDied","Data":"03564f1a3b45c14c07012da0348c565095a4b8ecbb49e7620987770ad9bc1221"} Apr 16 14:55:15.289844 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:15.289828 2577 scope.go:117] "RemoveContainer" containerID="03564f1a3b45c14c07012da0348c565095a4b8ecbb49e7620987770ad9bc1221" Apr 16 14:55:16.293473 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:16.293440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8898fcc9-t9tfv" event={"ID":"27195efe-d90d-405b-94c1-f7863e41efcd","Type":"ContainerStarted","Data":"7583f00a355d3066d30406c92cedd0dd8331885d6eab4389b5481130a41af569"} Apr 16 14:55:16.295003 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:16.294982 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" event={"ID":"11bb9de9-92f3-4fb1-9653-666f48e5e018","Type":"ContainerStarted","Data":"6f088e6b937b79c1744d71b93077097869342af8d6a53b717be44a129a6327bb"} Apr 16 14:55:16.295242 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:16.295224 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:55:16.295878 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:16.295860 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-77c998d4f7-wkl9c" Apr 16 14:55:16.753177 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:16.753140 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:55:26.880831 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.880794 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xbx5f"] Apr 16 14:55:26.883577 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.883556 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:26.885783 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.885763 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:26.886704 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.886682 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:26.886819 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.886751 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8pzvz\"" Apr 16 14:55:26.886979 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.886962 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:26.887254 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.887239 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:26.893406 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.893376 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xbx5f"] Apr 16 14:55:26.978049 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.978020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e012d2f1-e273-4443-8305-4cca55b420d4-crio-socket\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:26.978049 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.978060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggl9d\" (UniqueName: \"kubernetes.io/projected/e012d2f1-e273-4443-8305-4cca55b420d4-kube-api-access-ggl9d\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:26.978266 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.978156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e012d2f1-e273-4443-8305-4cca55b420d4-data-volume\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:26.978266 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.978182 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e012d2f1-e273-4443-8305-4cca55b420d4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:26.978266 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:26.978211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e012d2f1-e273-4443-8305-4cca55b420d4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.078890 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.078857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggl9d\" (UniqueName: \"kubernetes.io/projected/e012d2f1-e273-4443-8305-4cca55b420d4-kube-api-access-ggl9d\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.079039 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.078924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e012d2f1-e273-4443-8305-4cca55b420d4-data-volume\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.079145 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.079122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e012d2f1-e273-4443-8305-4cca55b420d4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.079204 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.079191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e012d2f1-e273-4443-8305-4cca55b420d4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.079262 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.079201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e012d2f1-e273-4443-8305-4cca55b420d4-data-volume\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.079312 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.079276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e012d2f1-e273-4443-8305-4cca55b420d4-crio-socket\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.079383 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.079368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e012d2f1-e273-4443-8305-4cca55b420d4-crio-socket\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.079655 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.079638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e012d2f1-e273-4443-8305-4cca55b420d4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.083051 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.083035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e012d2f1-e273-4443-8305-4cca55b420d4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.087053 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.087025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggl9d\" (UniqueName: \"kubernetes.io/projected/e012d2f1-e273-4443-8305-4cca55b420d4-kube-api-access-ggl9d\") pod \"insights-runtime-extractor-xbx5f\" (UID: \"e012d2f1-e273-4443-8305-4cca55b420d4\") " pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.193057 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.192955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xbx5f" Apr 16 14:55:27.313650 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.313617 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xbx5f"] Apr 16 14:55:27.317733 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:55:27.317704 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode012d2f1_e273_4443_8305_4cca55b420d4.slice/crio-1eb8e3bc5ccbed011ba1774debeb11e08d91f5be470296f8aa9441d9731b5f00 WatchSource:0}: Error finding container 1eb8e3bc5ccbed011ba1774debeb11e08d91f5be470296f8aa9441d9731b5f00: Status 404 returned error can't find the container with id 1eb8e3bc5ccbed011ba1774debeb11e08d91f5be470296f8aa9441d9731b5f00 Apr 16 14:55:27.325822 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:27.325790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xbx5f" event={"ID":"e012d2f1-e273-4443-8305-4cca55b420d4","Type":"ContainerStarted","Data":"1eb8e3bc5ccbed011ba1774debeb11e08d91f5be470296f8aa9441d9731b5f00"} Apr 16 14:55:28.330066 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:28.330026 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xbx5f" event={"ID":"e012d2f1-e273-4443-8305-4cca55b420d4","Type":"ContainerStarted","Data":"2c8637c93bbd5937e233e34e97e6fd33ccd4e161679fe578a8e1a0242f656e91"} Apr 16 14:55:28.330415 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:28.330071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xbx5f" event={"ID":"e012d2f1-e273-4443-8305-4cca55b420d4","Type":"ContainerStarted","Data":"53e87117cae2d2cd54b7546d3a2accb3c41a4a9fad39128cea27a5200e45ddf7"} Apr 16 14:55:30.336829 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:30.336789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xbx5f" event={"ID":"e012d2f1-e273-4443-8305-4cca55b420d4","Type":"ContainerStarted","Data":"912d2b42faa148833642a957dfff18f43c754ac8403b8202217f98218ffe5900"} Apr 16 14:55:30.356860 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:30.356807 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xbx5f" podStartSLOduration=2.401283478 podStartE2EDuration="4.356792316s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="2026-04-16 14:55:27.369407299 +0000 UTC m=+178.259822163" lastFinishedPulling="2026-04-16 14:55:29.324916137 +0000 UTC m=+180.215331001" observedRunningTime="2026-04-16 14:55:30.356791481 +0000 UTC m=+181.247206366" watchObservedRunningTime="2026-04-16 14:55:30.356792316 +0000 UTC m=+181.247207203" Apr 16 14:55:48.929677 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:48.929642 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f5df9cf-4lq8j"] Apr 16 14:55:48.930155 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:48.929837 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" podUID="746d3597-20e2-4f59-a894-1bd13270f82a" Apr 16 14:55:49.383405 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.383377 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:55:49.387368 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.387341 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:55:49.545873 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.545824 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-trusted-ca\") pod \"746d3597-20e2-4f59-a894-1bd13270f82a\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " Apr 16 14:55:49.545873 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.545888 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-bound-sa-token\") pod \"746d3597-20e2-4f59-a894-1bd13270f82a\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " Apr 16 14:55:49.546152 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.545921 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-image-registry-private-configuration\") pod \"746d3597-20e2-4f59-a894-1bd13270f82a\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " Apr 16 14:55:49.546152 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.545968 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-registry-certificates\") pod \"746d3597-20e2-4f59-a894-1bd13270f82a\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " Apr 16 14:55:49.546152 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.545997 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/746d3597-20e2-4f59-a894-1bd13270f82a-ca-trust-extracted\") pod \"746d3597-20e2-4f59-a894-1bd13270f82a\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " Apr 16 14:55:49.546152 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.546052 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-installation-pull-secrets\") pod \"746d3597-20e2-4f59-a894-1bd13270f82a\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " Apr 16 14:55:49.546152 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.546082 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m7vs\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-kube-api-access-2m7vs\") pod \"746d3597-20e2-4f59-a894-1bd13270f82a\" (UID: \"746d3597-20e2-4f59-a894-1bd13270f82a\") " Apr 16 14:55:49.546429 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.546398 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "746d3597-20e2-4f59-a894-1bd13270f82a" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:49.546487 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.546402 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/746d3597-20e2-4f59-a894-1bd13270f82a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "746d3597-20e2-4f59-a894-1bd13270f82a" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:49.546487 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.546418 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "746d3597-20e2-4f59-a894-1bd13270f82a" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:49.548479 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.548455 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-kube-api-access-2m7vs" (OuterVolumeSpecName: "kube-api-access-2m7vs") pod "746d3597-20e2-4f59-a894-1bd13270f82a" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a"). InnerVolumeSpecName "kube-api-access-2m7vs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:49.548594 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.548507 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "746d3597-20e2-4f59-a894-1bd13270f82a" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:49.548594 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.548529 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "746d3597-20e2-4f59-a894-1bd13270f82a" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:49.548594 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.548550 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "746d3597-20e2-4f59-a894-1bd13270f82a" (UID: "746d3597-20e2-4f59-a894-1bd13270f82a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:49.646875 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.646785 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-installation-pull-secrets\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 14:55:49.646875 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.646815 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2m7vs\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-kube-api-access-2m7vs\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 14:55:49.646875 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.646828 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-trusted-ca\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 14:55:49.646875 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.646841 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-bound-sa-token\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 14:55:49.646875 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.646853 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/746d3597-20e2-4f59-a894-1bd13270f82a-image-registry-private-configuration\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 14:55:49.646875 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.646866 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/746d3597-20e2-4f59-a894-1bd13270f82a-registry-certificates\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 14:55:49.646875 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:49.646879 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/746d3597-20e2-4f59-a894-1bd13270f82a-ca-trust-extracted\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 14:55:50.385436 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:50.385407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f5df9cf-4lq8j" Apr 16 14:55:50.423495 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:50.423458 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f5df9cf-4lq8j"] Apr 16 14:55:50.427547 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:50.427517 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-f5df9cf-4lq8j"] Apr 16 14:55:50.554429 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:50.554392 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/746d3597-20e2-4f59-a894-1bd13270f82a-registry-tls\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 14:55:51.756557 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:51.756523 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="746d3597-20e2-4f59-a894-1bd13270f82a" path="/var/lib/kubelet/pods/746d3597-20e2-4f59-a894-1bd13270f82a/volumes" Apr 16 14:55:53.331095 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.331061 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-llxsj"] Apr 16 14:55:53.335322 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.335297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.337745 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.337720 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:53.338791 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.338769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:53.340005 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.339988 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:53.340068 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.340030 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:53.340123 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.340113 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:53.340191 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.340175 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:53.340322 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.340306 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mkbd4\"" Apr 16 14:55:53.475993 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.475954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-textfile\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.475993 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.475994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3f766f2-936e-41b9-8875-5e4b71e56887-metrics-client-ca\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.476200 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.476065 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg9qz\" (UniqueName: \"kubernetes.io/projected/e3f766f2-936e-41b9-8875-5e4b71e56887-kube-api-access-cg9qz\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.476200 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.476105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-sys\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.476200 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.476130 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-wtmp\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.476200 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.476171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-accelerators-collector-config\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.476341 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.476203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-root\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.476341 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.476219 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.476341 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.476260 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-tls\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.576920 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.576884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-textfile\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577054 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.576926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3f766f2-936e-41b9-8875-5e4b71e56887-metrics-client-ca\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577054 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cg9qz\" (UniqueName: \"kubernetes.io/projected/e3f766f2-936e-41b9-8875-5e4b71e56887-kube-api-access-cg9qz\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577054 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-sys\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577054 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-wtmp\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577226 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-accelerators-collector-config\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577226 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-sys\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577226 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-root\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577347 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-textfile\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577347 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577347 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-root\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577347 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-wtmp\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577347 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-tls\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577512 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:53.577349 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:55:53.577512 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:55:53.577410 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-tls podName:e3f766f2-936e-41b9-8875-5e4b71e56887 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:54.077392861 +0000 UTC m=+204.967807728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-tls") pod "node-exporter-llxsj" (UID: "e3f766f2-936e-41b9-8875-5e4b71e56887") : secret "node-exporter-tls" not found Apr 16 14:55:53.577593 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e3f766f2-936e-41b9-8875-5e4b71e56887-metrics-client-ca\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.577593 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.577560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-accelerators-collector-config\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.579581 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.579565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:53.589388 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:53.589318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg9qz\" (UniqueName: \"kubernetes.io/projected/e3f766f2-936e-41b9-8875-5e4b71e56887-kube-api-access-cg9qz\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:54.080622 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:54.080585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-tls\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:54.083077 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:54.083056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e3f766f2-936e-41b9-8875-5e4b71e56887-node-exporter-tls\") pod \"node-exporter-llxsj\" (UID: \"e3f766f2-936e-41b9-8875-5e4b71e56887\") " pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:54.244674 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:54.244620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-llxsj" Apr 16 14:55:54.252961 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:55:54.252915 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3f766f2_936e_41b9_8875_5e4b71e56887.slice/crio-71a27ad5019359471cfd680912385f359a56e8d07fd1eca8fa82096b7011e6c7 WatchSource:0}: Error finding container 71a27ad5019359471cfd680912385f359a56e8d07fd1eca8fa82096b7011e6c7: Status 404 returned error can't find the container with id 71a27ad5019359471cfd680912385f359a56e8d07fd1eca8fa82096b7011e6c7 Apr 16 14:55:54.396230 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:54.396141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-llxsj" event={"ID":"e3f766f2-936e-41b9-8875-5e4b71e56887","Type":"ContainerStarted","Data":"71a27ad5019359471cfd680912385f359a56e8d07fd1eca8fa82096b7011e6c7"} Apr 16 14:55:55.400427 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:55.400390 2577 generic.go:358] "Generic (PLEG): container finished" podID="e3f766f2-936e-41b9-8875-5e4b71e56887" containerID="88a0f5365258d9ce9d622f79059b2c0668b0ad79dcb2bef36d5efe4e152de07c" exitCode=0 Apr 16 14:55:55.400857 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:55.400460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-llxsj" event={"ID":"e3f766f2-936e-41b9-8875-5e4b71e56887","Type":"ContainerDied","Data":"88a0f5365258d9ce9d622f79059b2c0668b0ad79dcb2bef36d5efe4e152de07c"} Apr 16 14:55:56.404841 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:56.404805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-llxsj" event={"ID":"e3f766f2-936e-41b9-8875-5e4b71e56887","Type":"ContainerStarted","Data":"e55010d724b924e13f8a9c8fd79bc70c673d272cd01819a5df9bebd6ca881354"} Apr 16 14:55:56.404841 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:56.404838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-llxsj" event={"ID":"e3f766f2-936e-41b9-8875-5e4b71e56887","Type":"ContainerStarted","Data":"630fb4c91d1059bc9a430e9375059af9388f9152e3957da6ea175d4fc06202a7"} Apr 16 14:55:56.425113 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:55:56.425066 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-llxsj" podStartSLOduration=2.777922847 podStartE2EDuration="3.425050829s" podCreationTimestamp="2026-04-16 14:55:53 +0000 UTC" firstStartedPulling="2026-04-16 14:55:54.254744416 +0000 UTC m=+205.145159280" lastFinishedPulling="2026-04-16 14:55:54.901872395 +0000 UTC m=+205.792287262" observedRunningTime="2026-04-16 14:55:56.423567376 +0000 UTC m=+207.313982262" watchObservedRunningTime="2026-04-16 14:55:56.425050829 +0000 UTC m=+207.315465716" Apr 16 14:56:02.833075 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:02.833032 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" podUID="caf51d6d-515c-4900-b0f6-a0458a558256" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:56:12.833381 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:12.833346 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" podUID="caf51d6d-515c-4900-b0f6-a0458a558256" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:56:22.833029 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:22.832977 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" podUID="caf51d6d-515c-4900-b0f6-a0458a558256" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:56:22.833497 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:22.833061 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" Apr 16 14:56:22.833551 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:22.833521 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a0d872ab6e76077c6fe868512719935e096057c355f57ca573bb4e7ba46bf401"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:56:22.833613 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:22.833598 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" podUID="caf51d6d-515c-4900-b0f6-a0458a558256" containerName="service-proxy" containerID="cri-o://a0d872ab6e76077c6fe868512719935e096057c355f57ca573bb4e7ba46bf401" gracePeriod=30 Apr 16 14:56:23.478673 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:23.478635 2577 generic.go:358] "Generic (PLEG): container finished" podID="caf51d6d-515c-4900-b0f6-a0458a558256" containerID="a0d872ab6e76077c6fe868512719935e096057c355f57ca573bb4e7ba46bf401" exitCode=2 Apr 16 14:56:23.478820 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:23.478707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" event={"ID":"caf51d6d-515c-4900-b0f6-a0458a558256","Type":"ContainerDied","Data":"a0d872ab6e76077c6fe868512719935e096057c355f57ca573bb4e7ba46bf401"} Apr 16 14:56:23.478820 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:23.478747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-d989dbd46-shlwr" event={"ID":"caf51d6d-515c-4900-b0f6-a0458a558256","Type":"ContainerStarted","Data":"8b697f43c3fc94320ec5b01db681909feb02c9b89b5742f20250fe382830defa"} Apr 16 14:56:24.727589 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:24.727545 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nhw5h_a7987cab-cae1-4625-84c9-b135a5b3b6e7/dns-node-resolver/0.log" Apr 16 14:56:41.562873 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:41.562820 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:56:41.565452 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:41.565425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e-metrics-certs\") pod \"network-metrics-daemon-7rlnq\" (UID: \"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e\") " pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:56:41.656811 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:41.656768 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kc8hc\"" Apr 16 14:56:41.664733 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:41.664699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7rlnq" Apr 16 14:56:41.786775 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:41.786738 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7rlnq"] Apr 16 14:56:41.790051 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:56:41.790014 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod731df22b_c6b3_4d5d_ae5c_60a8d6d4d99e.slice/crio-68802a097cda44a6d9d566e7848a048345a55310aebf0729a47e2da841d45848 WatchSource:0}: Error finding container 68802a097cda44a6d9d566e7848a048345a55310aebf0729a47e2da841d45848: Status 404 returned error can't find the container with id 68802a097cda44a6d9d566e7848a048345a55310aebf0729a47e2da841d45848 Apr 16 14:56:42.529077 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:42.529035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rlnq" event={"ID":"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e","Type":"ContainerStarted","Data":"68802a097cda44a6d9d566e7848a048345a55310aebf0729a47e2da841d45848"} Apr 16 14:56:43.533020 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:43.532982 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rlnq" event={"ID":"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e","Type":"ContainerStarted","Data":"bedcf949bf4fff3e91152160f8bea68aed3cfda78d3ae3da4a2477db3256e2cd"} Apr 16 14:56:43.533020 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:43.533016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7rlnq" event={"ID":"731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e","Type":"ContainerStarted","Data":"916f0676532b3e20aed9fb0cf301fc8b0f9019e4be6ea10f91586f790a9b664f"} Apr 16 14:56:43.549487 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:56:43.549442 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7rlnq" podStartSLOduration=253.68019189 podStartE2EDuration="4m14.549427948s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:56:41.792217211 +0000 UTC m=+252.682632076" lastFinishedPulling="2026-04-16 14:56:42.66145327 +0000 UTC m=+253.551868134" observedRunningTime="2026-04-16 14:56:43.548984927 +0000 UTC m=+254.439399814" watchObservedRunningTime="2026-04-16 14:56:43.549427948 +0000 UTC m=+254.439842834" Apr 16 14:57:09.267130 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:57:09.267059 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-nqtsn" podUID="df57e009-9151-4b90-8c22-bedd6e86b057" Apr 16 14:57:09.267130 ip-10-0-139-55 kubenswrapper[2577]: E0416 14:57:09.267059 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lbn5l" podUID="7b10a07b-1617-44d5-b186-97458e04e5e0" Apr 16 14:57:09.602817 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:09.602740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:57:09.602974 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:09.602740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lbn5l" Apr 16 14:57:12.587770 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.587737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:57:12.588227 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.587785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:57:12.590299 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.590273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b10a07b-1617-44d5-b186-97458e04e5e0-metrics-tls\") pod \"dns-default-lbn5l\" (UID: \"7b10a07b-1617-44d5-b186-97458e04e5e0\") " pod="openshift-dns/dns-default-lbn5l" Apr 16 14:57:12.590401 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.590326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df57e009-9151-4b90-8c22-bedd6e86b057-cert\") pod \"ingress-canary-nqtsn\" (UID: \"df57e009-9151-4b90-8c22-bedd6e86b057\") " pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:57:12.606004 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.605983 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vzfs6\"" Apr 16 14:57:12.606703 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.606690 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-8jkdg\"" Apr 16 14:57:12.614018 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.613998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lbn5l" Apr 16 14:57:12.614093 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.614016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nqtsn" Apr 16 14:57:12.740750 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.740722 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nqtsn"] Apr 16 14:57:12.743318 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:57:12.743292 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf57e009_9151_4b90_8c22_bedd6e86b057.slice/crio-386de1cf3b5bb8b445be86b6ef54a2ae007e252a9056a00f361ce234a959a30d WatchSource:0}: Error finding container 386de1cf3b5bb8b445be86b6ef54a2ae007e252a9056a00f361ce234a959a30d: Status 404 returned error can't find the container with id 386de1cf3b5bb8b445be86b6ef54a2ae007e252a9056a00f361ce234a959a30d Apr 16 14:57:12.756998 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:12.756977 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lbn5l"] Apr 16 14:57:12.759604 ip-10-0-139-55 kubenswrapper[2577]: W0416 14:57:12.759581 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b10a07b_1617_44d5_b186_97458e04e5e0.slice/crio-655e24b625c8332ea9e70d288b2565b32ee485526bd442cfd265a7edd396d8de WatchSource:0}: Error finding container 655e24b625c8332ea9e70d288b2565b32ee485526bd442cfd265a7edd396d8de: Status 404 returned error can't find the container with id 655e24b625c8332ea9e70d288b2565b32ee485526bd442cfd265a7edd396d8de Apr 16 14:57:13.614006 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:13.613966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lbn5l" event={"ID":"7b10a07b-1617-44d5-b186-97458e04e5e0","Type":"ContainerStarted","Data":"655e24b625c8332ea9e70d288b2565b32ee485526bd442cfd265a7edd396d8de"} Apr 16 14:57:13.615260 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:13.615228 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nqtsn" event={"ID":"df57e009-9151-4b90-8c22-bedd6e86b057","Type":"ContainerStarted","Data":"386de1cf3b5bb8b445be86b6ef54a2ae007e252a9056a00f361ce234a959a30d"} Apr 16 14:57:15.621417 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:15.621378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lbn5l" event={"ID":"7b10a07b-1617-44d5-b186-97458e04e5e0","Type":"ContainerStarted","Data":"fd6b9699c2e05d59d6782e133a77d973988292c78f6a53bfdd766645305d6adf"} Apr 16 14:57:15.621417 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:15.621416 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lbn5l" event={"ID":"7b10a07b-1617-44d5-b186-97458e04e5e0","Type":"ContainerStarted","Data":"098643cdbccc953bd890c612fc908b957b47ccc31b91ea51bb2223409586c762"} Apr 16 14:57:15.621971 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:15.621514 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lbn5l" Apr 16 14:57:15.622712 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:15.622692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nqtsn" event={"ID":"df57e009-9151-4b90-8c22-bedd6e86b057","Type":"ContainerStarted","Data":"db7621fc1c99ee9e7dc518e233dce87c2bcb0f31523770575656be32fab011bd"} Apr 16 14:57:15.648972 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:15.648907 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lbn5l" podStartSLOduration=251.803477037 podStartE2EDuration="4m13.648871629s" podCreationTimestamp="2026-04-16 14:53:02 +0000 UTC" firstStartedPulling="2026-04-16 14:57:12.761200891 +0000 UTC m=+283.651615755" lastFinishedPulling="2026-04-16 14:57:14.606595481 +0000 UTC m=+285.497010347" observedRunningTime="2026-04-16 14:57:15.647810346 +0000 UTC m=+286.538225235" watchObservedRunningTime="2026-04-16 14:57:15.648871629 +0000 UTC m=+286.539286515" Apr 16 14:57:15.662202 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:15.662153 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nqtsn" podStartSLOduration=251.797686476 podStartE2EDuration="4m13.662139207s" podCreationTimestamp="2026-04-16 14:53:02 +0000 UTC" firstStartedPulling="2026-04-16 14:57:12.745297466 +0000 UTC m=+283.635712330" lastFinishedPulling="2026-04-16 14:57:14.609750194 +0000 UTC m=+285.500165061" observedRunningTime="2026-04-16 14:57:15.661302571 +0000 UTC m=+286.551717458" watchObservedRunningTime="2026-04-16 14:57:15.662139207 +0000 UTC m=+286.552554123" Apr 16 14:57:25.627924 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:25.627893 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lbn5l" Apr 16 14:57:29.638518 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:29.638486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 14:57:29.639300 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:29.639274 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 14:57:29.649000 ip-10-0-139-55 kubenswrapper[2577]: I0416 14:57:29.648978 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:02:29.660233 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:02:29.660194 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:02:29.660761 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:02:29.660249 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:03:24.398291 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.398250 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-fjpph"] Apr 16 15:03:24.401542 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.401524 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:24.404017 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.403986 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 15:03:24.404125 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.404070 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-sbr5p\"" Apr 16 15:03:24.404125 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.404111 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 15:03:24.404877 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.404861 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 15:03:24.404963 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.404862 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 15:03:24.410588 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.410564 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-fjpph"] Apr 16 15:03:24.449797 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.449761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l8ps\" (UniqueName: \"kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-kube-api-access-4l8ps\") pod \"keda-admission-cf49989db-fjpph\" (UID: \"5164d40e-3fda-42ba-9ae1-27a4318b2aa9\") " pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:24.450042 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.449821 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-certificates\") pod \"keda-admission-cf49989db-fjpph\" (UID: \"5164d40e-3fda-42ba-9ae1-27a4318b2aa9\") " pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:24.550456 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.550404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l8ps\" (UniqueName: \"kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-kube-api-access-4l8ps\") pod \"keda-admission-cf49989db-fjpph\" (UID: \"5164d40e-3fda-42ba-9ae1-27a4318b2aa9\") " pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:24.550657 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.550481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-certificates\") pod \"keda-admission-cf49989db-fjpph\" (UID: \"5164d40e-3fda-42ba-9ae1-27a4318b2aa9\") " pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:24.550657 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:03:24.550606 2577 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 15:03:24.550657 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:03:24.550629 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-fjpph: secret "keda-admission-webhooks-certs" not found Apr 16 15:03:24.550840 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:03:24.550704 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-certificates podName:5164d40e-3fda-42ba-9ae1-27a4318b2aa9 nodeName:}" failed. No retries permitted until 2026-04-16 15:03:25.050680589 +0000 UTC m=+655.941095454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-certificates") pod "keda-admission-cf49989db-fjpph" (UID: "5164d40e-3fda-42ba-9ae1-27a4318b2aa9") : secret "keda-admission-webhooks-certs" not found Apr 16 15:03:24.565019 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:24.564991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l8ps\" (UniqueName: \"kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-kube-api-access-4l8ps\") pod \"keda-admission-cf49989db-fjpph\" (UID: \"5164d40e-3fda-42ba-9ae1-27a4318b2aa9\") " pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:25.054054 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:25.054001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-certificates\") pod \"keda-admission-cf49989db-fjpph\" (UID: \"5164d40e-3fda-42ba-9ae1-27a4318b2aa9\") " pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:25.056655 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:25.056634 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5164d40e-3fda-42ba-9ae1-27a4318b2aa9-certificates\") pod \"keda-admission-cf49989db-fjpph\" (UID: \"5164d40e-3fda-42ba-9ae1-27a4318b2aa9\") " pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:25.312429 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:25.312329 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:25.437338 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:25.437292 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-fjpph"] Apr 16 15:03:25.439960 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:03:25.439919 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5164d40e_3fda_42ba_9ae1_27a4318b2aa9.slice/crio-93ac1df348c8844c778314ab1dde4924912f4ef871dfc1fd4cded9a64f5792f3 WatchSource:0}: Error finding container 93ac1df348c8844c778314ab1dde4924912f4ef871dfc1fd4cded9a64f5792f3: Status 404 returned error can't find the container with id 93ac1df348c8844c778314ab1dde4924912f4ef871dfc1fd4cded9a64f5792f3 Apr 16 15:03:25.441241 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:25.441225 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:03:25.541716 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:25.541677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-fjpph" event={"ID":"5164d40e-3fda-42ba-9ae1-27a4318b2aa9","Type":"ContainerStarted","Data":"93ac1df348c8844c778314ab1dde4924912f4ef871dfc1fd4cded9a64f5792f3"} Apr 16 15:03:29.554433 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:29.554394 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-fjpph" event={"ID":"5164d40e-3fda-42ba-9ae1-27a4318b2aa9","Type":"ContainerStarted","Data":"90b3933c387a1fe2c5f0a176d1ea5d6058d3f062f6d2708670a637f64276b601"} Apr 16 15:03:29.554893 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:29.554508 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:03:29.573477 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:29.573433 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-fjpph" podStartSLOduration=1.833759412 podStartE2EDuration="5.573417326s" podCreationTimestamp="2026-04-16 15:03:24 +0000 UTC" firstStartedPulling="2026-04-16 15:03:25.441386531 +0000 UTC m=+656.331801397" lastFinishedPulling="2026-04-16 15:03:29.181044437 +0000 UTC m=+660.071459311" observedRunningTime="2026-04-16 15:03:29.571794464 +0000 UTC m=+660.462209349" watchObservedRunningTime="2026-04-16 15:03:29.573417326 +0000 UTC m=+660.463832211" Apr 16 15:03:50.559714 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:03:50.559678 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-fjpph" Apr 16 15:04:31.819954 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.819842 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j"] Apr 16 15:04:31.821896 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.821880 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:31.824185 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.824164 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:04:31.824299 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.824164 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:04:31.825017 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.824999 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 15:04:31.825197 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.825000 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-z7p7g\"" Apr 16 15:04:31.834406 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.834382 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j"] Apr 16 15:04:31.919103 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.919070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-wdv5j\" (UID: \"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:31.919282 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:31.919123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96fq7\" (UniqueName: \"kubernetes.io/projected/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-kube-api-access-96fq7\") pod \"llmisvc-controller-manager-68cc5db7c4-wdv5j\" (UID: \"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:32.020179 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:32.020136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-wdv5j\" (UID: \"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:32.020348 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:32.020199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96fq7\" (UniqueName: \"kubernetes.io/projected/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-kube-api-access-96fq7\") pod \"llmisvc-controller-manager-68cc5db7c4-wdv5j\" (UID: \"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:32.020348 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:04:32.020239 2577 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 15:04:32.020348 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:04:32.020316 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-cert podName:54a8c02a-ce22-4a1e-bbe6-5a08c52ececd nodeName:}" failed. No retries permitted until 2026-04-16 15:04:32.520298456 +0000 UTC m=+723.410713322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-cert") pod "llmisvc-controller-manager-68cc5db7c4-wdv5j" (UID: "54a8c02a-ce22-4a1e-bbe6-5a08c52ececd") : secret "llmisvc-webhook-server-cert" not found Apr 16 15:04:32.031641 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:32.031610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96fq7\" (UniqueName: \"kubernetes.io/projected/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-kube-api-access-96fq7\") pod \"llmisvc-controller-manager-68cc5db7c4-wdv5j\" (UID: \"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:32.524411 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:32.524364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-wdv5j\" (UID: \"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:32.527698 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:32.527664 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54a8c02a-ce22-4a1e-bbe6-5a08c52ececd-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-wdv5j\" (UID: \"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:32.731732 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:32.731696 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:32.852847 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:32.852817 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j"] Apr 16 15:04:32.855785 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:04:32.855756 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod54a8c02a_ce22_4a1e_bbe6_5a08c52ececd.slice/crio-2314561b14eecc8470b6b047114c8bd73d8c685a2723dc32aa377b76d089ddbb WatchSource:0}: Error finding container 2314561b14eecc8470b6b047114c8bd73d8c685a2723dc32aa377b76d089ddbb: Status 404 returned error can't find the container with id 2314561b14eecc8470b6b047114c8bd73d8c685a2723dc32aa377b76d089ddbb Apr 16 15:04:33.721568 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:33.721528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" event={"ID":"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd","Type":"ContainerStarted","Data":"2314561b14eecc8470b6b047114c8bd73d8c685a2723dc32aa377b76d089ddbb"} Apr 16 15:04:34.725093 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:34.725064 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" event={"ID":"54a8c02a-ce22-4a1e-bbe6-5a08c52ececd","Type":"ContainerStarted","Data":"5f9429afb3f957f8c6c2c010e77c12ef5672e2b17fc9976f896706e533507883"} Apr 16 15:04:34.725447 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:34.725184 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:04:34.742217 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:04:34.742175 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" podStartSLOduration=2.005251886 podStartE2EDuration="3.742163208s" podCreationTimestamp="2026-04-16 15:04:31 +0000 UTC" firstStartedPulling="2026-04-16 15:04:32.856872713 +0000 UTC m=+723.747287576" lastFinishedPulling="2026-04-16 15:04:34.59378403 +0000 UTC m=+725.484198898" observedRunningTime="2026-04-16 15:04:34.741533973 +0000 UTC m=+725.631948859" watchObservedRunningTime="2026-04-16 15:04:34.742163208 +0000 UTC m=+725.632578110" Apr 16 15:05:05.730104 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:05.730069 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-wdv5j" Apr 16 15:05:40.620136 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.620091 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-tlkcq"] Apr 16 15:05:40.623190 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.623174 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:40.625501 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.625480 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 15:05:40.625601 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.625583 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-n8dlz\"" Apr 16 15:05:40.632364 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.632341 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-tlkcq"] Apr 16 15:05:40.710712 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.710682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe97d4d-ac00-4e82-9640-3778e78c1453-cert\") pod \"odh-model-controller-696fc77849-tlkcq\" (UID: \"abe97d4d-ac00-4e82-9640-3778e78c1453\") " pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:40.710865 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.710735 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkxw\" (UniqueName: \"kubernetes.io/projected/abe97d4d-ac00-4e82-9640-3778e78c1453-kube-api-access-crkxw\") pod \"odh-model-controller-696fc77849-tlkcq\" (UID: \"abe97d4d-ac00-4e82-9640-3778e78c1453\") " pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:40.811302 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.811260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crkxw\" (UniqueName: \"kubernetes.io/projected/abe97d4d-ac00-4e82-9640-3778e78c1453-kube-api-access-crkxw\") pod \"odh-model-controller-696fc77849-tlkcq\" (UID: \"abe97d4d-ac00-4e82-9640-3778e78c1453\") " pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:40.811483 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.811319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe97d4d-ac00-4e82-9640-3778e78c1453-cert\") pod \"odh-model-controller-696fc77849-tlkcq\" (UID: \"abe97d4d-ac00-4e82-9640-3778e78c1453\") " pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:40.811483 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:05:40.811419 2577 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 15:05:40.811555 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:05:40.811488 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abe97d4d-ac00-4e82-9640-3778e78c1453-cert podName:abe97d4d-ac00-4e82-9640-3778e78c1453 nodeName:}" failed. No retries permitted until 2026-04-16 15:05:41.311467716 +0000 UTC m=+792.201882597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/abe97d4d-ac00-4e82-9640-3778e78c1453-cert") pod "odh-model-controller-696fc77849-tlkcq" (UID: "abe97d4d-ac00-4e82-9640-3778e78c1453") : secret "odh-model-controller-webhook-cert" not found Apr 16 15:05:40.821804 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:40.821764 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkxw\" (UniqueName: \"kubernetes.io/projected/abe97d4d-ac00-4e82-9640-3778e78c1453-kube-api-access-crkxw\") pod \"odh-model-controller-696fc77849-tlkcq\" (UID: \"abe97d4d-ac00-4e82-9640-3778e78c1453\") " pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:41.314675 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:41.314635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe97d4d-ac00-4e82-9640-3778e78c1453-cert\") pod \"odh-model-controller-696fc77849-tlkcq\" (UID: \"abe97d4d-ac00-4e82-9640-3778e78c1453\") " pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:41.317271 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:41.317243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/abe97d4d-ac00-4e82-9640-3778e78c1453-cert\") pod \"odh-model-controller-696fc77849-tlkcq\" (UID: \"abe97d4d-ac00-4e82-9640-3778e78c1453\") " pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:41.533144 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:41.533089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:41.658152 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:41.658113 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-tlkcq"] Apr 16 15:05:41.661686 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:05:41.661659 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe97d4d_ac00_4e82_9640_3778e78c1453.slice/crio-a96e8254bdd84accaf59184859d01f506ea4ba8a59aa2d17d6180026fbae47f4 WatchSource:0}: Error finding container a96e8254bdd84accaf59184859d01f506ea4ba8a59aa2d17d6180026fbae47f4: Status 404 returned error can't find the container with id a96e8254bdd84accaf59184859d01f506ea4ba8a59aa2d17d6180026fbae47f4 Apr 16 15:05:41.898821 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:41.898727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-tlkcq" event={"ID":"abe97d4d-ac00-4e82-9640-3778e78c1453","Type":"ContainerStarted","Data":"a96e8254bdd84accaf59184859d01f506ea4ba8a59aa2d17d6180026fbae47f4"} Apr 16 15:05:44.908788 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:44.908752 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-tlkcq" event={"ID":"abe97d4d-ac00-4e82-9640-3778e78c1453","Type":"ContainerStarted","Data":"f9448285f5e987b9b4bbf67b27303cc2613653c0c43bc2697c809c6596d58d0d"} Apr 16 15:05:44.909288 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:44.908902 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:05:44.927589 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:44.927538 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-tlkcq" podStartSLOduration=2.385078702 podStartE2EDuration="4.927521778s" podCreationTimestamp="2026-04-16 15:05:40 +0000 UTC" firstStartedPulling="2026-04-16 15:05:41.662899345 +0000 UTC m=+792.553314209" lastFinishedPulling="2026-04-16 15:05:44.205342416 +0000 UTC m=+795.095757285" observedRunningTime="2026-04-16 15:05:44.926379204 +0000 UTC m=+795.816794092" watchObservedRunningTime="2026-04-16 15:05:44.927521778 +0000 UTC m=+795.817936664" Apr 16 15:05:55.913655 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:05:55.913624 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-tlkcq" Apr 16 15:06:07.277258 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.277224 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr"] Apr 16 15:06:07.280429 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.280411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:07.282612 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.282591 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 15:06:07.282612 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.282610 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9zz8m\"" Apr 16 15:06:07.284837 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.284818 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr"] Apr 16 15:06:07.410845 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.410811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqfg\" (UniqueName: \"kubernetes.io/projected/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-kube-api-access-6hqfg\") pod \"seaweedfs-tls-custom-ddd4dbfd-vvlxr\" (UID: \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:07.411046 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.410853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vvlxr\" (UID: \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:07.511845 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.511807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqfg\" (UniqueName: \"kubernetes.io/projected/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-kube-api-access-6hqfg\") pod \"seaweedfs-tls-custom-ddd4dbfd-vvlxr\" (UID: \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:07.511845 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.511845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vvlxr\" (UID: \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:07.512279 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.512259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vvlxr\" (UID: \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:07.520341 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.520317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqfg\" (UniqueName: \"kubernetes.io/projected/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-kube-api-access-6hqfg\") pod \"seaweedfs-tls-custom-ddd4dbfd-vvlxr\" (UID: \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:07.590035 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.589921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:07.706806 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.706776 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr"] Apr 16 15:06:07.709875 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:06:07.709849 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5717970c_f7fe_4a09_b9ac_8ce0da3b80b4.slice/crio-4ccc7ad79406fe003dcf399cac007cef311019ee9b5dbbf68d264a224c9a4f45 WatchSource:0}: Error finding container 4ccc7ad79406fe003dcf399cac007cef311019ee9b5dbbf68d264a224c9a4f45: Status 404 returned error can't find the container with id 4ccc7ad79406fe003dcf399cac007cef311019ee9b5dbbf68d264a224c9a4f45 Apr 16 15:06:07.968029 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:07.967924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" event={"ID":"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4","Type":"ContainerStarted","Data":"4ccc7ad79406fe003dcf399cac007cef311019ee9b5dbbf68d264a224c9a4f45"} Apr 16 15:06:10.977268 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:10.977229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" event={"ID":"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4","Type":"ContainerStarted","Data":"cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b"} Apr 16 15:06:10.991431 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:10.991354 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" podStartSLOduration=1.382415813 podStartE2EDuration="3.991336254s" podCreationTimestamp="2026-04-16 15:06:07 +0000 UTC" firstStartedPulling="2026-04-16 15:06:07.711316746 +0000 UTC m=+818.601731612" lastFinishedPulling="2026-04-16 15:06:10.320237187 +0000 UTC m=+821.210652053" observedRunningTime="2026-04-16 15:06:10.990784627 +0000 UTC m=+821.881199524" watchObservedRunningTime="2026-04-16 15:06:10.991336254 +0000 UTC m=+821.881751141" Apr 16 15:06:11.936313 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:11.936274 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr"] Apr 16 15:06:12.982007 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:12.981966 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" podUID="5717970c-f7fe-4a09-b9ac-8ce0da3b80b4" containerName="seaweedfs-tls-custom" containerID="cri-o://cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b" gracePeriod=30 Apr 16 15:06:14.216444 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.216422 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:14.362702 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.362625 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-data\") pod \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\" (UID: \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\") " Apr 16 15:06:14.362702 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.362662 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hqfg\" (UniqueName: \"kubernetes.io/projected/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-kube-api-access-6hqfg\") pod \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\" (UID: \"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4\") " Apr 16 15:06:14.363847 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.363815 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-data" (OuterVolumeSpecName: "data") pod "5717970c-f7fe-4a09-b9ac-8ce0da3b80b4" (UID: "5717970c-f7fe-4a09-b9ac-8ce0da3b80b4"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:06:14.364866 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.364849 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-kube-api-access-6hqfg" (OuterVolumeSpecName: "kube-api-access-6hqfg") pod "5717970c-f7fe-4a09-b9ac-8ce0da3b80b4" (UID: "5717970c-f7fe-4a09-b9ac-8ce0da3b80b4"). InnerVolumeSpecName "kube-api-access-6hqfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:06:14.464096 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.464067 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hqfg\" (UniqueName: \"kubernetes.io/projected/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-kube-api-access-6hqfg\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:06:14.464096 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.464092 2577 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4-data\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:06:14.987783 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.987746 2577 generic.go:358] "Generic (PLEG): container finished" podID="5717970c-f7fe-4a09-b9ac-8ce0da3b80b4" containerID="cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b" exitCode=0 Apr 16 15:06:14.987978 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.987800 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" Apr 16 15:06:14.987978 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.987834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" event={"ID":"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4","Type":"ContainerDied","Data":"cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b"} Apr 16 15:06:14.987978 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.987864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr" event={"ID":"5717970c-f7fe-4a09-b9ac-8ce0da3b80b4","Type":"ContainerDied","Data":"4ccc7ad79406fe003dcf399cac007cef311019ee9b5dbbf68d264a224c9a4f45"} Apr 16 15:06:14.987978 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.987879 2577 scope.go:117] "RemoveContainer" containerID="cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b" Apr 16 15:06:14.996720 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.996702 2577 scope.go:117] "RemoveContainer" containerID="cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b" Apr 16 15:06:14.997012 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:06:14.996991 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b\": container with ID starting with cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b not found: ID does not exist" containerID="cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b" Apr 16 15:06:14.997071 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:14.997020 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b"} err="failed to get container status \"cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b\": rpc error: code = NotFound desc = could not find container \"cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b\": container with ID starting with cd1063e18ae285a7b3e7b3b4ed713c17cacdb3b87ac0efc6d84dde10a013386b not found: ID does not exist" Apr 16 15:06:15.009866 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.009842 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr"] Apr 16 15:06:15.011833 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.011815 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vvlxr"] Apr 16 15:06:15.038898 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.038873 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-8z479"] Apr 16 15:06:15.039168 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.039153 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5717970c-f7fe-4a09-b9ac-8ce0da3b80b4" containerName="seaweedfs-tls-custom" Apr 16 15:06:15.039240 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.039172 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5717970c-f7fe-4a09-b9ac-8ce0da3b80b4" containerName="seaweedfs-tls-custom" Apr 16 15:06:15.039292 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.039242 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5717970c-f7fe-4a09-b9ac-8ce0da3b80b4" containerName="seaweedfs-tls-custom" Apr 16 15:06:15.043251 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.043234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.045252 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.045232 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 15:06:15.045353 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.045335 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 15:06:15.045423 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.045404 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9zz8m\"" Apr 16 15:06:15.051328 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.051309 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-8z479"] Apr 16 15:06:15.171124 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.171098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/ce9f8d49-8f9b-4bca-b103-95e9e0513091-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.171233 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.171128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ce9f8d49-8f9b-4bca-b103-95e9e0513091-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.171233 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.171159 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srmq2\" (UniqueName: \"kubernetes.io/projected/ce9f8d49-8f9b-4bca-b103-95e9e0513091-kube-api-access-srmq2\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.276161 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.272178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/ce9f8d49-8f9b-4bca-b103-95e9e0513091-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.276161 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.272241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ce9f8d49-8f9b-4bca-b103-95e9e0513091-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.276161 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.272296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srmq2\" (UniqueName: \"kubernetes.io/projected/ce9f8d49-8f9b-4bca-b103-95e9e0513091-kube-api-access-srmq2\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.276161 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.275794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/ce9f8d49-8f9b-4bca-b103-95e9e0513091-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.276161 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.276138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ce9f8d49-8f9b-4bca-b103-95e9e0513091-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.279704 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.279682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srmq2\" (UniqueName: \"kubernetes.io/projected/ce9f8d49-8f9b-4bca-b103-95e9e0513091-kube-api-access-srmq2\") pod \"seaweedfs-tls-custom-5c88b85bb7-8z479\" (UID: \"ce9f8d49-8f9b-4bca-b103-95e9e0513091\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.352536 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.352510 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" Apr 16 15:06:15.469747 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.469723 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-8z479"] Apr 16 15:06:15.472315 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:06:15.472287 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce9f8d49_8f9b_4bca_b103_95e9e0513091.slice/crio-b47c337091f7edf5416b996995740aefb03ead2a0873ab705c1ba2485c735ecb WatchSource:0}: Error finding container b47c337091f7edf5416b996995740aefb03ead2a0873ab705c1ba2485c735ecb: Status 404 returned error can't find the container with id b47c337091f7edf5416b996995740aefb03ead2a0873ab705c1ba2485c735ecb Apr 16 15:06:15.757308 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.757280 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5717970c-f7fe-4a09-b9ac-8ce0da3b80b4" path="/var/lib/kubelet/pods/5717970c-f7fe-4a09-b9ac-8ce0da3b80b4/volumes" Apr 16 15:06:15.992077 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.992047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" event={"ID":"ce9f8d49-8f9b-4bca-b103-95e9e0513091","Type":"ContainerStarted","Data":"041908100f3d84ae1a49b627ef75d087d7c003b67885b8389fafe14ec3adbe9f"} Apr 16 15:06:15.992077 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:15.992080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" event={"ID":"ce9f8d49-8f9b-4bca-b103-95e9e0513091","Type":"ContainerStarted","Data":"b47c337091f7edf5416b996995740aefb03ead2a0873ab705c1ba2485c735ecb"} Apr 16 15:06:16.007906 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:06:16.007830 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8z479" podStartSLOduration=0.700892553 podStartE2EDuration="1.007812672s" podCreationTimestamp="2026-04-16 15:06:15 +0000 UTC" firstStartedPulling="2026-04-16 15:06:15.473538777 +0000 UTC m=+826.363953640" lastFinishedPulling="2026-04-16 15:06:15.780458888 +0000 UTC m=+826.670873759" observedRunningTime="2026-04-16 15:06:16.006635997 +0000 UTC m=+826.897050884" watchObservedRunningTime="2026-04-16 15:06:16.007812672 +0000 UTC m=+826.898227559" Apr 16 15:07:29.677579 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:07:29.677508 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:07:29.678834 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:07:29.678817 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:12:29.695508 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:12:29.695470 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:12:29.697403 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:12:29.697381 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:17:29.721456 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:17:29.721426 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:17:29.723957 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:17:29.723917 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:21:11.356806 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.356775 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk"] Apr 16 15:21:11.359664 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.359646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:21:11.361967 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.361921 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:21:11.367711 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.367683 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk"] Apr 16 15:21:11.476084 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.476047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aac49458-692f-47ba-8dba-bef2409a7ffd-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-xklbk\" (UID: \"aac49458-692f-47ba-8dba-bef2409a7ffd\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:21:11.576685 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.576649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aac49458-692f-47ba-8dba-bef2409a7ffd-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-xklbk\" (UID: \"aac49458-692f-47ba-8dba-bef2409a7ffd\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:21:11.577027 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.577008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aac49458-692f-47ba-8dba-bef2409a7ffd-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-xklbk\" (UID: \"aac49458-692f-47ba-8dba-bef2409a7ffd\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:21:11.670629 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.670561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:21:11.789769 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.789701 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk"] Apr 16 15:21:11.792390 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:21:11.792356 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaac49458_692f_47ba_8dba_bef2409a7ffd.slice/crio-d526708d7805c9ae558efee77a9158dd12c9a279980d7a80fc1395e64fc7e8b2 WatchSource:0}: Error finding container d526708d7805c9ae558efee77a9158dd12c9a279980d7a80fc1395e64fc7e8b2: Status 404 returned error can't find the container with id d526708d7805c9ae558efee77a9158dd12c9a279980d7a80fc1395e64fc7e8b2 Apr 16 15:21:11.794077 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:11.794062 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:21:12.373575 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:12.373533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" event={"ID":"aac49458-692f-47ba-8dba-bef2409a7ffd","Type":"ContainerStarted","Data":"d526708d7805c9ae558efee77a9158dd12c9a279980d7a80fc1395e64fc7e8b2"} Apr 16 15:21:17.388647 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:17.388609 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" event={"ID":"aac49458-692f-47ba-8dba-bef2409a7ffd","Type":"ContainerStarted","Data":"e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb"} Apr 16 15:21:21.401388 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:21.401351 2577 generic.go:358] "Generic (PLEG): container finished" podID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerID="e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb" exitCode=0 Apr 16 15:21:21.401837 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:21.401435 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" event={"ID":"aac49458-692f-47ba-8dba-bef2409a7ffd","Type":"ContainerDied","Data":"e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb"} Apr 16 15:21:33.444565 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:33.444533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" event={"ID":"aac49458-692f-47ba-8dba-bef2409a7ffd","Type":"ContainerStarted","Data":"733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9"} Apr 16 15:21:34.447760 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:34.447726 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:21:34.449025 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:34.448995 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:21:34.465342 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:34.465295 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" podStartSLOduration=1.945942179 podStartE2EDuration="23.465280585s" podCreationTimestamp="2026-04-16 15:21:11 +0000 UTC" firstStartedPulling="2026-04-16 15:21:11.794187005 +0000 UTC m=+1722.684601869" lastFinishedPulling="2026-04-16 15:21:33.313525409 +0000 UTC m=+1744.203940275" observedRunningTime="2026-04-16 15:21:34.463565225 +0000 UTC m=+1745.353980110" watchObservedRunningTime="2026-04-16 15:21:34.465280585 +0000 UTC m=+1745.355695471" Apr 16 15:21:35.450824 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:35.450785 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:21:45.450810 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:45.450765 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:21:55.451091 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:21:55.451042 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:22:05.451384 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:05.451340 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:22:15.451874 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:15.451842 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:22:22.841688 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:22.841588 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk"] Apr 16 15:22:22.842215 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:22.841885 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" containerID="cri-o://733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9" gracePeriod=30 Apr 16 15:22:22.894606 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:22.894576 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l"] Apr 16 15:22:22.926103 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:22.926074 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l"] Apr 16 15:22:22.926230 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:22.926181 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:22:23.007294 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:23.007266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l\" (UID: \"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:22:23.108586 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:23.108496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l\" (UID: \"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:22:23.108874 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:23.108853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l\" (UID: \"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:22:23.236210 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:23.236177 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:22:23.353966 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:23.353914 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l"] Apr 16 15:22:23.357236 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:22:23.357209 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76bcb15_d7b8_4d8f_bca0_c8a8ed3eac84.slice/crio-e2fa71475085f791afea24105270784b5b0ee8291190d8dc8ab01408040f83a7 WatchSource:0}: Error finding container e2fa71475085f791afea24105270784b5b0ee8291190d8dc8ab01408040f83a7: Status 404 returned error can't find the container with id e2fa71475085f791afea24105270784b5b0ee8291190d8dc8ab01408040f83a7 Apr 16 15:22:23.574066 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:23.574034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" event={"ID":"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84","Type":"ContainerStarted","Data":"c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75"} Apr 16 15:22:23.574213 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:23.574074 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" event={"ID":"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84","Type":"ContainerStarted","Data":"e2fa71475085f791afea24105270784b5b0ee8291190d8dc8ab01408040f83a7"} Apr 16 15:22:25.269873 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.269852 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:22:25.426143 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.426037 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aac49458-692f-47ba-8dba-bef2409a7ffd-kserve-provision-location\") pod \"aac49458-692f-47ba-8dba-bef2409a7ffd\" (UID: \"aac49458-692f-47ba-8dba-bef2409a7ffd\") " Apr 16 15:22:25.434206 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.434173 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aac49458-692f-47ba-8dba-bef2409a7ffd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aac49458-692f-47ba-8dba-bef2409a7ffd" (UID: "aac49458-692f-47ba-8dba-bef2409a7ffd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:22:25.527446 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.527411 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aac49458-692f-47ba-8dba-bef2409a7ffd-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:22:25.584993 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.584958 2577 generic.go:358] "Generic (PLEG): container finished" podID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerID="733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9" exitCode=0 Apr 16 15:22:25.585169 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.585023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" event={"ID":"aac49458-692f-47ba-8dba-bef2409a7ffd","Type":"ContainerDied","Data":"733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9"} Apr 16 15:22:25.585169 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.585040 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" Apr 16 15:22:25.585169 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.585068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk" event={"ID":"aac49458-692f-47ba-8dba-bef2409a7ffd","Type":"ContainerDied","Data":"d526708d7805c9ae558efee77a9158dd12c9a279980d7a80fc1395e64fc7e8b2"} Apr 16 15:22:25.585169 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.585088 2577 scope.go:117] "RemoveContainer" containerID="733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9" Apr 16 15:22:25.593233 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.593213 2577 scope.go:117] "RemoveContainer" containerID="e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb" Apr 16 15:22:25.599811 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.599793 2577 scope.go:117] "RemoveContainer" containerID="733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9" Apr 16 15:22:25.600071 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:22:25.600054 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9\": container with ID starting with 733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9 not found: ID does not exist" containerID="733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9" Apr 16 15:22:25.600146 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.600084 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9"} err="failed to get container status \"733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9\": rpc error: code = NotFound desc = could not find container \"733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9\": container with ID starting with 733810ab85d3e5273e2994e76c30161e6551d3c2b33e1fad7aa90c7e882494c9 not found: ID does not exist" Apr 16 15:22:25.600146 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.600108 2577 scope.go:117] "RemoveContainer" containerID="e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb" Apr 16 15:22:25.600299 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:22:25.600283 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb\": container with ID starting with e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb not found: ID does not exist" containerID="e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb" Apr 16 15:22:25.600348 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.600305 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb"} err="failed to get container status \"e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb\": rpc error: code = NotFound desc = could not find container \"e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb\": container with ID starting with e13a684d02e73cb5794c2e818a8c6d5650a8e422a68b152269335e018df92ceb not found: ID does not exist" Apr 16 15:22:25.603876 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.603857 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk"] Apr 16 15:22:25.607681 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.607663 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-xklbk"] Apr 16 15:22:25.756516 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:25.756488 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" path="/var/lib/kubelet/pods/aac49458-692f-47ba-8dba-bef2409a7ffd/volumes" Apr 16 15:22:28.594759 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:28.594727 2577 generic.go:358] "Generic (PLEG): container finished" podID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerID="c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75" exitCode=0 Apr 16 15:22:28.595244 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:28.594803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" event={"ID":"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84","Type":"ContainerDied","Data":"c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75"} Apr 16 15:22:29.599731 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:29.599690 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" event={"ID":"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84","Type":"ContainerStarted","Data":"798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4"} Apr 16 15:22:29.600251 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:29.599964 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:22:29.601308 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:29.601276 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:22:29.617264 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:29.617216 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" podStartSLOduration=7.61720119 podStartE2EDuration="7.61720119s" podCreationTimestamp="2026-04-16 15:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:22:29.615678497 +0000 UTC m=+1800.506093385" watchObservedRunningTime="2026-04-16 15:22:29.61720119 +0000 UTC m=+1800.507616075" Apr 16 15:22:29.740041 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:29.740014 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:22:29.742036 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:29.742013 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:22:30.602521 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:30.602481 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:22:40.602569 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:40.602508 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:22:50.602759 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:22:50.602705 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:23:00.602800 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:00.602749 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:23:10.603710 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:10.603680 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:23:14.635913 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.635878 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l"] Apr 16 15:23:14.636293 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.636186 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" containerID="cri-o://798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4" gracePeriod=30 Apr 16 15:23:14.729677 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.729648 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682"] Apr 16 15:23:14.729907 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.729896 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" Apr 16 15:23:14.729970 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.729910 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" Apr 16 15:23:14.729970 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.729926 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="storage-initializer" Apr 16 15:23:14.729970 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.729953 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="storage-initializer" Apr 16 15:23:14.730060 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.730007 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aac49458-692f-47ba-8dba-bef2409a7ffd" containerName="kserve-container" Apr 16 15:23:14.732915 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.732899 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:23:14.746146 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.746119 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682"] Apr 16 15:23:14.893184 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.893082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2c4226d-7185-4d04-a84c-9d98df69c2bd-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-ft682\" (UID: \"a2c4226d-7185-4d04-a84c-9d98df69c2bd\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:23:14.994005 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.993972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2c4226d-7185-4d04-a84c-9d98df69c2bd-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-ft682\" (UID: \"a2c4226d-7185-4d04-a84c-9d98df69c2bd\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:23:14.994388 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:14.994364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2c4226d-7185-4d04-a84c-9d98df69c2bd-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-ft682\" (UID: \"a2c4226d-7185-4d04-a84c-9d98df69c2bd\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:23:15.042706 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:15.042676 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:23:15.166386 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:15.166362 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682"] Apr 16 15:23:15.168693 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:23:15.168666 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c4226d_7185_4d04_a84c_9d98df69c2bd.slice/crio-59e7b3cfad183627e57816f06a095f0f3c532b2146ee1484f161006fe1a75aab WatchSource:0}: Error finding container 59e7b3cfad183627e57816f06a095f0f3c532b2146ee1484f161006fe1a75aab: Status 404 returned error can't find the container with id 59e7b3cfad183627e57816f06a095f0f3c532b2146ee1484f161006fe1a75aab Apr 16 15:23:15.729883 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:15.729843 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" event={"ID":"a2c4226d-7185-4d04-a84c-9d98df69c2bd","Type":"ContainerStarted","Data":"6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f"} Apr 16 15:23:15.729883 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:15.729888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" event={"ID":"a2c4226d-7185-4d04-a84c-9d98df69c2bd","Type":"ContainerStarted","Data":"59e7b3cfad183627e57816f06a095f0f3c532b2146ee1484f161006fe1a75aab"} Apr 16 15:23:17.183638 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.183614 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:23:17.310486 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.310405 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84-kserve-provision-location\") pod \"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84\" (UID: \"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84\") " Apr 16 15:23:17.319652 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.319621 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" (UID: "e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:17.411812 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.411787 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:23:17.735845 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.735813 2577 generic.go:358] "Generic (PLEG): container finished" podID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerID="798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4" exitCode=0 Apr 16 15:23:17.736011 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.735883 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" Apr 16 15:23:17.736011 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.735896 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" event={"ID":"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84","Type":"ContainerDied","Data":"798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4"} Apr 16 15:23:17.736011 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.735946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l" event={"ID":"e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84","Type":"ContainerDied","Data":"e2fa71475085f791afea24105270784b5b0ee8291190d8dc8ab01408040f83a7"} Apr 16 15:23:17.736011 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.735962 2577 scope.go:117] "RemoveContainer" containerID="798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4" Apr 16 15:23:17.743993 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.743969 2577 scope.go:117] "RemoveContainer" containerID="c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75" Apr 16 15:23:17.750624 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.750604 2577 scope.go:117] "RemoveContainer" containerID="798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4" Apr 16 15:23:17.750840 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:23:17.750823 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4\": container with ID starting with 798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4 not found: ID does not exist" containerID="798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4" Apr 16 15:23:17.750890 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.750848 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4"} err="failed to get container status \"798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4\": rpc error: code = NotFound desc = could not find container \"798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4\": container with ID starting with 798c06ca335b428024fbee91ef67f9f172dbe18dc46354e570d9fe270a5835e4 not found: ID does not exist" Apr 16 15:23:17.750890 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.750866 2577 scope.go:117] "RemoveContainer" containerID="c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75" Apr 16 15:23:17.751117 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:23:17.751100 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75\": container with ID starting with c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75 not found: ID does not exist" containerID="c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75" Apr 16 15:23:17.751167 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.751123 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75"} err="failed to get container status \"c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75\": rpc error: code = NotFound desc = could not find container \"c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75\": container with ID starting with c730afd78f9bd0669c9c132752247fbefabeec520311a9188dc73b906cac7d75 not found: ID does not exist" Apr 16 15:23:17.756991 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.756974 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l"] Apr 16 15:23:17.759965 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:17.759925 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-bc22l"] Apr 16 15:23:19.742758 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:19.742725 2577 generic.go:358] "Generic (PLEG): container finished" podID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerID="6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f" exitCode=0 Apr 16 15:23:19.743144 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:19.742782 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" event={"ID":"a2c4226d-7185-4d04-a84c-9d98df69c2bd","Type":"ContainerDied","Data":"6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f"} Apr 16 15:23:19.757036 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:19.756994 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" path="/var/lib/kubelet/pods/e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84/volumes" Apr 16 15:23:26.770300 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:26.770265 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" event={"ID":"a2c4226d-7185-4d04-a84c-9d98df69c2bd","Type":"ContainerStarted","Data":"19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8"} Apr 16 15:23:26.770642 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:26.770517 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:23:26.771776 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:26.771741 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:23:26.786676 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:26.786636 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podStartSLOduration=5.911003087 podStartE2EDuration="12.786624011s" podCreationTimestamp="2026-04-16 15:23:14 +0000 UTC" firstStartedPulling="2026-04-16 15:23:19.74390718 +0000 UTC m=+1850.634322043" lastFinishedPulling="2026-04-16 15:23:26.619528103 +0000 UTC m=+1857.509942967" observedRunningTime="2026-04-16 15:23:26.78556945 +0000 UTC m=+1857.675984337" watchObservedRunningTime="2026-04-16 15:23:26.786624011 +0000 UTC m=+1857.677038961" Apr 16 15:23:27.773365 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:27.773321 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:23:37.773901 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:37.773856 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:23:47.773650 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:47.773600 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:23:57.774104 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:23:57.774053 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:24:07.773990 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:07.773918 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:24:17.773774 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:17.773729 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:24:27.773903 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:27.773862 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:24:36.753122 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:36.753072 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:24:46.754224 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:46.754194 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:24:55.744331 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.744300 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682"] Apr 16 15:24:55.744999 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.744571 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" containerID="cri-o://19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8" gracePeriod=30 Apr 16 15:24:55.823144 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.823107 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv"] Apr 16 15:24:55.823368 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.823356 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="storage-initializer" Apr 16 15:24:55.823410 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.823370 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="storage-initializer" Apr 16 15:24:55.823410 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.823388 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" Apr 16 15:24:55.823410 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.823394 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" Apr 16 15:24:55.823510 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.823452 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e76bcb15-d7b8-4d8f-bca0-c8a8ed3eac84" containerName="kserve-container" Apr 16 15:24:55.826183 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.826163 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:24:55.834123 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.834102 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv"] Apr 16 15:24:55.925457 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:55.925424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecab4a0e-b0ce-4836-8acd-769cd8e727cc-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-zt4tv\" (UID: \"ecab4a0e-b0ce-4836-8acd-769cd8e727cc\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:24:56.026281 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:56.026200 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecab4a0e-b0ce-4836-8acd-769cd8e727cc-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-zt4tv\" (UID: \"ecab4a0e-b0ce-4836-8acd-769cd8e727cc\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:24:56.026532 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:56.026515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecab4a0e-b0ce-4836-8acd-769cd8e727cc-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-zt4tv\" (UID: \"ecab4a0e-b0ce-4836-8acd-769cd8e727cc\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:24:56.137262 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:56.137231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:24:56.256177 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:56.256145 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv"] Apr 16 15:24:56.258708 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:24:56.258680 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecab4a0e_b0ce_4836_8acd_769cd8e727cc.slice/crio-0c2a865651e49b4dd4f2e2fc8e1301275a436df94d411dd4f59bed0dd817389c WatchSource:0}: Error finding container 0c2a865651e49b4dd4f2e2fc8e1301275a436df94d411dd4f59bed0dd817389c: Status 404 returned error can't find the container with id 0c2a865651e49b4dd4f2e2fc8e1301275a436df94d411dd4f59bed0dd817389c Apr 16 15:24:56.753224 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:56.753184 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:24:57.022749 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:57.022658 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" event={"ID":"ecab4a0e-b0ce-4836-8acd-769cd8e727cc","Type":"ContainerStarted","Data":"8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120"} Apr 16 15:24:57.022749 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:57.022696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" event={"ID":"ecab4a0e-b0ce-4836-8acd-769cd8e727cc","Type":"ContainerStarted","Data":"0c2a865651e49b4dd4f2e2fc8e1301275a436df94d411dd4f59bed0dd817389c"} Apr 16 15:24:59.279619 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:59.279597 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:24:59.353285 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:59.353199 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2c4226d-7185-4d04-a84c-9d98df69c2bd-kserve-provision-location\") pod \"a2c4226d-7185-4d04-a84c-9d98df69c2bd\" (UID: \"a2c4226d-7185-4d04-a84c-9d98df69c2bd\") " Apr 16 15:24:59.353504 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:59.353478 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c4226d-7185-4d04-a84c-9d98df69c2bd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a2c4226d-7185-4d04-a84c-9d98df69c2bd" (UID: "a2c4226d-7185-4d04-a84c-9d98df69c2bd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:59.453768 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:24:59.453730 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2c4226d-7185-4d04-a84c-9d98df69c2bd-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:25:00.032231 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.032135 2577 generic.go:358] "Generic (PLEG): container finished" podID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerID="19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8" exitCode=0 Apr 16 15:25:00.032365 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.032228 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" event={"ID":"a2c4226d-7185-4d04-a84c-9d98df69c2bd","Type":"ContainerDied","Data":"19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8"} Apr 16 15:25:00.032365 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.032271 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" event={"ID":"a2c4226d-7185-4d04-a84c-9d98df69c2bd","Type":"ContainerDied","Data":"59e7b3cfad183627e57816f06a095f0f3c532b2146ee1484f161006fe1a75aab"} Apr 16 15:25:00.032365 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.032286 2577 scope.go:117] "RemoveContainer" containerID="19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8" Apr 16 15:25:00.032365 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.032241 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682" Apr 16 15:25:00.040511 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.040493 2577 scope.go:117] "RemoveContainer" containerID="6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f" Apr 16 15:25:00.048573 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.048551 2577 scope.go:117] "RemoveContainer" containerID="19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8" Apr 16 15:25:00.048972 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:25:00.048921 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8\": container with ID starting with 19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8 not found: ID does not exist" containerID="19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8" Apr 16 15:25:00.049150 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.049110 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8"} err="failed to get container status \"19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8\": rpc error: code = NotFound desc = could not find container \"19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8\": container with ID starting with 19004af5c560e41b31a6b1d89e254d9e01226b16ac40ac4744fe31844d2fc0f8 not found: ID does not exist" Apr 16 15:25:00.049285 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.049263 2577 scope.go:117] "RemoveContainer" containerID="6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f" Apr 16 15:25:00.049612 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:25:00.049584 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f\": container with ID starting with 6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f not found: ID does not exist" containerID="6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f" Apr 16 15:25:00.049680 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.049620 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f"} err="failed to get container status \"6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f\": rpc error: code = NotFound desc = could not find container \"6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f\": container with ID starting with 6461c364acd8fe21fe5810e5567643bfaf85750a20d5b64bb4b818a94aa9d95f not found: ID does not exist" Apr 16 15:25:00.050455 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.050436 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682"] Apr 16 15:25:00.054091 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:00.054070 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ft682"] Apr 16 15:25:01.036639 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:01.036607 2577 generic.go:358] "Generic (PLEG): container finished" podID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerID="8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120" exitCode=0 Apr 16 15:25:01.037137 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:01.036687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" event={"ID":"ecab4a0e-b0ce-4836-8acd-769cd8e727cc","Type":"ContainerDied","Data":"8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120"} Apr 16 15:25:01.757740 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:01.757708 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" path="/var/lib/kubelet/pods/a2c4226d-7185-4d04-a84c-9d98df69c2bd/volumes" Apr 16 15:25:02.041383 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:02.041302 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" event={"ID":"ecab4a0e-b0ce-4836-8acd-769cd8e727cc","Type":"ContainerStarted","Data":"96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2"} Apr 16 15:25:02.041772 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:02.041692 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:25:02.042713 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:02.042687 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:25:02.057420 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:02.057380 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podStartSLOduration=7.057365394 podStartE2EDuration="7.057365394s" podCreationTimestamp="2026-04-16 15:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:25:02.056512496 +0000 UTC m=+1952.946927381" watchObservedRunningTime="2026-04-16 15:25:02.057365394 +0000 UTC m=+1952.947780280" Apr 16 15:25:03.044751 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:03.044708 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:25:13.045580 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:13.045528 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:25:23.045487 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:23.045382 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:25:33.045243 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:33.045196 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:25:43.045553 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:43.045505 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:25:53.045258 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:25:53.045212 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:26:03.045768 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:03.045718 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:26:04.754186 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:04.754140 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:26:14.754530 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:14.754492 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:26:24.755817 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:24.755784 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:26:26.882769 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.882730 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv"] Apr 16 15:26:26.883173 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.883038 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" containerID="cri-o://96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2" gracePeriod=30 Apr 16 15:26:26.933637 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.933602 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt"] Apr 16 15:26:26.934014 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.933995 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="storage-initializer" Apr 16 15:26:26.934067 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.934020 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="storage-initializer" Apr 16 15:26:26.934067 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.934036 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" Apr 16 15:26:26.934067 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.934047 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" Apr 16 15:26:26.934165 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.934139 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2c4226d-7185-4d04-a84c-9d98df69c2bd" containerName="kserve-container" Apr 16 15:26:26.937282 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.937261 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:26:26.945992 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:26.945966 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt"] Apr 16 15:26:27.121069 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:27.121031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84ec3308-c52f-4c97-a9e1-0da802f3e2f5-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt\" (UID: \"84ec3308-c52f-4c97-a9e1-0da802f3e2f5\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:26:27.222098 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:27.222012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84ec3308-c52f-4c97-a9e1-0da802f3e2f5-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt\" (UID: \"84ec3308-c52f-4c97-a9e1-0da802f3e2f5\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:26:27.222430 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:27.222402 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84ec3308-c52f-4c97-a9e1-0da802f3e2f5-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt\" (UID: \"84ec3308-c52f-4c97-a9e1-0da802f3e2f5\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:26:27.248710 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:27.248668 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:26:27.371242 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:27.371214 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt"] Apr 16 15:26:27.373849 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:26:27.373819 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ec3308_c52f_4c97_a9e1_0da802f3e2f5.slice/crio-cd3b9d9fe43bd320d778affb5cc48ad87d6a0132a61b87e5e72811e5044d3a63 WatchSource:0}: Error finding container cd3b9d9fe43bd320d778affb5cc48ad87d6a0132a61b87e5e72811e5044d3a63: Status 404 returned error can't find the container with id cd3b9d9fe43bd320d778affb5cc48ad87d6a0132a61b87e5e72811e5044d3a63 Apr 16 15:26:27.375664 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:27.375642 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:26:28.272756 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:28.272717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" event={"ID":"84ec3308-c52f-4c97-a9e1-0da802f3e2f5","Type":"ContainerStarted","Data":"1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0"} Apr 16 15:26:28.272756 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:28.272761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" event={"ID":"84ec3308-c52f-4c97-a9e1-0da802f3e2f5","Type":"ContainerStarted","Data":"cd3b9d9fe43bd320d778affb5cc48ad87d6a0132a61b87e5e72811e5044d3a63"} Apr 16 15:26:30.519260 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:30.519237 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:26:30.650529 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:30.650436 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecab4a0e-b0ce-4836-8acd-769cd8e727cc-kserve-provision-location\") pod \"ecab4a0e-b0ce-4836-8acd-769cd8e727cc\" (UID: \"ecab4a0e-b0ce-4836-8acd-769cd8e727cc\") " Apr 16 15:26:30.650775 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:30.650749 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecab4a0e-b0ce-4836-8acd-769cd8e727cc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ecab4a0e-b0ce-4836-8acd-769cd8e727cc" (UID: "ecab4a0e-b0ce-4836-8acd-769cd8e727cc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:30.751303 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:30.751268 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecab4a0e-b0ce-4836-8acd-769cd8e727cc-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:26:31.282485 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.282441 2577 generic.go:358] "Generic (PLEG): container finished" podID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerID="96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2" exitCode=0 Apr 16 15:26:31.282664 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.282582 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" Apr 16 15:26:31.282664 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.282576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" event={"ID":"ecab4a0e-b0ce-4836-8acd-769cd8e727cc","Type":"ContainerDied","Data":"96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2"} Apr 16 15:26:31.282770 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.282701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv" event={"ID":"ecab4a0e-b0ce-4836-8acd-769cd8e727cc","Type":"ContainerDied","Data":"0c2a865651e49b4dd4f2e2fc8e1301275a436df94d411dd4f59bed0dd817389c"} Apr 16 15:26:31.282770 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.282720 2577 scope.go:117] "RemoveContainer" containerID="96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2" Apr 16 15:26:31.284049 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.284024 2577 generic.go:358] "Generic (PLEG): container finished" podID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerID="1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0" exitCode=0 Apr 16 15:26:31.284152 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.284081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" event={"ID":"84ec3308-c52f-4c97-a9e1-0da802f3e2f5","Type":"ContainerDied","Data":"1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0"} Apr 16 15:26:31.291566 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.291029 2577 scope.go:117] "RemoveContainer" containerID="8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120" Apr 16 15:26:31.298383 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.298366 2577 scope.go:117] "RemoveContainer" containerID="96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2" Apr 16 15:26:31.298649 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:26:31.298629 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2\": container with ID starting with 96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2 not found: ID does not exist" containerID="96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2" Apr 16 15:26:31.298706 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.298658 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2"} err="failed to get container status \"96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2\": rpc error: code = NotFound desc = could not find container \"96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2\": container with ID starting with 96d81f251b1fa690270fa74542d1c0cc8554ed4d7a81a1e260355817066c99e2 not found: ID does not exist" Apr 16 15:26:31.298706 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.298675 2577 scope.go:117] "RemoveContainer" containerID="8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120" Apr 16 15:26:31.298894 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:26:31.298876 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120\": container with ID starting with 8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120 not found: ID does not exist" containerID="8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120" Apr 16 15:26:31.299014 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.298901 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120"} err="failed to get container status \"8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120\": rpc error: code = NotFound desc = could not find container \"8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120\": container with ID starting with 8d65ad8ad7915013414bf730d5f566173537795a7e9cfb90b24ae0b4e454c120 not found: ID does not exist" Apr 16 15:26:31.317561 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.317535 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv"] Apr 16 15:26:31.320196 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.320175 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-zt4tv"] Apr 16 15:26:31.756595 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:31.756565 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" path="/var/lib/kubelet/pods/ecab4a0e-b0ce-4836-8acd-769cd8e727cc/volumes" Apr 16 15:26:32.291500 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:32.291462 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" event={"ID":"84ec3308-c52f-4c97-a9e1-0da802f3e2f5","Type":"ContainerStarted","Data":"e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb"} Apr 16 15:26:32.291763 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:32.291747 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:26:32.293291 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:32.293259 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:26:32.307469 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:32.307424 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podStartSLOduration=6.307411378 podStartE2EDuration="6.307411378s" podCreationTimestamp="2026-04-16 15:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:26:32.305799475 +0000 UTC m=+2043.196214384" watchObservedRunningTime="2026-04-16 15:26:32.307411378 +0000 UTC m=+2043.197826263" Apr 16 15:26:33.294413 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:33.294368 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:26:43.295132 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:43.295079 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:26:53.295085 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:26:53.295042 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:27:03.294976 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:03.294887 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:27:13.294643 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:13.294591 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:27:23.295299 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:23.295243 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:27:29.760081 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:29.760055 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:27:29.766592 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:29.766554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:27:33.294839 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:33.294786 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:27:42.753385 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:42.753347 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:27:52.754846 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:52.754816 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:27:57.998018 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:57.997981 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt"] Apr 16 15:27:57.998391 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:27:57.998324 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" containerID="cri-o://e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb" gracePeriod=30 Apr 16 15:28:01.937080 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:01.937057 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:28:02.010581 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.010538 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84ec3308-c52f-4c97-a9e1-0da802f3e2f5-kserve-provision-location\") pod \"84ec3308-c52f-4c97-a9e1-0da802f3e2f5\" (UID: \"84ec3308-c52f-4c97-a9e1-0da802f3e2f5\") " Apr 16 15:28:02.010919 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.010890 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84ec3308-c52f-4c97-a9e1-0da802f3e2f5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84ec3308-c52f-4c97-a9e1-0da802f3e2f5" (UID: "84ec3308-c52f-4c97-a9e1-0da802f3e2f5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:28:02.111817 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.111733 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84ec3308-c52f-4c97-a9e1-0da802f3e2f5-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:28:02.530948 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.530909 2577 generic.go:358] "Generic (PLEG): container finished" podID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerID="e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb" exitCode=0 Apr 16 15:28:02.531129 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.530966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" event={"ID":"84ec3308-c52f-4c97-a9e1-0da802f3e2f5","Type":"ContainerDied","Data":"e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb"} Apr 16 15:28:02.531129 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.530996 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" Apr 16 15:28:02.531129 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.531007 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt" event={"ID":"84ec3308-c52f-4c97-a9e1-0da802f3e2f5","Type":"ContainerDied","Data":"cd3b9d9fe43bd320d778affb5cc48ad87d6a0132a61b87e5e72811e5044d3a63"} Apr 16 15:28:02.531129 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.531023 2577 scope.go:117] "RemoveContainer" containerID="e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb" Apr 16 15:28:02.539081 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.539061 2577 scope.go:117] "RemoveContainer" containerID="1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0" Apr 16 15:28:02.545685 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.545661 2577 scope.go:117] "RemoveContainer" containerID="e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb" Apr 16 15:28:02.545907 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:28:02.545889 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb\": container with ID starting with e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb not found: ID does not exist" containerID="e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb" Apr 16 15:28:02.546095 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.545919 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb"} err="failed to get container status \"e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb\": rpc error: code = NotFound desc = could not find container \"e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb\": container with ID starting with e3d0f119b90a5a6c092723e0d0d7d6e5d6b3f42b0e6f34fd79c0df9812f391eb not found: ID does not exist" Apr 16 15:28:02.546095 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.546061 2577 scope.go:117] "RemoveContainer" containerID="1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0" Apr 16 15:28:02.546333 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:28:02.546318 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0\": container with ID starting with 1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0 not found: ID does not exist" containerID="1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0" Apr 16 15:28:02.546373 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.546338 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0"} err="failed to get container status \"1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0\": rpc error: code = NotFound desc = could not find container \"1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0\": container with ID starting with 1b093ff7d4f4d72bca55d4e299f20cd00ebed404a269298a63f71ae14ad271e0 not found: ID does not exist" Apr 16 15:28:02.551021 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.551000 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt"] Apr 16 15:28:02.553458 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:02.553439 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-24hvt"] Apr 16 15:28:03.756554 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:28:03.756521 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" path="/var/lib/kubelet/pods/84ec3308-c52f-4c97-a9e1-0da802f3e2f5/volumes" Apr 16 15:29:46.472757 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.472715 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp"] Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473063 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="storage-initializer" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473078 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="storage-initializer" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473094 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="storage-initializer" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473102 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="storage-initializer" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473112 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473120 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473136 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473142 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473186 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="84ec3308-c52f-4c97-a9e1-0da802f3e2f5" containerName="kserve-container" Apr 16 15:29:46.473227 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.473193 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecab4a0e-b0ce-4836-8acd-769cd8e727cc" containerName="kserve-container" Apr 16 15:29:46.476057 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.476041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:29:46.478311 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.478289 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:29:46.482247 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.482223 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp"] Apr 16 15:29:46.580518 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.580471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5418baf-df6b-477a-b1e8-ac1cd947fc09-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-hkrrp\" (UID: \"d5418baf-df6b-477a-b1e8-ac1cd947fc09\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:29:46.681926 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.681874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5418baf-df6b-477a-b1e8-ac1cd947fc09-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-hkrrp\" (UID: \"d5418baf-df6b-477a-b1e8-ac1cd947fc09\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:29:46.682287 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.682262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5418baf-df6b-477a-b1e8-ac1cd947fc09-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-hkrrp\" (UID: \"d5418baf-df6b-477a-b1e8-ac1cd947fc09\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:29:46.787288 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.787249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:29:46.903564 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:46.903529 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp"] Apr 16 15:29:46.906960 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:29:46.906916 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5418baf_df6b_477a_b1e8_ac1cd947fc09.slice/crio-cfe675992c627414f5c80d94ec7b01bf9d53a3b9060407d062dda9f3efd087c6 WatchSource:0}: Error finding container cfe675992c627414f5c80d94ec7b01bf9d53a3b9060407d062dda9f3efd087c6: Status 404 returned error can't find the container with id cfe675992c627414f5c80d94ec7b01bf9d53a3b9060407d062dda9f3efd087c6 Apr 16 15:29:47.811141 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:47.811105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" event={"ID":"d5418baf-df6b-477a-b1e8-ac1cd947fc09","Type":"ContainerStarted","Data":"32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d"} Apr 16 15:29:47.811141 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:47.811143 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" event={"ID":"d5418baf-df6b-477a-b1e8-ac1cd947fc09","Type":"ContainerStarted","Data":"cfe675992c627414f5c80d94ec7b01bf9d53a3b9060407d062dda9f3efd087c6"} Apr 16 15:29:50.819417 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:50.819382 2577 generic.go:358] "Generic (PLEG): container finished" podID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerID="32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d" exitCode=0 Apr 16 15:29:50.819800 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:29:50.819460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" event={"ID":"d5418baf-df6b-477a-b1e8-ac1cd947fc09","Type":"ContainerDied","Data":"32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d"} Apr 16 15:30:12.896728 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:12.896645 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" event={"ID":"d5418baf-df6b-477a-b1e8-ac1cd947fc09","Type":"ContainerStarted","Data":"6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7"} Apr 16 15:30:12.897166 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:12.897059 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:30:12.898307 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:12.898273 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:30:12.912771 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:12.912719 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podStartSLOduration=5.245904077 podStartE2EDuration="26.912704341s" podCreationTimestamp="2026-04-16 15:29:46 +0000 UTC" firstStartedPulling="2026-04-16 15:29:50.820714649 +0000 UTC m=+2241.711129512" lastFinishedPulling="2026-04-16 15:30:12.487514894 +0000 UTC m=+2263.377929776" observedRunningTime="2026-04-16 15:30:12.91201538 +0000 UTC m=+2263.802430266" watchObservedRunningTime="2026-04-16 15:30:12.912704341 +0000 UTC m=+2263.803119227" Apr 16 15:30:13.900116 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:13.900080 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:30:23.900270 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:23.900226 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:30:33.900827 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:33.900775 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:30:43.900454 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:43.900403 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:30:53.901044 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:30:53.900999 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:31:03.900258 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:03.900217 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:31:13.900659 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:13.900614 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:31:23.900681 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:23.900581 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:31:24.754138 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:24.754096 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:31:34.755908 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:34.755875 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:31:36.659724 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:36.659692 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp"] Apr 16 15:31:36.660155 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:36.659955 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" containerID="cri-o://6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7" gracePeriod=30 Apr 16 15:31:36.750343 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:36.750312 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v"] Apr 16 15:31:36.752675 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:36.752654 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:31:36.763922 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:36.763895 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v"] Apr 16 15:31:36.809827 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:36.809793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee62cb3f-8613-4117-860b-8f3361f45ec7-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v\" (UID: \"ee62cb3f-8613-4117-860b-8f3361f45ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:31:36.910853 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:36.910756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee62cb3f-8613-4117-860b-8f3361f45ec7-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v\" (UID: \"ee62cb3f-8613-4117-860b-8f3361f45ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:31:36.911175 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:36.911156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee62cb3f-8613-4117-860b-8f3361f45ec7-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v\" (UID: \"ee62cb3f-8613-4117-860b-8f3361f45ec7\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:31:37.062103 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:37.062072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:31:37.182176 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:37.182098 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v"] Apr 16 15:31:37.185121 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:31:37.185088 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee62cb3f_8613_4117_860b_8f3361f45ec7.slice/crio-9cf6a32eb3910cefde68a6c6b4f72798907e63fc9fb833d794e0fdee0a145675 WatchSource:0}: Error finding container 9cf6a32eb3910cefde68a6c6b4f72798907e63fc9fb833d794e0fdee0a145675: Status 404 returned error can't find the container with id 9cf6a32eb3910cefde68a6c6b4f72798907e63fc9fb833d794e0fdee0a145675 Apr 16 15:31:37.186921 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:37.186905 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:31:38.133986 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:38.133949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" event={"ID":"ee62cb3f-8613-4117-860b-8f3361f45ec7","Type":"ContainerStarted","Data":"62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a"} Apr 16 15:31:38.133986 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:38.133985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" event={"ID":"ee62cb3f-8613-4117-860b-8f3361f45ec7","Type":"ContainerStarted","Data":"9cf6a32eb3910cefde68a6c6b4f72798907e63fc9fb833d794e0fdee0a145675"} Apr 16 15:31:41.144434 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:41.144399 2577 generic.go:358] "Generic (PLEG): container finished" podID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerID="62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a" exitCode=0 Apr 16 15:31:41.144857 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:41.144452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" event={"ID":"ee62cb3f-8613-4117-860b-8f3361f45ec7","Type":"ContainerDied","Data":"62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a"} Apr 16 15:31:41.601868 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:41.601845 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:31:41.650538 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:41.650442 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5418baf-df6b-477a-b1e8-ac1cd947fc09-kserve-provision-location\") pod \"d5418baf-df6b-477a-b1e8-ac1cd947fc09\" (UID: \"d5418baf-df6b-477a-b1e8-ac1cd947fc09\") " Apr 16 15:31:41.650778 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:41.650754 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5418baf-df6b-477a-b1e8-ac1cd947fc09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d5418baf-df6b-477a-b1e8-ac1cd947fc09" (UID: "d5418baf-df6b-477a-b1e8-ac1cd947fc09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:31:41.751859 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:41.751808 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5418baf-df6b-477a-b1e8-ac1cd947fc09-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:31:42.148415 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.148367 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" event={"ID":"ee62cb3f-8613-4117-860b-8f3361f45ec7","Type":"ContainerStarted","Data":"8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5"} Apr 16 15:31:42.148827 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.148740 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:31:42.149708 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.149676 2577 generic.go:358] "Generic (PLEG): container finished" podID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerID="6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7" exitCode=0 Apr 16 15:31:42.149816 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.149719 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" event={"ID":"d5418baf-df6b-477a-b1e8-ac1cd947fc09","Type":"ContainerDied","Data":"6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7"} Apr 16 15:31:42.149816 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.149738 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" Apr 16 15:31:42.149816 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.149741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp" event={"ID":"d5418baf-df6b-477a-b1e8-ac1cd947fc09","Type":"ContainerDied","Data":"cfe675992c627414f5c80d94ec7b01bf9d53a3b9060407d062dda9f3efd087c6"} Apr 16 15:31:42.149816 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.149767 2577 scope.go:117] "RemoveContainer" containerID="6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7" Apr 16 15:31:42.150252 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.150229 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:31:42.157514 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.157498 2577 scope.go:117] "RemoveContainer" containerID="32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d" Apr 16 15:31:42.164470 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.164451 2577 scope.go:117] "RemoveContainer" containerID="6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7" Apr 16 15:31:42.164729 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:31:42.164698 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7\": container with ID starting with 6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7 not found: ID does not exist" containerID="6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7" Apr 16 15:31:42.164814 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.164730 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7"} err="failed to get container status \"6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7\": rpc error: code = NotFound desc = could not find container \"6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7\": container with ID starting with 6737596257da0d0c93159a063e2e9e9ecc67c71cc440ef33c01d9e44cdcf67d7 not found: ID does not exist" Apr 16 15:31:42.164814 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.164808 2577 scope.go:117] "RemoveContainer" containerID="32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d" Apr 16 15:31:42.165074 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:31:42.165056 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d\": container with ID starting with 32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d not found: ID does not exist" containerID="32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d" Apr 16 15:31:42.165131 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.165081 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d"} err="failed to get container status \"32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d\": rpc error: code = NotFound desc = could not find container \"32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d\": container with ID starting with 32ec1e119a6c4bd35a2b543ad36ffcb1c7be676c90f299fb88ad184156f47c1d not found: ID does not exist" Apr 16 15:31:42.167142 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.167101 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podStartSLOduration=6.167089376 podStartE2EDuration="6.167089376s" podCreationTimestamp="2026-04-16 15:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:31:42.165070198 +0000 UTC m=+2353.055485083" watchObservedRunningTime="2026-04-16 15:31:42.167089376 +0000 UTC m=+2353.057504262" Apr 16 15:31:42.177306 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.177282 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp"] Apr 16 15:31:42.181333 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:42.181311 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hkrrp"] Apr 16 15:31:43.153901 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:43.153858 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:31:43.756765 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:43.756733 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" path="/var/lib/kubelet/pods/d5418baf-df6b-477a-b1e8-ac1cd947fc09/volumes" Apr 16 15:31:53.154259 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:31:53.154215 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:32:03.154099 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:03.154043 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:32:13.154863 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:13.154817 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:32:23.154441 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:23.154394 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:32:29.782724 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:29.782690 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:32:29.785911 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:29.785880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:32:33.153998 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:33.153956 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:32:43.154415 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:43.154367 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:32:53.155729 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:53.155695 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:32:56.920645 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.920587 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v"] Apr 16 15:32:56.921405 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.920882 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" containerID="cri-o://8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5" gracePeriod=30 Apr 16 15:32:56.972629 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.972593 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7"] Apr 16 15:32:56.972872 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.972860 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" Apr 16 15:32:56.972919 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.972875 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" Apr 16 15:32:56.972919 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.972900 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="storage-initializer" Apr 16 15:32:56.972919 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.972907 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="storage-initializer" Apr 16 15:32:56.973051 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.973000 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5418baf-df6b-477a-b1e8-ac1cd947fc09" containerName="kserve-container" Apr 16 15:32:56.975843 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.975827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:32:56.983977 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.983915 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7"] Apr 16 15:32:56.998371 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:56.998347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaf3603a-c163-4052-8df5-b77befb7b854-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-zvdk7\" (UID: \"aaf3603a-c163-4052-8df5-b77befb7b854\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:32:57.098779 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:57.098738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaf3603a-c163-4052-8df5-b77befb7b854-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-zvdk7\" (UID: \"aaf3603a-c163-4052-8df5-b77befb7b854\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:32:57.099137 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:57.099116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaf3603a-c163-4052-8df5-b77befb7b854-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-zvdk7\" (UID: \"aaf3603a-c163-4052-8df5-b77befb7b854\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:32:57.287431 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:57.287399 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:32:57.411845 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:57.411820 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7"] Apr 16 15:32:57.414390 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:32:57.414365 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaf3603a_c163_4052_8df5_b77befb7b854.slice/crio-44c465efac97239b98bc1e70cef8b4f3edb73b2375b7ad4817e6e35345c38263 WatchSource:0}: Error finding container 44c465efac97239b98bc1e70cef8b4f3edb73b2375b7ad4817e6e35345c38263: Status 404 returned error can't find the container with id 44c465efac97239b98bc1e70cef8b4f3edb73b2375b7ad4817e6e35345c38263 Apr 16 15:32:58.356304 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:58.356262 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" event={"ID":"aaf3603a-c163-4052-8df5-b77befb7b854","Type":"ContainerStarted","Data":"32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19"} Apr 16 15:32:58.356304 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:32:58.356310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" event={"ID":"aaf3603a-c163-4052-8df5-b77befb7b854","Type":"ContainerStarted","Data":"44c465efac97239b98bc1e70cef8b4f3edb73b2375b7ad4817e6e35345c38263"} Apr 16 15:33:01.365273 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:01.365239 2577 generic.go:358] "Generic (PLEG): container finished" podID="aaf3603a-c163-4052-8df5-b77befb7b854" containerID="32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19" exitCode=0 Apr 16 15:33:01.365664 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:01.365313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" event={"ID":"aaf3603a-c163-4052-8df5-b77befb7b854","Type":"ContainerDied","Data":"32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19"} Apr 16 15:33:01.555475 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:01.555454 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:33:01.630183 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:01.630149 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee62cb3f-8613-4117-860b-8f3361f45ec7-kserve-provision-location\") pod \"ee62cb3f-8613-4117-860b-8f3361f45ec7\" (UID: \"ee62cb3f-8613-4117-860b-8f3361f45ec7\") " Apr 16 15:33:01.630465 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:01.630440 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee62cb3f-8613-4117-860b-8f3361f45ec7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ee62cb3f-8613-4117-860b-8f3361f45ec7" (UID: "ee62cb3f-8613-4117-860b-8f3361f45ec7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:33:01.730570 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:01.730535 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee62cb3f-8613-4117-860b-8f3361f45ec7-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:33:02.369443 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.369404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" event={"ID":"aaf3603a-c163-4052-8df5-b77befb7b854","Type":"ContainerStarted","Data":"b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71"} Apr 16 15:33:02.369869 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.369731 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:33:02.370838 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.370780 2577 generic.go:358] "Generic (PLEG): container finished" podID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerID="8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5" exitCode=0 Apr 16 15:33:02.370838 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.370838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" event={"ID":"ee62cb3f-8613-4117-860b-8f3361f45ec7","Type":"ContainerDied","Data":"8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5"} Apr 16 15:33:02.371026 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.370856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" event={"ID":"ee62cb3f-8613-4117-860b-8f3361f45ec7","Type":"ContainerDied","Data":"9cf6a32eb3910cefde68a6c6b4f72798907e63fc9fb833d794e0fdee0a145675"} Apr 16 15:33:02.371026 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.370854 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v" Apr 16 15:33:02.371026 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.370870 2577 scope.go:117] "RemoveContainer" containerID="8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5" Apr 16 15:33:02.371386 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.371363 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:33:02.378451 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.378433 2577 scope.go:117] "RemoveContainer" containerID="62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a" Apr 16 15:33:02.385045 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.385003 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podStartSLOduration=6.384989276 podStartE2EDuration="6.384989276s" podCreationTimestamp="2026-04-16 15:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:33:02.383578711 +0000 UTC m=+2433.273993598" watchObservedRunningTime="2026-04-16 15:33:02.384989276 +0000 UTC m=+2433.275404206" Apr 16 15:33:02.385794 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.385779 2577 scope.go:117] "RemoveContainer" containerID="8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5" Apr 16 15:33:02.386179 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:33:02.386148 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5\": container with ID starting with 8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5 not found: ID does not exist" containerID="8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5" Apr 16 15:33:02.386273 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.386197 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5"} err="failed to get container status \"8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5\": rpc error: code = NotFound desc = could not find container \"8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5\": container with ID starting with 8da105efa6bdf4fd2dc0c973203f496f5121923ed5c10f25d1cd8885bf10c8e5 not found: ID does not exist" Apr 16 15:33:02.386273 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.386217 2577 scope.go:117] "RemoveContainer" containerID="62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a" Apr 16 15:33:02.386516 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:33:02.386498 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a\": container with ID starting with 62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a not found: ID does not exist" containerID="62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a" Apr 16 15:33:02.386569 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.386525 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a"} err="failed to get container status \"62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a\": rpc error: code = NotFound desc = could not find container \"62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a\": container with ID starting with 62fa9dd726ea4361402062260b0d279b470f0450b3c2812e389d999f811ef71a not found: ID does not exist" Apr 16 15:33:02.394388 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.394366 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v"] Apr 16 15:33:02.396342 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:02.396320 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-hxq4v"] Apr 16 15:33:03.374860 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:03.374827 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:33:03.756727 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:03.756697 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" path="/var/lib/kubelet/pods/ee62cb3f-8613-4117-860b-8f3361f45ec7/volumes" Apr 16 15:33:13.375385 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:13.375342 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:33:23.375089 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:23.375041 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:33:33.374888 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:33.374839 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:33:43.375190 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:43.375143 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:33:53.374986 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:33:53.374917 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:34:03.375261 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:03.375213 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:34:13.375188 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:13.375139 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:34:22.753969 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:22.753918 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:34:27.670042 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.670000 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7"] Apr 16 15:34:27.672554 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.670418 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" containerID="cri-o://b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71" gracePeriod=30 Apr 16 15:34:27.752124 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.752085 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9"] Apr 16 15:34:27.752376 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.752363 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="storage-initializer" Apr 16 15:34:27.752420 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.752378 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="storage-initializer" Apr 16 15:34:27.752420 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.752396 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" Apr 16 15:34:27.752420 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.752401 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" Apr 16 15:34:27.752511 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.752449 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee62cb3f-8613-4117-860b-8f3361f45ec7" containerName="kserve-container" Apr 16 15:34:27.755196 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.755171 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:34:27.765336 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.765292 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9"] Apr 16 15:34:27.845785 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.845737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca521a49-05b0-4948-8874-26a6263616c5-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9\" (UID: \"ca521a49-05b0-4948-8874-26a6263616c5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:34:27.946607 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.946512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca521a49-05b0-4948-8874-26a6263616c5-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9\" (UID: \"ca521a49-05b0-4948-8874-26a6263616c5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:34:27.946918 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:27.946895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca521a49-05b0-4948-8874-26a6263616c5-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9\" (UID: \"ca521a49-05b0-4948-8874-26a6263616c5\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:34:28.067024 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:28.066988 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:34:28.191240 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:28.191215 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9"] Apr 16 15:34:28.194594 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:34:28.194555 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca521a49_05b0_4948_8874_26a6263616c5.slice/crio-b58585b80229acec4495aa42bd78a250ff1f3c011300ebe4329387110613aff1 WatchSource:0}: Error finding container b58585b80229acec4495aa42bd78a250ff1f3c011300ebe4329387110613aff1: Status 404 returned error can't find the container with id b58585b80229acec4495aa42bd78a250ff1f3c011300ebe4329387110613aff1 Apr 16 15:34:28.615037 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:28.615004 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" event={"ID":"ca521a49-05b0-4948-8874-26a6263616c5","Type":"ContainerStarted","Data":"30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e"} Apr 16 15:34:28.615037 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:28.615038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" event={"ID":"ca521a49-05b0-4948-8874-26a6263616c5","Type":"ContainerStarted","Data":"b58585b80229acec4495aa42bd78a250ff1f3c011300ebe4329387110613aff1"} Apr 16 15:34:32.628104 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:32.628066 2577 generic.go:358] "Generic (PLEG): container finished" podID="ca521a49-05b0-4948-8874-26a6263616c5" containerID="30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e" exitCode=0 Apr 16 15:34:32.628476 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:32.628114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" event={"ID":"ca521a49-05b0-4948-8874-26a6263616c5","Type":"ContainerDied","Data":"30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e"} Apr 16 15:34:32.753119 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:32.753072 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:34:33.503906 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.503883 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:34:33.584789 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.584695 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaf3603a-c163-4052-8df5-b77befb7b854-kserve-provision-location\") pod \"aaf3603a-c163-4052-8df5-b77befb7b854\" (UID: \"aaf3603a-c163-4052-8df5-b77befb7b854\") " Apr 16 15:34:33.585110 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.585085 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaf3603a-c163-4052-8df5-b77befb7b854-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aaf3603a-c163-4052-8df5-b77befb7b854" (UID: "aaf3603a-c163-4052-8df5-b77befb7b854"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:34:33.632429 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.632390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" event={"ID":"ca521a49-05b0-4948-8874-26a6263616c5","Type":"ContainerStarted","Data":"e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5"} Apr 16 15:34:33.632873 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.632626 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:34:33.633879 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.633854 2577 generic.go:358] "Generic (PLEG): container finished" podID="aaf3603a-c163-4052-8df5-b77befb7b854" containerID="b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71" exitCode=0 Apr 16 15:34:33.634008 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.633904 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" Apr 16 15:34:33.634008 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.633918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" event={"ID":"aaf3603a-c163-4052-8df5-b77befb7b854","Type":"ContainerDied","Data":"b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71"} Apr 16 15:34:33.634008 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.633960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7" event={"ID":"aaf3603a-c163-4052-8df5-b77befb7b854","Type":"ContainerDied","Data":"44c465efac97239b98bc1e70cef8b4f3edb73b2375b7ad4817e6e35345c38263"} Apr 16 15:34:33.634008 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.633977 2577 scope.go:117] "RemoveContainer" containerID="b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71" Apr 16 15:34:33.641778 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.641759 2577 scope.go:117] "RemoveContainer" containerID="32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19" Apr 16 15:34:33.648578 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.648561 2577 scope.go:117] "RemoveContainer" containerID="b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71" Apr 16 15:34:33.648822 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:34:33.648804 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71\": container with ID starting with b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71 not found: ID does not exist" containerID="b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71" Apr 16 15:34:33.648875 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.648831 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71"} err="failed to get container status \"b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71\": rpc error: code = NotFound desc = could not find container \"b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71\": container with ID starting with b643ebac88d046c9622f0369c9b16ce7d2694109fcf2c853d4f96c448d0f6d71 not found: ID does not exist" Apr 16 15:34:33.648875 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.648850 2577 scope.go:117] "RemoveContainer" containerID="32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19" Apr 16 15:34:33.649394 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:34:33.649377 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19\": container with ID starting with 32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19 not found: ID does not exist" containerID="32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19" Apr 16 15:34:33.649429 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.649398 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19"} err="failed to get container status \"32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19\": rpc error: code = NotFound desc = could not find container \"32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19\": container with ID starting with 32a0036995efb7a17afd8f4262cb6dbe5f8048f8f40b3270d28f51c7155f6e19 not found: ID does not exist" Apr 16 15:34:33.655805 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.655767 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" podStartSLOduration=6.655755431 podStartE2EDuration="6.655755431s" podCreationTimestamp="2026-04-16 15:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:34:33.654236276 +0000 UTC m=+2524.544651162" watchObservedRunningTime="2026-04-16 15:34:33.655755431 +0000 UTC m=+2524.546170314" Apr 16 15:34:33.664956 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.664910 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7"] Apr 16 15:34:33.667150 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.667127 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-zvdk7"] Apr 16 15:34:33.685201 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.685171 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaf3603a-c163-4052-8df5-b77befb7b854-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:34:33.757173 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:34:33.757144 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" path="/var/lib/kubelet/pods/aaf3603a-c163-4052-8df5-b77befb7b854/volumes" Apr 16 15:35:04.639402 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:04.639358 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:35:14.638624 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:14.638577 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:35:24.638667 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:24.638617 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:35:34.638180 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:34.638145 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.26:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:35:44.642489 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:44.642453 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:35:47.868489 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.868447 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9"] Apr 16 15:35:47.869078 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.868686 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="kserve-container" containerID="cri-o://e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5" gracePeriod=30 Apr 16 15:35:47.929247 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.929161 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk"] Apr 16 15:35:47.929427 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.929407 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="storage-initializer" Apr 16 15:35:47.929427 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.929418 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="storage-initializer" Apr 16 15:35:47.929585 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.929437 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" Apr 16 15:35:47.929585 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.929442 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" Apr 16 15:35:47.929585 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.929491 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaf3603a-c163-4052-8df5-b77befb7b854" containerName="kserve-container" Apr 16 15:35:47.932321 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.932299 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:35:47.939662 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:47.939637 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk"] Apr 16 15:35:48.001176 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:48.001137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41f451d4-4303-405b-ae43-90d9c95ac40d-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk\" (UID: \"41f451d4-4303-405b-ae43-90d9c95ac40d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:35:48.101927 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:48.101884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41f451d4-4303-405b-ae43-90d9c95ac40d-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk\" (UID: \"41f451d4-4303-405b-ae43-90d9c95ac40d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:35:48.102277 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:48.102254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41f451d4-4303-405b-ae43-90d9c95ac40d-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk\" (UID: \"41f451d4-4303-405b-ae43-90d9c95ac40d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:35:48.242546 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:48.242513 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:35:48.362613 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:48.362581 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk"] Apr 16 15:35:48.365216 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:35:48.365185 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41f451d4_4303_405b_ae43_90d9c95ac40d.slice/crio-3ac526953ba233493cfe461d34c34e73596ec8b0ef02cf154161aa5f75c27285 WatchSource:0}: Error finding container 3ac526953ba233493cfe461d34c34e73596ec8b0ef02cf154161aa5f75c27285: Status 404 returned error can't find the container with id 3ac526953ba233493cfe461d34c34e73596ec8b0ef02cf154161aa5f75c27285 Apr 16 15:35:48.835405 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:48.835370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" event={"ID":"41f451d4-4303-405b-ae43-90d9c95ac40d","Type":"ContainerStarted","Data":"d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a"} Apr 16 15:35:48.835405 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:48.835408 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" event={"ID":"41f451d4-4303-405b-ae43-90d9c95ac40d","Type":"ContainerStarted","Data":"3ac526953ba233493cfe461d34c34e73596ec8b0ef02cf154161aa5f75c27285"} Apr 16 15:35:52.413069 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.413047 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:35:52.535234 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.535195 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca521a49-05b0-4948-8874-26a6263616c5-kserve-provision-location\") pod \"ca521a49-05b0-4948-8874-26a6263616c5\" (UID: \"ca521a49-05b0-4948-8874-26a6263616c5\") " Apr 16 15:35:52.535537 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.535513 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca521a49-05b0-4948-8874-26a6263616c5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca521a49-05b0-4948-8874-26a6263616c5" (UID: "ca521a49-05b0-4948-8874-26a6263616c5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:35:52.636445 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.636372 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca521a49-05b0-4948-8874-26a6263616c5-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:35:52.847532 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.847499 2577 generic.go:358] "Generic (PLEG): container finished" podID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerID="d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a" exitCode=0 Apr 16 15:35:52.847707 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.847574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" event={"ID":"41f451d4-4303-405b-ae43-90d9c95ac40d","Type":"ContainerDied","Data":"d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a"} Apr 16 15:35:52.848876 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.848852 2577 generic.go:358] "Generic (PLEG): container finished" podID="ca521a49-05b0-4948-8874-26a6263616c5" containerID="e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5" exitCode=0 Apr 16 15:35:52.849030 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.848909 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" Apr 16 15:35:52.849030 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.848967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" event={"ID":"ca521a49-05b0-4948-8874-26a6263616c5","Type":"ContainerDied","Data":"e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5"} Apr 16 15:35:52.849030 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.848996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9" event={"ID":"ca521a49-05b0-4948-8874-26a6263616c5","Type":"ContainerDied","Data":"b58585b80229acec4495aa42bd78a250ff1f3c011300ebe4329387110613aff1"} Apr 16 15:35:52.849030 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.849016 2577 scope.go:117] "RemoveContainer" containerID="e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5" Apr 16 15:35:52.857199 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.857064 2577 scope.go:117] "RemoveContainer" containerID="30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e" Apr 16 15:35:52.864446 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.864431 2577 scope.go:117] "RemoveContainer" containerID="e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5" Apr 16 15:35:52.864785 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:35:52.864729 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5\": container with ID starting with e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5 not found: ID does not exist" containerID="e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5" Apr 16 15:35:52.864872 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.864794 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5"} err="failed to get container status \"e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5\": rpc error: code = NotFound desc = could not find container \"e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5\": container with ID starting with e622ac31bc5615c6d46e3e8c69b463bf2411c6508d136ab15e014acb004da5c5 not found: ID does not exist" Apr 16 15:35:52.864872 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.864816 2577 scope.go:117] "RemoveContainer" containerID="30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e" Apr 16 15:35:52.865107 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:35:52.865091 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e\": container with ID starting with 30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e not found: ID does not exist" containerID="30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e" Apr 16 15:35:52.865147 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.865113 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e"} err="failed to get container status \"30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e\": rpc error: code = NotFound desc = could not find container \"30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e\": container with ID starting with 30b541ec5b3e77bd67b4aa084562588cd9661d79c3b464531929f46c0cdfb46e not found: ID does not exist" Apr 16 15:35:52.873277 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.873253 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9"] Apr 16 15:35:52.878463 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:52.878440 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-68rw9"] Apr 16 15:35:53.757357 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:53.757328 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca521a49-05b0-4948-8874-26a6263616c5" path="/var/lib/kubelet/pods/ca521a49-05b0-4948-8874-26a6263616c5/volumes" Apr 16 15:35:53.853160 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:53.853127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" event={"ID":"41f451d4-4303-405b-ae43-90d9c95ac40d","Type":"ContainerStarted","Data":"c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba"} Apr 16 15:35:53.853381 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:53.853362 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:35:53.868906 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:35:53.868865 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" podStartSLOduration=6.868851708 podStartE2EDuration="6.868851708s" podCreationTimestamp="2026-04-16 15:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:35:53.867408536 +0000 UTC m=+2604.757823422" watchObservedRunningTime="2026-04-16 15:35:53.868851708 +0000 UTC m=+2604.759266655" Apr 16 15:36:24.858919 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:36:24.858873 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:36:34.858367 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:36:34.858325 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:36:44.858416 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:36:44.858369 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:36:54.858154 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:36:54.858107 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.27:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:37:04.861425 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:04.861388 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:37:08.059121 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.059083 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk"] Apr 16 15:37:08.059497 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.059392 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="kserve-container" containerID="cri-o://c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba" gracePeriod=30 Apr 16 15:37:08.115356 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.115312 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm"] Apr 16 15:37:08.115589 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.115577 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="storage-initializer" Apr 16 15:37:08.115634 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.115591 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="storage-initializer" Apr 16 15:37:08.115634 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.115610 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="kserve-container" Apr 16 15:37:08.115634 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.115616 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="kserve-container" Apr 16 15:37:08.115721 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.115659 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca521a49-05b0-4948-8874-26a6263616c5" containerName="kserve-container" Apr 16 15:37:08.118320 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.118301 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:37:08.125980 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.125954 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm"] Apr 16 15:37:08.186614 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.186575 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/674a310c-1dfe-4e7c-b51c-58a8753b534c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm\" (UID: \"674a310c-1dfe-4e7c-b51c-58a8753b534c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:37:08.287907 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.287862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/674a310c-1dfe-4e7c-b51c-58a8753b534c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm\" (UID: \"674a310c-1dfe-4e7c-b51c-58a8753b534c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:37:08.288254 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.288235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/674a310c-1dfe-4e7c-b51c-58a8753b534c-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm\" (UID: \"674a310c-1dfe-4e7c-b51c-58a8753b534c\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:37:08.428757 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.428671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:37:08.546879 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.546837 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm"] Apr 16 15:37:08.549759 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:37:08.549729 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod674a310c_1dfe_4e7c_b51c_58a8753b534c.slice/crio-3b133199ea3af1d1d82fcd9dcc141aa8fb33bdc184480809908420a9b99fafb3 WatchSource:0}: Error finding container 3b133199ea3af1d1d82fcd9dcc141aa8fb33bdc184480809908420a9b99fafb3: Status 404 returned error can't find the container with id 3b133199ea3af1d1d82fcd9dcc141aa8fb33bdc184480809908420a9b99fafb3 Apr 16 15:37:08.551442 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:08.551415 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:37:09.052318 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:09.052286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" event={"ID":"674a310c-1dfe-4e7c-b51c-58a8753b534c","Type":"ContainerStarted","Data":"5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05"} Apr 16 15:37:09.052318 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:09.052325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" event={"ID":"674a310c-1dfe-4e7c-b51c-58a8753b534c","Type":"ContainerStarted","Data":"3b133199ea3af1d1d82fcd9dcc141aa8fb33bdc184480809908420a9b99fafb3"} Apr 16 15:37:12.799741 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:12.799721 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:37:12.922761 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:12.922724 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41f451d4-4303-405b-ae43-90d9c95ac40d-kserve-provision-location\") pod \"41f451d4-4303-405b-ae43-90d9c95ac40d\" (UID: \"41f451d4-4303-405b-ae43-90d9c95ac40d\") " Apr 16 15:37:12.923085 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:12.923062 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f451d4-4303-405b-ae43-90d9c95ac40d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41f451d4-4303-405b-ae43-90d9c95ac40d" (UID: "41f451d4-4303-405b-ae43-90d9c95ac40d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:37:13.023872 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.023841 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41f451d4-4303-405b-ae43-90d9c95ac40d-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:37:13.064019 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.063984 2577 generic.go:358] "Generic (PLEG): container finished" podID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerID="5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05" exitCode=0 Apr 16 15:37:13.064175 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.064069 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" event={"ID":"674a310c-1dfe-4e7c-b51c-58a8753b534c","Type":"ContainerDied","Data":"5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05"} Apr 16 15:37:13.065601 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.065575 2577 generic.go:358] "Generic (PLEG): container finished" podID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerID="c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba" exitCode=0 Apr 16 15:37:13.065689 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.065638 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" event={"ID":"41f451d4-4303-405b-ae43-90d9c95ac40d","Type":"ContainerDied","Data":"c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba"} Apr 16 15:37:13.065689 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.065663 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" event={"ID":"41f451d4-4303-405b-ae43-90d9c95ac40d","Type":"ContainerDied","Data":"3ac526953ba233493cfe461d34c34e73596ec8b0ef02cf154161aa5f75c27285"} Apr 16 15:37:13.065689 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.065668 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk" Apr 16 15:37:13.065689 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.065682 2577 scope.go:117] "RemoveContainer" containerID="c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba" Apr 16 15:37:13.073495 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.073357 2577 scope.go:117] "RemoveContainer" containerID="d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a" Apr 16 15:37:13.081389 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.081369 2577 scope.go:117] "RemoveContainer" containerID="c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba" Apr 16 15:37:13.081644 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:37:13.081626 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba\": container with ID starting with c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba not found: ID does not exist" containerID="c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba" Apr 16 15:37:13.081698 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.081653 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba"} err="failed to get container status \"c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba\": rpc error: code = NotFound desc = could not find container \"c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba\": container with ID starting with c29dd962212647129e58a4cfad53518a988c3c7ebc296f34be7f6a76ed9834ba not found: ID does not exist" Apr 16 15:37:13.081698 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.081671 2577 scope.go:117] "RemoveContainer" containerID="d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a" Apr 16 15:37:13.081924 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:37:13.081895 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a\": container with ID starting with d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a not found: ID does not exist" containerID="d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a" Apr 16 15:37:13.081985 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.081959 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a"} err="failed to get container status \"d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a\": rpc error: code = NotFound desc = could not find container \"d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a\": container with ID starting with d6b0ce38b8a477aa0430c63a5290d075785f006aa4d53c9bcbb0971a1f44e68a not found: ID does not exist" Apr 16 15:37:13.089784 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.089758 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk"] Apr 16 15:37:13.091636 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.091616 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-vpbbk"] Apr 16 15:37:13.758787 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:13.758748 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" path="/var/lib/kubelet/pods/41f451d4-4303-405b-ae43-90d9c95ac40d/volumes" Apr 16 15:37:14.070106 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:14.070013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" event={"ID":"674a310c-1dfe-4e7c-b51c-58a8753b534c","Type":"ContainerStarted","Data":"c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6"} Apr 16 15:37:14.070551 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:14.070316 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:37:14.087479 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:14.087425 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" podStartSLOduration=6.087411338 podStartE2EDuration="6.087411338s" podCreationTimestamp="2026-04-16 15:37:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:37:14.086108272 +0000 UTC m=+2684.976523158" watchObservedRunningTime="2026-04-16 15:37:14.087411338 +0000 UTC m=+2684.977826298" Apr 16 15:37:29.801827 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:29.801746 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:37:29.805750 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:29.805725 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:37:45.075200 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:45.075149 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:37:55.074383 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:37:55.074338 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:38:05.074014 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:05.073960 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:38:15.074794 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:15.074744 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:38:25.074223 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:25.074173 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.28:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:38:35.078400 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:35.078354 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:38:38.282815 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:38.282778 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm"] Apr 16 15:38:38.283244 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:38.283152 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" containerID="cri-o://c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6" gracePeriod=30 Apr 16 15:38:44.115724 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.115701 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:38:44.201893 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.201858 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/674a310c-1dfe-4e7c-b51c-58a8753b534c-kserve-provision-location\") pod \"674a310c-1dfe-4e7c-b51c-58a8753b534c\" (UID: \"674a310c-1dfe-4e7c-b51c-58a8753b534c\") " Apr 16 15:38:44.202220 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.202197 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/674a310c-1dfe-4e7c-b51c-58a8753b534c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "674a310c-1dfe-4e7c-b51c-58a8753b534c" (UID: "674a310c-1dfe-4e7c-b51c-58a8753b534c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:38:44.302784 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.302707 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/674a310c-1dfe-4e7c-b51c-58a8753b534c-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:38:44.318084 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.318055 2577 generic.go:358] "Generic (PLEG): container finished" podID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerID="c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6" exitCode=0 Apr 16 15:38:44.318231 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.318121 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" Apr 16 15:38:44.318231 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.318132 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" event={"ID":"674a310c-1dfe-4e7c-b51c-58a8753b534c","Type":"ContainerDied","Data":"c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6"} Apr 16 15:38:44.318231 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.318170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm" event={"ID":"674a310c-1dfe-4e7c-b51c-58a8753b534c","Type":"ContainerDied","Data":"3b133199ea3af1d1d82fcd9dcc141aa8fb33bdc184480809908420a9b99fafb3"} Apr 16 15:38:44.318231 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.318185 2577 scope.go:117] "RemoveContainer" containerID="c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6" Apr 16 15:38:44.326388 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.326370 2577 scope.go:117] "RemoveContainer" containerID="5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05" Apr 16 15:38:44.333390 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.333368 2577 scope.go:117] "RemoveContainer" containerID="c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6" Apr 16 15:38:44.333649 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:38:44.333630 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6\": container with ID starting with c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6 not found: ID does not exist" containerID="c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6" Apr 16 15:38:44.333725 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.333661 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6"} err="failed to get container status \"c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6\": rpc error: code = NotFound desc = could not find container \"c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6\": container with ID starting with c1f3df58ef47ba978e0296721f836a0199a2b9f931f96cfe1066bcee86cc4dd6 not found: ID does not exist" Apr 16 15:38:44.333725 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.333688 2577 scope.go:117] "RemoveContainer" containerID="5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05" Apr 16 15:38:44.333917 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:38:44.333902 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05\": container with ID starting with 5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05 not found: ID does not exist" containerID="5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05" Apr 16 15:38:44.333995 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.333923 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05"} err="failed to get container status \"5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05\": rpc error: code = NotFound desc = could not find container \"5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05\": container with ID starting with 5cf3dedbf3a36e5a3f0d4514d1f3a8f239feec4fb512996519910bc8f6f8bd05 not found: ID does not exist" Apr 16 15:38:44.346324 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.346299 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm"] Apr 16 15:38:44.352233 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:44.352210 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-sbblm"] Apr 16 15:38:45.756749 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:38:45.756717 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" path="/var/lib/kubelet/pods/674a310c-1dfe-4e7c-b51c-58a8753b534c/volumes" Apr 16 15:42:29.819261 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:42:29.819235 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:42:29.823393 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:42:29.823371 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:44:58.670660 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670623 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt"] Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670894 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="kserve-container" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670906 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="kserve-container" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670916 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670922 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670928 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="storage-initializer" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670949 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="storage-initializer" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670957 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="storage-initializer" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.670962 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="storage-initializer" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.671005 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="41f451d4-4303-405b-ae43-90d9c95ac40d" containerName="kserve-container" Apr 16 15:44:58.672987 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.671035 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="674a310c-1dfe-4e7c-b51c-58a8753b534c" containerName="kserve-container" Apr 16 15:44:58.673872 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.673856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:44:58.676305 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.676286 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:44:58.682178 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.682141 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt"] Apr 16 15:44:58.805196 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.805152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4218662c-3fe9-4684-bbc8-9809f0b13e63-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-fjkvt\" (UID: \"4218662c-3fe9-4684-bbc8-9809f0b13e63\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:44:58.906462 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.906426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4218662c-3fe9-4684-bbc8-9809f0b13e63-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-fjkvt\" (UID: \"4218662c-3fe9-4684-bbc8-9809f0b13e63\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:44:58.906825 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.906804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4218662c-3fe9-4684-bbc8-9809f0b13e63-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-fjkvt\" (UID: \"4218662c-3fe9-4684-bbc8-9809f0b13e63\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:44:58.985596 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:58.985566 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:44:59.105413 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:59.105379 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt"] Apr 16 15:44:59.108517 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:44:59.108486 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4218662c_3fe9_4684_bbc8_9809f0b13e63.slice/crio-292871ee9e5c546566ac1c96a6e5a63305a67a217141c673385ad366109e9ced WatchSource:0}: Error finding container 292871ee9e5c546566ac1c96a6e5a63305a67a217141c673385ad366109e9ced: Status 404 returned error can't find the container with id 292871ee9e5c546566ac1c96a6e5a63305a67a217141c673385ad366109e9ced Apr 16 15:44:59.110247 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:59.110228 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:44:59.303446 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:59.303365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" event={"ID":"4218662c-3fe9-4684-bbc8-9809f0b13e63","Type":"ContainerStarted","Data":"d94f3efd97c764957c7bd4a665e9940e49f134e443264a3772e552acb3173dbc"} Apr 16 15:44:59.303446 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:44:59.303398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" event={"ID":"4218662c-3fe9-4684-bbc8-9809f0b13e63","Type":"ContainerStarted","Data":"292871ee9e5c546566ac1c96a6e5a63305a67a217141c673385ad366109e9ced"} Apr 16 15:45:04.318081 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:04.318045 2577 generic.go:358] "Generic (PLEG): container finished" podID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerID="d94f3efd97c764957c7bd4a665e9940e49f134e443264a3772e552acb3173dbc" exitCode=0 Apr 16 15:45:04.318507 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:04.318125 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" event={"ID":"4218662c-3fe9-4684-bbc8-9809f0b13e63","Type":"ContainerDied","Data":"d94f3efd97c764957c7bd4a665e9940e49f134e443264a3772e552acb3173dbc"} Apr 16 15:45:09.334736 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:09.334700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" event={"ID":"4218662c-3fe9-4684-bbc8-9809f0b13e63","Type":"ContainerStarted","Data":"d16aeeb15c7cc6cd320c52196bca32e7f59720671d1eae9f729a15343675df33"} Apr 16 15:45:09.335171 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:09.334963 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:45:09.336306 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:09.336280 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 15:45:09.356426 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:09.356383 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" podStartSLOduration=7.399569932 podStartE2EDuration="11.356368174s" podCreationTimestamp="2026-04-16 15:44:58 +0000 UTC" firstStartedPulling="2026-04-16 15:45:04.319264993 +0000 UTC m=+3155.209679857" lastFinishedPulling="2026-04-16 15:45:08.276063231 +0000 UTC m=+3159.166478099" observedRunningTime="2026-04-16 15:45:09.356008237 +0000 UTC m=+3160.246423124" watchObservedRunningTime="2026-04-16 15:45:09.356368174 +0000 UTC m=+3160.246783060" Apr 16 15:45:10.338065 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:10.338025 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 15:45:20.339715 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:20.339685 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:45:40.116107 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.116074 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt"] Apr 16 15:45:40.116604 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.116425 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="kserve-container" containerID="cri-o://d16aeeb15c7cc6cd320c52196bca32e7f59720671d1eae9f729a15343675df33" gracePeriod=30 Apr 16 15:45:40.185680 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.185644 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr"] Apr 16 15:45:40.188852 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.188830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:45:40.199089 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.199067 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr"] Apr 16 15:45:40.289797 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.289764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15019466-0dc5-41a6-aade-0613180ccd2e-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr\" (UID: \"15019466-0dc5-41a6-aade-0613180ccd2e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:45:40.391108 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.391025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15019466-0dc5-41a6-aade-0613180ccd2e-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr\" (UID: \"15019466-0dc5-41a6-aade-0613180ccd2e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:45:40.391397 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.391374 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15019466-0dc5-41a6-aade-0613180ccd2e-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr\" (UID: \"15019466-0dc5-41a6-aade-0613180ccd2e\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:45:40.499159 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.499120 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:45:40.613851 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:40.613809 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr"] Apr 16 15:45:40.617301 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:45:40.617271 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15019466_0dc5_41a6_aade_0613180ccd2e.slice/crio-774f6809c839186e3a1215bf23d650c6bbfdc6a64b276a4329956e87d4b74335 WatchSource:0}: Error finding container 774f6809c839186e3a1215bf23d650c6bbfdc6a64b276a4329956e87d4b74335: Status 404 returned error can't find the container with id 774f6809c839186e3a1215bf23d650c6bbfdc6a64b276a4329956e87d4b74335 Apr 16 15:45:41.422952 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:41.422893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" event={"ID":"15019466-0dc5-41a6-aade-0613180ccd2e","Type":"ContainerStarted","Data":"d05975c5ddd60c8e90e6b298c8b3cc9967a857568b2cdc89f3c19b91b874f520"} Apr 16 15:45:41.423342 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:41.422961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" event={"ID":"15019466-0dc5-41a6-aade-0613180ccd2e","Type":"ContainerStarted","Data":"774f6809c839186e3a1215bf23d650c6bbfdc6a64b276a4329956e87d4b74335"} Apr 16 15:45:45.434607 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:45.434525 2577 generic.go:358] "Generic (PLEG): container finished" podID="15019466-0dc5-41a6-aade-0613180ccd2e" containerID="d05975c5ddd60c8e90e6b298c8b3cc9967a857568b2cdc89f3c19b91b874f520" exitCode=0 Apr 16 15:45:45.434607 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:45.434576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" event={"ID":"15019466-0dc5-41a6-aade-0613180ccd2e","Type":"ContainerDied","Data":"d05975c5ddd60c8e90e6b298c8b3cc9967a857568b2cdc89f3c19b91b874f520"} Apr 16 15:45:46.439477 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:46.439434 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" event={"ID":"15019466-0dc5-41a6-aade-0613180ccd2e","Type":"ContainerStarted","Data":"2aace5dbc7238bd53f3aaf650172f6ce91939cea69c71b9c311c9350e27d8e48"} Apr 16 15:45:46.439885 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:46.439746 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:45:46.441143 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:46.441117 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 15:45:46.455023 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:46.454973 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" podStartSLOduration=6.454957431 podStartE2EDuration="6.454957431s" podCreationTimestamp="2026-04-16 15:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:45:46.453123307 +0000 UTC m=+3197.343538193" watchObservedRunningTime="2026-04-16 15:45:46.454957431 +0000 UTC m=+3197.345372311" Apr 16 15:45:47.442715 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:47.442674 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 15:45:57.443992 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:45:57.443951 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:46:10.338386 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:10.338332 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 15:46:10.505712 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:10.505677 2577 generic.go:358] "Generic (PLEG): container finished" podID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerID="d16aeeb15c7cc6cd320c52196bca32e7f59720671d1eae9f729a15343675df33" exitCode=137 Apr 16 15:46:10.505883 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:10.505745 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" event={"ID":"4218662c-3fe9-4684-bbc8-9809f0b13e63","Type":"ContainerDied","Data":"d16aeeb15c7cc6cd320c52196bca32e7f59720671d1eae9f729a15343675df33"} Apr 16 15:46:10.748924 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:10.748900 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:46:10.815692 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:10.815603 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4218662c-3fe9-4684-bbc8-9809f0b13e63-kserve-provision-location\") pod \"4218662c-3fe9-4684-bbc8-9809f0b13e63\" (UID: \"4218662c-3fe9-4684-bbc8-9809f0b13e63\") " Apr 16 15:46:10.825906 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:10.825873 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4218662c-3fe9-4684-bbc8-9809f0b13e63-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4218662c-3fe9-4684-bbc8-9809f0b13e63" (UID: "4218662c-3fe9-4684-bbc8-9809f0b13e63"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:46:10.916743 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:10.916703 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4218662c-3fe9-4684-bbc8-9809f0b13e63-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:46:11.139842 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.139760 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr"] Apr 16 15:46:11.140086 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.140062 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" containerName="kserve-container" containerID="cri-o://2aace5dbc7238bd53f3aaf650172f6ce91939cea69c71b9c311c9350e27d8e48" gracePeriod=30 Apr 16 15:46:11.202481 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.202446 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76"] Apr 16 15:46:11.202713 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.202700 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="kserve-container" Apr 16 15:46:11.202752 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.202715 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="kserve-container" Apr 16 15:46:11.202752 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.202722 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="storage-initializer" Apr 16 15:46:11.202752 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.202728 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="storage-initializer" Apr 16 15:46:11.202838 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.202779 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" containerName="kserve-container" Apr 16 15:46:11.204661 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.204645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:46:11.212856 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.212828 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76"] Apr 16 15:46:11.320463 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.320416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfead25d-3a9d-436e-baaf-42506d23e34d-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-4rt76\" (UID: \"cfead25d-3a9d-436e-baaf-42506d23e34d\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:46:11.421268 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.421176 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfead25d-3a9d-436e-baaf-42506d23e34d-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-4rt76\" (UID: \"cfead25d-3a9d-436e-baaf-42506d23e34d\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:46:11.421627 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.421553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfead25d-3a9d-436e-baaf-42506d23e34d-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-4rt76\" (UID: \"cfead25d-3a9d-436e-baaf-42506d23e34d\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:46:11.510927 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.510888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" event={"ID":"4218662c-3fe9-4684-bbc8-9809f0b13e63","Type":"ContainerDied","Data":"292871ee9e5c546566ac1c96a6e5a63305a67a217141c673385ad366109e9ced"} Apr 16 15:46:11.510927 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.510914 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt" Apr 16 15:46:11.511169 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.510971 2577 scope.go:117] "RemoveContainer" containerID="d16aeeb15c7cc6cd320c52196bca32e7f59720671d1eae9f729a15343675df33" Apr 16 15:46:11.514391 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.514369 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:46:11.521424 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.521398 2577 scope.go:117] "RemoveContainer" containerID="d94f3efd97c764957c7bd4a665e9940e49f134e443264a3772e552acb3173dbc" Apr 16 15:46:11.531290 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.531259 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt"] Apr 16 15:46:11.532298 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.532258 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-fjkvt"] Apr 16 15:46:11.640616 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.640580 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76"] Apr 16 15:46:11.643820 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:46:11.643792 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfead25d_3a9d_436e_baaf_42506d23e34d.slice/crio-539c4d38ca28f8d06f30aef9724cd313243413246866e0e667540b463d9fc59b WatchSource:0}: Error finding container 539c4d38ca28f8d06f30aef9724cd313243413246866e0e667540b463d9fc59b: Status 404 returned error can't find the container with id 539c4d38ca28f8d06f30aef9724cd313243413246866e0e667540b463d9fc59b Apr 16 15:46:11.758121 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:11.758082 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4218662c-3fe9-4684-bbc8-9809f0b13e63" path="/var/lib/kubelet/pods/4218662c-3fe9-4684-bbc8-9809f0b13e63/volumes" Apr 16 15:46:12.519403 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:12.519362 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" event={"ID":"cfead25d-3a9d-436e-baaf-42506d23e34d","Type":"ContainerStarted","Data":"5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e"} Apr 16 15:46:12.519403 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:12.519409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" event={"ID":"cfead25d-3a9d-436e-baaf-42506d23e34d","Type":"ContainerStarted","Data":"539c4d38ca28f8d06f30aef9724cd313243413246866e0e667540b463d9fc59b"} Apr 16 15:46:15.530712 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:15.530681 2577 generic.go:358] "Generic (PLEG): container finished" podID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerID="5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e" exitCode=0 Apr 16 15:46:15.531106 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:15.530741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" event={"ID":"cfead25d-3a9d-436e-baaf-42506d23e34d","Type":"ContainerDied","Data":"5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e"} Apr 16 15:46:41.639671 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:41.639632 2577 generic.go:358] "Generic (PLEG): container finished" podID="15019466-0dc5-41a6-aade-0613180ccd2e" containerID="2aace5dbc7238bd53f3aaf650172f6ce91939cea69c71b9c311c9350e27d8e48" exitCode=137 Apr 16 15:46:41.639671 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:41.639663 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" event={"ID":"15019466-0dc5-41a6-aade-0613180ccd2e","Type":"ContainerDied","Data":"2aace5dbc7238bd53f3aaf650172f6ce91939cea69c71b9c311c9350e27d8e48"} Apr 16 15:46:41.843451 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:41.843413 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:46:41.872528 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:41.872494 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15019466-0dc5-41a6-aade-0613180ccd2e-kserve-provision-location\") pod \"15019466-0dc5-41a6-aade-0613180ccd2e\" (UID: \"15019466-0dc5-41a6-aade-0613180ccd2e\") " Apr 16 15:46:41.884401 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:41.884357 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15019466-0dc5-41a6-aade-0613180ccd2e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15019466-0dc5-41a6-aade-0613180ccd2e" (UID: "15019466-0dc5-41a6-aade-0613180ccd2e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:46:41.973967 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:41.973853 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15019466-0dc5-41a6-aade-0613180ccd2e-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:46:42.644535 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:42.644462 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" event={"ID":"15019466-0dc5-41a6-aade-0613180ccd2e","Type":"ContainerDied","Data":"774f6809c839186e3a1215bf23d650c6bbfdc6a64b276a4329956e87d4b74335"} Apr 16 15:46:42.644535 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:42.644515 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr" Apr 16 15:46:42.645056 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:42.644518 2577 scope.go:117] "RemoveContainer" containerID="2aace5dbc7238bd53f3aaf650172f6ce91939cea69c71b9c311c9350e27d8e48" Apr 16 15:46:42.656112 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:42.655757 2577 scope.go:117] "RemoveContainer" containerID="d05975c5ddd60c8e90e6b298c8b3cc9967a857568b2cdc89f3c19b91b874f520" Apr 16 15:46:42.669171 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:42.669125 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr"] Apr 16 15:46:42.673266 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:42.673209 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-s2rdr"] Apr 16 15:46:43.758104 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:46:43.758071 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" path="/var/lib/kubelet/pods/15019466-0dc5-41a6-aade-0613180ccd2e/volumes" Apr 16 15:48:09.281072 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:09.281033 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:48:09.281072 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:09.281058 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:48:10.909779 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:10.909740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" event={"ID":"cfead25d-3a9d-436e-baaf-42506d23e34d","Type":"ContainerStarted","Data":"64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679"} Apr 16 15:48:10.910200 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:10.909971 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:48:10.911232 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:10.911207 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 15:48:10.926124 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:10.926051 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" podStartSLOduration=5.274402598 podStartE2EDuration="1m59.926039409s" podCreationTimestamp="2026-04-16 15:46:11 +0000 UTC" firstStartedPulling="2026-04-16 15:46:15.531837931 +0000 UTC m=+3226.422252795" lastFinishedPulling="2026-04-16 15:48:10.18347474 +0000 UTC m=+3341.073889606" observedRunningTime="2026-04-16 15:48:10.924097325 +0000 UTC m=+3341.814512215" watchObservedRunningTime="2026-04-16 15:48:10.926039409 +0000 UTC m=+3341.816454295" Apr 16 15:48:11.912760 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:11.912725 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 15:48:21.914446 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:21.914417 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:48:33.670773 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.670735 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76"] Apr 16 15:48:33.671210 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.671008 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerName="kserve-container" containerID="cri-o://64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679" gracePeriod=30 Apr 16 15:48:33.741596 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.741553 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt"] Apr 16 15:48:33.741817 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.741806 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" containerName="storage-initializer" Apr 16 15:48:33.741865 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.741819 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" containerName="storage-initializer" Apr 16 15:48:33.741865 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.741829 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" containerName="kserve-container" Apr 16 15:48:33.741865 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.741834 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" containerName="kserve-container" Apr 16 15:48:33.742010 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.741883 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="15019466-0dc5-41a6-aade-0613180ccd2e" containerName="kserve-container" Apr 16 15:48:33.778124 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.778090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:48:33.782024 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.781998 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt"] Apr 16 15:48:33.796074 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.796043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73dec0be-6723-4968-bc04-27bcc07d9206-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rnkmt\" (UID: \"73dec0be-6723-4968-bc04-27bcc07d9206\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:48:33.897191 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.897156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73dec0be-6723-4968-bc04-27bcc07d9206-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rnkmt\" (UID: \"73dec0be-6723-4968-bc04-27bcc07d9206\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:48:33.897603 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:33.897579 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73dec0be-6723-4968-bc04-27bcc07d9206-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rnkmt\" (UID: \"73dec0be-6723-4968-bc04-27bcc07d9206\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:48:34.087968 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:34.087897 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:48:34.206728 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:34.206698 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt"] Apr 16 15:48:34.209669 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:48:34.209641 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73dec0be_6723_4968_bc04_27bcc07d9206.slice/crio-9cd205b95db25398d67815d3e64c06cee437fbd48a5d16b86520ef40b6572dfb WatchSource:0}: Error finding container 9cd205b95db25398d67815d3e64c06cee437fbd48a5d16b86520ef40b6572dfb: Status 404 returned error can't find the container with id 9cd205b95db25398d67815d3e64c06cee437fbd48a5d16b86520ef40b6572dfb Apr 16 15:48:34.974358 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:34.974313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" event={"ID":"73dec0be-6723-4968-bc04-27bcc07d9206","Type":"ContainerStarted","Data":"1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf"} Apr 16 15:48:34.974358 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:34.974362 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" event={"ID":"73dec0be-6723-4968-bc04-27bcc07d9206","Type":"ContainerStarted","Data":"9cd205b95db25398d67815d3e64c06cee437fbd48a5d16b86520ef40b6572dfb"} Apr 16 15:48:36.222529 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.222503 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:48:36.314214 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.314132 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfead25d-3a9d-436e-baaf-42506d23e34d-kserve-provision-location\") pod \"cfead25d-3a9d-436e-baaf-42506d23e34d\" (UID: \"cfead25d-3a9d-436e-baaf-42506d23e34d\") " Apr 16 15:48:36.314541 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.314516 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfead25d-3a9d-436e-baaf-42506d23e34d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cfead25d-3a9d-436e-baaf-42506d23e34d" (UID: "cfead25d-3a9d-436e-baaf-42506d23e34d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:48:36.415129 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.415088 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfead25d-3a9d-436e-baaf-42506d23e34d-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:48:36.980700 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.980668 2577 generic.go:358] "Generic (PLEG): container finished" podID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerID="64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679" exitCode=0 Apr 16 15:48:36.980883 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.980731 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" event={"ID":"cfead25d-3a9d-436e-baaf-42506d23e34d","Type":"ContainerDied","Data":"64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679"} Apr 16 15:48:36.980883 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.980745 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" Apr 16 15:48:36.980883 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.980759 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76" event={"ID":"cfead25d-3a9d-436e-baaf-42506d23e34d","Type":"ContainerDied","Data":"539c4d38ca28f8d06f30aef9724cd313243413246866e0e667540b463d9fc59b"} Apr 16 15:48:36.980883 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.980774 2577 scope.go:117] "RemoveContainer" containerID="64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679" Apr 16 15:48:36.989167 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.989145 2577 scope.go:117] "RemoveContainer" containerID="5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e" Apr 16 15:48:36.996692 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.996671 2577 scope.go:117] "RemoveContainer" containerID="64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679" Apr 16 15:48:36.997002 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:48:36.996982 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679\": container with ID starting with 64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679 not found: ID does not exist" containerID="64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679" Apr 16 15:48:36.997059 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.997013 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679"} err="failed to get container status \"64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679\": rpc error: code = NotFound desc = could not find container \"64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679\": container with ID starting with 64b594fc48b8e475a49575df7aaddd86609909b25ab3a6224a087cbbd8741679 not found: ID does not exist" Apr 16 15:48:36.997059 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.997032 2577 scope.go:117] "RemoveContainer" containerID="5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e" Apr 16 15:48:36.997282 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:48:36.997262 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e\": container with ID starting with 5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e not found: ID does not exist" containerID="5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e" Apr 16 15:48:36.997323 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:36.997288 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e"} err="failed to get container status \"5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e\": rpc error: code = NotFound desc = could not find container \"5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e\": container with ID starting with 5444eae5c82ddeaa83746d4b1b9a3bd6380df38b4993f9cd3ceb2a5169cb671e not found: ID does not exist" Apr 16 15:48:37.000186 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:37.000157 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76"] Apr 16 15:48:37.002241 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:37.002218 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-4rt76"] Apr 16 15:48:37.757251 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:37.757221 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" path="/var/lib/kubelet/pods/cfead25d-3a9d-436e-baaf-42506d23e34d/volumes" Apr 16 15:48:38.988716 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:38.988682 2577 generic.go:358] "Generic (PLEG): container finished" podID="73dec0be-6723-4968-bc04-27bcc07d9206" containerID="1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf" exitCode=0 Apr 16 15:48:38.989118 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:38.988756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" event={"ID":"73dec0be-6723-4968-bc04-27bcc07d9206","Type":"ContainerDied","Data":"1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf"} Apr 16 15:48:59.050864 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:59.050821 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" event={"ID":"73dec0be-6723-4968-bc04-27bcc07d9206","Type":"ContainerStarted","Data":"368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990"} Apr 16 15:48:59.051363 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:59.051227 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:48:59.052407 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:59.052375 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 15:48:59.067724 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:48:59.067676 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podStartSLOduration=6.733648461 podStartE2EDuration="26.067660997s" podCreationTimestamp="2026-04-16 15:48:33 +0000 UTC" firstStartedPulling="2026-04-16 15:48:38.990020925 +0000 UTC m=+3369.880435790" lastFinishedPulling="2026-04-16 15:48:58.324033459 +0000 UTC m=+3389.214448326" observedRunningTime="2026-04-16 15:48:59.065578324 +0000 UTC m=+3389.955993209" watchObservedRunningTime="2026-04-16 15:48:59.067660997 +0000 UTC m=+3389.958075883" Apr 16 15:49:00.053808 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:49:00.053763 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 15:49:10.054531 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:49:10.054481 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 15:49:20.054202 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:49:20.054109 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 15:49:30.054549 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:49:30.054503 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 15:49:40.054260 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:49:40.054209 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 15:49:50.054152 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:49:50.054105 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 15:50:00.054426 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:00.054391 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:50:03.873073 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:03.873040 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt"] Apr 16 15:50:03.873549 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:03.873385 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" containerID="cri-o://368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990" gracePeriod=30 Apr 16 15:50:07.619765 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:07.619730 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:50:07.706696 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:07.706593 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73dec0be-6723-4968-bc04-27bcc07d9206-kserve-provision-location\") pod \"73dec0be-6723-4968-bc04-27bcc07d9206\" (UID: \"73dec0be-6723-4968-bc04-27bcc07d9206\") " Apr 16 15:50:07.707031 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:07.707003 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73dec0be-6723-4968-bc04-27bcc07d9206-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73dec0be-6723-4968-bc04-27bcc07d9206" (UID: "73dec0be-6723-4968-bc04-27bcc07d9206"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:50:07.807210 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:07.807166 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73dec0be-6723-4968-bc04-27bcc07d9206-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:50:08.243232 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.243200 2577 generic.go:358] "Generic (PLEG): container finished" podID="73dec0be-6723-4968-bc04-27bcc07d9206" containerID="368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990" exitCode=0 Apr 16 15:50:08.243422 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.243274 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" Apr 16 15:50:08.243422 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.243302 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" event={"ID":"73dec0be-6723-4968-bc04-27bcc07d9206","Type":"ContainerDied","Data":"368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990"} Apr 16 15:50:08.243422 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.243341 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt" event={"ID":"73dec0be-6723-4968-bc04-27bcc07d9206","Type":"ContainerDied","Data":"9cd205b95db25398d67815d3e64c06cee437fbd48a5d16b86520ef40b6572dfb"} Apr 16 15:50:08.243422 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.243357 2577 scope.go:117] "RemoveContainer" containerID="368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990" Apr 16 15:50:08.251408 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.251390 2577 scope.go:117] "RemoveContainer" containerID="1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf" Apr 16 15:50:08.257427 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.257401 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt"] Apr 16 15:50:08.258839 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.258825 2577 scope.go:117] "RemoveContainer" containerID="368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990" Apr 16 15:50:08.259133 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:50:08.259117 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990\": container with ID starting with 368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990 not found: ID does not exist" containerID="368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990" Apr 16 15:50:08.259183 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.259140 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990"} err="failed to get container status \"368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990\": rpc error: code = NotFound desc = could not find container \"368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990\": container with ID starting with 368b6d29bde08b3d53b2a778c291e399f8c37cf78a87aaf1e806def54d1f9990 not found: ID does not exist" Apr 16 15:50:08.259183 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.259158 2577 scope.go:117] "RemoveContainer" containerID="1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf" Apr 16 15:50:08.259390 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:50:08.259374 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf\": container with ID starting with 1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf not found: ID does not exist" containerID="1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf" Apr 16 15:50:08.259437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.259394 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf"} err="failed to get container status \"1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf\": rpc error: code = NotFound desc = could not find container \"1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf\": container with ID starting with 1eac19f2f6b770c682b09b471c627b660999434d24d9ea7df13a632d092985bf not found: ID does not exist" Apr 16 15:50:08.262617 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:08.262596 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rnkmt"] Apr 16 15:50:09.756694 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:50:09.756661 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" path="/var/lib/kubelet/pods/73dec0be-6723-4968-bc04-27bcc07d9206/volumes" Apr 16 15:51:24.515954 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.515901 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p"] Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516164 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerName="kserve-container" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516177 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerName="kserve-container" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516187 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="storage-initializer" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516193 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="storage-initializer" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516201 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerName="storage-initializer" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516206 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerName="storage-initializer" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516219 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516225 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516269 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfead25d-3a9d-436e-baaf-42506d23e34d" containerName="kserve-container" Apr 16 15:51:24.516437 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.516277 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="73dec0be-6723-4968-bc04-27bcc07d9206" containerName="kserve-container" Apr 16 15:51:24.519166 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.519148 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:51:24.521440 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.521413 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:51:24.528320 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.528293 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p"] Apr 16 15:51:24.665669 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.665629 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e853e10-498d-48f7-8313-3e1d301515f8-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-sqv4p\" (UID: \"0e853e10-498d-48f7-8313-3e1d301515f8\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:51:24.766562 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.766461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e853e10-498d-48f7-8313-3e1d301515f8-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-sqv4p\" (UID: \"0e853e10-498d-48f7-8313-3e1d301515f8\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:51:24.766898 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.766874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e853e10-498d-48f7-8313-3e1d301515f8-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-sqv4p\" (UID: \"0e853e10-498d-48f7-8313-3e1d301515f8\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:51:24.830614 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.830583 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:51:24.954796 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.954746 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p"] Apr 16 15:51:24.957552 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:51:24.957526 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e853e10_498d_48f7_8313_3e1d301515f8.slice/crio-bff69f98746a423882713d73455107f29cfe6b79936167565f6ddb0974c96f3d WatchSource:0}: Error finding container bff69f98746a423882713d73455107f29cfe6b79936167565f6ddb0974c96f3d: Status 404 returned error can't find the container with id bff69f98746a423882713d73455107f29cfe6b79936167565f6ddb0974c96f3d Apr 16 15:51:24.959756 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:24.959740 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:51:25.454138 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:25.454102 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" event={"ID":"0e853e10-498d-48f7-8313-3e1d301515f8","Type":"ContainerStarted","Data":"a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5"} Apr 16 15:51:25.454138 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:25.454139 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" event={"ID":"0e853e10-498d-48f7-8313-3e1d301515f8","Type":"ContainerStarted","Data":"bff69f98746a423882713d73455107f29cfe6b79936167565f6ddb0974c96f3d"} Apr 16 15:51:29.467629 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:29.467590 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e853e10-498d-48f7-8313-3e1d301515f8" containerID="a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5" exitCode=0 Apr 16 15:51:29.468040 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:29.467639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" event={"ID":"0e853e10-498d-48f7-8313-3e1d301515f8","Type":"ContainerDied","Data":"a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5"} Apr 16 15:51:30.472552 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:30.472518 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" event={"ID":"0e853e10-498d-48f7-8313-3e1d301515f8","Type":"ContainerStarted","Data":"1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d"} Apr 16 15:51:30.472955 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:30.472798 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:51:30.474022 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:30.473998 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 15:51:30.487838 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:30.487787 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podStartSLOduration=6.48777538 podStartE2EDuration="6.48777538s" podCreationTimestamp="2026-04-16 15:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:51:30.486493841 +0000 UTC m=+3541.376908727" watchObservedRunningTime="2026-04-16 15:51:30.48777538 +0000 UTC m=+3541.378190267" Apr 16 15:51:31.475071 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:31.475030 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 15:51:41.475179 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:41.475122 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 15:51:51.475664 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:51:51.475610 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 15:52:01.475177 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:01.475126 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 15:52:11.475472 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:11.475423 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 15:52:21.475353 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:21.475260 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 15:52:31.476135 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:31.476106 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:52:34.651334 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:34.651300 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p"] Apr 16 15:52:34.651789 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:34.651542 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" containerID="cri-o://1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d" gracePeriod=30 Apr 16 15:52:38.492474 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.492448 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:52:38.606489 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.606393 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e853e10-498d-48f7-8313-3e1d301515f8-kserve-provision-location\") pod \"0e853e10-498d-48f7-8313-3e1d301515f8\" (UID: \"0e853e10-498d-48f7-8313-3e1d301515f8\") " Apr 16 15:52:38.606732 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.606706 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e853e10-498d-48f7-8313-3e1d301515f8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0e853e10-498d-48f7-8313-3e1d301515f8" (UID: "0e853e10-498d-48f7-8313-3e1d301515f8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:52:38.655906 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.655877 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e853e10-498d-48f7-8313-3e1d301515f8" containerID="1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d" exitCode=0 Apr 16 15:52:38.656031 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.655959 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" event={"ID":"0e853e10-498d-48f7-8313-3e1d301515f8","Type":"ContainerDied","Data":"1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d"} Apr 16 15:52:38.656031 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.655994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" event={"ID":"0e853e10-498d-48f7-8313-3e1d301515f8","Type":"ContainerDied","Data":"bff69f98746a423882713d73455107f29cfe6b79936167565f6ddb0974c96f3d"} Apr 16 15:52:38.656031 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.656010 2577 scope.go:117] "RemoveContainer" containerID="1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d" Apr 16 15:52:38.656140 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.655970 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p" Apr 16 15:52:38.663879 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.663860 2577 scope.go:117] "RemoveContainer" containerID="a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5" Apr 16 15:52:38.670909 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.670888 2577 scope.go:117] "RemoveContainer" containerID="1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d" Apr 16 15:52:38.671182 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:52:38.671160 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d\": container with ID starting with 1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d not found: ID does not exist" containerID="1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d" Apr 16 15:52:38.671240 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.671188 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d"} err="failed to get container status \"1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d\": rpc error: code = NotFound desc = could not find container \"1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d\": container with ID starting with 1836a756a870ef1d867c89dc1df5e6ffe63d7e26a98611bb776f595a1a51871d not found: ID does not exist" Apr 16 15:52:38.671240 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.671205 2577 scope.go:117] "RemoveContainer" containerID="a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5" Apr 16 15:52:38.671428 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:52:38.671410 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5\": container with ID starting with a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5 not found: ID does not exist" containerID="a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5" Apr 16 15:52:38.671471 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.671434 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5"} err="failed to get container status \"a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5\": rpc error: code = NotFound desc = could not find container \"a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5\": container with ID starting with a80246fc5e22c1e4928f52a36068d80b151f34c1ea999da95252a37e8f1557d5 not found: ID does not exist" Apr 16 15:52:38.675418 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.675396 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p"] Apr 16 15:52:38.681196 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.681176 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-sqv4p"] Apr 16 15:52:38.706978 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:38.706954 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e853e10-498d-48f7-8313-3e1d301515f8-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:52:39.756861 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:52:39.756832 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" path="/var/lib/kubelet/pods/0e853e10-498d-48f7-8313-3e1d301515f8/volumes" Apr 16 15:53:09.298046 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:09.297999 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:53:09.299852 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:09.299824 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:53:34.899954 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.899890 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv"] Apr 16 15:53:34.900361 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.900160 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" Apr 16 15:53:34.900361 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.900172 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" Apr 16 15:53:34.900361 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.900183 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="storage-initializer" Apr 16 15:53:34.900361 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.900190 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="storage-initializer" Apr 16 15:53:34.900361 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.900242 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e853e10-498d-48f7-8313-3e1d301515f8" containerName="kserve-container" Apr 16 15:53:34.904302 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.904282 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:53:34.906565 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.906540 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tsb5d\"" Apr 16 15:53:34.911540 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:34.911515 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv"] Apr 16 15:53:35.013607 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:35.013564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b0365a-c4fa-40de-8077-783ad87c2226-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-thlnv\" (UID: \"71b0365a-c4fa-40de-8077-783ad87c2226\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:53:35.114986 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:35.114922 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b0365a-c4fa-40de-8077-783ad87c2226-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-thlnv\" (UID: \"71b0365a-c4fa-40de-8077-783ad87c2226\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:53:35.115335 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:35.115315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b0365a-c4fa-40de-8077-783ad87c2226-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-thlnv\" (UID: \"71b0365a-c4fa-40de-8077-783ad87c2226\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:53:35.215206 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:35.215124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:53:35.341111 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:35.341088 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv"] Apr 16 15:53:35.343826 ip-10-0-139-55 kubenswrapper[2577]: W0416 15:53:35.343797 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b0365a_c4fa_40de_8077_783ad87c2226.slice/crio-f134b6afdfe82b4ad01116970a97bb6214320cfc2cf8c893dfa0a6352d1d34dd WatchSource:0}: Error finding container f134b6afdfe82b4ad01116970a97bb6214320cfc2cf8c893dfa0a6352d1d34dd: Status 404 returned error can't find the container with id f134b6afdfe82b4ad01116970a97bb6214320cfc2cf8c893dfa0a6352d1d34dd Apr 16 15:53:35.817297 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:35.817261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" event={"ID":"71b0365a-c4fa-40de-8077-783ad87c2226","Type":"ContainerStarted","Data":"5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd"} Apr 16 15:53:35.817297 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:35.817299 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" event={"ID":"71b0365a-c4fa-40de-8077-783ad87c2226","Type":"ContainerStarted","Data":"f134b6afdfe82b4ad01116970a97bb6214320cfc2cf8c893dfa0a6352d1d34dd"} Apr 16 15:53:39.829675 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:39.829643 2577 generic.go:358] "Generic (PLEG): container finished" podID="71b0365a-c4fa-40de-8077-783ad87c2226" containerID="5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd" exitCode=0 Apr 16 15:53:39.830089 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:39.829687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" event={"ID":"71b0365a-c4fa-40de-8077-783ad87c2226","Type":"ContainerDied","Data":"5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd"} Apr 16 15:53:40.833669 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:40.833631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" event={"ID":"71b0365a-c4fa-40de-8077-783ad87c2226","Type":"ContainerStarted","Data":"82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52"} Apr 16 15:53:40.834076 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:40.833947 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:53:40.835156 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:40.835130 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 15:53:40.848832 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:40.848790 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podStartSLOduration=6.848779581 podStartE2EDuration="6.848779581s" podCreationTimestamp="2026-04-16 15:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:53:40.847586851 +0000 UTC m=+3671.738001738" watchObservedRunningTime="2026-04-16 15:53:40.848779581 +0000 UTC m=+3671.739194467" Apr 16 15:53:41.837547 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:41.837461 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 15:53:51.838480 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:53:51.838429 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 15:54:01.838380 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:01.838319 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 15:54:11.838296 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:11.838240 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 15:54:21.838259 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:21.838212 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 15:54:31.838188 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:31.838145 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 15:54:41.838954 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:41.838897 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:54:45.019470 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:45.019438 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv"] Apr 16 15:54:45.020064 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:45.019686 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" containerID="cri-o://82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52" gracePeriod=30 Apr 16 15:54:48.666260 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:48.666237 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:54:48.767123 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:48.767025 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b0365a-c4fa-40de-8077-783ad87c2226-kserve-provision-location\") pod \"71b0365a-c4fa-40de-8077-783ad87c2226\" (UID: \"71b0365a-c4fa-40de-8077-783ad87c2226\") " Apr 16 15:54:48.767364 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:48.767341 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71b0365a-c4fa-40de-8077-783ad87c2226-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "71b0365a-c4fa-40de-8077-783ad87c2226" (UID: "71b0365a-c4fa-40de-8077-783ad87c2226"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:54:48.867710 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:48.867672 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71b0365a-c4fa-40de-8077-783ad87c2226-kserve-provision-location\") on node \"ip-10-0-139-55.ec2.internal\" DevicePath \"\"" Apr 16 15:54:49.025694 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.025605 2577 generic.go:358] "Generic (PLEG): container finished" podID="71b0365a-c4fa-40de-8077-783ad87c2226" containerID="82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52" exitCode=0 Apr 16 15:54:49.025694 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.025672 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" Apr 16 15:54:49.025887 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.025693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" event={"ID":"71b0365a-c4fa-40de-8077-783ad87c2226","Type":"ContainerDied","Data":"82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52"} Apr 16 15:54:49.025887 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.025728 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv" event={"ID":"71b0365a-c4fa-40de-8077-783ad87c2226","Type":"ContainerDied","Data":"f134b6afdfe82b4ad01116970a97bb6214320cfc2cf8c893dfa0a6352d1d34dd"} Apr 16 15:54:49.025887 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.025743 2577 scope.go:117] "RemoveContainer" containerID="82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52" Apr 16 15:54:49.034008 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.033988 2577 scope.go:117] "RemoveContainer" containerID="5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd" Apr 16 15:54:49.041162 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.041141 2577 scope.go:117] "RemoveContainer" containerID="82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52" Apr 16 15:54:49.041437 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:54:49.041419 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52\": container with ID starting with 82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52 not found: ID does not exist" containerID="82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52" Apr 16 15:54:49.041478 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.041445 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52"} err="failed to get container status \"82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52\": rpc error: code = NotFound desc = could not find container \"82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52\": container with ID starting with 82017c1235b7eeb15c431f8ed7e1a153eb00d2ea73423563041cd9cd3e8ffa52 not found: ID does not exist" Apr 16 15:54:49.041478 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.041465 2577 scope.go:117] "RemoveContainer" containerID="5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd" Apr 16 15:54:49.041701 ip-10-0-139-55 kubenswrapper[2577]: E0416 15:54:49.041676 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd\": container with ID starting with 5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd not found: ID does not exist" containerID="5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd" Apr 16 15:54:49.041799 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.041705 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd"} err="failed to get container status \"5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd\": rpc error: code = NotFound desc = could not find container \"5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd\": container with ID starting with 5af8e8334276d8901eefe099ee0320af0e6a7b093975527d40339a3af3f9bffd not found: ID does not exist" Apr 16 15:54:49.045013 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.044988 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv"] Apr 16 15:54:49.047783 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.047762 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-thlnv"] Apr 16 15:54:49.757832 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:54:49.757797 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" path="/var/lib/kubelet/pods/71b0365a-c4fa-40de-8077-783ad87c2226/volumes" Apr 16 15:58:09.315174 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:58:09.315147 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 15:58:09.317309 ip-10-0-139-55 kubenswrapper[2577]: I0416 15:58:09.317289 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 16:01:01.235264 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:01.235230 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m29v8_b683b265-ac6f-41fa-a2b7-b634d385846f/global-pull-secret-syncer/0.log" Apr 16 16:01:01.338869 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:01.338831 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mbdgw_89345827-ec40-475a-b830-4d6c1df489c5/konnectivity-agent/0.log" Apr 16 16:01:01.509552 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:01.509467 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-55.ec2.internal_8bc00e80c830605c41caa74a5fa346b2/haproxy/0.log" Apr 16 16:01:05.217174 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:05.217143 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-llxsj_e3f766f2-936e-41b9-8875-5e4b71e56887/node-exporter/0.log" Apr 16 16:01:05.237905 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:05.237883 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-llxsj_e3f766f2-936e-41b9-8875-5e4b71e56887/kube-rbac-proxy/0.log" Apr 16 16:01:05.265155 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:05.265133 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-llxsj_e3f766f2-936e-41b9-8875-5e4b71e56887/init-textfile/0.log" Apr 16 16:01:08.289198 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.289139 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm"] Apr 16 16:01:08.289557 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.289406 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="storage-initializer" Apr 16 16:01:08.289557 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.289418 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="storage-initializer" Apr 16 16:01:08.289557 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.289425 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" Apr 16 16:01:08.289557 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.289432 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" Apr 16 16:01:08.289557 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.289482 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="71b0365a-c4fa-40de-8077-783ad87c2226" containerName="kserve-container" Apr 16 16:01:08.292135 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.292120 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.294343 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.294318 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvdx8\"/\"kube-root-ca.crt\"" Apr 16 16:01:08.295202 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.295181 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gvdx8\"/\"default-dockercfg-sldsr\"" Apr 16 16:01:08.295202 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.295191 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gvdx8\"/\"openshift-service-ca.crt\"" Apr 16 16:01:08.299069 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.298953 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm"] Apr 16 16:01:08.411001 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.410972 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-lib-modules\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.411001 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.411007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trbt\" (UniqueName: \"kubernetes.io/projected/2df317a3-3c75-4f97-b9bc-b3848e46286b-kube-api-access-8trbt\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.411214 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.411041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-sys\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.411214 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.411070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-proc\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.411214 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.411087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-podres\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.511887 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.511857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-lib-modules\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.511887 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.511890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8trbt\" (UniqueName: \"kubernetes.io/projected/2df317a3-3c75-4f97-b9bc-b3848e46286b-kube-api-access-8trbt\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.512131 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.511918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-sys\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.512131 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.511961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-proc\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.512131 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.512036 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-sys\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.512131 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.512066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-lib-modules\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.512131 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.512081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-proc\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.512131 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.512104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-podres\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.512330 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.512189 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2df317a3-3c75-4f97-b9bc-b3848e46286b-podres\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.519465 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.519435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trbt\" (UniqueName: \"kubernetes.io/projected/2df317a3-3c75-4f97-b9bc-b3848e46286b-kube-api-access-8trbt\") pod \"perf-node-gather-daemonset-vfzjm\" (UID: \"2df317a3-3c75-4f97-b9bc-b3848e46286b\") " pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.602501 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.602412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:08.719216 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.719183 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm"] Apr 16 16:01:08.721881 ip-10-0-139-55 kubenswrapper[2577]: W0416 16:01:08.721854 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2df317a3_3c75_4f97_b9bc_b3848e46286b.slice/crio-64bd3acdbb61c0c19ed0f9184554949fee83f1608181f278a0623eb3032cb1f3 WatchSource:0}: Error finding container 64bd3acdbb61c0c19ed0f9184554949fee83f1608181f278a0623eb3032cb1f3: Status 404 returned error can't find the container with id 64bd3acdbb61c0c19ed0f9184554949fee83f1608181f278a0623eb3032cb1f3 Apr 16 16:01:08.723524 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.723505 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:01:08.911392 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.911317 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lbn5l_7b10a07b-1617-44d5-b186-97458e04e5e0/dns/0.log" Apr 16 16:01:08.928601 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:08.928562 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lbn5l_7b10a07b-1617-44d5-b186-97458e04e5e0/kube-rbac-proxy/0.log" Apr 16 16:01:09.011996 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:09.011964 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nhw5h_a7987cab-cae1-4625-84c9-b135a5b3b6e7/dns-node-resolver/0.log" Apr 16 16:01:09.041268 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:09.041233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" event={"ID":"2df317a3-3c75-4f97-b9bc-b3848e46286b","Type":"ContainerStarted","Data":"283d39edb8885a608d0074e6e301d1c7ac528fc7dfd66842a0e274f7804f43b4"} Apr 16 16:01:09.041436 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:09.041270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" event={"ID":"2df317a3-3c75-4f97-b9bc-b3848e46286b","Type":"ContainerStarted","Data":"64bd3acdbb61c0c19ed0f9184554949fee83f1608181f278a0623eb3032cb1f3"} Apr 16 16:01:09.041436 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:09.041382 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:09.056039 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:09.055998 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" podStartSLOduration=1.055984421 podStartE2EDuration="1.055984421s" podCreationTimestamp="2026-04-16 16:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:09.054548392 +0000 UTC m=+4119.944963279" watchObservedRunningTime="2026-04-16 16:01:09.055984421 +0000 UTC m=+4119.946399307" Apr 16 16:01:09.428883 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:09.428840 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dx2h6_4f1d6917-4b83-407b-a119-822474f666a8/node-ca/0.log" Apr 16 16:01:10.447066 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:10.447032 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nqtsn_df57e009-9151-4b90-8c22-bedd6e86b057/serve-healthcheck-canary/0.log" Apr 16 16:01:10.928944 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:10.928898 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xbx5f_e012d2f1-e273-4443-8305-4cca55b420d4/kube-rbac-proxy/0.log" Apr 16 16:01:10.945887 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:10.945861 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xbx5f_e012d2f1-e273-4443-8305-4cca55b420d4/exporter/0.log" Apr 16 16:01:10.963500 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:10.963477 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xbx5f_e012d2f1-e273-4443-8305-4cca55b420d4/extractor/0.log" Apr 16 16:01:12.854963 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:12.854902 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-wdv5j_54a8c02a-ce22-4a1e-bbe6-5a08c52ececd/manager/0.log" Apr 16 16:01:13.268401 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:13.268364 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-tlkcq_abe97d4d-ac00-4e82-9640-3778e78c1453/manager/0.log" Apr 16 16:01:13.374086 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:13.374053 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-8z479_ce9f8d49-8f9b-4bca-b103-95e9e0513091/seaweedfs-tls-custom/0.log" Apr 16 16:01:15.054179 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:15.054151 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gvdx8/perf-node-gather-daemonset-vfzjm" Apr 16 16:01:18.929476 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:18.929388 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdnq8_01ced172-b8c5-4042-97b6-a8813deb4542/kube-multus-additional-cni-plugins/0.log" Apr 16 16:01:18.950172 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:18.950142 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdnq8_01ced172-b8c5-4042-97b6-a8813deb4542/egress-router-binary-copy/0.log" Apr 16 16:01:18.999331 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:18.999299 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdnq8_01ced172-b8c5-4042-97b6-a8813deb4542/cni-plugins/0.log" Apr 16 16:01:19.018470 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.018439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdnq8_01ced172-b8c5-4042-97b6-a8813deb4542/bond-cni-plugin/0.log" Apr 16 16:01:19.044667 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.044643 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdnq8_01ced172-b8c5-4042-97b6-a8813deb4542/routeoverride-cni/0.log" Apr 16 16:01:19.072342 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.072317 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdnq8_01ced172-b8c5-4042-97b6-a8813deb4542/whereabouts-cni-bincopy/0.log" Apr 16 16:01:19.089628 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.089597 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gdnq8_01ced172-b8c5-4042-97b6-a8813deb4542/whereabouts-cni/0.log" Apr 16 16:01:19.174500 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.174475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zsj2z_ba49942b-2640-414e-a0c5-df74f0eda02d/kube-multus/0.log" Apr 16 16:01:19.239699 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.239671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7rlnq_731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e/network-metrics-daemon/0.log" Apr 16 16:01:19.257700 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.257675 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7rlnq_731df22b-c6b3-4d5d-ae5c-60a8d6d4d99e/kube-rbac-proxy/0.log" Apr 16 16:01:19.972305 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.972273 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-controller/0.log" Apr 16 16:01:19.986425 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:19.986395 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/0.log" Apr 16 16:01:20.023791 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:20.023753 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovn-acl-logging/1.log" Apr 16 16:01:20.055557 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:20.055523 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/kube-rbac-proxy-node/0.log" Apr 16 16:01:20.082815 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:20.082773 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 16:01:20.097614 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:20.097587 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/northd/0.log" Apr 16 16:01:20.114671 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:20.114647 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/nbdb/0.log" Apr 16 16:01:20.132877 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:20.132852 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/sbdb/0.log" Apr 16 16:01:20.330809 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:20.330774 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4rq4s_d696eec3-0a3d-418d-8aa7-05a9c9e9ef26/ovnkube-controller/0.log" Apr 16 16:01:21.984905 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:21.984849 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tt7s8_367d90fd-ba99-4dee-9cbe-ed7ac607159d/network-check-target-container/0.log" Apr 16 16:01:22.863068 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:22.863034 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-m9pmx_4974791a-9870-4f06-9c9e-14a97a8f16f7/iptables-alerter/0.log" Apr 16 16:01:23.495179 ip-10-0-139-55 kubenswrapper[2577]: I0416 16:01:23.495150 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7s8dl_95faf3e2-9a40-4a30-bcf9-eb791cd16272/tuned/0.log"